I have an existing code that queries AWS for resources one after another, filtering based on the resource name. The current implementation is linear and moves from one function for each resource to the next, creating the respective clients and querying aws using the clients. This of course consumes a lot of time. Is it possible to run each of these functions in an asynchronous fashion? Code flow looks like the snippet below with lot more resource queries. Any inputs/suggestions would be helpful.
def query_acm():
client = boto3.client('acm', region_name=region)
client.get_paginator('list_certificates')
# filter and write to file
def query_asg():
client = boto3.client('autoscaling', region_name=region)
paginator = client.get_paginator('describe_auto_scaling_groups')
# paginate filter and write to file
def main():
query_acm()
query_asg()
It is possible to run them in parallel, but for that I would suggest you to use python's built-in easy to use concurrent.futures library.
from concurrent.futures import ThreadPoolExecutor, as_completed
import boto3
def query_acm():
client = boto3.client('acm', region_name=region)
client.get_paginator('list_certificates')
# filter and write to file
def query_asg():
client = boto3.client('autoscaling', region_name=region)
paginator = client.get_paginator('describe_auto_scaling_groups')
# paginate filter and write to file
def main():
with concurrent.futures.Executor() as executor:
futures = [executor.submit(query_acm), executor.submit(query_asg)]
for f in as_completed(futures):
# Do what you want with f.result(), for example:
print(f.result())
You can pass parameters and get response for each function call as well. Read more or follow some examples on concurrent.futures.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With