I am using facebook-python-ads-sdk to make async calls for FB insights API as described.
params = {
"time_increment": 1,
"level": "ad",
"date_preset": "last_28d",
"breakdowns": "hourly_stats_aggregated_by_advertiser_time_zone",
"limit": 1000
}
job = AdAccount("id").get_insights_async(params=params)
result_cursor = wait_for_async_job(job)
results = [item for item in result_cursor]
def wait_for_async_job(job):
for _ in range(TIMEOUT):
time.sleep(1)
job = job.remote_read()
status = job[AdReportRun.Field.async_status]
if status == "Job Completed":
return job.get_result()
So the job to retrieve insights for last_28d finishes in a few minutes, however, the pagination over the results can take up to an hour!
Is it the right way to paginate over an async job?
I am posting the answer so it can help other developers that had the same issue.
modify:
return job.get_result()
to:
return job.get_result(params={"limit": 1000})
This will paginate over the results in jumps of 1000 and not the default which is 25.
The above change saved us 30 minutes of run.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With