Given a function process_list that takes a list of unique IDs and sends the list to an API endpoint for processing. The limit for the list is 100 elements at a time.
If I have a list that is more than 100 elements, how do I process the first 100, then the next 100, until I reach n?
my_list = [232, 231, 932, 233, ... n]
# first 100
process_list(my_list[:100])
def process_list(my_list):
url = 'https://api.example.com'
data = {'update_list': my_list}
headers = {'auth': auth}
r = requests.put(url, data=json.dumps(data), headers=headers)
Trying to keep it simple because I assume you are starting with Python
Iterate the list increasing a hundred every iteration
my_list = [i for i in range(10123)]
for i in range(0, len(my_list), 100):
process_list(my_list[i:i+100])
def process_list(my_list)
url = 'https://api.example.com'
data = {'update_list': my_list}
headers = {'auth': auth}
r = requests.put(url, data=json.dumps(data), headers=headers)
You have two options on how to use range from the docs:
range(start, stop[, step])
or
range(stop)
Using the first option you iterate through the sequence 0, 100, 200, ...
Here is a recipe from the itertools docs that should may help:
def grouper(iterable, n, fillvalue=None):
"Collect data into fixed-length chunks or blocks"
# grouper('ABCDEFG', 3, 'x') --> ABC DEF Gxx"
args = [iter(iterable)] * n
return zip_longest(*args, fillvalue=fillvalue)
Use it like this:
def process_list(my_list):
url = 'https://api.example.com'
for group in grouper(mylist):
data = {'update_list': list(group)}
headers = {'auth': auth}
r = requests.put(url, data=json.dumps(data), headers=headers)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With