Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Dealing with Large Data Sets in Google Apps Scripts

What is the best way to deal with a large data set in Google Apps Scripts? My data is located in Google Spreadsheets (~10,000 - ~11,000 cells) and I am using it with a "form letter" (about 4-5 pages long) that I created in as a Google Document. I would like to run a find and replace in the document for about 100 place holders (that match up to columns) and then save the document. The last step is to merge all of these new documents (~100) into one document that is converted into a PDf.

Any thoughts on the best way to deal with the flow of all of this data? Is it better to merge the documents into one document as I create them or should I wait and merge them all at the end?

like image 497
will w Avatar asked Mar 26 '26 20:03

will w


1 Answers

You will need to plan for exhausting quotas. Docslist and execution time are likely concerns. Creating 100 docs is fine but will the job grow to exceed the quota ? (250/500/1500 depending on whether you are consumer/free apps/or business apps user) see the Google Apps Script Dashboard for current limits. Unless you really need the individual documents from find/replace - create the final document as you go. Minimise api calls - eg get data from spreadsheet once, not for each document. Have you tested pdf'ing a sample of your large final document? This can be a time/resource consuming operation and fall foul of script limits. Plan to separate this part of the script out.

like image 200
DavidF Avatar answered Mar 28 '26 08:03

DavidF



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!