So I’m querying an API and building a google sheet with 30,000 rows.
I’m then looking to move this data in to a DB. I’m using send to API (and testing enrich to use the response data) but my DB service limits bulk updates to 100 rows/records.
Iterating through the rows is one option but it’s super slow for 30k records.
Is there any way to build a loop to send rows in batches of 100 to the API?
I have built a working flow that could be scheduled to do this, but it would be every 10 mins, and involves another sheets lookup table to filter the 30k original items against those that have been added. All a bit involved and slow.
Any help would be great, thanks!