Send to API - batch process

Hello,

So I’m querying an API and building a google sheet with 30,000 rows.

I’m then looking to move this data in to a DB. I’m using send to API (and testing enrich to use the response data) but my DB service limits bulk updates to 100 rows/records.

Iterating through the rows is one option but it’s super slow for 30k records.

Is there any way to build a loop to send rows in batches of 100 to the API?

I have built a working flow that could be scheduled to do this, but it would be every 10 mins, and involves another sheets lookup table to filter the 30k original items against those that have been added. All a bit involved and slow.

Any help would be great, thanks!

Hi @jim - Welcome to the community :wave:

We should be able to solve this by consolidating rows into a JSON array.

You can start by inserting an “Insert Row Numbers” step, and repeat numbers for 100 rows (ex. your first 100 rows would all be labeled 1, next 100 rows labeled 2, etc.).

Next we can add an “Insert Text Column” step called something like ‘arrayConstructor,’ and set the column value to the JSON array that you are interested in passing to your API.

Note: If the bracket is acting funny at the end, add “}{x}” to the end of the column value, and then use a find and replace step to replace “{x” with a blank value.

You can then use the “Merge Duplicate Rows” step to consolidate those batches of 100 rows. Merge the column ‘arrayConstructor’ with a comma delimiter, ignoring blanks.

To send the array to your API, you can include [{arrayConstructor}] in your request.

Let me know if this works for you or if you run into any roadblocks. Hope this helps!

Thanks @Adam_R , sounds like a good approach and will cut out lots of steps.

I’ll give it go.

1 Like