Large data set handling

  1. What is your feature/integration request? A way to split rows by steps e.g. split in 4, split every 10,000 rows etc etc
  2. What problem would this feature/integration solve? Processing and exporting large data sets
  3. How do you solve/workaround this problem today? Large number of limit row nodes

This is what it looks like when you try divide 400,000 rows into 20,000 row chunks!

Dealing with the same issue but maxing out on the sends into Shopify… they only allow 5k rows… So I had to set up multiple inventory feeds…

Also dealing with this. We have to process a CSV with 7k rows through Enrich with API, but due to the API and Parabola flow timeouts I had to break the file down into separate 500 row files. Would be nice to have some solution