I’m working to see if this platform is the key we need in syncing Google Sheets to Webflow. I have 4,000 worth of rows in a data-set and am experiencing this issue:
“Exports larger than 1,000 are not allowed at this time”
I figured I can chapter the data into 4 parts, but wouldn’t that increase my price/credit usage by 400% ? We’re in concept stage and cannot go up into the $100’s a month on data connection for 4k worth of rows.
Thanks for reaching out, and your guess is right - you can chapter the data into 4 parts (split it into four export steps Send to Webflow). I have more good news: this will not increase your price/credit usage. The only thing to note is that processing this many rows through 4 export steps will take some time, so the flow may run slow.
The way a flow gauges how many credits to use per run is based on the max rows incorporated at any point during the flow. Since your flow already has a step with a dataset of 4,000 rows in it prior to the export step, sending these rows to an export step doesn’t change the flow’s usage amount. You can also use the table “How many credits do I need?” seen here as a guide.
@THOMAS_A_HUSMANN great question. The 1,000 row limit is the limit for the export step Send to Webflow specifically. This step’s limit has always existed and has one of the lowest limits among Parabola steps.
Only import/export steps have limits, which are based on how long flows take to process and run (i.e. limits vary per API step since APIs have different rules for receiving data, like some allowing batches).
Are you still unable to get your export step Send to MySQL to work? This one is on the slower side and can experience delays.
Thank you for the info. As a mysql newbie, I thought the process of creating tables and columns was automatically done by Parabola. I now understand how to do it and was surprised at how easy it was when use the Google SQL with Mysql cloud console command line. Thank you for all of your help, you guys have been so responsive and easy to work with.
I probably should post this as a separate question, but does your export to mysql feature support the JSON data type?