How to split data into multiple steps (to keep from timing out)

I’m uploading a large CSV file to Bubble (9679 rows, 56 columns). It keeps timing out.
2020-11-16 at 09.17.03

I have tried adjusting the Max Requests per minute several times.
2020-11-16 at 09.18.56

It never seems to help.
2020-11-16 at 09.18.56

Should I try changing the Max Requests to 999? At this point I feel like I’m just burning money since I don’t know how to fix the problem and just guessing.

Hi @Nathan_Lively,

Great question, I understand how that would be confusing. The step Send to an API's settings Max Requests per minute should match to Bubble’s API documents seen here: https://bubble.io/reference#API.rate_limit

Changing Max Requests to 999 may be the solution depending on how many Send to an API steps are in your flow’s canvas file. Since Bubble’s API documents mention a limit of 1,000 requests per minute, this number is the max requests you can send throughout an entire flow canvas file. Try having your flow’s total Send to an API step(s) make a combined amount of up to 1,000 Max Requests per minute to resolve timeout errors. :slight_smile:

For example, if a Parabola flow has branches ending in two different Send to an API steps, these need to have max requests of 500 per individual step (or another combination adding up to 1,000) versus having max requests of 1,000 within each step. Something to note is that increasing the max requests past a service’s limit will likely cause theirs to tell ours to slow down, which makes this step take longer (i.e. leads to a 1-hour timeout error).

2 Likes

Thanks so much Adeline. With my next transfer I’ll try this.

2 Likes

Hey Adeline, I want to post a follow up problem here because I think it may be the same issue. I haven’t been able to test a solution to the original problem I posted about because the first step in my flow is to download an object from Bubble and then delete it. First I clear clear the database before I upload the new rows. Parabola is unable to complete it.

2020-11-25 at 08.33

There are 12k rows in the database that I need to delete.

When I open my flow I don’t see the error, but it does seem to suggest that it is only going to try to delete 2,500 rows, right?

Here are the import settings. Again, it doesn’t seem like it’s going to bring in all 12k rows. I have tried a bunch of different settings for “Increment each page by” and it doesn’t seem to change the result.

Thanks for any insight into this problem. I swear I’m not trying to make you read the user manual for me. I read through it, but I guess I don’t understand it because I can’t see what I’m doing wrong here.

Hi @Nathan_Lively,

Ah I see. Thanks for following up and informing us of this. If I’m understanding correctly, it sounds like:

  • Currently your import step Pull from an API brings in 2,500 rows of data
  • You’d like to pull in 12,000 rows of data from Bubble
  • Bubble sends over 100 items per page (noting this for later Pagination settings)
  • 12,000 rows to import / 100 items per page = 120 max pages to fetch from Bubble, though I’d up this to 130 as a safety buffer

Given the above, in your import step’s Pagination settings, for the Maximum pages to fetch field can you change its current number to 130? That should solve the issue with why the import step is only bringing in 2,500 rows (due to a value that’s in the Maximum pages to fetch field). Please see below for an example of what this looks like for you:

1 Like

Thanks @Adeline. This is my understanding of the pagination function as well, but I was putting the 130 number in the “Increment each page by” field for some reason.

Looks like updating the Maximum pages to fetch field did the trick, but then in the next step I got this error:

{
  "body": {
    "message": "Missing object of type Alignments: object with id 1605507061995x126428622598613980 does not exist",
    "status": "MISSING_DATA"
  },
  "statusCode": 404
}

Here’s the sent request:

{
  "headers": {
    "Authorization": "Bearer *****"
  },
  "method": "DELETE",
  "url": "https://app.subaligner.com/version-test/api/1.1/obj/Alignments/1605507061995x126428622598613980"
}

I searched my database in Bubble for that ID and it returned nothing. Now I’m really confused and again, burning through credits.

I think I tracked down my problem. I’m not sure how this happened, by after running Remove Duplicates I discovered that of the 11,600 rows imported from Bubble, only 214 of them were unique. With that many rows I could not find any way to delete them so I did it all manually, 1,000 at a time.

With everything deleted I was able to start fresh. So far it seems that the solution of limiting the rate so that all Bubble exports in the flow do not add up and go over 1,000 is working.

1 Like

@Nathan_Lively awesome, glad to hear you tracked it down and resolved it! Feel free to reach out if any other questions arise.

1 Like