Sum total number of orders by type

Hi there,

I’m trying to make a website for a meal prep business that currently takes orders via text message. I’ve set up their store on Squarespace and need some help massaging the data so that every night at midnight, they can print out what they need to make in the early morning.

Details are first, and at the very bottom is my progress.

Some details:

  • There are 10+ meals (e.g. chicken teriyaki, turkey BBQ burger, salmon teriyaki, shepherd’s pie)
  • Each meal has the exact same following product variants… amount of protein, type of carbs, amount of carbs, type of veggies, amount of veggies)
  • You can also order protein muffins (there’s ~8 different types)
  • You can also order meat in bulk (1lb, 2lb, 3lb, 4lb, 5lb)

Meal selection on website:

Meal customization on website:

Printouts / calculations required:

  • total number of meals we need to make BY type (e.g. teriyaki chicken, turkey BBQ burger, salmon teriyaki)
  • total amount of protein needed to make, and by meal selected (e.g. teriyaki chicken and can choose 6oz, 7oz, or 8oz; but also needs to sum up bulk meat orders)
  • total amount of carbs needed to make, and by type (there are 4 types)
  • total amount of veggies needed to make, and by type (there are 3 types)
  • total amount of muffins to make, and by type (there are ~8 types)

Here are some snapshots of the data I get from Squarespace:

This test customer ordered 2 variations of Teriyaki Chicken (3 in first variation combo and 2 in second variation combo):

Here’s what the variations look like fully expanded:

The printout for this 24-hr period should show:

  • Total teriyaki chicken meals: 5 meals
  • Total teriyaki chicken protein: 33oz
  • Total jasmine rice: 1 cup
  • Total brown rice: 3 cups
  • Total green beans: 3 orders (orders because we just fill the remaining space after protein and carbs)
  • Total broccoli: 2 orders (orders because we just fill the remaining space after protein and carbs)

Additionally, we wanted to figure out how to print out 4 orders per page vs. the Squarespace 1 order per person per page. This step is to make sure at the end of bulk-preparing everything, the team can quickly assemble the meals appropriately.

Any help would be appreciated! I’m a novice with data manipulation, so I’m not very far along. So far I’ve been able to bring in the data and have been experimenting with how to make the initial de-duped counts for totals.

Hey Ashwin!

Ultimately, your calculations will all be done by Sum by Group or Count by Group steps. Those steps can sum a column or count the number of rows, per unique value in a another column. So you could sum the total pounds of carbs to make per each type, or you could count the total number of meals to produce by type (like your first calculation listed).

Before doing that, you need to get your data into a shape that those steps will work well on. Looking at your screenshots, I think you should use a Pivot Columns step after your Expand JSON that is expanding the variations. Target the optionName column as the Pivot column, and the Option Value column as the Value column. That will flip those options into columns so that meals are not repeated in the rows. Make sure, though, that you have removed any old JSON columns that were already expanded using a Remove Rows step between the Expand JSON step and the Pivot Columns step. Otherwise the rows will not update properly.

Let me know if that works for you!

1 Like

Hey Brian, thanks for the quick answer. This all makes sense. I’m having a tough time figuring out which are the “old JSON columns” to remove and how to do it. Any guidance?

Hi @Ashwin_Khurana, the old JSON column you’d want to remove with Remove columns would be “lineItems”. This is because the step Expand JSON splits JSON object text into separate columns for each field in the object, while keeping the original text in its original column. Using Remove columns removes the JSON column after it has been expanded since the JSON blobs are unique values that would make the step Pivot columns fail at removing duplicates. By removing the unique column, we’re ensuring that the pivot step works properly by combining duplicate columns.

1 Like

Thanks for your help, everyone. I think I got the calculations down :smiley:

Need help with the last step of importing into Google Sheets.

  1. Am hoping to set up an automated push of the order data at 11:59pm on Sunday night, Tuesday night, and Thursday night. These pulls would include orders from e.g. 12:00am Monday morning through 11:59pm Tuesday night.

  2. Need to create a separate page within the exported sheet that simply pulls in individual orders with the following attributes: Name of customer + orders (with variant details). The purpose of the calculations we did above was to give the business the aggregate totals they’d need to make across all orders. This step is trying to create printouts with four orders by page so that the business can distribute and package meals according to the variants and amounts individual orders are comprised of.

Any help here would be much appreciated. Almost over the finish line!

I’m particularly confused with:

  • Can I drop the exports into a specific shared folder? Multiple people need access
  • Can it create a new file (not sure how it’ll be named) every time we push to Google (every Sunday, Tuesday, and Thursday night)?
  • Can I name the file based on the dates of the orders included?

Hey @Ashwin_Khurana - Glad to hear you’re getting close! Answering your questions inline:

Can I drop the exports into a specific shared folder? Multiple people need access
Unfortunately, our current Send to Google Sheets step doesn’t support exporting to a specific folder. If the sheets file that Parabola is appending to or overwriting to is stored in a shared folder, it will stay in that shared folder as Parabola updates your data.
Can it create a new file (not sure how it’ll be named) every time we push to Google (every Sunday, Tuesday, and Thursday night)?
Yes, you’ll want to set up your Send to Google Sheets step like @David suggests in this thread. Make sure that you set it to Each time this flow is run it will create a new file, like it shows in David’s screenshot. This will automatically add a timestamp of the time your Google Sheets file was created.
Can I name the file based on the dates of the orders included?
By default, Google will append the timestamp of when the file was created. Unfortunately, I guess in your use case, it’ll display the following day because the flow you run on Tuesday will be the order data from Monday. Since the timestamp is currently set by Google right now, there won’t be a way to workaround this for now.

Hope this helps! Let us know if you have any other questions.

1 Like