GeoPath API + Parabola OAuth Issue

I am new here and attempted to use Parabola to assist with data extract from GeoPath API and transform the data for my web app. I keep running into an oauth issue. The GeoPath API requires that you supply the api key value in the header object with the very specific, case sensitive ‘Geopath-API-Key’ key. For example

“headers”: {

‘Geopath-API-Key’: “C44VqWZK7h5OeKzXKAZ1CF2q0xUVgiFw”

}

I keep trying that and this is the error I keep receiving below:

{
“fault”: {
“detail”: {
“errorcode”: “steps.oauth.v2.FailedToResolveAPIKey”
},
“faultstring”: “Failed to resolve API Key variable request.header.Geopath-API-Key”
}
}

Please advise what I need to do to get through.

This is a get request type

Hey Ari! Welcome to Parabola and our Community!

I know the UI for configuring Header Keys is a bit confusing right now, but you can actually type in a custom Header Key, you don’t need to select from our dropdown.

So, try configuring your Header Key like so. Don’t forget to paste your API Key into Header Value.

Let me know how that goes!

2 Likes

@sachi Thank you so much. It worked like a charm. I am so glad I found Parabola and looking forward to using it a lot more.

2 Likes

Thanks for the update! Glad to hear it. How’d you find us? What are some problems you’re looking to solve with Parabola?

1 Like

I found Parabola on MakerPad. I tested it with transforming coordinates to Addresses and I was blown away. It was about 120k coordinates and it was done in minutes.

I will be using it primarily to transform data from different api sources including data from my site going in/out.

4 Likes

here is what I have:

https://api-staging.geopath.org/v2.1/inventory/search?operator_name_list=outfront&page=4&page_size=1000

Please advise.

Looks like I got it to work as a POST call. I will play with it some more to clean the data before exporting. Please advise on any tips to clean before csv export / API feedback into my site

1 Like

Hey Ari, what’s your question around how to clean your data?

@brian thanks for your message. currently, when I join data from 5 different sources there seems to be an accurate count, however, the rows are blank upon exporting the csv file. Also, duplicates are still present after adding in dedup tile. Please advise

Also is there a concatenate feature, I would like to add image urls inside of: [https://i.imgur.com/KN3r5Ia.png]

1 Like

Hey Ari,

To confirm, you are downloading the CSV files after running the flow, and they are blank?

For your dedupe question, can you show me which dedupe step doesn’t seem to be doing its job?

To concatenate, you can use the Column Merge or Text Merge steps.

Yes, the are blank before and after. Or I have to download each one individually, instead of 10k rows at once.

The dedupe is working fine on individuals and not when joined. Column merger is working fine now.

also when I join tables, all the columns are replicated which I remove. Not sure if that causes blanks after I download the csv but I definitely see blanks before; like the image above.

Can you try joining your 5 sources before applying a single Dedupe step.Can you also send us a screenshot of how your Join step is configured and the results you see after you Join?

Does the CSV you downloaded after you ran your flow match the data you see in the “Results” tab of the CSV Export step you’ve highlighted in your screenshot above?

I moved dedupe further up in the process. The works fine now. Please see the images of join. It still produces blanks and creates duplicate records. Please advise

Hey Ari - that is because you have the top setting set to “Keep all rows in all tables” which means, if a match is not found by the join, it will still keep the rows, and create blank sections of the table to make it work. You probably want to keep all rows from the primary table, and keep those from the other tables that match.

If you scroll to the right of those blanks, you will see the data present but shifted

I am not having any luck with it Join at CSV level or JSON flattener We have don 70k rows had to do this 70 times. 7 times would be even more efficient. Overall the tool has saved use plenty of time. Down stream we would want to use API export and this would require Join considering we would be at ~500k rows of data. Any help would be nice

Can you provide more information on the data sets you’re trying to Join together?

  1. Do they all share the same column structure?
  2. I see you’re joining the tables based on the frameid field. Are there shared frameid values across these tables? I ask because if not, you could use the Table Merge step instead of the Join step and it’ll just stack these tables all together for you.
  3. There are no rows where frameId is blank, right?

I have better luck with the Table Merge. Thank you for your help. It has been a huge success !!

3 Likes

Glad to hear that worked! If your column structures are the same across your tables and you don’t expect duplicate rows across your tables, the Table Merge step is a simpler step to use over the Join step!

Just in case it’s helpful in the future, here are our educational videos on the Join and Table Merge steps:

Using the Join step
Using the Table Merge step