Add batch row update/insert/upsert to Make.com integration

I have a table with ~8k products that needs to be updated once a week at least, due to many of our workflows using it. I was planning to use Make.com for it (that’s what we normally use), but I realized the Make.com Coda integration does not support batch handling. I calculated that my Make.com workflow would result in ~16 000 extra operations due to this (as it has to go back to the repeater each time) – every time I need to sync it. That’s bananas and would cost us a lot of money and hours per sync. It also adds unnecessary amounts of network requests, for something that can be done in one request.

As the rows in Coda are referenced to in many of our docs and pages, I cannot just do a csv import/export etc as it opens up room for a lot of issues. For example, if one product is removed in our ERP – that means that the rows in the exported CSV will not match the order of the table. The previous references to the table will then show the wrong product.

What I wish for is a way to send a batch upsert/update/insert/remove via the API and the Make.com integration. Any plans for this?

I don’t think if Coda is responsible for developing those Make integrations, or if it’s someone from Make.

In any case, there’s the generic “Call Coda API” step where you can manually construct the payload to do batch upsert. That said, I think you can still only batch insert 100 rows at once or something.

How about instead of doing upserts you would just send the products to Coda through a single webhook request? The payload can be 4 MB, so if you pack your products nicely, it may fit

Thank you for the response, very helpful. I will try the API call method, I was not aware of that. Still lot of requests for this specific use case, but might work for some other things. I really feel like the limits are weighing me down. Will hit 10k products soon so have to tackle that issue as well soon due to the row limit.

The webhook idea might work! The data is quite compact, so I will not hit that size limit. Its just many rows, with 3 columns (product name, sku and publish status).

You got my brain working now… maybe I can just send the whole JSON to one field, and trigger an automation in Coda that parses that field. Hmm… Might have to write a pack for upserting json into a table, but that’s fine.

Anyway, my suggestion still remains for better batch handling for the Make integration. Also not sure if it is Make or Coda that is responsible for the Make integration.

Thanks once again, I have more hope and ideas now on how to solve the issue.

You don’t need a pack to upsert JSON into a table. You just receive the JSON in your automation and run a coda formula on it, just like that.

Let me know if you need help with it. After you ParseJson() the Step 1 Result you’ll already get your objects and lists you can then ForEach() and AddOrUpdateRows() on

P.S. 10k rows is only an issue for crossdoc etc. It’s not a limit for the doc; you can have millions of rows if all is efficiently built.

@Paul_Danyliuk You really are a treasure to this community, thank you. I will be able to solve it on my own with your tips. I’m very grateful!

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.