Question about bulk upserts with Coda API (Google Apps Script)

I’m working on syncing data from PostHog to a Coda table using Google Apps Script and want to check if anyone has run into issues with larger bulk operations. I’m looking at upserting around 12k-17k rows using the /rows endpoint with keyColumns for matching existing records.

I’m going the Apps Script + API route because I hit the 10k row limit with Coda sync packs, so this seemed like the best workaround for getting all my data in.

I know about the 125MB doc size limit, but I’m not sure how to actually check my current doc size - I can only see attachment sizes in the interface. Does anyone know where to find the total doc size?

Beyond that limit, are there any other known issues or gotchas with bulk upserts of this size through Apps Script?

I’m familiar with the rate limits as well. The data itself isn’t huge per row - mostly text fields with user info, school names, engagement metrics, etc. Just want to make sure I’m not going to hit any walls before I start pushing thousands of rows through the API.

Thanks for any insights!

You can find info on doc size in the doc map in the settings. You cans sort tables by size, row count etc. Your 20k rows will barely scratch 10mb if rows are light with info. I had one doc with around 125k rows in one master table that was still under the 125MB for the API. Just respect the rate limits of Coda API and be careful with the timeouts and you shall be good to go.

1 Like

Thank you, that makes sense to me! It’s a 10 column table, all text, so it should be pretty small overall. I’ll experiment with this a bit more, but it sounds like I should be ‘ok’ adding/updating these rows via API.