I’m working on syncing data from PostHog to a Coda table using Google Apps Script and want to check if anyone has run into issues with larger bulk operations. I’m looking at upserting around 12k-17k rows using the /rows
endpoint with keyColumns
for matching existing records.
I’m going the Apps Script + API route because I hit the 10k row limit with Coda sync packs, so this seemed like the best workaround for getting all my data in.
I know about the 125MB doc size limit, but I’m not sure how to actually check my current doc size - I can only see attachment sizes in the interface. Does anyone know where to find the total doc size?
Beyond that limit, are there any other known issues or gotchas with bulk upserts of this size through Apps Script?
I’m familiar with the rate limits as well. The data itself isn’t huge per row - mostly text fields with user info, school names, engagement metrics, etc. Just want to make sure I’m not going to hit any walls before I start pushing thousands of rows through the API.
Thanks for any insights!