Export to CVS pack scalability

Hi Coda,

I have a question: would this pack still work at very large doc size? Like approx 8 mil rows for example.

When the table gets that large, download continuity is important. Would I still be able to sustain a continuous connection?

I’m guessing the workaround is to archive pieces as I go along so that the size doesn’t become unmanageable

Just for reference, Coda itself is not going to be able to hold 8 million rows. Let alone 200,000 rows in a way that is usable.

So that will definitely be something to consider.

@Courtney_Milligan1 can tell you more about the limitations of the pack specifically though

1 Like

Hey @Jake_Nguyen! Scott’s right about Coda’s row limitations, however unfortunately the pack is even more limited. This is because it can only fetch 500 rows at a time and times out after 1min. I’ve been able to export just over 7000 rows at a time before running into errors. I’d recommend doing what you said and exporting as you go, with no more than 5000 rows at a time.

Since it sounds like you have a lot of data, here’s another suggestion: due to the save dialog box that pops up when running this formula, it can’t be put in an automation. However, I have another pack (Export to Google Workplace) that can be used with this pack to upload the exports automatically to Google Drive which could be helpful if you have a lot of data to export (ex: an automation that runs once/week and exports then deletes the oldest X number of rows)

Thanks for looping me in here @Scott_Collier-Weir!

1 Like

Hey Courtney. Thanks for the reply.

Yeah I figured - I decided to just push all the data to a SQL database so I can query that DB later for reporting.

This can be done by making an API and then writing some code in studio pack to integrate it inside Coda

2 Likes

Thanks Scott appreciate the CC

1 Like

Good idea with that amount of data. Glad you found a solution!

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.