Hi Coda Community,
I have a use case where I have multiple modules. As an example, we can say 3 modules:
- Issues
- Checklists
- Learning
Each module has its own associated sync tables from other docs. I currently have a setup such that I have all of the tables for all 3 modules in one doc. Naturally, I’m seeing performance issues on this doc as I cross 25000 rows with 10s of cols for each row.
Now I have 2 options: A) I can continue with this doc and clear out space or B) separate these 3 modules in 3 separate docs through doc duplications and deletion of other modules. After which I can use “Sync Page” in One Doc to get all these modules back in.
Which approach between these 2 options will have a better performance in the long term? What would be your recommendation?
Would love to know if you know a 3rd option also.
Please note that it is important that I have one doc where all modules are accessible.
Thanks in advance for all suggestions🙏
2 Likes
Hello @IT_Software ,
I can’t answer your question(s) with 100% certainty, but being a long time Coda user and having experience with larger docs, I can share my gut feelings.
-
25K rows is at this point in time not really a problem. Coda is not a database (it is a doc with tables) and upon opening a doc, it loads the complete doc. Therefore, as your tables (and/or your doc) grow, you will get performance issues, even if it is just because of the longer loading times.
-
performance issues are experienced when inserting new rows in (very) large tables, in particular when calculations are connected to these tables (like column totals, filtering, etc.). Whether that feels like an issue or not depends on the doc/application you are building: adding a couple of rows per minute or a couple of rows per day may result in a completely different experience.
-
at some point, you will have to look at debug calculations in your doc map and see where the bottlenecks are. Simple optimizations can do wonders.
-
I think that splitting your doc and accessing them through yet another doc will complicate things, will result in redundancy (like base tables) and will not give you better performance.
-
Coda can connect to external databases (which can grow to an almost indefinite number of rows), but due to the limits of the API it would not be my choice.
My guess would be your best choice is to look critically at the structure of your data, formulas, relations and see if things can be optimized. If you know that you will end up with way more rows (say 100K) with a complex structure, I think Coda is (at this point in time) not the ideal choice.
The screenprint below show the statistics for one my my larger docs. It is used by our team and works still fast (running typically on 10 or more devices, processing a couple of hundred (trans)actions per day). Loading time and activating formulas (like user()) after a fresh start is about 15 seconds (but not on IOS mobile devices - a known issue). We have had to implement optimizations in formulas, but that is to be expected.
I hope this helps,
Greetings, Joost
11 Likes