Document Size & Speeed

At what point do you see documents getting too big. Is it rows, tables, media, formulas, etc. At what point do I need to start thinking about archiving or deleting rows in order to prevent speed issues. We are trying to understand if coda can be scaled, without having to worry about it breaking at a later date without a plan on how to mitigate any risks.
Coda allows flexibility incomparable to a lot of traditional software, but want to ensure that is scalable as we continue to grow.

Would you be able to share how you plan to use the platform? It all heavily depends on how you plan to structure your doc(s).

You can have an unoptimized table with 5k rows with many many formulas and have an extremely slow doc, but you can also have a doc with over 100K+ rows that loads super fast!

There’s going to be (2) main use cases, different documents.

One is used and a logistics/delivering schedule tool. This has delivery dates, images, tracking time, etc.

Here are the current statistics, I don’t see it getting much bigger than this if I can figure out a way how to archive, ideally automatically.

image

The second table is going to be used as a manufacturing scheduling tool. This one is a little smaller, but has essentially due dates for specific tasks.
image

Archiving and efficient way to scale this is the biggest factor. I don’t see these doc sizes getting too much bigger, and thus far speed hasn’t been a issue.

Ah, those seem like pretty light docs! You should be fine, and there are lots of solutions to archiving. Scott Weir has a nice archiving tutorial and there are other packs and CSV exports you could work with.

If you want to test performance out, you can duplicate your current docs and create a button to duplicate all rows in batches of 10,000. This will help you simulate how a really large doc would behave!

What would that button look like, I haven’t tried that before but definetly can do so.