When modifying a large worksheet, Coda always crashes or it takes forever to load. The page usually times out so it’s quite frustrating.
For example, one sheet has appx 10,000 lines and 12-15 columns, without any calculations. (Maybe one column has a mathematical formula) My internet speed is 70mbps.
I know it’s just me because Coda is too amazing to let this be a problem for everyone.
Hi @MK_2109, once you get to 10,000 rows of data we do indeed see some performance issues with docs. This is something our team has been actively working on! In the meantime, you can speed up your doc by hiding columns you don’t normally look at or set up filter in your big table. Most people put their master table in some “admin” folder of their doc and build views that have many filters so that you only see the rows that matter to you. Let me know if these strategies speed up your doc!
Hey thanks for writing in. We’ve been making some improvements to performance and was digging through old posts. How are things going with your doc? Based on your description, I should be able to help you get a doc of that size to run very smoothly. Please feel free to reach out on Intercom!
Thanks for the request Jack. We’re working on something very exciting to help with just that. Hope to have an update in a week or two. Watch this space!
Meanwhile, if you can share your doc with us, we’d be happy to take a look and help you figure out how to make things faster. Can you please ping us on intercom?
Hey @Pedro_Miguel_Rocha_Silveira you shouldn’t really see any issues with having too many sections. It’s only a problem if each section is very big (which we are working on fixing) or if you have too many different views of a table as a result of having too many sections.
Thanks @Angad, this is the single biggest thing that will sway me to Coda. Unfortunately beforehand it wasn’t clear if this was fully acknowledged how much of a limiting factor this was - but it clearly on your radar now.
Looking forward to pushing ahead with some new docs and developments.
Thank you. Search speed is definitely on our list although admittedly it is not at the top of our list. Hope to get to it soon but can’t promise a time unfortunately.
I am working on building a system that will likely have 10-15K rows in the master database, along with several other smaller tables. I need to have access to all of these rows to run reports for clients, so deleted rows isn’t really an option. I then will likely need to create 50-100 “client hub” docs that aggregate the rows that are just relevant for them.
I created an initial template that has 10K rows and it is a big doc. I can filter the main table since it will not be looked at as a whole, but do you think once I start building out all the client hubs, which will each require their own view, that I will break the doc and essentially make it unusable? I’m just wanting to get my approach right at the start. I have some pretty big docs but this one will be the biggest I have made.
I was also wondering if you think using your Google Scripts option would be better than cross-docs due to the amount of views I will have to make.