I imported my contacts from across different “funnels” (i.e. LinkedIn, personal contact list, etc.) to build a CRM in Coda. The problem is once I hit a anything over 70-100 rows, Coda is basically unusable. This prevents me from using Coda as a CRM (one of the advertised use cases).
I’d love to see performance prioritized in the next few dev sprints!
@419 sorry that you’re experiencing performance issues! Can you share a bit more about the structure of the doc you’re trying to build (number of tables / rows per table, lookups and calcs, etc). Performance issues at less than 100 rows seems irregular - we have plenty of docs in the high thousands that are performant, though it is dependent on a number of factors (for example, if you have multiple joins / lookups traversing data sets every time an update is made to tables). Happy to dig in to this!
Started noticing some slowdown after an import of ~97 rows. Decided to go for broke and add another 400 or so, then added another 2,000. After the 97 I noticed some hitching, but the 400 threw it over the top. Not too much of a difference between performance at 400 rows and performance at 2,000 rows.
I can try to recreate it if that’d be helpful. It’s been a few weeks since I last accessed the Doc. @evan
That would be great @419 - is this just straight import into a new table or is it a table with a bunch of calcs / lookups? If you can share the doc you’re testing with I can run some checks on our side to see where the biggest performance burden is.
@419 I find it’s most effective to break calculations out into their component steps (iteratively calculating across columns) and then hiding the results you don’t need. Our document has several ~400 row tables with a few joins and multiple collaborators read/writing, and we see in the worst case a beat of delay before updates propagate. That usually happens when I have other programs open that eat up RAM.
Are you connecting across a lot of tables? I find lookups tend to be the slowest (which makes sense!).
I ran into a similar problem but my problem is not (yet) with the number of rows but with formulas. After creating a good number of formulas using all sort of capabilities in Coda, I started testing and a button that triggers another 15 buttons might take up to 10 min. to delete and then add 15 rows. I am quite afraid that all the time invested in developing the file would be wasted if it performs like this.
After a couple of hours of testing I realised that the things that slow things down the most are:
Conditional formatting
Formulas that include data from more than 1 table
Cells that have an error (even when empty cell is an error)
Lookups
In that order.
Removing the first 3 considerably changes the behaviour but then again makes the tables not very functional.
I don’t think that the local machine has much to do with the computing.
I hope that there is a way to improve that other than removing functionality.
Same here. I spent about 3 days developing a complex doc that operates as a CRM, but my buttons lock the dock up for 5-10 minutes. Love what the docs are capable of, but the performance is making it a hard sell to others on my team who are considering this as an alternative to Google Docs.
Totally agree! I was hoping that the doc I created (spent a lot more than 3 days) would be something to introduce to the team but I am quite afraid that when there are few users who would click a button it would not only take forever but will also mess up the actual data because of the latency. Hopefully something will be considerably improved so I can actually start using what I’ve created.
A further observation:
I had a complex formula in one of the columns. After placing same formula in values for new rows, performance improved quite a lot. This works for me perfectly well and only in this particular column and situation. So this is not a real solution but sharing in case useful for Codans.
Hey @Justin_Rosales can you ping me on Intercom? We’re currently working on improving the performance of Coda docs and have helped some customers be able to speed up their doc significantly. Based on your description, your doc should be running smoothly and it might be an unoptimized formula that we can help you find and fix.