Increase performance when dealing with a Coda Doc that has ### rows

#1

I imported my contacts from across different “funnels” (i.e. LinkedIn, personal contact list, etc.) to build a CRM in Coda. The problem is once I hit a anything over 70-100 rows, Coda is basically unusable. This prevents me from using Coda as a CRM (one of the advertised use cases).

I’d love to see performance prioritized in the next few dev sprints!

1 Like

Performance with a large table
#2

@419 sorry that you’re experiencing performance issues! Can you share a bit more about the structure of the doc you’re trying to build (number of tables / rows per table, lookups and calcs, etc). Performance issues at less than 100 rows seems irregular - we have plenty of docs in the high thousands that are performant, though it is dependent on a number of factors (for example, if you have multiple joins / lookups traversing data sets every time an update is made to tables). Happy to dig in to this!

0 Likes

#3

Started noticing some slowdown after an import of ~97 rows. Decided to go for broke and add another 400 or so, then added another 2,000. After the 97 I noticed some hitching, but the 400 threw it over the top. Not too much of a difference between performance at 400 rows and performance at 2,000 rows.

I can try to recreate it if that’d be helpful. It’s been a few weeks since I last accessed the Doc. @evan

0 Likes

#4

That would be great @419 - is this just straight import into a new table or is it a table with a bunch of calcs / lookups? If you can share the doc you’re testing with I can run some checks on our side to see where the biggest performance burden is.

0 Likes

#5

@419 I find it’s most effective to break calculations out into their component steps (iteratively calculating across columns) and then hiding the results you don’t need. Our document has several ~400 row tables with a few joins and multiple collaborators read/writing, and we see in the worst case a beat of delay before updates propagate. That usually happens when I have other programs open that eat up RAM.

Are you connecting across a lot of tables? I find lookups tend to be the slowest (which makes sense!).

1 Like

#6

Nope, just raw data. I’m scared to add formulas to any of my rows haha

1 Like

#7

I ran into a similar problem but my problem is not (yet) with the number of rows but with formulas. After creating a good number of formulas using all sort of capabilities in Coda, I started testing and a button that triggers another 15 buttons might take up to 10 min. to delete and then add 15 rows. I am quite afraid that all the time invested in developing the file would be wasted if it performs like this.

After a couple of hours of testing I realised that the things that slow things down the most are:

  1. Conditional formatting
  2. Formulas that include data from more than 1 table
  3. Cells that have an error (even when empty cell is an error)
  4. Lookups

In that order.

Removing the first 3 considerably changes the behaviour but then again makes the tables not very functional.

I don’t think that the local machine has much to do with the computing.

I hope that there is a way to improve that other than removing functionality.

0 Likes

#8

Same here. I spent about 3 days developing a complex doc that operates as a CRM, but my buttons lock the dock up for 5-10 minutes. Love what the docs are capable of, but the performance is making it a hard sell to others on my team who are considering this as an alternative to Google Docs.

1 Like

#9

Totally agree! I was hoping that the doc I created (spent a lot more than 3 days) would be something to introduce to the team but I am quite afraid that when there are few users who would click a button it would not only take forever but will also mess up the actual data because of the latency. Hopefully something will be considerably improved so I can actually start using what I’ve created.

0 Likes

#10

A further observation:
I had a complex formula in one of the columns. After placing same formula in values for new rows, performance improved quite a lot. This works for me perfectly well and only in this particular column and situation. So this is not a real solution but sharing in case useful for Codans.

1 Like