I was patient through numerous prompts by Chrome asking if I wanted to “Wait” due to an unresponsive site… and uttimately I was able to upload two large tables of data from Google Sheets, one abut 9,900 records in size (3.9Mb) and the other about 32,000 records in size (12.6Mb).
I discovered that the waiting time for a filter formula search on the larger table was just barely on the border of workable, though there may be some background indexing as it improved as the day went on.
But most notably, the ability to randomly insert a row hyperlink via the “@” feature on the canvas has slowed so much that it is entirely unusable in the entire document. I presume this is more affected than a filter formula search because it searches through all tables in a document.
I do realize these are pretty large tables. But that said, coda.io is of course a database! Files of this size are not an issue for Excel or Google Sheets nor for Airtable (for paid customers).
Is the plan for coda.io to be able to handle tables of this size, or is the intent for the “database” feature to be for limited personal databases only?
If it can support larger databases, the usefulness goes up quite considerably.
******* Update – as of Monday morning all of the searches in the document are effectively unusable - even after waiting 5-10 minutes they time-out or do not respond. Perhaps there is a variability in performance depending on server load. In any event, at present Coda simply does not scale to any workable degree with moderate “database” sized tables.
I realize that in the live release, there may need to be a paid tier for larger database/table sizes - that is reasonable. But the system really will not be usable, even for free, without some way to handle larger tables. What a disappointment to have such a capable database that becomes unusable when “too much” data is added.