Yeah - 10,000 rows is a lot and whenever I approach that number I start to rethink data structure or even tool.
But depending on your data structure, number of columns and what you need to do with that data specifically, I’ve developed ways to store well into the 50,000 rows range.
Been meaning to post solution on community - maybe this will get me to do it.
Can you tell me more about your context, use case, doc design, and what you need to do with your data?
Alternatively - come to my GetUnstuck free training and I’d be happy to take a look at your doc! Only 2 days away!
hi @Anthony_Thong_Do , there is a difference between the creation process of 10K rows and the daily usage. As @Scott_Collier-Weir indicated a lot is possible if your data architecture is well written. I try to stay away from these heavy tables by splitting them in sleeping and active parts. The data of 2019 you might need once in a while, while the 2021 &2022 data could be of interest in most your manipulations.
I guess @Scott_Collier-Weir can tell you more about it, but it is an option to store all values per client in one year and unpack them in case you need to work for that client. If each client has 4 years you reduced your rows from 10000 to 2500 , but of course you then have to push (via a button) the data into a single cell in a way you can easily retrieve it afterwards or you split it over 4 tables (one for each year) and you bring them in a new table once you work for a specific client.
In general I would say that the approach you follow, I would follow as well, it is row based not column based.