With the example you seggested I can keep track of historical cost (or in my case hourly rates), that’s good but I would like for example, after some months, get the total cost of a user activities.
To get an accurate cost I need that all activities keep the hourly rate at the moment they were created.
The only way I found is rename the user name (for example mark_old) when the hourly rate change occur and re-create the user name (mark) and start using it from now on…
…yes thanks, I already summarize and filter cost with external tables; but regarding historical data preservation…do you think that renaming the user name when the hourly rate change, as I did in my last example could be the right pattern?
As mentioned before, it’s necessary to move the records to an external table, with the use of buttons, so that the reference with the hour rate will be disconnected.
Don’t use a formula column to look up cost for each task — but use an input column and somehow populate it with the client’s current rate (e.g. make a select control to choose current client and a dedicated button to create a new task, which will fill in the rate from the selected client)
Use three columns: a formula one for the lookup, an input one for manual override, and the third one for resulting rate (e.g. [Custom cost].IfBlank([Regular cost])) But this won’t work nicely if you want to update the rate once and have all new tasks have it.
Same as 2, but implement a mechanism that will first fill all Custom cost cells with existing rate, and only then update the existing rate with the new value.
All three approaches work to preserve the historical perspective, but it shows that to extract useful analytics from any data/transaction model we need to think about data the way we think about geological time. Indeed, some data falls into the Cambrian-period and nothing we do today should be allowed to rewrite history.
As such, this demonstrates some cases where we should carefully plan the flow of information from day-to-day operational activities to archival resources.
There are two key drivers - (i) analytics, and (ii) provenance. You want to understand today and the future by reflecting on the past, and you may need to prove from time-to-time what happened.
@Paul_Danyliuk’s approach takes a logical approach, whereas, @Jean_Pierre_Traets solution is to separate historical data physically from operational data. I have a hard time resisting either approach and my schizophrenic Coda-style is clear evidence I am on the fence.
This is an admirable problem for which I have no clear advice.
…thank you all guys for all your suggestions…I studied all your approaches and you put me on the right path.
I ended up using the formula you see in following example. The operator just select his name and the formula compare the activity data with all data where a hourly rates change occurs for that user and pick the most recent one.
@Bill_French …I think this is the good and the bad of a completely dynamic table, way different from an RDBMS or noSQL table/collection.
So I ended up using a formula…just another dynamic action
Indeed, and isn’t it wonderful that in Coda we have so many choices? Love this platform.
To put a finer point on my comment, I don’t necessarily regard dynamic tables as good or bad in either an operational or archival context; they can be used for both static or dynamically-changing information and it’s nice to have a consistent canvas for building clever solutions. Like a good all-purpose cleaner.
And I am certainly guilty of racing into projects without any regard for the lifecycle of data. It increasingly seems that for every solution, there is a sort of “archive debt” that must be repaid [eventually]. As in life, we are free to finance these debts or race a little slower and not accumulate debt so quickly.
The test to determine if a solution will accrue interest on this debt is pretty simple -
Must the data used to address operational tasks be preserved to meet (i) analytics requirements, or (ii) provenance requirements?
An affirmative answer to either question seems to suggest the there are interest charges and the data lifecycle must be attended to in some way.