Pro hack: Trigger cross-doc sync with an action (hidden formula)

@Paul_Danyliuk :pray: thank you for this. I was able to implement and after shallow testing I can report that it does function as desired; activating a crossdoc addrow action button that is paired with the above pro-hack adds the row back to the source table and then initiates a sync which brings said added row in through the sync. As you said I will need more testing to see its fidelity, but early tests show it works! This is a huge step closer to two-way sync! :pray:

2 Likes

Thanks for reporting back and glad that it works (and glad to have my theory confirmed).

Please report back if it at some point it fails.

Also please keep in mind that this is all undocumented functionality that may change / break at any moment. So while you can rely on it while it lasts, accept that at some point it won’t work anymore.

@Paul_Danyliuk thanks for this idea - it taught me a few things along the way and it is addressing something that has been a real problem since I started playing with crossdoc tables.

I’ve implemented your pro-hack as follows:

  • I have button which opens a new row (add row) for editing and it has a button column for “save”'ing the entry - this is how I’m trying to simulate a form entry in a modal dialog for adding some data.
  • The “save” button runs the Crossdoc::AddRow to send the data to my remote table. I’ve wrapped it with a RunActions call so that it does that and also updates the timestamp in the Trigger table as per your solution.

I’m finding that my data is sent across and my sync is triggered, but I’m finding that the sync is triggering before my remote data is saved in the snapshot that gets sync’d.

Are there any ideas for how I can delay that sync trigger to give more time for the snapshot to be updated before the sync is triggered?

I was trying to see if I could leverage the timestamp where I would update the timestamp to Now()+Seconds(10) and then have the automation only trigger if Now() was later than or equal to the timestamp, but I haven’t found a way to make that happen.

Oh. This means that my approach doesn’t work reliably then.

You can delay the sync trigger by wrapping your timestamp-updating ModifyRows action with a _Delay(ms), e.g.

ModifyRows(TriggerTable.Timestamp, Now())._Delay(10000)

for a 10 second delay.

This is not going to be much more reliable though. The time of snapshot creation is non-deterministic.

A reliable (but super-mundane) way to approach this would be to set up an automation on the receiving doc to crossdoc-update the timestamp remotely from that doc into this :slight_smile: which would then trigger the automation in this doc to start the sync.

1 Like

I would like to test this but I’m not seeing a way to use a Crossdoc Action in an Automation. Do I need to setup up layer of indirection to support that (e.g. when my receiving table is updated, update a trigger table and have an automation that pushes a button to do the Crossdoc Action to update the timestamp on the remote Trigger table?

Thanks!

Let’s say Doc A is a data source and Doc B is where you have the sync table from Doc A. I see the process like this:

  1. Doc B inserts a row into Doc A through a Cross-doc action
  2. Doc A has an automation listening on new/updated rows in that table you just inserted a row to.
  3. When automation triggers, Doc A updates that trigger cell in Doc B through a Cross-doc ModifyRow action.
  4. Doc B has an automation listening on that cell change, as per the trick. This triggers the startsync/synctabletable scenario, and the table eventually updates.

So the idea is to replace who’s modifying that cell and triggering the sync. Previously it was Doc B who updated that value. Now it should be Doc A remotely into Doc B. This should work because the automation in Doc A will only fire when the added row has actually landed into the snapshot.

Regarding setting this up, I personally always prefer to set up any actions in buttons, not in the automation settings directly. I think editing buttons is easier, partially because it preserves newlines.

3 Likes

Thanks for clarifying @Paul_Danyliuk.

I set this up and it worked. There’s still a bit of delay before all is complete and the sync completes, but it’s working more reliably now.

This is a clever approach - thanks for taking me through it and teaching me many things along the way.

That said, will this delay be a permanent barrier or are there plans to make it more “instantaneous”?

I ask as in my use case, I’m asking a user of the doc for input data, storing that in a remote doc, and then I want to have the user see what they just input via the sync’d table. This is equivalent of a web app presenting a modal form, the user providing some information and clicking “save”, and that information showing up in a table of like information immediately so the user has a sense of their input being “saved”.

If there’s not a more instantaneous sync solution, I may have to remove the “table of like information” so that the user just submits information and it goes off into the ether, but they’re not waiting for it to show up like a traditional web app would (by instantaenously making a call to a backend that will pull the updated data from a database, for example).

Huge thanks again for this clever pro-hack.

I wouldn’t know — I’m not working at Coda :slight_smile: But given how Coda works under the hood (which I had a chance to learn purely by reverse-engineering it myself), I doubt that any improvements to this particular use case will happen anytime soon.

And yeah, if you want to build the UX where the user immediately sees what they have submitted, you should probably build that UI around the “local” table, not the sync one. The user adds the row to that table (via a form-like layout, for example), it gets submitted but they see it already. Then when it’s synced back (into a separate, sync table), it’s looked up into that local table and e.g. a checkbox “Saved” becomes true for that row as a confirmation. Something like that.

The problem with this approach though is that users can delete rows from this local table in Doc B, thinking it would also delete rows remotely on your Doc A.

(and yeah, I’ve built a solution like that for a client a few months ago, and in the end chose to base the end user’s UI on a sync table and just warn them that the data would not appear immediately)

2 Likes

Thanks, at least I’m netting out in the same spot that you ended up, even with starting to warn the users. :slight_smile:

I had the same concern about local edits to the table and that’s why I wanted to source of truth to be the remote table (similar to a remote data store / database concept).

Thanks again for all of your insights!

1 Like

Paul what do you mean by this?

Ah, only that the line breaks in the formula (indented code formatting like I usually do) are gone in automation formulas, making them harder to comprehend and edit afterwards.

ezgif.com-optimize (82)

ahh ok yes! I agree. It would be nice if linebreaks would remain in code blocks here too :slight_smile:

You mean the community? You wrap a code block with three accents before and three accents after, on separate lines:

(also it was another reply to you :wink: )

are there are any problems you can’t solve @Paul_Danyliuk!! :slight_smile: that little nuance has bothered me as I like to give solutions to people and I know how difficult it is to parse/interpret when its new info and hard to read the run on sentence. Thanks again!

This isn’t a supported strategy and is strongly discouraged.

This may break at any time or your docs may have calculations turned off if they include this workaround.

2 Likes

Ben what would be the supported way of having a CrossDoc action sync the table? This is imperative to making a UX a Doc user can intuitively understand. Many times a user doesn’t know or have access to the table that would need to be synced after the CrossDoc action. Thanks!

There isn’t 2-way sync right now. We just don’t support it at the moment. Cross-doc is meant to pull in data on a particular cadence of hourly or daily. I’d work to design your schema around those parameters instead of trying to force a solution that can break.

I struggle with this response as crossdoc already feels like a quite poor solution to a fundamental weakness of Coda.

Serious users of Coda can have some fairly large datasets and many different users need access to some or all of that data but each set of users might operate in different Silos… imagine a large organisation that has sales teams needing access to thousands of clients and an accounts team also needing the same client data however they cannot share the same doc as they each have other private tables that cannot be exposed. The 10,000 row limit is already an issue, and waiting 1 hour for a sync is not practical for many systems.

In reality many Coda users probably dont want sync, they want live access to a shared table, but since they cannot have this, sync becomes a workaround.

There does not appear to be any schema that can overcome this.

2 Likes

I would say try not to read this as the only and permanent solution. Coda is a growing product, and part of what helps us grow as fast as we do is getting solutions out there, taking in feedback, iterating, and improving. After all, how can you build a great product for your customers if you don’t actually build it with your customers?

This isn’t something that can be solved overnight, but we do have our eye on things and the product will continue to improve.

Thank you for letting us know your use-case and needs.

4 Likes

I just tried this, but when the automation is triggered it gets flagged with an error that says “Syncing table from cross-doc failed due to an internal system error”. It triggers the Cross-doc sync in my table, but after the table starts syncing and it says “Loading Table from CrossDoc…” and never actually loads. It just gets stuck there. Any thoughts?