[Pack Alpha] Doc as API: Turn any doc into a mini server

Hiya!

Here’s a little thing I’m building. It is a pack/service that allows turning your doc into a REST API server.

ezgif.com-optimize (19)

With this thing you won’t just be able to receive requests through the webhooks mechanism, but also reply with arbitrary responses. To the API client this will look like a single HTTP request/response cycle. And my server will take care of holding the connection until Coda queues, runs, and finally responds to the request (basically turn asynchronous calls into synchronous):

So yeah, no black magic, just a Coda Webhook automation and a Pack Action to send back the feedback, and my code in the middle to connect the two. But what possibilities!

  • If you already have all your life/business on Coda, this will help you quickly hook it up with some other service, using your doc as a database. Previously you had to either query data from the doc through the Coda API (and build the connector to decode raw /rows data) or build a pack to push data from Coda with an action. Now you can trigger a webhook and have it return live (not snapshotted!) piece of data, whatever you command it to, and even with filtering and security if needed!

  • You set up a chatbot like a Whatsapp bot directly in Coda. Whatsapp delivers messages through webhook but requires custom responses — so previously you had to build an actual middleware backend. Now you can spin up integrations like that quicker.

  • and a lot of other uses I’m yet to find out.

Of course, this is not intended for heavy use. First, it already takes a few seconds on an empty doc — and the larger the doc will get, the longer it will take to process the request (sometimes a minute!) And second, Coda API has its quotas too. So yes, the indended use is to quickly spin up a request/response server when you need to query the doc only occasionally, but NEVER to make an impression that Coda can replace an actual performant and scalable database and/or server.


Now, I’m only working on this now and just got my first proof of concept to work. While I’m building this I’d like to learn how many of you would be interested in this. This will be a premium pack — operating costs are not zero and Google will still bill me for idling time while my server waits on responses from Coda. Maybe I won’t use maker billing but rather quote/charge separately, or maybe I will to reduce friction. Most likely the response-sending pack will be free and the admin pack (the one used to set up the API; you can install it on e.g. your own one-person workspace) will be the paid one. In any case, if you’re interested, please fill out the form below:

Cheers!

3 Likes

4 AM but making good progress on this.

Here’s a sample To-do app. The API is completely coded in Coda in a single automation function.
(whose neat formatting got broken because of a bug)

Cool stuff Paul!

I’m curious, why not use something like Replit or Val Town for the server component?

1 Like

Why use Replit or Val Town, or any other premium-priced no/low-code wrapper over the usual lambdas if I could just directly code my lambdas?

By server I colloquially meant Firebase / Google Cloud Functions, which is basically Google’s Lambdas.

(I haven’t even heard about replit or val town)

P.S. I looked them up. Fun stuff, val town looks interesting!

That said, I don’t think it’s avoidable to get wall time billing or quotas with these services. You cannot get the HTTP request in one lambda execution and respond to it in another — you must finish the req/res cycle or reset the connection. So you still have to ‘block’ (synchronize) on receiving the response from Coda and only then releasing the lambda to send the response and finish running.


P.P.S. I mean, it’s my service that’s the val town for Coda; the purpose of both is to just cook up some interim API that’s not really meant to be a real production grade one. But to power a service like that, you’d rather code it with something solid. Firebase IMO is the closest to writing a proper backend without all the devops hell.

1 Like

I think Firebase Cloud Funtuons are a fine choice. Are you using Google’s new IDX as the IDE, or something else? I heard a rumor that Replit may eventually be folded into IDX. Google and Replit are very tight apparently.

Nah, good old VSCode and deploying with commands. Pretty much used to it already tbh. I’ve been doing a lot of lambdaing with node and firebase lately.

I guess I’m becoming one of those “in my times…” old folks who reject modernity, lol. I still write my code myself, no co-pilots. Although I should try it sometime.

1 Like

Here’s some practical results already :slight_smile:

I wrote a webhook for WhatsApp. I managed to authenticate with Whatsapp directly from Coda and also start receiving incoming messages. From here, replying to messages is a piece of cake (like with a regular-made pack to trigger some facebook endpoints)


Also on a lookout for the best name for this pack. Cast your votes or suggest your own options:

  • Doc as API
  • Backend / Backendify
  • Endpoint / Endpointify
  • Req/Res
  • Webhook Response
  • HTTP App
0 voters