For those who have not seen the latest news, OpenAI now allows us to make our own GPT Assistants directly at chat.openai.com
The idea is that we can setup an assistant capable of doing many things, like internet search and function calling.
We are able to add actions to our own GPT, like actions to get information from a website. Notice the print below, they have a weather report example where they provide a Schema so that GPT understands how to call the GetCurrentWeather function when needed.
They also allow us to setup our authentications
Coda Webhook Automations
I noticed Coda automations follow the same structure as they can provide webhook URLs and API tokens to authenticate POST methods with JSON payloads to our Coda documents.
I’m not a developer but I tried to follow instructions and used robots to figure out how to setup a Schema (like in their weather example) that enables the GPT to call a POST function with some data. But it has proven to be a challenging tasks. Since I’m not a developer, I was wondering if this only works if I make a GPT Plugin, or else I could actually make a schema that works.
If there is someone in the community that successfully setup a GPT able to send data to a Coda document through webhooks, I’d appreaciate your help.
My goal is to make an assistant capable of discussing our projects with us and then, when we’re done talking, I ask it to send a list of items paired with content based on our discussions to a Coda document. All of that without leaving the context of chat.openai.com
Here are some useful links on the matter:
- Function Calling documentation
- Trigger WebHook in Coda
- Playground for Assistant testing
- OpenAPI Builder (a GPT that helps make schemas)
Thank you for your interest.