Launched: AI chat, your new virtual collaborator

Thank you for sharing your feedback! I love these examples :grinning:

2 Likes

Bill, seeing this time-saving quantified is :exploding_head: . I’ll be in touch, I want to send you some Coda swag for sharing this.

I’ll also send Coda swag to anyone else who would be willing to share in the thread how much time they estimate being able to save with Coda AI chat, and how they’ll accomplish that.

2 Likes

Posted to Linked on as well - Bill French on LinkedIn: Some say generative AI will not amount to much if anything. This new Coda…

Ideally, we’ll be able to [someday] log Chat AI activity and harvest analytics so that we can understand exactly what a team is saving.

The approach I’m thinking of is a way to register a Coda Webhook with the Chat prompt window to capture the user’s identity, the prompt, and the outcome. By analyzing the outcome, and if the Insert option was selected, we can better understand productive outcomes.

I am eager to develop some use cases for effective outbound campaign management and sales reporting. If you are bringing in CRM data, call insights, and communication into one table, it should be easy to quickly identify the right customers to get front of in the next few days. You could take it a step further and use AI for the copy for personalized call scripts or emails, and automations to trigger workflows or communications. If you build it correctly, any return communication is then brought back into the table where you could use it to identify which campaigns are gaining traction and which team members are seeing the best results. I can’t even wrap my head around how much time is saved in such a workflow.

watch this space for across all your docs :slight_smile:

3 Likes

Will the new CODA AI integration give us access to the Code Interpreter?

Yeah, it can help. On the other hand, I find that with CFL its mistake rate goes up astronomically. Enough so that it almost doesn’t seem worth using for Coda, for me. Which is quite sad, because it can be extremely helpful in other areas.

Is the AI chat only visible to doc makers? I just tried to share a our company wiki with a colleague however the ai wasn’t available for them to use? Do they have to be logged in?

What permissions need to be available for a doc? I have read-only access to a doc in our workspace and I do not see the Coda chat. I do see Coda chat in other docs where I have edit access.

Got access today :slightly_smiling_face: And it is wonderful. So far in my very brief testing, it seems really good with conversation revolving around semi-structured document-style text, but not so great with structured table data and crafting formulas. To be expected with LLMs, I suppose.

1 Like

That is amazing!! Thank you :slight_smile:

As many of have experienced, generative AI often leaves us wishing for better performance. However, we need to keep a grounded perspective about AI. Here are my thoughts.

It’s Beta

Let’s not forget that Coda released this as a “beta” version which they don’t often do. In this case, the advantages of getting it early are great for what it does well, and good for Coda to get early feedback.

Structured Data

LLMs are not very good with data tables and for many reasons. Bear in mind that a data table with more than 8k bytes is problematic for GPT. But more specifically, data tends to be mostly about numbers and less about text. LLMs are notoriously poor mathematicians. Why they are so terrible at math is understood by knowing a simple fact - the token for “12” is “12”, whereas, the token for “1245” is actually two tokens; “12” and “45”. LLMs are good at predicting the next word, which would be “123”. Imagine doing math with that mental model.

Coda AI will do better with structured data in the future, but a lot has to happen at OpenAI and at Coda. In the meantime, you can improve inferences with some data techniques, but not easily in a chat conversation.

Formulas

I’ve often observed that LLMs have little knowledge of CFL and almost every no-code platform. Airtable and many other systems struggle to use AI to generate formulas and the cause is simple - there is almost zero instances of formula content that an LLM could learn from. There are no public Github projects about CFL. There are millions of public projects about Python.

Until Coda creates a fine-tuned model that fully understands CFL, it could be a while before you can rely on AI to build that perfect formula. In the meantime, you can create learner shorts that will perform pretty well, however, this is not practical in conversational AI.

4 Likes

Considering that Coda Chat can read/access datas within a doc/page and that it allows you to modify those datas (through Insert / Replace), it does make some sense to me that it wouldn’t be available for view only/ comment only access to a doc :thinking:

I mean, a maker somewhere took the time to only give you, personally/privately, un uneditable way to see a doc for some reason(s)…
I would expected Coda Chat to respect the choice made by the maker of the doc, just for privacy concerns :blush:.

In other words, I can only guess that the required level of access to be able to interact with Coda Chat/Coda AI within a doc would be Edit :innocent: .
(Maybe this will change though :woman_shrugging: … As Coda Chat/AI is improved in the future)

1 Like

One of the use cases is to have Coda AI tell you what page a particular bit of information is on. That use case only needs read only access. If you have a long doc with lots of subpages of text, this could be very helpful. It is better than a search because search requires you to know the exact terms.

Say you have a company handbook and you want to find out the policy for vacation days. But the official policy is for personal days off and doesn’t use the word vacation. Your search for “vacation” wouldn’t turn up anything, but I expect AI could quickly direct you to the correct page.

3 Likes

Yep. It’s why I keep shouting about embeddings. :slight_smile:

Similarity search is powerful, but not as powerful as mimicked reasoning. LLMs can’t reason, but they can surmise. Imagine, instead of a search, you describe your situation to the HR AI, and it helps you navigate to the right people, resources, and assistance.

HR copilots are quickly becoming a thing. But copilots are being developed for many functional business segments from learning to knowledge management.

1 Like

Maybe eventually, Coda AI will offer an option for calculating embeddings for a doc. (Or maybe they already do this.) There would need to be some way of identifying when embeddings should be recalculated and restrictions on how frequently they are recalculated. Maybe a manual button push, or maybe when a certain % of the text has changed?

1 Like

The very nature of embedding vectors is that they are constant unless the content itself is modified, and that’s the event trigger you would use to recalculate.

Yeah, but if the whole doc is the content, the content changes as soon as someone types a new character. And I don’t think that pressing a single key is the right trigger. But I’m probably missing something here. I feel like I’ve ventured into waters that are too deep for me right now.

1 Like

Vector embeddings are based on chunks of a large document because you really want discrete findable content. Since Coda has a diff between each version, it knows what chunks have changed or have been added or removed.