From Your Lips to Coda's Ears

Voice notes and composition anywhere, anytime, and from your phone.

Recently, I explained how I use Tana Capture to capture voice notes from iOS to Coda. This alchemy of tools makes it possible for me to dash off a voice note and I can manually nudge through a workflow such that my voice appears as transcribed text in a Coda document.

There are many issues with this workflow:

  1. I need to push notes like this from Tana to Coda via a Tana-based Supertag and into Coda via a designated webhook. This is extremely rigid and results in more steps to utilize the notes further downstream.

  2. Voice notes are often jumbles of text that need refinement. I could use Tana AI to clean them up, or even Coda AI to shape them upon arrival. But in either case, you don’t often know the nature of the incoming notes and what prompts might work best. The context exists at the capture point (the mobile edge).

I also remarked at the time:

I have a vision of Coda as the entirety of my note-talking/sense-making platform.

I’ve written a lot about note-taking and sense-making, wondering occasionally if there should be note-taking apps at all. PKM (personal knowledge management), is a big industry that never seems to hit the sweet spot for adoption. Vastly, we are always looking for the right vehicle to end the note-taking app search.

Coda could serve as your second brain, thus, ending the note-talking app madness. Connor McCormick has similar visions as expressed in his post about Session Somethings | An Executive Second Brain. I also wrote about it as far back as early 2020 here.

What if you could capture voice notes in Coda’s iOS app directly into any document and use AI to transcribe the notes into instantly useful texts?

Earlier today, Grammarly added Voice Composer to its feature set. As soon as I could reach for my phone, I tested it with Coda and not surprisingly, it works.

I have been using GrammarlyGo since beta and I’ve written about it often because it represented AI accessibility in a common framework across all your apps. I observed that it is a modern AI copilot for anyone who develops collections of words.

Every app on my iPhone and iPad includes access to Grammarly’s features including Coda’s mobile app.

Voice Notes are Ugly

I tapped the mic button and spoke these words. The content is often not recognizable a few weeks later. While the transcription was accurate, it included a number um’s and words that I typically remove in an editing process. But more specifically, it’s a big ball of utterances that are difficult to follow. Voice notes are cool, but they aren’t really usable especially by others until you clean them up.

Grammarly, transformed the text into something usable right inside my Coda document. No post-capture editing. No copying/pasting. No integration through Tana or webhooks.

In the time-tested words of the Mandalorian Creed,

“This is the way.”

But there’s more. Let your imagination run with this capability.

  • In your voice notes you can describe how to classify each note. Capturing voice notes directly to a table affords you the ability to perform all sorts of Coda AI processes on Grammarly-captured notes.
  • Notes can be shared to other documents based on their content and other key markers in the text which are easily teased out Using Coda AI.
  • You could conceivably list tasks in the voice note that Coda AI should perform upon receipt.



I think that different people use notes in different ways. For me, writing notes tends to serve one of two main purposes.

Jotting down facts to use later. This is things like grocery lists, to-do lists, luggage dimensions for different airline carriers, my daughter’s student ID number, etc. These tend to be small, discrete bits of well defined info. I can write these things down fairly quickly—often a copy/paste from the source. These notes don’t have deep meaning. I refer back to these notes when doing the tasks, and then throw them away or otherwise ignore them afterwards. I’ve used a variety of systems over my lifetime, and if AI can give me a better system, yay. (But my current Coda bullet journal method works pretty well.)

Thinking. I write down notes to help me understand and synthesize information. Sometimes these notes take a long time to write, sometimes much longer than the time it took to ingest the initial information. Other times I take these notes in real-time, like when I am at orientation meetings, or in class back when I was a student. I almost never refer back to these notes. If I ever want specific info, I tend to go back to the original source—the book or handout that accompanied the lecture, or the actual thing that we were discussing. Even when I was a high school or college student, I found reviewing my notes to be pointless, so I stopped reviewing them. Every now and then I’d try to look back at my notes, because someone said that reviewing notes was a good study technique, but it was never worthwhile. On the other hand, taking those notes in the first place was vital—it really helped me comprehend the topics.

In a similar vein, I never found looking at anyone else’s notes of this type to be helpful if I had access to the original info. In fact, if I missed a class, I found looking at someone else’s notes for that class was never worth the effort—I would often be more confused or think the wrong things were important; I’d go read the textbook. (When I was in school we had physical textbooks.) Maybe high school students weren’t very good at distilling information into notes, or maybe those notes didn’t work for me because I wasn’t the one who wrote them. (Of course, when I don’t have access to the original info, I used whatever notes people were willing to share.)

I sometimes have a mix of these two approaches. I’ll use notes to think through something and come up with a few key bullet points that I want to use later. Those bullet points can also change when I am in the process of crafting them because the act of crafting the bullet points gives me new thoughts. I’m guessing that your system is designed to use AI to bypass the notes taking aspect of my process—to jump from the transcript of the event to those key bullet points. While it sounds nifty, I have my doubts that it would work for me. I personally use that note taking time to process the information so that I understand the key bullet points, even if I never go back and read anything but the bullet points.

On the other hand, I am a solo developer who values the time I spend rolling ideas around in my head. I am not an executive faced with information overload and a responsibility to deal with it all. If I am overwhelmed with data, I can tell my boss/client/friend that it is too much for me to process.

Processing audio information is also not my preference. I never leave myself voice memos. Even when I “think out loud”, I find the value in the thinking part which leads to my bullet points. If someone else (or AI) created bullet points from my ramblings, I would probably find them about as useful as other people’s class notes back in school.

So I have a hard time picturing your AI system fitting into my workflows.

On the other hand, this system may work better for other people. When I want a set of bullet points from someone else, I don’t care to see the messiness that went into their creation of those bullet points. If using AI gets them to those bullet points better, go for it.


Thanks, Bill - this article was really helpful and Grammarly has solved a major issue for me having to write up interview notes dictated into Otter. Some of the notes then get brought into Coda by cut and paste or PDF upload. How are you directly using Grammarly in Coda? You note that Grammarly “transformed the text into something usable right inside my Coda document.” Thanks again!

1 Like

I was thinking about getting a Grammarly subscription after hearing Bill’s praise, but then It found some posts on this community that implies that Grammarly doesn’t play well with Coda. A bunch of copy paste doesn’t appeal to me. But if it works seamlessly, especially for typos on my phone, I might give it a try.

This post is about using Grammarly in Coda on iOS - it works while editing in Coda on iPhone and iPad. But I also use it in Coda with Chrome - both Grammarly spell and grammar checkers as well as as GrammarlyGo - no issues.

Yes, in the Coda iOS app, the keyboard includes Grammarly options. When you use them it can replace the highlighted text used for an inference. It can also be used in the editing of a document to capture voice, transform it, and insert at the Coda cursor.

It’s possible they were not installing the desktop version - perhaps?

There’s this as well - Grammarly desktop nor Chrome work in this community, which is pretty dumb because it works fine in other communities.

Aha - I’ve only be testing Grammarly today on a MacBook Pro - still it’s already saved me a ton of time cleaning up a dictated set of interview notes, for example. I have no problem with cut & paste and uploading, because we have to do that anyway with the copies of the original documents so users can review the original as well as the summary, when necessary. Thanks!

1 Like

Even when it’s not me doing that, it bothers me. LOL. Are you sure there’s no way to streamline that process to avoid the copy/paste friction?

LOL. It’s no one’s preference actually. The data shows few people work like this because it’s too much trouble. What Grammarly has demonstrated is that maybe the trouble can be eliminated. Will it stick for everyday users? Who knows. It is sticking for me but I’m as much of an outlier as you are.

Yes, this is true in the present and the past. However, it will change in the near future as AI is able to become your second-brain copilot. There will be too many advantages missed if your “notes” (however they may be captured and stored) are not wrapped in what ostensibly represents your knowledge base via an AI agent. The fact that you didn’t mention a single app from the note-taking sense-making genre is a clear indication of my hypothesis -

Is one actually necessary especially in the face of future AI-based knowledge agents that can work anywhere?

I don’t think I can reduce the friction right now. Many of the reports are created outide of Coda, usually in Word, and then brought in, or there are pdfs of actual documents that are uploaded. On fact, for us, that is a one of the powers of Coda. Now, we can have pdfs avaialable in the Coda report on the device rather than having to keep a seperate folder or directory of the documents that has to be transmitted with the report for people to review. Maintaining that folder/directory was a huge problem…

1 Like

Correction - I just got time to try the very simple step of dictation in Grammarly to a Coda text block. work fabulously well! Now, if I could just find a way to make it put in paragraph separations and punctuation the way Notes does: “new paragraph” makes a break; “semi-colon” puts in a semi-colon…

Trying to find a way to dictate a memo from investment company presentation notes without the editing taking longer than the presentation did. Straight dictation plus ChatGPT bullet pointing did a pretty good job - but I want it all in Coda to start with.

Thanks for this Grammarly idea, its been really helpful.


This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.