The Chatbot is Dead. Long Live the Orchestrator

While the world was distracted by talking dolls, Grammarly quietly built an AI that could act. They didn’t use a better model. They used a better weapon: Coda.

Your Prompts Are a Prayer to an Amnesiac God

Let’s be brutally honest. The entire AI industry is captivated by a lie. We’ve anointed “prompt engineering” as a mystical art, a high priesthood for coaxing wisdom from silicon gods.
It’s a sham. We are meticulously polishing the conversational skills of a machine with terminal amnesia.

We were promised an omniscient partner, an AI co-pilot. What we got was a brilliant intern with no long-term memory, an entity we must re-brief from scratch every five minutes. This isn’t productivity. It’s digital babysitting for a machine that can recite Shakespeare but struggles to remember your name or perform precise calculations.

Today’s large language models are amnesiacs by design. We celebrate their ballooning context windows—a million, two million tokens—as if it were memory. It’s not. It’s a bigger notepad. A volatile transcript that dissolves into the ether the moment you close the tab. This architecture condemns us to a state of digital shrapnel: your project plan lives in Slack, your research is scattered across browser tabs, your decisions are buried in email, and your draft is in a doc. We ask our AI to be intelligent, but we force it to operate blindfolded, guessing the shape of our work by touching one disconnected piece at a time.

This makes you the AI’s external hard drive. You are the connective tissue. You perform the soul-crushing labor of copying, pasting, and re-explaining context, bridging the gap with every single query. This isn’t a feature. It’s a catastrophic, unforgivable design flaw.

Stop Describing. Start Commanding.

For a moment, I thought the answer was contexting—the architectural discipline of curating a rich data environment for an AI agent. It was the right instinct, but the wrong verb. It was a step away from the vacant art of prompting, but it was still just talking at the machine.

You cannot build a skyscraper by describing it to a pile of bricks, no matter how eloquently. You need an architectural plan, a crane, and a crew. Prompting is the description. Coda is the crane and the blueprint. It transforms your context from a static pile of information into a dynamic set of executable instructions.

The true frontier isn’t what an AI knows. It’s what it can do. This demands a new class of software: an Agentic Orchestration Framework. A system that doesn’t just talk, but commands, coordinates, and executes. It directs multiple specialized agents—AI, automation, and human—across complex, multi-step workflows with unwavering precision.

The archetype for this framework has been hiding in plain sight: Coda.

Forget the “all-in-one doc” marketing. That was the Trojan horse. Coda is a workflow engine for manufacturing bespoke, agentic software without writing a line of code. Its architecture is the very blueprint for orchestration:

  • Docs as the Command Center: The unified surface where human intent and AI execution converge.
  • Structured Tables as the Memory: A shared, persistent “brain” that provides unwavering context, rendering the amnesiac chat log obsolete.
  • Packs as the Limbs: The API-driven connectors that give agents power over the real world—to manipulate Google Calendar, create Jira tickets, or rewrite Salesforce records.
  • Automations as the Nervous System: The rule-based engine that triggers actions and executes entire workflows with inhuman speed and reliability.

This was never a document app. It’s a factory for building intelligent actors.

The Grammarly Gambit: An Empire in Two Moves

While the market obsessed over chatbot demos and press releases, Grammarly executed a two-step strategic coup to build the world’s first true agentic productivity platform. Anyone who saw these as unrelated acquisitions wasn’t just missing the story; they were illiterate in the language of power.

Move 1 (Acquire the Brain) : Seize the Orchestration Framework
In late 2024, Grammarly acquired Coda. They didn’t buy a popular doc app. They bought the operating system for their future AI agents. This was the foundational act of war. As undeniable proof, Coda founder Shishir Mehrotra wasn’t just given a board seat; he was installed as Grammarly’s new CEO. He isn’t running a company; he is performing a hostile takeover of its DNA, injecting Coda’s agentic framework into a platform with 40 million daily active users.

Move 2 (Conquer the Battlefield): Seize the Critical Interface
Months later, Grammarly acquired Superhuman. This was not about adding a slick email client. Superhuman is what Mehrotra calls the “perfect staging ground for orchestrating multiple AI agents simultaneously.” Email is the chaotic nexus where work, communication, and tasks collide. Grammarly didn’t buy Superhuman for its pathetic summarization features; they bought the most valuable turf in professional life to serve as the GUI for their Coda-powered agentic backend.
Imagine it: A sales agent, a support agent, and a scheduling agent collaborating within a single email draft, orchestrated by the Coda engine, pulling live context from connected Packs, executing tasks across a dozen SaaS apps. That is the gambit.

Your AI Stack is a Museum Piece

The chasm between the dying paradigm and the emerging one is not an increment; it is a cliff. One is a toy, the other is a weapon. One talks, the other acts. The stack Grammarly is building does not compete with the old one. It renders it irrelevant.

The Moat Isn’t the Model; It’s the Machine

The winners of the AI war will not be the companies with the largest language model. That is a commodity race to the bottom. They will be the ones who own the orchestration framework that makes those models act. Building a better chatbot today is like perfecting the horse-drawn carriage in the age of the automobile. The real innovation was the assembly line and the highway system.
The only defensible moat is the machine: the framework that enables action, the integrations that give it reach, the structured data that serves as its memory, and the user workflows that become its territory.

Stop asking if your AI is smart. Start demanding that it act. Stop celebrating prompts. Start building engines of execution.

The future of work isn’t a conversation. It’s a command. The companies that understand this are building empires. The rest are polishing tombstones.

ps. A warm welcome to Rahul and the SuperHuman team.

15 Likes

thx for sharing your ideas. interesting & promising.
what do you believe would be a next tool to acquire @Bill_French ?

2 Likes

my $$ is on reclaim or motion… something along the lines of calendar/task management :sweat_smile:

edit: oh, cannot be reclaim as they got bought by dropbox

1 Like

I’m uncertain, but I have to believe someone at Grammarly (who is apparently really dialed in to the way I think) is watching Pieces, Flowith, and Dia.

5 Likes

Shishir keeps talking about ‘surfaces’.
Email is the most frequent surface for Grammarly usage (=>Superhuman)
Documents and sheets are another major surface (=>Coda)

So whats the next-biggest surface where we spend our time?

How about messaging and chats and forums?

So maybe Grammarly has it’s eye on one of those?
Not the major players maybe, but a startup with oodles of innovation and AI expertise?

Another key criteria for these acquisitions is a shared vision.
It was highlighted during the Coda deal.
It’s highlighted again by Shishir and Rahul during the Superhuman deal.

So the next target will have a messaging surface, an AI focus, and a clear vision that matches that of Shashir, Rahul, and their teams.

I dont know the marketplace well enough to identify candidates, but perhaps someone reading this does?

Just a thought.

Max

2 Likes

As always, I’ve greatly enjoyed reading your perspective, @Bill_French, thanks for sharing!

I do remember from past conversations your concerns about Coda Pack’s limited support for common integration patterns (eg handling incoming webhooks, running persistent background services, or responding to external events)

In your post above, you envision Coda as the blueprint for a new kind of orchestration framework.

Do you think what once seemed like a limitation (Packs’ closed and controlled architecture) is now less of a drawback?

Or do you even see it potentially becoming a competitive advantage as the focus shifts from open interoperability to orchestrated execution?

Thanks,
Nina

4 Likes

The reason I’m not yet bullish on Coda’s future is due to one key problem that LLMs still don’t solve particularly well: data collection. Specifically, I’ve been thinking a lot about the following limitations of Coda, as I understand its vision going forward:

  1. Collecting data by typing into Coda’s quadrangle notes (cells) is too slow and impractical. Instead, data should come from voice/video meetings and recordings, mobile/desktop screen activity, linked/shared content sources, mobile phone conversations, and visual/audible/touch perception input devices. Pushing data from hardware devices in real time becomes even more critical than pulling it from structured database, because if data isn’t captured quickly enough, the use of the database can become unreliable and irrelevant even within a minute.
  2. Email is no longer a practical communication channel - it belongs to the past. As someone who has long supported email, it’s hard for me to admit this, but emails are now like the written contracts of a previous era. Today, very little needs to be formally written when people prefer adaptable, relevant content in audio/video formats.
  3. The joint venture company’s language support is essentially limited to English at the moment. In an era of LLMs—where tokenization of language is foundational to progress—this is concerning. Many LLMs still perform poorly in languages other than English. What does it mean to build a great app for the UK but a terrible one for France, simply because UX in the UK relies on quick chat-based communication, while in France it might depend on elaborate instructions—or worse, traditional filters and search?

My understanding of LLMs and Coda’s vision is limited, so I’d really appreciate the community’s thoughts on these points.

1 Like

HI @Stefan_Stoyanov

Thanks for sharing your thoughts, those are valid concerns. Just so you know, I don’t have access to any internal “cookbook” or strategic plans. These are simply my best guesses and interpretations based on what’s publicly available and general market trends.

Regarding your first question about data collection, there are already many AI-powered tools out there that are really good at capturing voice from meetings, whether you’re in the room or online. These tools can pull out and organize key information, remarks, and action items from conversations. When we think about the “surfaces” and “orchestration” Grammarly is building, meeting platforms are a big communication area. It makes sense that Grammarly might look to acquire a company in this field. They’d likely be looking for strong AI features, existing connections to other tools, potential for their “agents” to expand, and solid support for many languages. Companies like Sembly AI , Fireflies.ai , Otter.ai , or MeetGeek come to mind.

As for your comment on email, that’s an an interesting thought. While other ways of communicating have certainly grown, email is still a core, if not the core, communication hub for many businesses. That’s actually why Grammarly bought Superhuman. It’s not just about email itself; it’s also about the related areas Superhuman is working on, like chats, calendars, and tasks (direct overlap with Coda). The whole idea is that all these interactions can be managed and brought together within Coda, building out an AI-powered productivity system that covers all the main places where work happens.

Finally, on your point about language support for LLMs, Superhuman already includes AI-driven translation right within its email platform. This feature allows for easy translation of messages, which is super important for international teams and global operations. This capability directly tackles language barriers in a crucial communication space, showing that Grammarly is focused on making its AI tools truly effective for a wide range of users worldwide.

welcoming further feedback.

Chers, christiaan

3 Likes

Indeed. It’s why I use Pieces. It captures everything, and I can use it as a long-term memory or a funnel into Coda and other tools. Capturing contexts is probably where Grammarly is heading.

For approximately 2.4 billion people, it remains practical in most cases. It’s not going anywhere, at least not in your lifetime. If you see ‘email’ when you see Superhuman, you need to recalibrate. The AI engine in this tool was perfected in one of the harshest environments, but it is applicable in many operational domains. They didn’t buy Superhuman to support email better.

This is not Grammarly’s problem to solve. I think some of the higher-level executives and engineers have seen solutions that will address the multilingual challenges.

1 Like

Nina, thank you for the thoughtful words and for engaging so directly with these architectural questions.

To your point: Packs, as currently conceived, remain constrained by several limitations I’ve outlined over time—chief among them, their closed nature and limited support for dynamic integration patterns. These constraints have historically hampered Coda’s ability to serve as a true orchestration hub, especially when it comes to handling real-world, real-time data flows.

However, with Grammarly’s expanding reach across critical “surfaces”—and their likely trajectory toward integrating voice, vision, and other modalities—there’s a strong possibility that Packs will evolve. In this emerging context, Packs could become the ideal conduit for retrieving context-rich, multimodal data from the broader Grammarly ecosystem. Rather than being a bottleneck, Packs may soon serve as the connective tissue between Coda’s orchestration framework and the vast, real-time data streams generated by Grammarly’s “mothership.”

The challenge, and the opportunity, is for Packs to move beyond their current boundaries—adapting to new integration standards and supporting the kind of pervasive, intelligent data exchange that true agentic orchestration demands.

1 Like

Perhaps relevant - I can go from email to LLM to Coda without ever copying/pasting, ChatGPT, or any intermediate tool. Dia carries the weight of shortening the line from information to curation.

1 Like