Democratized AI: Its Time

Coda AI has given us a glimpse of the possibilities of using generative AI to achieve improved productivity and creativity. You may know it is based on OpenAI’s APIs and, therefore, constrained to GPT models, a limiting envelope of AGI.

Even Microsoft, which is deeply invested in OpenAI, has broadened its LLM support by embracing Meta’s new commercial Llama 2 in a very important partnership.

Meta and Microsoft share a commitment to democratizing AI and its benefits and we are excited that Meta is taking an open approach with Llama 2. We offer developers choice in the types of models they build on, supporting open and frontier models and are thrilled to be Meta’s preferred partner as they release their new version of Llama 2 to commercial customers for the first time.

There are many other seriously powerful commercial and open-source contenders that perform exceedingly better in some use cases compared to OpenAI’s models. Opt, MPT-7B, GPT4All, and GPT-J each provide significant advantages depending on your AI objectives.

I have built Coda systems for internal use that lean on PaLM 2 (Google), MPT-7B, and GPT-J in one solution. These were the ideal LLM mix to accomplish a few critical business requirements. Blending these models into a unified application requires Packs (of course) and this leads me to problem #1 for Coda’s march toward democratized AI.

One Pack, Multiple Endpoints

I complained about this constraint on day-one of the Packs beta launch. I warned the Codans that binding a pack to a single endpoint would need to be lifted. With the advent of a fractured and broadly diversified LLM landscape, this is even more crucial.

Imagine a vector embedding process that begins in in PaLM 2 that needs to trigger a dependent text completion process in Llama 2. This is impossible to achieve in real-time with Coda Packs. Meeting this requirement in Coda with multiple packs is complex and convoluted. Some might say it’s impossible.

I don’t see how Coda can continue to stand on a Pack architecture whose sandbox depends on a single domain.

Coda AI unbound to a Specific Model

When there was only one viable LLM for commercial use, this was a reasonable choice. That ship has sailed and it’s not coming back. The landscape of LLMs with domain-specific advantages will only fracture more and soon, there will be a vast landscape of options to choose from accomplish our best AI work.

I’m not insensitive to the complexities and configuration issues that come along with this requirement, but I don’t see any other pathway for the future of Coda AI. It must allow no-code developers the choice of LLMs at a product, document, page, and component level.

Coda AI Primitives Exposed

From the first availability of Coda AI alpha, I immediately wanted the ability to access inferencing from a Pack or a formula. Imagine a chained process where the output of each inference is an input to another inference. This might occur in a formula, an automation, or a Pack. But without access to primitive AI functions, it cannot be achieved without a lot of work and dependency timing that are difficult to control.

The future of AGI is not one where a single prompt and response dominates the underlying AI solution. Increasingly, interchanges between users and LLMs will be conversational and AI automation almost certainly will require autonomous agents that look inward (into the LLM) for its own guidance.

Unless the internal Coda AI capabilities are exposed, we will begin to see AI solutions emerge that don’t actually use any Coda AI features. AI Truth Monitoring and Fact Checking is an example of exactly this. I was going to submit it as a Coda AI At Work contender, but it was built with Google PaLM 2 because you can’t expect an LLM to check its own work and be unbiased about it.

Coda AI: Perfect Integration

The AI veneer the Codans built is among the best I’ve seen anywhere. I suspect they have even more plans to embrace the future of generative AI in ways that will create natural and seamless intelligence inside Coda solutions.

2 Likes

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.