One of the biggest wish list items for generative AI projects is the ability to utilize current Internet content to shape inference results. No one wants to chat with a smarty pants who just came out of a two-year coma.
OpenAI has a plugin, but that pathway is not supported by Coda AI.
Until recently, I typically used Google’s PaLM 2 LLM to gain access to live web content when building Coda apps that required it. That approach dried up when Google constrained PaLM 2 from access to the live Internet. Its status is beta and subject to these types of abrupt changes. Bard still supports live Internet access without charge, but there’s no Bard-specific API [yet].
Live AI, it seems, is a distant and fleeting mirage for almost everyone using Coda. Unless… you get a little creative.
What we need is a Coda AI plugin for live Internet access.
Live Internet Inferences
If you ponder the nature of live Internet-driven AI responses, the objective is relatively simple.
Given a specific query, blend the power of generative AI with the data from a search for that same query.
For example, a simple question like this should be possible in a generative AI context.
What are some of the newest EV cars?
The AI output should include links with hover behaviors that include images, and videos that play in place. Most important, the content should be recent since the query is literally about the newest EVs on the market.
Other types of queries that require up-to-the-minute knowledge should also be possible like this one.
Show me the closing price of $TSLA on 25-Aug-2023.
The Coda AI responses in these examples underscore the limitation of an LLM that stopped learning a few years ago. Its lack of short term memory wholly eliminates AI from thousands of use cases.
These comparative tests contrast Coda AI’s results with Coda AI Live results. The differences are as stark as they are exciting. In this test harness I deliberately asked the AI fields to include dates so that you can see the modern-day references - some literally less than 24 hours ago.
Here’s how I did it. Enjoy…