One of the biggest wish list items for all generative AI projects is the ability to utilize the latest Internet content to shape inferences. OpenAI has a plugin, but that pathway is not supported by Coda AI.
Until recently, I typically used to gain access to live web content when building Coda apps that required it. That approach dried up when Google constrained PaLM 2 from access to the live Internet. Its status is beta and subject to these types of abrupt changes. still supports live Internet access without charge, but there’s no Bard-specific API [yet].
Live AI, it seems, is a distant and fleeting mirage for almost everyone who uses Coda. Unless... you get a little creative.
What we need is a Coda AI plugin for live Internet access.
Live Internet Inferences
If you ponder the nature of live Internet-driven AI responses, the objective is relatively simple.
Given a specific query, blend the power of generative AI with the data from a search for that same query.
For example, a simple question like this should be possible in an AI context.
What are some of the newest EV cars?
The AI output should include links with hover behaviors that include images, and videos that play in place. Most important, the content should be recent since the query is literally about the newest EVs on the market.
Other types of queries that require up-to-the-minute knowledge should also be possible like this one.
Show me the closing price of $TSLA on 25-Aug-2023.
As you can see from these simple examples, live AI is possible in Coda AI today.
To achieve this, you need a Pack capable of making search requests and a prompt engineering process that shapes the search results into something that Coda AI can use as data to build an inference output. So that’s what I built for my own internal use.
To validate the approach I created a number of tests that compare Coda AI with Coda AI Live. The contrasts are as stark as they are exciting. In this test harness I deliberately asked the AI fields to include dates so that you can see the modern-day references.
Note: Both Coda AI and Coda AI Live columns use identical prompts with one exception - Coda AI Live includes data from the Pack formula.
The Coda AI responses in these examples underscore the limitation of an LLM that stopped learning a few years ago. Its lack of short term memory wholly eliminates AI from thousands of use cases.
How Did I Make This?
I created a simple that I used in the examples and tests shown above. I used one Pack formula - . The pack provides a secure interface with the platform. You’ll need an and an API key. The free tier provides 100 searches per month. The next rung on the ladder is $50/month for 5,000 searches per month. There are many other search services available that do this including direct API access through , , and which has a bunch of API recipes for gaining legitimate access to different search engine results.
In the test harness shown above, I used the following prompt.
Source Code Openly Available
The and examples is one of those Packs that should be openly available to all Makers to do as they please to enhance and explore new Coda AI possibilities. The source code is amazingly simple; just another validation that the Codans really understand agile architectures.
I decided not to make this a commercial Pack in the Gallery. Sure, it has value and I would have charged about twenty five bucks for it, but this capability represents something much bigger. Even so, there are some very clever Makers in AI learning mode who deserve access to this without additional cost.
I’m happy to take your money, but I realize the economy is tight. I’m more interested in seeing this approach serve as inspiration for other Makers to build great solutions.
I’m also happy to consult on more complex use cases for Coda AI Live. Reach out anytime. I believe there is a horizon of opportunities to build more advanced applications based on this concept.