Why Prompts SERIOUSLY Matter

Coda AI is such a cool feature. You give it a prompt, and it does some magical stuff.

The big promise of AI is often delivered in anticipatory user experiences; apps that essentially live a few moments in the future (Matt Webb). AI can allow technologies to operate in the very near future, but that’s only our perception. More accurately, we are simply hacking the fact that our brains live in the recent past — our conscious experience is somewhere between milliseconds to tenths-of-seconds behind the unconscious sensory edge of our nervous systems.

This value, however a cerebral hack it may be, is based on the prompt, the string that is attached to this grand promise of artificial general intelligence.

Effective prompts are sometimes simple, but more often, they are not. Let’s take a simple one that falls into the latter category.

What is (1 + 2) plus (3 * 10) divided by 2?

Coda AI says this is the answer:

That’s wrong. Let’s refresh and see what happens.

Better? Nope. What’s happening here?

LLMs act before they think.

This is especially the case with mathematics and word problems. The trick is to get the LLM to think before it acts. The right answer, of course, is 16.5 and here’s how to encourage the LLM to think before it acts. This is achieved by providing a learner example; give it an example that steps it through the logic you want it to follow.

The first reaction you have is - this is not a valid test because I literally gave the AI the answer in my example. Try it with different numbers.

Different equations can be supported if (and only if) you provide additional examples that express greater complexities.

While this is a simple math problem, the key takeaway is the prompt example; it literally tells the AI to step through the logic.

First, do this...
Next, do that...
Finally, do this...

This approach applies to many prompt scenarios. Here’s an example.

First, remember the original text.
Original text:

Next, create a list of key points from the original text.

  Example:
Prompt: ‘I’m calling for three things - transparency, accountability, and leadership.’
Key points:

  1. Transparency
  2. Accountability
  3. Leadership

Prompt: original text

Key points: 

Next, display the original text. Do not show the label ‘Original text:’.

Next, display the key points. Do not show the label ‘Key points:’.

AI Productivity Paradox

As early adopters of AI, we have grand visions of escalating our work output. There’s no shortage of media outlets and multi-message Twitter posts that have convinced the masses that AI makes digital work a breeze. Reality check: it doesn’t.

ChatGPT and Coda AI users typically experience poor results because successful prompts are not as easy to create as you first imagine. How hard can it be, right? It’s just words, eh? The reality is that it is both hard and complex depending on the AI objective.

Two obvious aspects of prompt development are working against us.

  1. Prompt Construction - most of us “wing” it when building prompts.
  2. Prompt Repeatability - most of us are inclined to build AI prompts from scratch every time.

Getting these two dimensions right for any Coda solution takes patience, new knowledge, and a little luck. I assert that …

The vast productivity benefits of AI are initially offset and possibly entirely overshadowed by the corrosive effects of learning how to construct prompts that work to your benefit.

6 Likes

merci for the interesting contribution @Bill_French

for the moment I am working on legal texts that follow a very strict structure, but based on variables like family composition (no kids, one kid, two kids) there is a different writing. We are testing with AI prompts referencing examples to see if we can get very comparable results over and over again. The prompt writing is an art in itself asking for a mix of good writing skills and also quite a bit of CFL back ground. Not at least because we have to feed the prompt properly. We also noticed that the names of the columns matter and that you have to be explicit in telling not to use the instructions in the text. There is a lot to learn.

Cheers, Christiaan

1 Like

Nice writeup but I’m actively triggered by the fact that you say

(1 + 2) plus (3 * 10) divided by 2

is 16.5, while it is actually 18. Division comes first; the above sentence should be parsed as

(1 + 2)
  plus
(
  (3 * 10) divided by 2
)

But overall, maybe we just shouldn’t use the language model for what it’s not meant to be used (calculations?)

1 Like

In formulas, yes. This is not necessarily the case in word problems. Typically, word problems are described in ways that frame steps in the order the author hopes they will be resolved.

LOL. Yes, this is the challenge of teaching the LLM how you want it to parse. This is an ideal example that demonstrates the LLM can be bent to fit the real-world, however that world my not match the world of formulaic interpretation.

1 Like

I’m still puzzled how to best use tables in Coda AI. The window of data in a prompt is constrained, so it’s currently not possible to prime a prompt with a table reference who’s data is of substantial size. It requires some level of data reorg or compression. Including only fields that matter in an inference helps, but even this doesn’t scale well.

I overcome the column name issue by using embeddings to determine a similarity match between the human query and the schema. This allows the solution to use actual schema names despite the user describing it differently.

Since embedfings are not supported in Coda AI, I use PaLM 2 in a Pack. I abide OpenAI embedding a because they are slow compared to PaLM.

BTW, the new compose field is ideal for reshaping a schema for better AI outcomes.

compose can be of help indeed. I used Format() to list data like the names, age, birthday and the legal status (Dutch). Often I add additional word (groups) to help the AI understand better the information I want to get out of it. This list of three is also something the AI can count and it knows that for couples with 3 kids there is a different writing than for two or one child.

the column names help the AI to interpret correctly the data listed in it, I guess

In my opinion you need a decent understanding of the CFL before you can make use of the AI power.

1 Like

Shouldn’t it be possible to entirely negate the issue of language in these types of solutions? Ask in Dutch, get answers in Dutch. Ask in English, get answers in English.

yes I do ask in Dutch, the supportive text is in Dutch and the outcome is in Dutch.

the main challenge is to show the prompt what you want ,certainly in a legal context certain terms have to be used.

1 Like

And does it work in English as well?

yes you can ask in English: make a proper phrase in Dutch including the words ‘plaats’ and ‘gemeente’ to tell where the couple got married and under which conditions. It will output proper Dutch.

1 Like

To be clear, CFL means so many things across the globe. Can you define this for everyone?

yes @Bill_French the Coda Formula Language :wink:

1 Like

I use other websites like https://flowgpt.com/ that have great prompts. I’m a UI/UX Designer, I paste my notes in a table and then I have 6/7 columns for UI/UX related jobs, like copywriting, etc. using Coda AI.

But yeah, the prompt does matter a lot, for Coda and for all other prompt related AI.

1 Like

When I see the word “paste”, it triggers a flash to The Mandelorian’s creed - “This is the Way”.

I’m pretty sure copy/paste is not the way. :slight_smile:

But it’s still early. AI will become more automated soon enough.

Of course it’s not the way, I hope it changes soon but it comes with being an early explorer. It is the best way, so far. I didn’t mention that my notes are already the sum of another AI where I speak and it turns it into organized notes, so audio to text (AudioPen.ai), then paste in Coda and then I can trigger the other 6/7 AI columns.

My UI/UX Dream
It would be awesome If I could have a Mic button on Coda that would listen, transcribe, improve and then the other 6/7 AI Columns could be fired as soon as the main column has content added. This would streamline the process but I’m not sure about Coda plans or AI Packs to add voice recording and transcription by using Whisper from Open AI, for example.

1 Like

You just described exactly my process with Tana. Tana Capture, the mobile app (make sure you watch the brief video in the tweet), was designed with exactly this workflow in mind. I’ve had great success with it while also using Tana’s internal features to call out to Coda webhooks.

I’ve written a bit about Tana here and here, and I even built a few Tana Paste integrations that are automated from Coda.

The Codans (mobile team) is constantly pressured (by us) to build a mobile-suitable replica of Coda, but this may not be the right near-term goal. Coda is in a very good position to use Coda’s mobile app to capture and transcribe voice into content. They should do that now. Tana did it and it paid off.

Separately, I loathe the fact that I need both Tana and Coda. I have a vision of Coda as the entirety of my note-talking/sense-making platform. When will they realize they can exist at the edge of the funnel and all the way down?

1 Like

Thanks Bill for suggesting Tana. Seems to be a more refined workflow version than what I employ currently using AI tools like AudioPen and Coda. I appreciate how AudioPen automatically refines my notes into clear text, even mimicking my writing style with just a few paragraphs input of my own. It’s efficient at transcribing thoughts to text rapidly, leading me to the next step of the process.

I typically paste these refined texts into a Coda document where various columns utilise different AI formulas based on individual prompts. This helps break down the clean text into actionable items depending on what needs doing with it. As a UI/UX designer, one column generates text copy material while another summarises my notes succinctly in TLDR style.

However, returning to Tana - its integrated approach appeals to me in terms of connecting everything together through interlocking AI’s. I signed up for early access already, I await to explore more about Tana and possible integrations you’ve mentioned.

This prospect in the evolving landscape of AI and integrations between them seems promising indeed. Anticipating further advancements that could unlock new workflows is an exciting concept.

Tana does the same - fully integrated with AI. I looked to see if they have added any new invites to my account - still zero.

What you’ll discover (with some effort) is that Tana is able to do all of that using its deep bench of automations and custom commands using Supertags. I’m often torn between automation at the edge, or deep inside Coda. It’s nice to have the choice, but I’d rather eliminate Tana altogether and use Coda from the top of the note-taking/sense-making funnel to the bottom.

In this example you didn’t just tell the AI to step through the logic. You told the AI what the logic is. In the past you’ve written about prompting AI to do something step by step, but not actually write out all the steps. This feels different because you are providing more scaffolding in how to break down the logic.

It reminds me a lot of teaching kids. I might tell a teenager to stop and think things through before acting, and that is enough. For an elementary age kid, I might lead the kid through Socratic questioning. But a preschooler needs explicit instruction and safety guardrails.

I also think that figuring out the logic is the main hallmark of a developer. A developer figures out the logic given the requirements and constraints of the system, and tells the computer to go forth and continue to apply that logic.

Computers are really good at applying logic. I don’t know enough about AI to state how good AI is at coming up with complex logic to begin with.

Indeed. It’s not a reasoning engine despite the sensation that it reasons from time to time.

AGI is an adolescent; age doesn’t really matter. :wink: But, as you know, even an adolescent can be shown a repeatable pattern and they catch on pretty quick. Just ask the manufacturing line leads for Asia’s fashion industry.