Aurora - Easily Craft Prompts That SELL

Demo

Introduction

Hey there, fellow enthusiasts of Coda! I’d love to introduce you to a remarkable tool that I believe will revolutionize prompt engineering. Allow me to present Aurora: A Powerful Prompt Crafting Solution.

With Aurora, you can effortlessly generate clear prompts tailored for various AI models, including GPT-4, mid-journey, and more. Furthermore, you can seamlessly collaborate and share these prompts with your team.

GPT

Midjourney

The Crux of Aurora

2 Likes

Yeah, that’s kind’a cool. I guess these generated prompts will work in Coda AI, but most will be unable to access the target models, right?

I assume Coda AI is GPT 3.5, but even when I chose that model, Coda AI will say it can’t access it. I assumed it would be self-aware. Hmmm.

In any case, I gather that the purpose is to use Coda + Aurora to craft prompts and then copy/paste them to these various AI systems?

I also have a submission into the AI at Work-athon so I compared your prompt for the travel guide with my own. Here are the results. They’re similar. Your interface is pretty simple too. Mine focuses only on Coda AI for inference outputs.

1 Like

Yeah, that’s kind’a cool. I guess these generated prompts will work in Coda AI, but most will be unable to access the target models, right?

Absolutely! It’s fascinating to note that GPT models can actually envision what the target model would be like, even when such information isn’t explicitly present within the dataset.

I assume Coda AI is GPT 3.5, but even when I chose that model, Coda AI will say it can’t access it. I assumed it would be self-aware. Hmmm.

I think it is more like GPT-3. But also, it could be 3.5
https://twitter.com/wiseaidev/status/1676848360190361602

In any case, I gather that the purpose is to use Coda + Aurora to craft prompts and then copy/paste them to these various AI systems?

Yup! Even using it to write prompts for Coda AI :slight_smile:

I also have a submission into the AI at Work-athon so I compared your prompt for the travel guide with my own. Here are the results. They’re similar. Your interface is pretty simple too. Mine focuses only on Coda AI for inference outputs.

Interesting! I will have a look.

Edit: I had a look. I like the example you presented in the introductory section of the video. The way you contrasted the act of prompting with the strategic utilization of precisely targeted keywords for search engines is on point :ok_hand: +1.

As for my project, you only need to input the topic, in this case “San Luis Obispo Trip”. And then watch the magic happens. IT will generate different prompt variations.

From there, you are free to chose between 5 prompts variations in this case. Let’s try the first lengthy prompt on Coda AI:

ezgif.com-video-to-gif(14)

Coda AI Result:

Thanks for noticing! I tend to think this approach is more about looking ahead into the abyss of search than it is about keywords per se. I recently wrote about the death of tags which are not largely different from our infatuation with keywords. Keywords may be how we map our mental models to our relationship with LLMs, but I believe that is a preoccupation that will soon end.

Bard already demonstrates the blend of semantic search with LLMs to deliver massively more value. Gemini promises to be explosive in this regard and ChatGPT may ultimately become an interesting footnote in the AI march. Exciting and turbulent times lay ahead for sure.

In an early research attempt I labeled “solve for (x)” using Coda with PaLM 2, I hypothesized that we could control the LLMs’ inclination to confabulate while also enhancing productivity. This was an experimental approach built in Coda that began before Coda AI was released. It used two custom Packs that interfaced with both PaLM 2 and GPT to make it seem like a unified solution. It was clearly the right approach because it leaned on embedding vectors to better understand the cone of user intent. Promptology is an outgrowth of that work, but it did help me understand chained inferences were necessary if we care deeply about bending AI to our objectives. :wink:

LangChain and AutoGPT exist to mitigate the chat and search-level-thrashing used to reach specific objectives that are so common among users. These tools compress time by allowing the LLMs to dynamically determine what needs to be done next, and then doing them for you.

The trouble with this approach - you need to be a developer to leverage them. And despite a lot of noise about shaping these tools for every-day users, let’s just say that even that will be poorly implemented unless the UX is in a form that business users appreciate. Coda is one such form; Google Workspaces is another.

This is clearly the future. It is not unlike the long-in-the-tooth magic that happens behind Google Search. As users, we assume that there’s a one-to-one relationship between our search query and the search engine. This is far from the case. Google has been using AI-like machinery between the query bar and the search machinery for almost a decade.

And unlike ChatGPT (which is a one-to-one relationship), this is not the case with Bard. You can prove this by sending the same query to PaLM 2’s completion API and Bard. You’ll see the stark difference. Google is doing in Bard what it does in search - it’s leveraging its massive data set and 500,000 servers to generate responses that are far better than otherwise possible with LLMs alone.

  • Aurora uses AI itself to generate prompt variations.
  • Promptology asserts a structure to negate the need for prompt variations.

Both approaches have subtle productivity friction. But, this is also to say that soon, neither approach will matter. :wink: Under the covers, the friction will be eliminated.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.