OpenAI Pack: Problem generating new variant answers to same prompt

I’m loving the OpenAI pack, but I have a couple of questions:

When I re-use a prompt it doesn’t generate a new response but provides the old one as if it was cached. This is even when the temperature is set to 1, which should give creativity. An example prompt I’d find this problem with is “Suggest a name for a company that makes widgets in New York”. I’d expect it to give me new options every time I repeat the API call. Can anyone explain what’s going on?

My second question is how error messages are shown if the API call fails for any reason?

Thanks.

By default packs cache their results for some period of time if they are called with the identical inputs. Usually endpoints return the same results with the same inputs at the same time.

Have you taken a look at our new AI Alpha? Coda AI: the magic of OpenAI and GPT in your Coda docs - Coda

1 Like

To make statistical control charts, please (X and MR specifically). I am confident that I can complete this task in R. Is there a Coda R plugin or another method to accomplish this?

Thanks. This solves it for me I think.

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.