I’d love to be able to pass all the filtered data just by doing table.filter and it can still give me a summary.
I understand a workaround is to create a filtered table and then pass that to the Ai. That adds an extra step and creation of a table that I dont think is necessary.
Thats the thing… I was hoping I wouldn’t have to create a view and skip that step for speed.
Your second part is a nice fix! I guess its the underlying way the coda formula language works so perhaps I’ll never be able to just pass tableName.filter(date=today) and get all of the data returned rather than just display name.
I see why you want to do this, but it’s risky - it won’t scale. Eventually, you will swamp the prompt window and only errors will occur thereafter. My tests have made certain that Coda AI doesn’t manage this to defend the inferencing requests from calls it cannot perform. I think this will change in the future - prompt windows will expand, and Coda AI will get smarter about filters and other aggregations.
For the foreseeable future, aggregations are what we should be throwing at Coda AI, not entire tables.
At the outset, tables are typically not of Parquet format, and as such, the data itself is 90% bigger than it should be. Python data structures are typically arrays of columns and there’s plenty of science to prove they are massively more efficient for AI inferencing. Coda AI should do the same - regardless of the table format, it should transform this data into Parquet because it’s faster, smaller, and generative AI understands it better. Until then, you should be transforming all data into compressed Parquet formats.
I don’t actually recall suggesting this prompt and as I read yours, I get the sense you are self-defeating the intent of Parquet. Here’s why…
Returning the table as parquet that is then converted into JSON, puts all of the bloat back into the table data that you were trying to eliminate with a CSV or other table representation.
I would test accuracy of outputs without converting to JSON just to see if it’s more reliable.