If I submit an AI prompt that can produce results of indeterminate size, from a one-liner to a 50-page eBook with text, images, and formatting, what’s the best way to handle that?
Maybe it can be stuffed into a Zip file that’s then downloaded when it’s finished?
Can this be done asynchronously with a call-back or something when the results are complete?
It doesn’t matter how large the response is - it could be a single line or a large block of text with tens of thousands of characters - it will still be returned as a single string variable. So, essentially, you don’t need to do anything special on your side.
The output size is usually limited by the number of output tokens supported by the AI model itself, but that’s a limitation of the model, not our platform.
So to answer your question - by default, regardless of the output size, everything will always be returned in the same variable. If you have any further questions, I’ll be happy to help.