Skip to main content

Set the cut-off limit for GPT functions in GPT for Sheets

Set a cut-off limit for the responses of the GPT functions in the current spreadsheet. This won't shape the response, but if the response goes beyond this limit, it will be truncated. This setting doesn't affect bulk AI tools.

TermDefinition
TokenTokens can be thought of as pieces of words. During processing, the language model breaks down both the input (prompt) and the output (result) texts into smaller units called tokens. Tokens generally correspond to ~4 characters of common English text. So 100 tokens are approximately worth 75 words. See how text is split into tokens.
Context windowTotal amount of tokens that the model can consider at one time, including input (prompt) and output (result). The context window size depends on the model used.
Max outputMaximum number of tokens that can be generated in the output by a given model. Max output is typically much lower than Context window.
Cut-off limitMaximum size of the result in GPT for Sheets, measured in tokens. If the result is larger than this limit, it will be truncated. Helps control cost and speed. Cut-off limit is set 200 tokens below Max output.
  1. In the sidebar, select GPT functions, and click Model settings.

    Model settings in GPT for Sheets
  2. Set Cut-off limit as follows:

    • If you expect short responses, lower the limit to get faster responses.

    • If you expect long responses, increase the limit to make sure they are not truncated.

    • If your response is truncated, increase the limit.

    Use the slider to set a cut-off limit

GPT for Sheets now applies the cut-off limit to all GPT formula results.

info

When cache is enabled, existing GPT formulas will not automatically update to the new cut-off limit when you re-execute them.

To re-execute existing formulas with a different cut-off limit, you can either:

What's next