Whether or not to include usage data, like token counts
in the response. If set to true, this will invoke two
additional API calls to fetch token counts after the model
has responded. If streaming is enabled, this will append an
additional chunk containing the token usage at the end of
the stream.
Whether or not to include usage data, like token counts in the response. If set to true, this will invoke two additional API calls to fetch token counts after the model has responded. If streaming is enabled, this will append an additional chunk containing the token usage at the end of the stream.