OptionalapiThe API key to use.
OptionalmodelThe name of the model to use.
OptionalstreamWhether or not to include token usage when streaming.
This will include an extra chunk at the end of the stream
with eventType: "stream-end" and the token usage in
usage_metadata.
OptionalstreamingWhether or not to stream the response.
OptionaltemperatureWhat sampling temperature to use, between 0.0 and 2.0. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.
Input interface for ChatCohere