OpenAI API

ChatCompletionStreamOptions

object

Options for streaming response. Only set this when you set stream: true.

Default:null

include_usageboolean

If set, an additional chunk will be streamed before the data: [DONE] message. The usage field on this chunk shows the token usage statistics for the entire request, and the choices field will always be an empty array. All other chunks will also include a usage field, but with a null value.

Example

ChatCompletionStreamResponseDelta

object

A chat completion delta generated by streamed model responses.

contentstring

The contents of the chunk message.

function_callobjectDEPRECATED

Deprecated and replaced by tool_calls. The name and arguments of a function that should be called, as generated by the model.

Show Child Parameters
tool_callsarray[object]
Show Child Parameters
rolestring

The role of the author of this message.

Allowed values:developersystemuserassistanttool

refusalstring

The refusal message generated by the model.

Example

ChatCompletionTokenLogprob

object
tokenstringrequired

The token.

logprobnumberrequired

The log probability of this token, if it is within the top 20 most likely tokens. Otherwise, the value -9999.0 is used to signify that the token is very unlikely.

bytesarray[integer]required

A list of integers representing the UTF-8 bytes representation of the token. Useful in instances where characters are represented by multiple tokens and their byte representations must be combined to generate the correct text representation. Can be null if there is no bytes representation for the token.

top_logprobsarray[object]required

List of the most likely tokens and their log probability, at this token position. In rare cases, there may be fewer than the number of requested top_logprobs returned.

Show Child Parameters
Example

ChatCompletionTool

object
typestringrequired

The type of the tool. Currently, only function is supported.

Allowed values:function

functionobjectrequired
Show Child Parameters
Example

ChatCompletionToolChoiceOption

Controls which (if any) tool is called by the model.
none means the model will not call any tool and instead generates a message.
auto means the model can pick between generating a message or calling one or more tools.
required means the model must call one or more tools.
Specifying a particular tool via {"type": "function", "function": {"name": "my_function"}} forces the model to call that tool.

none is the default when no tools are present. auto is the default if tools are present.

One Of