Skip to content

OpenAI

OpenAIResource

OpenAIResource(client=None)

Base class for OpenAI resources.

ChatCompletionResource

ChatCompletionResource(*, client=None, model='gpt-3.5-turbo', stream=False, system=None, **create_kwargs)

Bases: OpenAIResource

OpenAIResource class for chat completions.

Constructor method.

Parameters:

Name Type Description Default
client AsyncOpenAI

An AsyncOpenAI instance.

None
model str

The model to use for completions.

'gpt-3.5-turbo'
stream bool

Whether to stream completions.

False
system str

A system message to prepend to the messages.

None
**create_kwargs dict[str, Any]

Keyword arguments to pass to the chat.completions.create method.

{}

stream_response async

stream_response(messages)

Stream chat completions.

If stream attribute is False, the generator will yield only one completion. Otherwise, it will yield chunk completions.

Parameters:

Name Type Description Default
messages list[dict]

A list of messages to use for the completion. message format: {"role": "user", "content": "Hello, world!"}

required

OpenAIAPIRoute

OpenAIAPIRoute(path, endpoint, *, response_model=Default(None), **kwargs)

Bases: APIRoute

APIRoute class for OpenAI resources.

Constructor method.

Parameters:

Name Type Description Default
path str

The path for the route.

required
endpoint Callable[..., Any]

The endpoint to call when the route is requested.

required
response_model Any

The response model to use for the route.

Default(None)
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

OpenAIAPIWebSocketRoute

OpenAIAPIWebSocketRoute(path, endpoint, *, name=None, **kwargs)

Bases: APIWebSocketRoute

APIWebSocketRoute class for OpenAI resources.

Constructor method.

Parameters:

Name Type Description Default
path str

The path for the route.

required
endpoint Callable[..., Any]

The endpoint to call when the route is requested.

required
name Optional[str]

The name of the route.

None
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

OpenAIAPIRouter

OpenAIAPIRouter(*, route_class=OpenAIAPIRoute, **kwargs)

Bases: APIRouter

APIRouter class for OpenAI resources.

StreamingResponse

StreamingResponse(resource, messages, *args, **kwargs)

Bases: StreamingResponse

StreamingResponse class for OpenAI resources.

Constructor method.

Parameters:

Name Type Description Default
resource OpenAIResource

An OpenAIResource instance.

required
messages list[Message]

A list of Message instances.

required
*args Any

Positional arguments to pass to the parent constructor.

()
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

stream_response async

stream_response(send)

Stream chat completions.

If an exception occurs while iterating over the OpenAI resource, an internal server error is sent to the client.

Parameters:

Name Type Description Default
send Send

The ASGI send callable.

required

Depends

Depends(dependency, *, dependency_kwargs={}, use_cache=True)

Dependency injection for OpenAI.

Parameters:

Name Type Description Default
dependency Optional[Callable[..., Any]]

a "dependable" resource factory callable.

required
dependency_kwargs dict[str, Any]

kwargs to pass to resource dependency.

{}
use_cache bool

use_cache parameter of fastapi.Depends.

True

build_factory_api_endpoint

build_factory_api_endpoint(path, endpoint)

Build a factory endpoint for API routes.

Parameters:

Name Type Description Default
path str

The path for the route.

required
endpoint Callable[..., Any]

openai resource factory function.

required

build_factory_websocket_endpoint

build_factory_websocket_endpoint(path, endpoint)

Build a factory endpoint for WebSocket routes.

Parameters:

Name Type Description Default
path str

The path for the route.

required
endpoint Callable[..., Any]

openai resource factory function.

required

compile_openai_resource_factory

compile_openai_resource_factory(endpoint)

Compile an OpenAI resource factory function.

Parameters:

Name Type Description Default
endpoint Callable[..., Any]

openai resource factory function.

required

Returns:

Type Description
OpenAIResource

An OpenAIResource instance.

compile_model_prefix

compile_model_prefix(path, resource)

Compile a prefix for pydantic models.

Parameters:

Name Type Description Default
path str

The path for the route.

required
resource OpenAIResource

An OpenAIResource instance.

required

create_request_model

create_request_model(resource, prefix='')

Create a pydantic model for incoming requests.

Note: Support limited to ChatCompletion resource.

Parameters:

Name Type Description Default
resource ChatCompletionResource

An OpenAIResource instance.

required
prefix str

A prefix for the model name.

''

create_response_model

create_response_model(resource, prefix=None)

Create a pydantic model for responses.

Note: Support limited to ChatCompletion resource.

Parameters:

Name Type Description Default
resource ChatCompletionResource

An OpenAIResource instance.

required
prefix str

A prefix for the model name.

None