Skip to content

LangChain

LangchainAPIRoute

LangchainAPIRoute(path, endpoint, *, response_model=Default(None), **kwargs)

Bases: APIRoute

APIRoute class for LangChain.

Constructor method.

Parameters:

Name Type Description Default
path str

The path for the route.

required
endpoint Callable[..., Any]

The endpoint to call when the route is requested.

required
response_model Any

The response model to use for the route.

Default(None)
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

LangchainAPIWebSocketRoute

LangchainAPIWebSocketRoute(path, endpoint, *, name=None, **kwargs)

Bases: APIWebSocketRoute

APIWebSocketRoute class for LangChain.

Constructor method.

Parameters:

Name Type Description Default
path str

The path for the route.

required
endpoint Callable[..., Any]

The endpoint to call when the route is requested.

required
name Optional[str]

The name of the route.

None
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

LangchainAPIRouter

LangchainAPIRouter(*, route_class=LangchainAPIRoute, **kwargs)

Bases: APIRouter

APIRouter class for LangChain.

ChainRunMode

Bases: StrEnum

Enum for LangChain run modes.

StreamingResponse

StreamingResponse(chain, config, run_mode=ChainRunMode.ASYNC, *args, **kwargs)

Bases: StreamingResponse

StreamingResponse class for LangChain resources.

Constructor method.

Parameters:

Name Type Description Default
chain Chain

A LangChain instance.

required
config dict[str, Any]

A config dict.

required
*args Any

Positional arguments to pass to the parent constructor.

()
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

stream_response async

stream_response(send)

Stream LangChain outputs.

If an exception occurs while iterating over the LangChain, an internal server error is sent to the client.

Parameters:

Name Type Description Default
send Send

The ASGI send callable.

required

LanarkyCallbackHandler

LanarkyCallbackHandler(**kwargs)

Bases: AsyncCallbackHandler

Base callback handler for Lanarky applications.

always_verbose property

always_verbose

Verbose mode is always enabled for Lanarky applications.

StreamingCallbackHandler

StreamingCallbackHandler(*, send=None, **kwargs)

Bases: LanarkyCallbackHandler

Callback handler for streaming responses.

Constructor method.

Parameters:

Name Type Description Default
send Send

The ASGI send callable.

None
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

always_verbose property

always_verbose

Verbose mode is always enabled for Lanarky applications.

TokenEventData

Bases: BaseModel

Event data payload for tokens.

TokenStreamingCallbackHandler

TokenStreamingCallbackHandler(*, output_key, mode=TokenStreamMode.JSON, **kwargs)

Bases: StreamingCallbackHandler

Callback handler for streaming tokens.

Constructor method.

Parameters:

Name Type Description Default
output_key str

chain output key.

required
mode TokenStreamMode

The stream mode.

JSON
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

always_verbose property

always_verbose

Verbose mode is always enabled for Lanarky applications.

on_chain_start async

on_chain_start(*args, **kwargs)

Run when chain starts running.

on_llm_new_token async

on_llm_new_token(token, **kwargs)

Run on new LLM token. Only available when streaming is enabled.

on_chain_end async

on_chain_end(outputs, **kwargs)

Run when chain ends running.

Final output is streamed only if LLM cache is enabled.

SourceDocumentsEventData

Bases: BaseModel

Event data payload for source documents.

SourceDocumentsStreamingCallbackHandler

SourceDocumentsStreamingCallbackHandler(*, send=None, **kwargs)

Bases: StreamingCallbackHandler

Callback handler for streaming source documents.

Constructor method.

Parameters:

Name Type Description Default
send Send

The ASGI send callable.

None
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

always_verbose property

always_verbose

Verbose mode is always enabled for Lanarky applications.

on_chain_end async

on_chain_end(outputs, **kwargs)

Run when chain ends running.

FinalTokenStreamingCallbackHandler

FinalTokenStreamingCallbackHandler(*, answer_prefix_tokens=None, strip_tokens=True, stream_prefix=False, **kwargs)

Bases: TokenStreamingCallbackHandler, FinalStreamingStdOutCallbackHandler

Callback handler for streaming final answer tokens.

Useful for streaming responses from Langchain Agents.

Constructor method.

Parameters:

Name Type Description Default
answer_prefix_tokens Optional[list[str]]

The answer prefix tokens to use.

None
strip_tokens bool

Whether to strip tokens.

True
stream_prefix bool

Whether to stream the answer prefix.

False
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

always_verbose property

always_verbose

Verbose mode is always enabled for Lanarky applications.

on_chain_start async

on_chain_start(*args, **kwargs)

Run when chain starts running.

on_chain_end async

on_chain_end(outputs, **kwargs)

Run when chain ends running.

Final output is streamed only if LLM cache is enabled.

on_llm_start async

on_llm_start(*args, **kwargs)

Run when LLM starts running.

on_llm_new_token async

on_llm_new_token(token, **kwargs)

Run on new LLM token. Only available when streaming is enabled.

WebSocketCallbackHandler

WebSocketCallbackHandler(*, mode=TokenStreamMode.JSON, websocket=None, **kwargs)

Bases: LanarkyCallbackHandler

Callback handler for websocket sessions.

Constructor method.

Parameters:

Name Type Description Default
mode TokenStreamMode

The stream mode.

JSON
websocket WebSocket

The websocket to use.

None
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

always_verbose property

always_verbose

Verbose mode is always enabled for Lanarky applications.

TokenWebSocketCallbackHandler

TokenWebSocketCallbackHandler(*, output_key, **kwargs)

Bases: WebSocketCallbackHandler

Callback handler for sending tokens in websocket sessions.

Constructor method.

Parameters:

Name Type Description Default
output_key str

chain output key.

required
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

always_verbose property

always_verbose

Verbose mode is always enabled for Lanarky applications.

on_chain_start async

on_chain_start(*args, **kwargs)

Run when chain starts running.

on_llm_new_token async

on_llm_new_token(token, **kwargs)

Run on new LLM token. Only available when streaming is enabled.

on_chain_end async

on_chain_end(outputs, **kwargs)

Run when chain ends running.

Final output is streamed only if LLM cache is enabled.

SourceDocumentsWebSocketCallbackHandler

SourceDocumentsWebSocketCallbackHandler(*, mode=TokenStreamMode.JSON, websocket=None, **kwargs)

Bases: WebSocketCallbackHandler

Callback handler for sending source documents in websocket sessions.

Constructor method.

Parameters:

Name Type Description Default
mode TokenStreamMode

The stream mode.

JSON
websocket WebSocket

The websocket to use.

None
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

always_verbose property

always_verbose

Verbose mode is always enabled for Lanarky applications.

on_chain_end async

on_chain_end(outputs, **kwargs)

Run when chain ends running.

FinalTokenWebSocketCallbackHandler

FinalTokenWebSocketCallbackHandler(*, answer_prefix_tokens=None, strip_tokens=True, stream_prefix=False, **kwargs)

Bases: TokenWebSocketCallbackHandler, FinalStreamingStdOutCallbackHandler

Callback handler for sending final answer tokens in websocket sessions.

Useful for streaming responses from Langchain Agents.

Constructor method.

Parameters:

Name Type Description Default
answer_prefix_tokens Optional[list[str]]

The answer prefix tokens to use.

None
strip_tokens bool

Whether to strip tokens.

True
stream_prefix bool

Whether to stream the answer prefix.

False
**kwargs dict[str, Any]

Keyword arguments to pass to the parent constructor.

{}

always_verbose property

always_verbose

Verbose mode is always enabled for Lanarky applications.

on_chain_start async

on_chain_start(*args, **kwargs)

Run when chain starts running.

on_chain_end async

on_chain_end(outputs, **kwargs)

Run when chain ends running.

Final output is streamed only if LLM cache is enabled.

on_llm_start async

on_llm_start(*args, **kwargs)

Run when LLM starts running.

on_llm_new_token async

on_llm_new_token(token, **kwargs)

Run on new LLM token. Only available when streaming is enabled.

get_token_data

get_token_data(token, mode)

Get token data based on mode.

Parameters:

Name Type Description Default
token str

The token to use.

required
mode TokenStreamMode

The stream mode.

required

Depends

Depends(dependency, *, dependency_kwargs={}, use_cache=True)

Dependency injection for LangChain.

Parameters:

Name Type Description Default
dependency Optional[Callable[..., Any]]

a "dependable" chain factory callable.

required
dependency_kwargs dict[str, Any]

kwargs to pass to chain dependency.

{}
use_cache bool

use_cache parameter of fastapi.Depends.

True

build_factory_api_endpoint

build_factory_api_endpoint(path, endpoint)

Build a factory endpoint for API routes.

Parameters:

Name Type Description Default
path str

The path for the route.

required
endpoint Callable[..., Any]

LangChain instance factory function.

required

build_factory_websocket_endpoint

build_factory_websocket_endpoint(path, endpoint)

Build a factory endpoint for WebSocket routes.

Parameters:

Name Type Description Default
path str

The path for the route.

required
endpoint Callable[..., Any]

LangChain instance factory function.

required

compile_chain_factory

compile_chain_factory(endpoint)

Compile a LangChain instance factory function.

Parameters:

Name Type Description Default
endpoint Callable[..., Any]

LangChain instance factory function.

required

create_request_model

create_request_model(chain, prefix='')

Create a pydantic request model for a LangChain instance.

Parameters:

Name Type Description Default
chain Chain

A LangChain instance.

required
prefix str

A prefix for the model name.

''

create_response_model

create_response_model(chain, prefix=None)

Create a pydantic response model for a LangChain instance.

Parameters:

Name Type Description Default
chain Chain

A LangChain instance.

required
prefix str

A prefix for the model name.

None

compile_model_prefix

compile_model_prefix(path, chain)

Compile a prefix for pydantic models.

Parameters:

Name Type Description Default
path str

The path for the route.

required
chain Chain

A LangChain instance.

required

get_streaming_callbacks

get_streaming_callbacks(chain)

Get streaming callbacks for a LangChain instance.

Note: This function might not support all LangChain chain and agent types. Please open an issue on GitHub to request support for a specific type.

Parameters:

Name Type Description Default
chain Chain

A LangChain instance.

required

get_websocket_callbacks

get_websocket_callbacks(chain, websocket)

Get websocket callbacks for a LangChain instance.

Note: This function might not support all LangChain chain and agent types. Please open an issue on GitHub to request support for a specific type.

Parameters:

Name Type Description Default
chain Chain

A LangChain instance.

required
websocket WebSocket

A WebSocket instance.

required