Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: clean up init_chat_model #26551

Merged
merged 2 commits into from
Sep 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 3 additions & 31 deletions docs/docs/how_to/chat_models_universal_init.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -15,43 +15,15 @@
"\n",
"Make sure you have the integration packages installed for any model providers you want to support. E.g. you should have `langchain-openai` installed to init an OpenAI model.\n",
"\n",
":::\n",
"\n",
":::info Requires ``langchain >= 0.2.8``\n",
"\n",
"This functionality was added in ``langchain-core == 0.2.8``. Please make sure your package is up to date.\n",
"\n",
":::"
]
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"id": "165b0de6-9ae3-4e3d-aa98-4fc8a97c4a06",
"metadata": {
"execution": {
"iopub.execute_input": "2024-09-10T20:22:32.858670Z",
"iopub.status.busy": "2024-09-10T20:22:32.858278Z",
"iopub.status.idle": "2024-09-10T20:22:33.009452Z",
"shell.execute_reply": "2024-09-10T20:22:33.007022Z"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"zsh:1: 0.2.8 not found\r\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Note: you may need to restart the kernel to use updated packages.\n"
]
}
],
"metadata": {},
"outputs": [],
"source": [
"%pip install -qU langchain>=0.2.8 langchain-openai langchain-anthropic langchain-google-vertexai"
]
Expand Down
83 changes: 46 additions & 37 deletions libs/langchain/langchain/chat_models/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -98,48 +98,40 @@ def init_chat_model(

Must have the integration package corresponding to the model provider installed.

.. versionadded:: 0.2.7

.. versionchanged:: 0.2.8

Support for ``configurable_fields`` and ``config_prefix`` added.

.. versionchanged:: 0.2.12

Support for Ollama via langchain-ollama package added. Previously
langchain-community version of Ollama (now deprecated) was installed by default.

Args:
model: The name of the model, e.g. "gpt-4o", "claude-3-opus-20240229".
model_provider: The model provider. Supported model_provider values and the
corresponding integration package:
- openai (langchain-openai)
- anthropic (langchain-anthropic)
- azure_openai (langchain-openai)
- google_vertexai (langchain-google-vertexai)
- google_genai (langchain-google-genai)
- bedrock (langchain-aws)
- cohere (langchain-cohere)
- fireworks (langchain-fireworks)
- together (langchain-together)
- mistralai (langchain-mistralai)
- huggingface (langchain-huggingface)
- groq (langchain-groq)
- ollama (langchain-ollama) [support added in langchain==0.2.12]

- openai (langchain-openai)
- anthropic (langchain-anthropic)
- azure_openai (langchain-openai)
- google_vertexai (langchain-google-vertexai)
- google_genai (langchain-google-genai)
- bedrock (langchain-aws)
- cohere (langchain-cohere)
- fireworks (langchain-fireworks)
- together (langchain-together)
- mistralai (langchain-mistralai)
- huggingface (langchain-huggingface)
- groq (langchain-groq)
- ollama (langchain-ollama) [support added in langchain==0.2.12]

Will attempt to infer model_provider from model if not specified. The
following providers will be inferred based on these model prefixes:
- gpt-3... or gpt-4... -> openai
- claude... -> anthropic
- amazon.... -> bedrock
- gemini... -> google_vertexai
- command... -> cohere
- accounts/fireworks... -> fireworks

- gpt-3... or gpt-4... -> openai
- claude... -> anthropic
- amazon.... -> bedrock
- gemini... -> google_vertexai
- command... -> cohere
- accounts/fireworks... -> fireworks
configurable_fields: Which model parameters are
configurable:
- None: No configurable fields.
- "any": All fields are configurable. *See Security Note below.*
- Union[List[str], Tuple[str, ...]]: Specified fields are configurable.

- None: No configurable fields.
- "any": All fields are configurable. *See Security Note below.*
- Union[List[str], Tuple[str, ...]]: Specified fields are configurable.

Fields are assumed to have config_prefix stripped if there is a
config_prefix. If model is specified, then defaults to None. If model is
Expand Down Expand Up @@ -168,7 +160,9 @@ def init_chat_model(
ValueError: If model_provider cannot be inferred or isn't supported.
ImportError: If the model provider integration package is not installed.

Initialize non-configurable models:
.. dropdown:: Init non-configurable model
:open:

.. code-block:: python

# pip install langchain langchain-openai langchain-anthropic langchain-google-vertexai
Expand All @@ -183,7 +177,8 @@ def init_chat_model(
gemini_15.invoke("what's your name")


Create a partially configurable model with no default model:
.. dropdown:: Partially configurable model with no default

.. code-block:: python

# pip install langchain langchain-openai langchain-anthropic
Expand All @@ -204,7 +199,8 @@ def init_chat_model(
)
# claude-3.5 sonnet response

Create a fully configurable model with a default model and a config prefix:
.. dropdown:: Fully configurable model with a default

.. code-block:: python

# pip install langchain langchain-openai langchain-anthropic
Expand Down Expand Up @@ -233,7 +229,8 @@ def init_chat_model(
)
# Claude-3.5 sonnet response with temperature 0.6

Bind tools to a configurable model:
.. dropdown:: Bind tools to a configurable model

You can call any ChatModel declarative methods on a configurable model in the
same way that you would with a normal model.

Expand Down Expand Up @@ -270,6 +267,18 @@ class GetPopulation(BaseModel):
config={"configurable": {"model": "claude-3-5-sonnet-20240620"}}
)
# Claude-3.5 sonnet response with tools

.. versionadded:: 0.2.7

.. versionchanged:: 0.2.8

Support for ``configurable_fields`` and ``config_prefix`` added.

.. versionchanged:: 0.2.12

Support for Ollama via langchain-ollama package added. Previously
langchain-community version of Ollama (now deprecated) was installed by default.

""" # noqa: E501
if not model and not configurable_fields:
configurable_fields = ("model", "model_provider")
Expand Down
Loading
Loading