Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: make dynamic clients discoverable #837

Open
wants to merge 8 commits into
base: canary
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 1 addition & 3 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -100,14 +100,12 @@ $RECYCLE.BIN/
.stylelintcache
.temp
.tern-port
.turbo!*
.turbo
.venv
.venv/# Created by pytest automatically.
.vercel
.vscode-test
.vscode-test/
.vscode/
.vscode/*
.vuepress/dist
.yarn-integrity
.yarn/*
Expand Down
20 changes: 14 additions & 6 deletions docs/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,10 @@ instances:

title: BAML Documentation

redirects:
- source: /docs/snippets/clients/providers/anthropic
destination: docs/snippets/clients/providers/overview#anthropic

navigation:
- section: Get started
contents:
Expand All @@ -18,6 +22,7 @@ navigation:
- page: Interactive Demos
path: docs/get-started/interactive-demos.mdx
- section: Quickstart
collapsed: false
contents:
- page: Python
path: docs/get-started/quickstart/python.mdx
Expand All @@ -30,6 +35,7 @@ navigation:
- page: Other Editors
path: docs/get-started/quickstart/editors-other.mdx
- section: Debugging
collapsed: false
contents:
- page: VSCode Playground
path: docs/get-started/debugging/vscode-playground.mdx
Expand Down Expand Up @@ -61,6 +67,8 @@ navigation:
contents:
- page: Overview
path: docs/snippets/clients/overview.mdx
- page: Accordion
path: docs/snippets/clients/accordion.mdx
- section: providers
contents:
- page: anthropic
Expand All @@ -71,22 +79,22 @@ navigation:
path: docs/snippets/clients/providers/azure.mdx
- page: google-ai
path: docs/snippets/clients/providers/gemini.mdx
- page: groq
path: docs/snippets/clients/providers/groq.mdx
- page: huggingface
path: docs/snippets/clients/providers/huggingface.mdx
- page: ollama
path: docs/snippets/clients/providers/ollama.mdx
- page: openai
path: docs/snippets/clients/providers/openai.mdx
- page: vertex-ai
path: docs/snippets/clients/providers/vertex.mdx
- page: openrouter
path: docs/snippets/clients/providers/openrouter.mdx
- page: together-ai
path: docs/snippets/clients/providers/together.mdx
- page: groq
path: docs/snippets/clients/providers/groq.mdx
- page: vertex-ai
path: docs/snippets/clients/providers/vertex.mdx
- page: vllm
path: docs/snippets/clients/providers/vllm.mdx
- page: huggingface
path: docs/snippets/clients/providers/huggingface.mdx
- section: provider strategies
contents:
- page: fallback
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/get-started/quickstart/python.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ To set up BAML in python do the following:
you must re-run this command, and regenerate the `baml_client` folder.

<Tip>
If you download our [VSCode extension](https://marketplace.visualstudio.com/items?itemName=Boundary.baml-extension), it will automatically generate `baml_client` on save!
Our [VSCode extension](https://marketplace.visualstudio.com/items?itemName=Boundary.baml-extension) automatically runs this command when you save a BAML file.
</Tip>

```bash
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/get-started/quickstart/ruby.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ To set up BAML in ruby do the following:
you must re-run this command, and regenerate the `baml_client` folder.

<Tip>
If you download our [VSCode extension](https://marketplace.visualstudio.com/items?itemName=Boundary.baml-extension), it will automatically generate `baml_client` on save!
Our [VSCode extension](https://marketplace.visualstudio.com/items?itemName=Boundary.baml-extension) automatically runs this command when you save a BAML file.
</Tip>

```bash
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/get-started/quickstart/typescript.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ To set up BAML in typescript do the following:
you must re-run this command, and regenerate the `baml_client` folder.

<Tip>
If you download our [VSCode extension](https://marketplace.visualstudio.com/items?itemName=Boundary.baml-extension), it will automatically generate `baml_client` on save!
Our [VSCode extension](https://marketplace.visualstudio.com/items?itemName=Boundary.baml-extension) automatically runs this command when you save a BAML file.
</Tip>

```json package.json
Expand Down
12 changes: 8 additions & 4 deletions docs/docs/snippets/client-constructor.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,17 @@ The configuration modifies the URL request BAML runtime makes.

| Provider Name | Docs | Notes |
| -------------- | -------------------------------- | ---------------------------------------------------------- |
| `openai` | [OpenAI](/docs/snippets/clients/providers/openai) | Anything that follows openai's API exactly |
| `ollama` | [Ollama](/docs/snippets/clients/providers/ollama) | Alias for an openai client but with default ollama options |
| `azure-openai` | [Azure OpenAI](/docs/snippets/clients/providers/azure) | |
| `anthropic` | [Anthropic](/docs/snippets/clients/providers/anthropic) | |
| `aws-bedrock` | [AWS Bedrock](/docs/snippets/clients/providers/aws-bedrock) | |
| `azure-openai` | [Azure OpenAI](/docs/snippets/clients/providers/azure) | |
| `google-ai` | [Google AI](/docs/snippets/clients/providers/gemini) | |
| `ollama` | [Ollama](/docs/snippets/clients/providers/ollama) | Alias for an OpenAI client but with default ollama options |
| `openai` | [OpenAI](/docs/snippets/clients/providers/openai) | Anything that follows OpenAI's API exactly |
| `vertex-ai` | [Vertex AI](/docs/snippets/clients/providers/vertex) | |
| `aws-bedrock` | [AWS Bedrock](/docs/snippets/clients/providers/aws-bedrock) | |

We also have some special providers that allow composing clients together:
| Provider Name | Docs | Notes |
| -------------- | -------------------------------- | ---------------------------------------------------------- |
| `fallback` | [Fallback](/docs/snippets/clients/fallback) | Used to chain models conditional on failures |
| `round-robin` | [Round Robin](/docs/snippets/clients/round-robin) | Used to load balance |

Expand Down
46 changes: 46 additions & 0 deletions docs/docs/snippets/clients/accordion.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
---
slug: docs/snippets/clients/accordion
---

This is a placeholder page - do not link to it, do not use it. Am using it to try out AccordionGroup for our providers.

## Providers

<AccordionGroup>
<Accordion title="Anthropic">
<Markdown src="providers/anthropic.mdx" />
</Accordion>
<Accordion title="aws-bedrock">
<Markdown src="providers/aws-bedrock.mdx" />
</Accordion>
<Accordion title="azure-openai">
<Markdown src="providers/azure.mdx" />
</Accordion>
<Accordion title="google-ai">
<Markdown src="providers/gemini.mdx" />
</Accordion>
<Accordion title="groq">
<Markdown src="providers/groq.mdx" />
</Accordion>
<Accordion title="huggingface">
<Markdown src="providers/huggingface.mdx" />
</Accordion>
<Accordion title="ollama">
<Markdown src="providers/ollama.mdx" />
</Accordion>
<Accordion title="openai">
<Markdown src="providers/openai.mdx" />
</Accordion>
<Accordion title="openrouter">
<Markdown src="providers/openrouter.mdx" />
</Accordion>
<Accordion title="together-ai">
<Markdown src="providers/together.mdx" />
</Accordion>
<Accordion title="vertex-ai">
<Markdown src="providers/vertex.mdx" />
</Accordion>
<Accordion title="vllm">
<Markdown src="providers/vllm.mdx" />
</Accordion>
</AccordionGroup>
40 changes: 30 additions & 10 deletions docs/docs/snippets/clients/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,23 +2,28 @@
slug: docs/snippets/clients/overview
---

Clients are used to configure how LLMs are called.
Clients are used to configure how LLMs are called, like so:

Here's an example of a client configuration:
```rust BAML
function MakeHaiku(topic: string) -> string {
client "openai/gpt-4o"
prompt #"
Write a haiku about {{ topic }}.
"#
}
```

This is `<provider>/<model>` shorthand for:

```baml BAML
```rust BAML
client<llm> MyClient {
provider openai
provider "openai"
options {
model gpt-4o // Configure which model is used
temperature 0.7 // Pass additional options to the model
model "gpt-4o"
// api_key defaults to env.OPENAI_API_KEY
}
}
```

Usage:

```rust BAML
function MakeHaiku(topic: string) -> string {
client MyClient
prompt #"
Expand All @@ -27,6 +32,21 @@ function MakeHaiku(topic: string) -> string {
}
```

Consult the [provider documentation](#fields) for a list of supported providers
and models, and the default options.

If you want to override options like `api_key` to use a different environment
variable, or you want to point `base_url` to a different endpoint, you should use
the latter form.

<Tip>
If you want to specify which client to use at runtime, in your Python/TS/Ruby code,
you can use the [client registry](/docs/calling-baml/client-registry) to do so.

This can come in handy if you're trying to, say, send 10% of your requests to a
different model.
</Tip>

## Fields

<Markdown src="../client-constructor.mdx" />
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/snippets/clients/providers/huggingface.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ See https://huggingface.co/docs/inference-endpoints/index for more information o

```baml BAML
client<llm> MyClient {
provider openai
provider openai-generic
options {
base_url "https://api-inference.huggingface.co/v1"
api_key env.HUGGINGFACE_API_KEY
Expand Down
1 change: 0 additions & 1 deletion docs/docs/snippets/clients/providers/ollama.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@ title: ollama
slug: docs/snippets/clients/providers/ollama
---


For `ollama`, we provide a client that can be used to interact with [ollama](https://ollama.com/) `/chat/completions` endpoint.

<Info>What is ollama? Ollama is an easy way to run LLMs locally!</Info>
Expand Down
9 changes: 6 additions & 3 deletions docs/docs/snippets/clients/providers/together.mdx
Original file line number Diff line number Diff line change
@@ -1,14 +1,17 @@

---
title: together
slug: docs/snippets/clients/providers/together
---

https://www.together.ai/ - The fastest cloud platform for building and running generative AI.

Together AI supports the OpenAI client, allowing you to use the [openai](/docs/snippets/clients/providers/openai) provider with an overriden `base_url`
Together AI supports the OpenAI client, allowing you to use the [openai-generic](/docs/snippets/clients/providers/openai-generic) provider with an overriden `base_url`

See https://docs.together.ai/docs/openai-api-compatibility for more information.

```baml BAML
client<llm> MyClient {
provider openai
provider openai-generi
options {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spelling: Typo in provider name openai-generi

Suggested change
provider openai-generi
provider openai-generic

base_url "https://api.together.ai/v1"
api_key env.TOGETHER_API_KEY
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/snippets/clients/providers/vllm.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ See https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html for mor

```baml BAML
client<llm> MyClient {
provider openai
provider openai-generic
options {
base_url "http://localhost:8000/v1"
api_key "token-abc123"
Expand Down
5 changes: 2 additions & 3 deletions engine/Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions engine/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,7 @@ baml-types = { path = "baml-lib/baml-types" }
internal-baml-codegen = { path = "language_client_codegen" }
internal-baml-core = { path = "baml-lib/baml-core" }
internal-baml-jinja = { path = "baml-lib/jinja" }
internal-baml-schema-ast = { path = "baml-lib/schema-ast" }

[workspace.package]
version = "0.52.1"
Expand Down
8 changes: 4 additions & 4 deletions engine/baml-fmt/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,13 @@ description.workspace = true
license-file.workspace = true

[dependencies]
colored = "2"
baml-lib = { path = "../baml-lib/baml" }
internal-baml-jinja = { path = "../baml-lib/jinja" }
anyhow.workspace = true

indoc.workspace = true
internal-baml-core.workspace = true
internal-baml-schema-ast.workspace = true
serde_json.workspace = true
serde.workspace = true
indoc.workspace = true
lsp-types = "0.91.1"
log = "0.4.14"
enumflags2 = "0.7"
Expand Down
16 changes: 0 additions & 16 deletions engine/baml-fmt/build.rs

This file was deleted.

Loading
Loading