Skip to content

Commit

Permalink
Docs update
Browse files Browse the repository at this point in the history
  • Loading branch information
ddebowczyk committed Aug 29, 2024
1 parent 1a5b931 commit e6fecc0
Show file tree
Hide file tree
Showing 12 changed files with 198 additions and 22 deletions.
58 changes: 57 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ Here's a simple CLI demo app using Instructor to extract structured data from te
### Other capabilities

- Developer friendly LLM context caching for reduced costs and faster inference (for Anthropic models)
- Developer friendly image processing (for OpenAI, Anthropic and Gemini models)
- Developer friendly data extraction from images (for OpenAI, Anthropic and Gemini models)

### Documentation and examples

Expand Down Expand Up @@ -163,6 +163,62 @@ var_dump($person);
>

### Connecting to various LLM API providers

Instructor allows you to define multiple API connections in `instructor.php` file.
This is useful when you want to use different LLMs or API providers in your application.

Default configuration file is located in `/config/instructor.php` in the root directory
of Instructor codebase and contains a set of predefined connections to all LLM APIs
supported out-of-the-box by Instructor.

Config file defines connections to LLM APIs and their parameters. It also specifies
the default connection to be used when calling Instructor without specifying
the client connection.

```php
/* This is fragment of /config/instructor.php file */
'defaultConnection' => 'openai',
//...
'connections' => [
'anthropic' => [ ... ],
'cohere' => [ ... ],
'gemini' => [ ... ],
'ollama' => [
'clientType' => ClientType::Ollama->value,
'apiUrl' => Env::get('OLLAMA_API_URL', 'http://localhost:11434/v1'),
'apiKey' => Env::get('OLLAMA_API_KEY', ''),
'defaultModel' => Env::get('OLLAMA_DEFAULT_MODEL', 'gemma2:2b'),
'defaultMaxTokens' => Env::get('OLLAMA_DEFAULT_MAX_TOKENS', 1024),
'connectTimeout' => Env::get('OLLAMA_CONNECT_TIMEOUT', 3),
'requestTimeout' => Env::get('OLLAMA_REQUEST_TIMEOUT', 30),
],
// ...
```
To customize the available connections you can either modify existing entries or
add your own.

Connecting to LLM API via predefined connection is as simple as calling `withClient`
method with the connection name.

```php
<?php
// ...
$user = (new Instructor)
->withClient('ollama')
->respond(
messages: "His name is Jason and he is 28 years old.",
responseModel: Person::class,
);
// ...
```

You can change the location of the configuration file for Instructor to use via
`INSTRUCTOR_CONFIG_PATH` environment variable. You can use a copy of the default
configuration file as a starting point.



### Structured-to-structured processing

Instructor offers a way to use structured data as an input. This is
Expand Down
3 changes: 2 additions & 1 deletion docs/advanced/modules.mdx
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
## Modular processing with LLMs

> NOTE: This is a work in progress. The documentation is not complete yet.
> NOTE: This is a work in progress. API is changing. Do not use for high stakes work yet.
> The documentation is not complete yet.
Modules are a way to encapsulate structured processing logic and data flows. They are inspired by DSPy and TensorFlow modules.

Expand Down
2 changes: 2 additions & 0 deletions docs/cookbook/examples/advanced/caching.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,8 @@ $user2 = $instructor->request(
responseModel: User::class,
)->withRequestCache()->get();

eval(\Psy\sh());

$delta = Profiler::mark('cache 1st call')->mili();
echo "Time elapsed (cache on, 1st call): $delta msec\n\n";

Expand Down
2 changes: 1 addition & 1 deletion docs/cookbook/examples/api_support/fireworks.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ class User {
}

// Mistral instance params
$yourApiKey = Env::get('FIREWORKSAI_API_KEY'); // set your own API key
$yourApiKey = Env::get('FIREWORKS_API_KEY'); // set your own API key

// Create instance of client initialized with custom parameters
$client = new FireworksAIClient(
Expand Down
2 changes: 1 addition & 1 deletion docs/cookbook/examples/api_support/togetherai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ class User {

// Create instance of client initialized with custom parameters
$client = new TogetherAIClient(
apiKey: Env::get('TOGETHERAI_API_KEY'),
apiKey: Env::get('TOGETHER_API_KEY'),
);

/// Get Instructor with the default client component overridden with your own
Expand Down
54 changes: 54 additions & 0 deletions docs/cookbook/examples/basics/using_config.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
---
title: 'Using LLM API connections from config file'
docname: 'using_config'
---

## Overview

Instructor allows you to define multiple API connections in `instructor.php` file.
This is useful when you want to use different LLMs or API providers in your application.

Connecting to LLM API via predefined connection is as simple as calling `withClient`
method with the connection name.

### Configuration file

Default configuration file is located in `/config/instructor.php` in the root directory
of Instructor codebase.

You can set the location of the configuration file via `INSTRUCTOR_CONFIG_PATH` environment
variable. You can use a copy of the default configuration file as a starting point.

Config file defines connections to LLM APIs and their parameters. It also specifies
the default connection to be used when calling Instructor without specifying the client
connection.

## Example

```php
<?php
$loader = require 'vendor/autoload.php';
$loader->add('Cognesy\\Instructor\\', __DIR__ . '../../src/');

use Cognesy\Instructor\Instructor;

class User {
public int $age;
public string $name;
}

// Get Instructor object with client defined in config.php under 'connections/openai' key
$instructor = (new Instructor)->withClient('openai');

// Call with custom model and execution mode
$user = $instructor->respond(
messages: "Our user Jason is 25 years old.",
responseModel: User::class,
);

// Use the results of LLM inference
dump($user);
assert(isset($user->name));
assert(isset($user->age));
?>
```
8 changes: 6 additions & 2 deletions docs/essentials/usage.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,8 @@ var_dump($person);
// Person {
// name: "Jason",
// age: 28
// }
// }
?>
```

!!! note
Expand All @@ -56,12 +57,14 @@ var_dump($person);
You can provide a string instead of an array of messages. This is useful when you want to extract data from a single block of text and want to keep your code simple.

```php
<?php
use Cognesy\Instructor\Instructor;

$value = (new Instructor)->respond(
messages: "His name is Jason, he is 28 years old.",
responseModel: Person::class,
);
?>
```


Expand All @@ -71,6 +74,7 @@ You can call `request()` method to initiate Instructor with request data
and then call `get()` to get the response.

```php
<?php
use Cognesy\Instructor\Instructor;

$instructor = (new Instructor)->request(
Expand All @@ -79,6 +83,7 @@ $instructor = (new Instructor)->request(
);

$person = $instructor->get();
?>
```


Expand Down Expand Up @@ -161,4 +166,3 @@ See [Streaming and partial updates](partials.md) for more information on how to
## Extracting arguments for function call

See [FunctionCall helper class](function_calls.md) for more information on how to extract arguments for callable objects.
```
30 changes: 16 additions & 14 deletions docs/mint.json
Original file line number Diff line number Diff line change
Expand Up @@ -45,23 +45,23 @@
"url": "https://cognesy.com/blog"
}
],
"navigation": [
{
"navigation": {
"0": {
"group": "Get Started",
"pages": [
"introduction",
"quickstart",
"llm_providers"
]
},
{
"1": {
"group": "Concepts",
"pages": [
"concepts/overview",
"concepts/why"
]
},
{
"2": {
"group": "Essentials",
"pages": [
"essentials/installation",
Expand All @@ -74,18 +74,19 @@
"essentials/demonstrations"
]
},
{
"4": {
"group": "Practical techniques",
"pages": [
"techniques/prompting",
"techniques/classification",
"techniques/search"
]
},
{
"5": {
"group": "Internals",
"pages": [
"internals/instructor",
"internals/configuration",
"internals/lifecycle",
"internals/response_models",
"internals/debugging",
Expand All @@ -94,22 +95,22 @@
"internals/configuration"
]
},
{
"6": {
"group": "More",
"pages": [
"misc/philosophy",
"misc/contributing",
"misc/help"
]
},
{
"7": {
"group": "Instructor Hub",
"pages": [
"cookbook/introduction",
"cookbook/contributing"
]
},
{
"8": {
"group": "Basics",
"pages": [
"cookbook/examples/basics/basic_use",
Expand All @@ -122,11 +123,12 @@
"cookbook/examples/basics/public_vs_private",
"cookbook/examples/basics/self_correction",
"cookbook/examples/basics/attributes",
"cookbook/examples/basics/using_config",
"cookbook/examples/basics/validation",
"cookbook/examples/basics/validation_multifield"
]
},
{
"9": {
"group": "Advanced",
"pages": [
"cookbook/examples/advanced/caching",
Expand All @@ -144,7 +146,7 @@
"cookbook/examples/advanced/structures"
]
},
{
"10": {
"group": "Techniques",
"pages": [
"cookbook/examples/techniques/arbitrary_properties",
Expand All @@ -168,7 +170,7 @@
"cookbook/examples/techniques/translate_ui_fields"
]
},
{
"11": {
"group": "Troubleshooting",
"pages": [
"cookbook/examples/troubleshooting/debugging",
Expand All @@ -178,7 +180,7 @@
"cookbook/examples/troubleshooting/wiretap"
]
},
{
"12": {
"group": "LLM API Support",
"pages": [
"cookbook/examples/api_support/anthropic",
Expand All @@ -194,7 +196,7 @@
"cookbook/examples/api_support/togetherai"
]
}
],
},
"footerSocials": {
"x": "https://x.com/ddebowczyk",
"github": "https://github.com/cognesy/instructor-php",
Expand Down
54 changes: 54 additions & 0 deletions examples/01_Basics/UsingConfig/run.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
---
title: 'Using LLM API connections from config file'
docname: 'using_config'
---

## Overview

Instructor allows you to define multiple API connections in `instructor.php` file.
This is useful when you want to use different LLMs or API providers in your application.

Connecting to LLM API via predefined connection is as simple as calling `withClient`
method with the connection name.

### Configuration file

Default configuration file is located in `/config/instructor.php` in the root directory
of Instructor codebase.

You can set the location of the configuration file via `INSTRUCTOR_CONFIG_PATH` environment
variable. You can use a copy of the default configuration file as a starting point.

Config file defines connections to LLM APIs and their parameters. It also specifies
the default connection to be used when calling Instructor without specifying the client
connection.

## Example

```php
<?php
$loader = require 'vendor/autoload.php';
$loader->add('Cognesy\\Instructor\\', __DIR__ . '../../src/');

use Cognesy\Instructor\Instructor;

class User {
public int $age;
public string $name;
}

// Get Instructor object with client defined in config.php under 'connections/openai' key
$instructor = (new Instructor)->withClient('openai');

// Call with custom model and execution mode
$user = $instructor->respond(
messages: "Our user Jason is 25 years old.",
responseModel: User::class,
);

// Use the results of LLM inference
dump($user);
assert(isset($user->name));
assert(isset($user->age));
?>
```
1 change: 1 addition & 0 deletions notes/NOTES.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ Catch up with the latest additions.
- Multiple tools with tool selection
- Parallel tool calls
- Generate unstructured, then format to structured - to improve reasoning
- Evals!!!

### API Client

Expand Down
Loading

0 comments on commit e6fecc0

Please sign in to comment.