Skip to content

Commit

Permalink
Docs: update for the new API release
Browse files Browse the repository at this point in the history
  • Loading branch information
raycastbot committed Aug 14, 2024
1 parent ad5e627 commit 3c138ed
Show file tree
Hide file tree
Showing 4 changed files with 42 additions and 14 deletions.
2 changes: 1 addition & 1 deletion docs/.config.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
"version": "1.80.0"
"version": "1.81.0"
}
33 changes: 20 additions & 13 deletions docs/api-reference/ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -152,25 +152,32 @@ The AI model to use to answer to the prompt. Defaults to `AI.Model["OpenAI_GPT3.

#### Enumeration members

| Name | Description |
| :---------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| OpenAI_GPT3.5-turbo | GPT-3.5 Turbo is OpenAI’s fastest model, making it ideal for tasks that require quick response times with basic language processing capabilities. |
| OpenAI_GPT4 | GPT-4 is OpenAI’s most capable model with broad general knowledge, allowing it to follow complex instructions and solve difficult problems. |
| OpenAI_GPT4-turbo | GPT-4 Turbo from OpenAI has a big context window that fits hundreds of pages of text, making it a great choice for workloads that involve longer prompts. |
| OpenAI_GPT4o | GPT-4o is the most advanced and fastest model from OpenAI, making it a great choice for complex everyday problems and deeper conversations. |
| Anthropic_Claude_Haiku | Claude 3 Haiku is Anthropic's fastest model, with a large context window that makes it ideal for analyzing code, documents, or large amounts of text. |
| Anthropic_Claude_Sonnet | Claude 3.5 Sonnet from Anthropic has enhanced intelligence with increased speed. It excels at complex tasks like visual reasoning or workflow orchestrations. |
| Anthropic_Claude_Opus | Claude 3 Opus is Anthropic's most intelligent model, with best-in-market performance on highly complex tasks. It stands out for remarkable fluency. |
| Perplexity_Llama3_Sonar_Small | Perplexity's Llama 3 Sonar Small is built for speed. It quickly gives you helpful answers using the latest internet knowledge while minimizing hallucinations. |
| Perplexity_Llama3_Sonar_Large | Perplexity's most advanced model, Llama 3 Sonar Large, can handle complex questions. It considers current web knowledge to provide well-reasoned, in-depth answers. |
| Llama3_70B | Llama 3 70B from Meta is the most capable openly available LLM which can serve as a tool for various text-related tasks. Powered by Groq. |
| MixtraL_8x7B | Mixtral 8x7B from Mistral is an open-source model that demonstrates high performance in generating code and text at an impressive speed. Powered by Groq. |
| Name | Description |
| :---------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| OpenAI_GPT3.5-turbo | GPT-3.5 Turbo is OpenAI’s fastest model, making it ideal for tasks that require quick response times with basic language processing capabilities. |
| OpenAI_GPT4 | GPT-4 is OpenAI’s most capable model with broad general knowledge, allowing it to follow complex instructions and solve difficult problems. |
| OpenAI_GPT4-turbo | GPT-4 Turbo from OpenAI has a big context window that fits hundreds of pages of text, making it a great choice for workloads that involve longer prompts. |
| OpenAI_GPT4o | GPT-4o is the most advanced and fastest model from OpenAI, making it a great choice for complex everyday problems and deeper conversations. |
| OpenAI_GPT4o-mini | GPT-4o mini is a highly intelligent and fast model that is ideal for a variety of everyday tasks. |
| Anthropic_Claude_Haiku | Claude 3 Haiku is Anthropic's fastest model, with a large context window that makes it ideal for analyzing code, documents, or large amounts of text. |
| Anthropic_Claude_Sonnet | Claude 3.5 Sonnet from Anthropic has enhanced intelligence with increased speed. It excels at complex tasks like visual reasoning or workflow orchestrations. |
| Anthropic_Claude_Opus | Claude 3 Opus is Anthropic's most intelligent model, with best-in-market performance on highly complex tasks. It stands out for remarkable fluency. |
| Perplexity_Llama3_Sonar_Small | Perplexity's Llama 3 Sonar Small is built for speed. It quickly gives you helpful answers using the latest internet knowledge while minimizing hallucinations. |
| Perplexity_Llama3_Sonar_Large | Perplexity's most advanced model, Llama 3 Sonar Large, can handle complex questions. It considers current web knowledge to provide well-reasoned, in-depth answers. |
| Llama3.1_70B | Llama 3.1 70B is a versatile open-source model from Meta suitable for complex reasoning tasks, multilingual interactions, and extensive text analysis. Powered by Groq. |
| Llama3.1_8B | Llama 3.1 8B is an open-source model from Meta, optimized for instruction following and high-speed performance. Powered by Groq. |
| Llama3_70B | Llama 3 70B from Meta is a highly capable open-source LLM that can serve as a tool for various text-related tasks. Powered by Groq. |
| Llama3.1_405B | Llama 3.1 405B is Meta's flagship open-source model, offering unparalleled capabilities in general knowledge, steerability, math, tool use, and multilingual translation. |
| MixtraL_8x7B | Mixtral 8x7B from Mistral is an open-source model that demonstrates high performance in generating code and text at an impressive speed. Powered by Groq. |
| Mistral_Nemo | Mistral Nemo is a small model built in collaboration with NVIDIA, and released under the Apache 2.0 license. |
| Mistral_Large2 | Mistral Large is Mistral's flagship model, capable of code generation, mathematics, and reasoning, with stronger multilingual support. |

If a model isn't available to the user, Raycast will fallback to a similar one:

- `AI.Model.Anthropic_Claude_Opus` and `AI.Model.Anthropic_Claude_Sonnet` -> `AI.Model.Anthropic_Claude_Haiku`
- `AI.Model.OpenAI_GPT4` and `AI.Model["OpenAI_GPT4-turbo"]` -> `AI.Model["OpenAI_GPT3.5-turbo"]`
- `AI.Model.Perplexity_Llama3_Sonar_Large` -> `AI.Model.Perplexity_Llama3_Sonar_Small`
- `AI.Model.Mistral_Large2` -> `AI.Model.Mistral_Nemo`

### AI.AskOptions

Expand Down
8 changes: 8 additions & 0 deletions docs/api-reference/user-interface/detail.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,14 @@ You can specify custom image dimensions by adding a `raycast-width` and `raycast
You can also specify a tint color to apply to an markdown image by adding a `raycast-tint-color` query string. For example: `![Image Title](example.png?raycast-tintColor=blue)`
{% endhint %}

{% hint style="info" %}
You can now render [LaTeX](https://www.latex-project.org) in the markdown. We support the following delimiters:

- Inline math: `\(...\)` and `\begin{math}...\end{math}`
- Display math: `\[...\]`, `$$...$$` and `\begin{equation}...\end{equation}`

{% endhint %}

### Detail.Metadata

A Metadata view that will be shown in the right-hand-side of the `Detail`.
Expand Down
13 changes: 13 additions & 0 deletions docs/changelog.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,18 @@
# Changelog

## 1.81.0 - 2024-08-13

### ✨ New

- **Detail:** You can now render LaTeX in the Detail views. We support the following delimiters:
- Inline math: `\(...\)` and `\begin{math}...\end{math}`
- Display math: `\[...\]`, `$$...$$` and `\begin{equation}...\end{equation}`

### 💎 Improvements

- You can now pick a different command template for each command that you add in the `Create Extension` command’s form.
- Added a new `Add Command` action for local extensions in the `Manage Extensions` command.

## 1.80.0 - 2024-07-31

### ✨ New
Expand Down

0 comments on commit 3c138ed

Please sign in to comment.