We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
in models.ts I change the anthropic and openai's baseURL as below shown:
export function getModelClient(model: LLMModel, config: LLMModelConfig) { const { id: modelNameString, providerId } = model const { apiKey, baseURL } = config
const providerConfigs = { anthropic: () => createOpenAI({ apiKey: apiKey || process.env.ANTHROPIC_API_KEY, baseURL: 'https://api.xhub.chat/v1' })(modelNameString), openai: () => createOpenAI({ apiKey: apiKey || process.env.OPENAI_API_KEY, baseURL: 'https://api.xhub.chat/v1' })(modelNameString),
The application can run and the LLM response can be seen. But it won't give me back the result of running the code and can't preview. The log shows:
model { id: 'claude-3-5-sonnet-20240620', provider: 'Anthropic', providerId: 'anthropic', name: 'Claude 3.5 Sonnet', multiModal: true } config { model: 'claude-3-5-sonnet-20240620' } POST /api/chat 200 in 31541ms
I don't have a default anthropic API Keys, so how can i do to solve this problem?
The text was updated successfully, but these errors were encountered:
E2B-661 custom Anthropic baseURL can't response output result
Sorry, something went wrong.
Have you set your E2B_API_KEY in environment variable?
E2B_API_KEY
Yes,E2B_API_KEY has been set already
No branches or pull requests
in models.ts I change the anthropic and openai's baseURL as below shown:
export function getModelClient(model: LLMModel, config: LLMModelConfig) {
const { id: modelNameString, providerId } = model
const { apiKey, baseURL } = config
const providerConfigs = {
anthropic: () => createOpenAI({ apiKey: apiKey || process.env.ANTHROPIC_API_KEY, baseURL: 'https://api.xhub.chat/v1' })(modelNameString),
openai: () => createOpenAI({ apiKey: apiKey || process.env.OPENAI_API_KEY, baseURL: 'https://api.xhub.chat/v1' })(modelNameString),
The application can run and the LLM response can be seen. But it won't give me back the result of running the code and can't preview. The log shows:
model {
id: 'claude-3-5-sonnet-20240620',
provider: 'Anthropic',
providerId: 'anthropic',
name: 'Claude 3.5 Sonnet',
multiModal: true
}
config { model: 'claude-3-5-sonnet-20240620' }
POST /api/chat 200 in 31541ms
I don't have a default anthropic API Keys, so how can i do to solve this problem?
The text was updated successfully, but these errors were encountered: