Skip to content

Commit

Permalink
update version
Browse files Browse the repository at this point in the history
  • Loading branch information
uripeled2 committed Jul 19, 2023
1 parent 57f8207 commit 980f72e
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 5 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,7 @@ more details below in Usage section.

## Base Interface
The package exposes two simple interfaces for communicating with LLMs (In the future, we
will expand the interface to support more tasks like embeddings, list models, edits, etc.
and we will add a standardized for LLMs param like max_tokens, temperature, etc.):
will expand the interface to support more tasks like list models, edits, etc.):
```python
from abc import ABC, abstractmethod
from dataclasses import dataclass, field
Expand Down Expand Up @@ -47,7 +46,7 @@ class BaseLLMAPIClient(BaseLLMClient, ABC):

@abstractmethod
async def text_completion(self, prompt: str, model: Optional[str] = None, max_tokens: int | None = None,
temperature: Optional[float] = None, **kwargs) -> list[str]:
temperature: Optional[float] = None, top_p: Optional[float] = None, **kwargs) -> list[str]:
raise NotImplementedError()

async def embedding(self, text: str, model: Optional[str] = None, **kwargs) -> list[float]:
Expand Down Expand Up @@ -200,6 +199,7 @@ Contributions are welcome! Please check out the todos below, and feel free to op
- [x] Convert common models parameter
- [x] temperature
- [x] max_tokens
- [x] top_p
- [ ] more

### Development
Expand Down
2 changes: 1 addition & 1 deletion llm_client/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
__version__ = "0.6.2"
__version__ = "0.7.0"

from llm_client.base_llm_client import BaseLLMClient

Expand Down
2 changes: 1 addition & 1 deletion llm_client/llm_api_client/base_llm_api_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ def __init__(self, config: LLMAPIClientConfig):

@abstractmethod
async def text_completion(self, prompt: str, model: Optional[str] = None, max_tokens: Optional[int] = None,
temperature: Optional[float] = None,top_p : Optional[float] = None, **kwargs) -> list[str]:
temperature: Optional[float] = None, top_p: Optional[float] = None, **kwargs) -> list[str]:
raise NotImplementedError()

async def embedding(self, text: str, model: Optional[str] = None, **kwargs) -> list[float]:
Expand Down

0 comments on commit 980f72e

Please sign in to comment.