Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support full parameter set for Ollama integrations #1129

Closed
emso-c opened this issue Oct 10, 2024 · 2 comments · Fixed by #1131
Closed

Support full parameter set for Ollama integrations #1129

emso-c opened this issue Oct 10, 2024 · 2 comments · Fixed by #1131
Labels
feature request Ideas to improve an integration integration:ollama P2

Comments

@emso-c
Copy link
Contributor

emso-c commented Oct 10, 2024

Is your feature request related to a problem? Please describe.

Ollama integrations do not support the full capabilities of the generation methods. Namely OllamaGenerator.generate and OllamaChatGenerator.chat methods. I was trying to add the keep_alive option to the generation using the OllamaGenerator. But when I inspected the code, I've seen that the method only passed down the model, prompt, stream and options parameters, even though the Ollama client is capable of utilizing more:

@overload
  def generate(
    self,
    model: str = '',
    prompt: str = '',
    suffix: str = '',
    system: str = '',
    template: str = '',
    context: Optional[Sequence[int]] = None,
    stream: Literal[True] = True,
    raw: bool = False,
    format: Literal['', 'json'] = '',
    images: Optional[Sequence[AnyStr]] = None,
    options: Optional[Options] = None,
    keep_alive: Optional[Union[float, str]] = None,
  ) -> Iterator[Mapping[str, Any]]: ...

Describe the solution you'd like
Adding all the top level parameters that ollama client is able to use to the OllamaGenerator and OllamaChatGenerator constructors respectively.

Describe alternatives you've considered
Using kwargs to pass down the parameters manually:

class OllamaGenerator:
    def __init__(
        ...
        run_kwargs: Optional[Dict[str, Any]] = None # added line
    ) 
    ...
    def run(
        ...
        response = self._client.generate(model=self.model,
                                         prompt=prompt,
                                         stream=stream,
                                         options=generation_kwargs,
                                         **self.run_kwargs) # added line
        ...

# usage
generator = OllamaGenerator(
    run_kwargs={"keep_alive": "5m"},
)

Additional context

@emso-c emso-c added the feature request Ideas to improve an integration label Oct 10, 2024
@anakin87
Copy link
Member

Some of the available parameters are already supported.

For now, I would only add support for keep_alive, which seems important.

Feel free to create a PR if you want, otherwise we will take care of it in the future.

@anakin87
Copy link
Member

Support for keep_alive was added in #1131.

Support for images will be added when we focus more on enabling multimodal use cases throughout the framework.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request Ideas to improve an integration integration:ollama P2
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants