Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added Top_p https://github.com/uripeled2/llm-client-sdk/issues/28 #29

Merged
merged 9 commits into from
Jul 14, 2023

Conversation

EyalPazz
Copy link
Contributor

@EyalPazz EyalPazz commented Jul 1, 2023

No description provided.

@uripeled2
Copy link
Owner

You didn't update BaseLLMAPIClient.text_completion function

llm_client/llm_api_client/ai21_client.py Outdated Show resolved Hide resolved
llm_client/llm_api_client/aleph_alpha_client.py Outdated Show resolved Hide resolved
llm_client/llm_api_client/anthropic_client.py Outdated Show resolved Hide resolved
tests/resources/openai/chat_completion.json Outdated Show resolved Hide resolved
tests/resources/openai/text_completion.json Outdated Show resolved Hide resolved
@uripeled2
Copy link
Owner

@CodiumAI-Agent

@CodiumAI-Agent
Copy link

PR Analysis

  • 🎯 Main theme: Adding 'top_p' parameter to text completion methods
  • 🔍 Description and title: No
  • 📌 Type of PR: Enhancement
  • 🧪 Relevant tests added: Yes
  • Minimal and focused: Yes, the PR is minimal and focused as it only adds a new parameter to the text completion methods and updates the corresponding tests.
  • 🔒 Security concerns: No, the PR does not introduce any obvious security concerns. It only adds a new parameter to existing methods and does not involve any sensitive data or security-critical operations.

PR Feedback

  • 💡 General PR suggestions: The PR lacks a description which is important to understand the context and reason for the changes. It would be helpful to add a description explaining why the 'top_p' parameter was added and how it affects the text completion methods.

  • 🤖 Code suggestions:

    • relevant file: llm_client/llm_api_client/base_llm_api_client.py
      suggestion content: Consider adding a docstring to the 'text_completion' method to explain the new 'top_p' parameter. This will help other developers understand its purpose and how it should be used. [important]

    • relevant file: tests/llm_api_client/openai_client/test_openai.py
      suggestion content: It would be beneficial to add some tests that specifically check the behavior of the 'top_p' parameter. For instance, you could test how the method behaves with different 'top_p' values. [medium]

    • relevant file: llm_client/llm_api_client/ai21_client.py
      suggestion content: In the 'text_completion' method, the 'top_p' parameter is set to None by default. It would be better to set a default value that reflects the typical use case, or if None is a valid value, explain what it means in the method's docstring. [medium]

    • relevant file: llm_client/llm_api_client/anthropic_client.py
      suggestion content: In the 'text_completion' method, the 'top_p' parameter is set to -1 by default. It would be better to set a default value that reflects the typical use case, or if -1 is a valid value, explain what it means in the method's docstring. [medium]

How to use

Tag me in a comment '@CodiumAI-Agent' to ask for a new review after you update the PR.
You can also tag me and ask any question, for example '@CodiumAI-Agent is the PR ready for merge?'

@uripeled2 uripeled2 linked an issue Jul 7, 2023 that may be closed by this pull request
@uripeled2 uripeled2 assigned uripeled2 and unassigned uripeled2 Jul 7, 2023
@uripeled2 uripeled2 merged commit 57f8207 into uripeled2:main Jul 14, 2023
22 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add p-top to common models parameter
3 participants