-
Notifications
You must be signed in to change notification settings - Fork 14.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
openai: embeddings: supported chunk_size when check_embedding_ctx_length is disabled #23767
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
dosubot
bot
added
size:S
This PR changes 10-29 lines, ignoring generated files.
Ɑ: embeddings
Related to text embedding models module
🔌: openai
Primarily related to OpenAI integrations
🤖:improvement
Medium size change to existing code to handle new use-cases
labels
Jul 2, 2024
adubovik
force-pushed
the
master
branch
2 times, most recently
from
July 2, 2024 14:36
215c34e
to
03f6842
Compare
@efriis is there something I could do to impove the PR? Thanks |
reassigning to @baskaryan who recently refactored this! |
Hi, @baskaryan! Do you have any suggestions? |
dosubot
bot
added
size:M
This PR changes 30-99 lines, ignoring generated files.
and removed
size:S
This PR changes 10-29 lines, ignoring generated files.
labels
Sep 20, 2024
efriis
approved these changes
Sep 20, 2024
dosubot
bot
added
the
lgtm
PR looks good. Use to confirm that a PR is ready for merging.
label
Sep 20, 2024
sfc-gh-nmoiseyev
pushed a commit
to sfc-gh-nmoiseyev/langchain
that referenced
this pull request
Sep 21, 2024
…gth is disabled (langchain-ai#23767) Chunking of the input array controlled by `self.chunk_size` is being ignored when `self.check_embedding_ctx_length` is disabled. Effectively, the chunk size is assumed to be equal 1 in such a case. This is suprising. The PR takes into account `self.chunk_size` passed by the user. --------- Co-authored-by: Erick Friis <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
Ɑ: embeddings
Related to text embedding models module
🤖:improvement
Medium size change to existing code to handle new use-cases
lgtm
PR looks good. Use to confirm that a PR is ready for merging.
🔌: openai
Primarily related to OpenAI integrations
partner
size:M
This PR changes 30-99 lines, ignoring generated files.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Chunking of the input array controlled by
self.chunk_size
is being ignored whenself.check_embedding_ctx_length
is disabled. Effectively, the chunk size is assumed to be equal 1 in such a case. This is suprising.The PR takes into account
self.chunk_size
passed by the user.