Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update to the latest google-cloud-aiplatform dependency and properly handle function calls in gemini #16793

Draft
wants to merge 13 commits into
base: main
Choose a base branch
from

Conversation

stfines-clgx
Copy link

@stfines-clgx stfines-clgx commented Nov 2, 2024

Description

This fixes the usage of function and non-function tools with Google's Gemini models. The prior implementation would not address the new ModelRole of tool when passing tool requests to the agent and it would therefore cause the Gemini message to fail.

Additionally, the non-public imports and beta imports that have been superseded by release versions have been corrected so that the integration now uses publicly supported APIs where they are available.

Works around the issues that google-cloud-aiplatform has with serialization of JSON Schema as well.

Fixes #16678 #16625

New Package?

Did I fill in the tool.llamahub section in the pyproject.toml and provide a detailed README.md for my new integration or package?

  • Yes
  • No

Version Bump?

Did I bump the version in the pyproject.toml file of the package I am updating? (Except for the llama-index-core package)

  • Yes
  • No

Type of Change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

How Has This Been Tested?

Your pull-request will likely not be merged unless it is covered by some form of impactful unit testing.

  • I added new unit tests to cover this change
  • I believe this change is already covered by existing unit tests

Suggested Checklist:

  • I have performed a self-review of my own code
  • [-] I have commented my code, particularly in hard-to-understand areas
  • [-] I have made corresponding changes to the documentation
  • [-] I have added Google Colab support for the newly added notebooks.
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • I ran make format; make lint to appease the lint gods

stfines-clgx and others added 3 commits November 1, 2024 16:51
Google updated the cloud-aiplatform dependencies,
so in a probably futile attempt to keep them
somewhat current I have updated them.
@logan-markewich
Copy link
Collaborator

(Appreciate this @stfines-clgx -- I don't have access to test this 🙏🏻 )

@@ -495,7 +495,7 @@ def get_tool_calls_from_response(

tool_selections = []
for tool_call in tool_calls:
response_dict = MessageToDict(tool_call._pb)
response_dict = MessageToDict(tool_call._raw_message._pb)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no way to do this without accessing private attributes hey? 😅 Oh google

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, in another branch, I did it by just creating an entirely new object by hand. I kept it this way to minimize the change surface area. I can move over to the other change style if it is an issue, but yeah- their MessageToDict function doesn't work without private access. Kind of silly; but then again, I'm not entirely sure that google makes apis for use outside of google.

stfines-clgx and others added 10 commits November 4, 2024 09:44
The BaseTool object serializes things using pydantic
but this a json schema objects which do not currently
play well with protobuf. This version is an attempt
to handle that issue.
… Protobuf

This changes how plans are serialized so that they are compatible with
protobuf. It's a workaround until Google addresses the issue in
google-cloud-aiplatform.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: Vertex LLM does not Handle FunctionCall tools
2 participants