Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ainvoke / astream of ChatGoogleGenerativeAI raising TypeError when running in event loop #520

Open
Chengdyc opened this issue Sep 29, 2024 · 1 comment

Comments

@Chengdyc
Copy link

Hi, I'm getting the following error "TypeError: object GenerateContentResponse can't be used in 'await' expression" when calling ainvoke or astream of ChatGoogleGenerativeAI.

Code to reproduce:

import asyncio
import os

import dotenv
from langchain_google_genai import ChatGoogleGenerativeAI
from pydantic import SecretStr


def create_llm() -> ChatGoogleGenerativeAI:
    llm = ChatGoogleGenerativeAI(
        model="gemini-1.5-flash-latest",
        google_api_key=SecretStr(os.getenv("GOOGLE_API_KEY")),
        client_options=None,
        transport="grpc",
        additional_headers=None,
        client=None,
        async_client=None,
        temperature=0.5,
    )
    return llm


async def main():
    dotenv.load_dotenv()
    print("verify that event loop is running")
    asyncio.get_event_loop()
    llm = create_llm()
    print("calling gemini")
    res = await llm.ainvoke("how are you")
    print(res)


if __name__ == "__main__":
    asyncio.run(main())

the output with exception trace:

verify that event loop is running
calling gemini
Traceback (most recent call last):
  File "/home/xxx/workspace/yyy/app/repro_genai_bug.py", line 34, in <module>
    asyncio.run(main())
  File "/usr/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/asyncio/base_events.py", line 654, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/app/repro_genai_bug.py", line 29, in main
    res = await llm.ainvoke("how are you")
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 305, in ainvoke
    llm_result = await self.agenerate_prompt(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 794, in agenerate_prompt
    return await self.agenerate(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 754, in agenerate
    raise exceptions[0]
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 930, in _agenerate_with_cache
    result = await self._agenerate(
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/langchain_google_genai/chat_models.py", line 1025, in _agenerate
    response: GenerateContentResponse = await _achat_with_retry(
                                        ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/langchain_google_genai/chat_models.py", line 238, in _achat_with_retry
    return await _achat_with_retry(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/tenacity/asyncio/__init__.py", line 185, in async_wrapped
    return await fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/tenacity/asyncio/__init__.py", line 111, in __call__
    do = await self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/tenacity/asyncio/__init__.py", line 153, in iter
    result = await action(retry_state)
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/tenacity/_utils.py", line 99, in inner
    return call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/tenacity/__init__.py", line 392, in <lambda>
    self._add_action_func(lambda rs: rs.outcome.result())
                                     ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/tenacity/asyncio/__init__.py", line 114, in __call__
    result = await fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/langchain_google_genai/chat_models.py", line 236, in _achat_with_retry
    raise e
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/langchain_google_genai/chat_models.py", line 229, in _achat_with_retry
    return await generation_method(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/xxx/workspace/yyy/venv/lib/python3.11/site-packages/google/ai/generativelanguage_v1beta/services/generative_service/async_client.py", line 406, in generate_content
    response = await rpc(
               ^^^^^^^^^^
TypeError: object GenerateContentResponse can't be used in 'await' expression

This was previously reported in #357, but the fix that was made (7467f84) focuses on the case when async_client is None, which happens if there's no event loop. But in my case, the event loop is running, so that fix doesn't apply.

versions:

name         : langchain-google-genai
version      : 2.0.0
name         : langchain
version      : 0.3.1
name         : langchain-core
version      : 0.3.6
@langcarl langcarl bot added the investigate label Sep 29, 2024
@LukeSamkharadze
Copy link

Bump

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants