Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: azure settings no longer respected due to pydantic-settings==2.2.0 #857

Closed

Conversation

lostmygithubaccount
Copy link
Contributor

I am beyond confused and would appreciate any insights here. This bug shows up in one project I have, but not another, on the same computer. Others using Marivn have had a similar experience, where it works just fine in one environment but fails in another. I'm not sure what is going on.

I managed to fix it by inspecting the marvin.settings object and changing a few lines in the OpenAI provider, shown in the PR

this code doesn't seem to have changed in the last month. downgrading to older versions of Marvin does not fix this bug in the one project it's present in. I'm testing Marvin 2.1.2 - Marvin 2.1.5. I really can't figure it out

@lostmygithubaccount
Copy link
Contributor Author

feel free to do something more proper and close this PR, really looking to understand why this is happening in some projects and not others

@lostmygithubaccount
Copy link
Contributor Author

ok wrong on two fronts -- in the project it was working, I had marvin==2.1.2 and the marvin.settings are different

@lostmygithubaccount
Copy link
Contributor Author

so at some point it seems marvin.settings.azure_openai_api_key -> marvin.settings.marvin_azure_openai_api_key and perhaps this Azure OpenAI client util code was not changed with it

@lostmygithubaccount
Copy link
Contributor Author

still confused in that I can't find any commit where this seems to have changed

@zzstoatzz
Copy link
Collaborator

zzstoatzz commented Feb 18, 2024

hey @lostmygithubaccount

hmmm I've noticed something fishy about pydantic_settings==2.2.0 which just got released recently but I haven't been able to make an MRE

can you try pip install pydantic-settings==2.1.0?

in case its helpful while sleuthing: https://github.com/pydantic/pydantic-settings/releases/tag/v2.2.0

@lostmygithubaccount
Copy link
Contributor Author

lostmygithubaccount commented Feb 18, 2024

I think you're on it -- this is the project where it works:

(venv) cody@dkdcsot gtc-demo-2024 % pip list | grep pydantic
pydantic                  2.6.1
pydantic_core             2.16.2
pydantic-settings         2.1.0

this is the project where it doesn't work:

(venv) cody@dkdcsot ibis % pip list | grep pydantic
pydantic                      2.6.1
pydantic_core                 2.16.2
pydantic-settings             2.2.0

edit: and downgrading in the latter project fixes it. looking through pydantic-setting's commits, I don't see anything obvious 🤷 but very glad we have a workaround

this is also consistent w/ what others have seen, worked on their old environments and failing on one just created today (thus picking up the latest version)

@zzstoatzz
Copy link
Collaborator

zzstoatzz commented Feb 18, 2024

@lostmygithubaccount I'd like to figure this out

can you be more specific about what you're seeing when things go wrong?

I think I've noticed that things set in .env files seem to ignoring their corresponding setting object's env_prefix and therefore "leaking" out of the setting object they should be in

either that or the "bootlegged" settings (i.e. the ones that are accessible via getattr on marvin.settings because we say extra="allow" on our BaseSettings subclass) are now being handled differently

@zzstoatzz zzstoatzz added the bug Something isn't working label Feb 18, 2024
@lostmygithubaccount
Copy link
Contributor Author

lostmygithubaccount commented Feb 18, 2024

yep so I have this in ~/.marvin/.env:

MARVIN_LOG_LEVEL=INFO

## Azure OpenAI
MARVIN_PROVIDER="azure_openai"
MARVIN_AZURE_OPENAI_API_KEY="..."
MARVIN_AZURE_OPENAI_ENDPOINT="https://birdbrain-eh.openai.azure.com"
MARVIN_AZURE_OPENAI_API_VERSION="2023-12-01-preview"
MARVIN_CHAT_COMPLETIONS_MODEL="gpt-4-turbo"

then I import marvin and run a simple example:

import marvin

result = marvin.classify(
    "Marvin is so easy to use!",
    labels=["positive", "negative"],
)
result

and get this traceback:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[1], [line 3](vscode-notebook-cell:?execution_count=1&line=3)
      [1](vscode-notebook-cell:?execution_count=1&line=1) import marvin
----> [3](vscode-notebook-cell:?execution_count=1&line=3) result = marvin.classify(
      [4](vscode-notebook-cell:?execution_count=1&line=4)     "Marvin is so easy to use!",
      [5](vscode-notebook-cell:?execution_count=1&line=5)     labels=["positive", "negative"],
      [6](vscode-notebook-cell:?execution_count=1&line=6) )
      [7](vscode-notebook-cell:?execution_count=1&line=7) result

File [~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:350](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:350), in classify(data, labels, instructions, model_kwargs, client)
    [328](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:328) """
    [329](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:329) Classifies the provided data based on the provided labels.
    [330](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:330) 
   (...)
    [346](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:346)     T: The label that the data was classified into.
    [347](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:347) """
    [349](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:349) model_kwargs = model_kwargs or {}
--> [350](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:350) return _generate_typed_llm_response_with_logit_bias(
    [351](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:351)     prompt_template=CLASSIFY_PROMPT,
    [352](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:352)     prompt_kwargs=dict(data=data, labels=labels, instructions=instructions),
    [353](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:353)     model_kwargs=model_kwargs | dict(temperature=0),
    [354](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:354)     client=client,
    [355](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:355) )

File [~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:209](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:209), in _generate_typed_llm_response_with_logit_bias(prompt_template, prompt_kwargs, encoder, max_tokens, model_kwargs, client)
    [205](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:205) grammar = cast_labels_to_grammar(
    [206](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:206)     labels=label_strings, encoder=encoder, max_tokens=max_tokens
    [207](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:207) )
    [208](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:208) model_kwargs.update(grammar.model_dump())
--> [209](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:209) response = generate_llm_response(
    [210](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:210)     prompt_template=prompt_template,
    [211](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:211)     prompt_kwargs=(prompt_kwargs or {}) | dict(labels=label_strings),
    [212](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:212)     model_kwargs=model_kwargs | dict(temperature=0),
    [213](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:213)     client=client,
    [214](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:214) )
    [216](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:216) # the response contains a single number representing the index of the chosen
    [217](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:217) label_index = int(response.response.choices[0].message.content)

File [~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:79](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:79), in generate_llm_response(prompt_template, prompt_kwargs, model_kwargs, client)
     [59](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:59) def generate_llm_response(
     [60](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:60)     prompt_template: str,
     [61](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:61)     prompt_kwargs: Optional[dict] = None,
     [62](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:62)     model_kwargs: Optional[dict] = None,
     [63](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:63)     client: Optional[MarvinClient] = None,
     [64](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:64) ) -> ChatResponse:
     [65](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:65)     """
     [66](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:66)     Generates a language model response based on a provided prompt template.
     [67](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:67) 
   (...)
     [77](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:77)         ChatResponse: The generated response from the language model.
     [78](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:78)     """
---> [79](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:79)     client = client or MarvinClient()
     [80](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:80)     model_kwargs = model_kwargs or {}
     [81](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/ai/text.py:81)     prompt_kwargs = prompt_kwargs or {}

    [... skipping hidden 1 frame]

File [~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:152](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:152), in MarvinClient.<lambda>()
    [146](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:146) class MarvinClient(pydantic.BaseModel):
    [147](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:147)     model_config = pydantic.ConfigDict(
    [148](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:148)         arbitrary_types_allowed=True, protected_namespaces=()
    [149](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:149)     )
    [151](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:151)     client: Client = pydantic.Field(
--> [152](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:152)         default_factory=lambda: get_openai_client(is_async=False)
    [153](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:153)     )
    [155](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:155)     @classmethod
    [156](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:156)     def wrap(cls, client: Client) -> "Client":
    [157](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:157)         client.chat.completions.create = partial(
    [158](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:158)             cls(client=client).generate_chat, completion=client.chat.completions.create
    [159](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/client/openai.py:159)         )  # type: ignore #noqa

File [~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:71](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:71), in get_openai_client(is_async)
     [68](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:68) azure_endpoint = getattr(marvin.settings, "azure_openai_endpoint", None)
     [70](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:70) if any(k is None for k in [api_key, api_version, azure_endpoint]):
---> [71](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:71)     raise ValueError(
     [72](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:72)         inspect.cleandoc(
     [73](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:73)             """
     [74](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:74)         Azure OpenAI configuration is missing. Marvin will not work properly without it.
     [75](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:75)         
     [76](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:76)         Please make sure to set the following environment variables:
     [77](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:77)             - MARVIN_AZURE_OPENAI_API_KEY
     [78](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:78)             - MARVIN_AZURE_OPENAI_API_VERSION
     [79](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:79)             - MARVIN_AZURE_OPENAI_ENDPOINT
     [80](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:80)             
     [81](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:81)         In addition, you must set the LLM model name to your Azure OpenAI deployment name, e.g.
     [82](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:82)             - MARVIN_CHAT_COMPLETIONS_MODEL = <your Azure OpenAI deployment name>
     [83](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:83)         """
     [84](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:84)         )
     [85](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:85)     )
     [86](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:86) client_class = AsyncAzureOpenAI if is_async else AzureOpenAI
     [87](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:87) kwargs.update(
     [88](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:88)     api_key=api_key, api_version=api_version, azure_endpoint=azure_endpoint
     [89](https://file+.vscode-resource.vscode-cdn.net/Users/cody/repos/ibis/docs/posts/duckdb-for-rag/~/repos/ibis/venv/lib/python3.11/site-packages/marvin/utilities/openai.py:89) )

ValueError: Azure OpenAI configuration is missing. Marvin will not work properly without it.

Please make sure to set the following environment variables:
    - MARVIN_AZURE_OPENAI_API_KEY
    - MARVIN_AZURE_OPENAI_API_VERSION
    - MARVIN_AZURE_OPENAI_ENDPOINT
    
In addition, you must set the LLM model name to your Azure OpenAI deployment name, e.g.
    - MARVIN_CHAT_COMPLETIONS_MODEL = <your Azure OpenAI deployment name>

which is happening because marvin.settings for all the Azure OpenAI things have marvin_ prefixed to them. printing out marvin.settings:

from rich import print

print(marvin.settings)

gives (removing a bunch of stuff):

Settings(
    provider='azure_openai',
    ....
    log_level='INFO',
    log_verbose=False,
    marvin_azure_openai_api_key='...',
    marvin_azure_openai_endpoint='https://birdbrain-eh.openai.azure.com',
    marvin_azure_openai_api_version='2023-12-01-preview',
    marvin_chat_completions_model='gpt-4-turbo'

while in utilities/openai.py it's checking for:

    # --- Azure OpenAI
    elif marvin.settings.provider == "azure_openai":
        api_key = getattr(marvin.settings, "azure_openai_api_key", None)
        api_version = getattr(marvin.settings, "azure_openai_api_version", None)
        azure_endpoint = getattr(marvin.settings, "azure_openai_endpoint", None)

and if I downgrade pydantic-settings, that marvin_ prefix is no longer there and it works fine

@zzstoatzz
Copy link
Collaborator

@lostmygithubaccount thank you! I'll see what I can figure out

@zzstoatzz zzstoatzz changed the title bug: fix azure openai bug: azure settings no longer respected due to pydantic-settings==2.2.0 Feb 19, 2024
@zzstoatzz
Copy link
Collaborator

zzstoatzz commented Feb 19, 2024

okay I think I have an MRE, now its just a matter of why 🙂

settings.py

from pydantic_settings import BaseSettings, SettingsConfigDict

class FooSettings(BaseSettings):
    model_config = SettingsConfigDict(
        extra="allow",
        env_file=".env",
        env_prefix="foo_"
    )
    
    x: str

settings = FooSettings()

print(f'{getattr(settings, "y", None)=}')
print(f'{getattr(settings, "foo_y", None)=}')

.env

FOO_X=something
FOO_Y=something_else

on 2.2.0

(pydantic-settings-2dot2) pad-2 :: src/open-source/mre
» python settings.py
getattr(settings, "y", None)=None
getattr(settings, "foo_y", None)='something_else'

» pip list | rg settings
pydantic-settings 2.2.0

on 2.1.0

(pydantic-settings-2dot1) pad-2 :: src/open-source/mre
» python settings.py
getattr(settings, "y", None)='something_else'
getattr(settings, "foo_y", None)=None

» pip list | rg settings
pydantic-settings 2.1.0

@zzstoatzz
Copy link
Collaborator

zzstoatzz commented Feb 19, 2024

opened a PR to fix on pydantic-settings

edit: this has been merged upstream and will likely go out with next release

@zzstoatzz
Copy link
Collaborator

closing as the fix has been released upstream

@zzstoatzz zzstoatzz closed this Feb 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants