-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(llm): address an issue where saved AI API keys are not carried over to the next sessions #76
Comments
I have noticed some possible issues with the proposal of using the env variables:
@dakshpokar Please investigate more reliable solutions. Ideally, we want to make the |
Greetings Professor @jooyoungseo, Upon investigating the issue, I discovered that the changing port number with each run in interactive mode causes the loss of local storage variables. This is because the local storage, where we store the OPENAI and GEMINI keys in the As you discussed in the previous comment, I also thought of passing it from Python Binder to Browser Frontend, but that could potentially be a security risk. Another solution I explored was storing the keys upon initial entry and passing them to subsequent runs. However, this is not feasible due to the (Same Origin Policy). I also found that we can have a way to share local storage data across domains (+ports) but for that, we need to know which domain we want the local storage data to be injected into, which in our case we will not know considering that every time we run we have a unique port. So this is eliminated. Lastly, I considered using cookies, as their Same Origin Policy is based on the domain name, not the port. Since our domain remains 'localhost', this could work. However, there is a risk that any web application could access the cookies and retrieve the keys, making this approach unsuitable for our needs. I am thinking of some other solutions as well, I will keep you posted about those but right now this is what I found. I will take tomorrow's time to find a reliable fix. Best regards, |
@dakshpokar Can we pin down the port number to make it stable? What's the tradeoff? |
Greetings Professor @jooyoungseo, |
Professor @jooyoungseo, function addKeyValueLocalStorage(iframeId, key, value) {
const iframe = document.getElementById(iframeId);
if (iframe && iframe.contentWindow) {
try {
iframe.contentWindow.localStorage.setItem(key, value);
} catch (error) {
console.error('Error accessing iframe localStorage:', error);
}
} else {
console.error('Iframe not found or inaccessible.');
}
}
addKeyValueLocalStorage('myIframe', 'openAIKey', <<fetch securely from python binder>>); Within the This does work perfectly, we just have to ask the user in the Python binder for the OpenAI and/or Gemini Keys. Once that is done we can store these keys in an encrypted manner and fetch this on the go whenever an instance is run. cc: @SaaiVenkat |
@dakshpokar Why do we have to ask users to enter the keys each time? Could we just fetch the keys from users env variables for both OpenAI and Gemini keys? |
Yes, Professor @jooyoungseo, we will store the keys in environment variables and will not ask for them each time. I would like clarification on when to request the keys: if a user denies the request initially, should we ask again? I am considering the best approach to ensure a positive experience for our target audience. |
@dakshpokar -- I would rather include the instructions on how to add Just fetch the |
Sure Professor @jooyoungseo that works! |
<!-- Suggested PR Title: [feat/fix/refactor/perf/test/ci/docs/chore] brief description of the change --> <!-- Please follow Conventional Commits: https://www.conventionalcommits.org/en/v1.0.0/ --> ## Description This pull request fixes the handling of API keys for LLMs in the code. It adds a JavaScript script to handle the API keys for LLMs and initializes the LLM secrets in the MAIDR instance. The script injects the LLM API keys into the MAIDR instance and sets the appropriate settings based on the presence of the Gemini and OpenAI API keys. This ensures that the LLM functionality works correctly with the updated API key handling. closes #76 ## Type of Change - [x] Bug fix - [ ] New feature - [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected) - [ ] Documentation update ## Checklist - [x] My code follows the style guidelines of this project - [x] I have performed a self-review of my code - [x] I have commented my code, particularly in hard-to-understand areas - [x] I have made corresponding changes to the documentation - [x] My changes generate no new warnings - [x] Any dependent changes have been merged and published in downstream modules # Pull Request ## Description 1. Added a new method called `initialize_llm_secrets()` in environment.py which fetches the keys from the environment variable. 2. Injected the script when the maidr iframe loads initially. ## Checklist <!-- Please select all applicable options. --> <!-- To select your options, please put an 'x' in the all boxes that apply. --> - [x] I have read the [Contributor Guidelines](../CONTRIBUTING.md). - [x] I have performed a self-review of my own code and ensured it follows the project's coding standards. - [x] I have tested the changes locally following `ManualTestingProcess.md`, and all tests related to this pull request pass. - [x] I have commented my code, particularly in hard-to-understand areas. - [x] I have updated the documentation, if applicable. - [x] I have added appropriate unit tests, if applicable. ## Additional Notes <!-- Add any additional notes or comments here. --> <!-- Template credit: This pull request template is based on Embedded Artistry {https://github.com/embeddedartistry/templates/blob/master/.github/PULL_REQUEST_TEMPLATE.md}, Clowder {https://github.com/clowder-framework/clowder/blob/develop/.github/PULL_REQUEST_TEMPLATE.md}, and TalAter {https://github.com/TalAter/open-source-templates} templates. -->
Reproducible Steps
Create any plots and display via
maidr.show()
Hit H key and provide API keys for both OpenAI and Google Gemini
Click Save and Close
If prompted, save the password in your browser
Open LLM from the interactive plot area via Ctrl+Shift+/ (on Windows) or Alt+Shift+/ (on Mac)
Make sure the AI responses are working
Close the maidr browser and exit out of the current Python repl
Repeat the steps above and see if the AI API keys are preserved to the next sessions
Current Behavior
API keys are not carried over to the next sessions even when it's saved in the browser
Suggested Solution
maidr.show
is executed search for the following env variables viaos.getenv()
:OPENAI_API_KEY
GOOGLE_API_KEY
The text was updated successfully, but these errors were encountered: