Combining ReplGPT.jl and Term.jl to get style error messages and AI-generated help in the REPL.
The help messages are generated using GPT-3, a state-of-the-art language model from OpenAI. The code for interactive with the OpenAI API is based on ReplGPT.jl
.
You will need to obtain an OpenAI API key from openai.com and pass it to Julia. HelpGPT.jl
will look for an API key in the module's settings and in
the OPENAI_API_KEY
environment variable as a fallback.
The recommended approach is to save the API key in the module's settings by running:
julia> using HelpGPT
julia> HelpGPT.setAPIkey("<YOUR KEY HERE>")
The API key can later be cleared with HelpGPT.clearAPIkey()
.
Note: with this approach your API key will be stored in plaintext in a
LocalPreferences.toml
folder in your environment directory. For example, on a Linux computer running Julia 1.8, the key is
stored in
~/.julia/environments/v1.8/LocalPreferences.toml
.
After that, you just need to:
using HelpGPT
HelpGPT.install_help_stacktrace()
after that, all errors will be styled and the help messages will be generated by GPT-3.
--- WARNING --- This overrides Julia's error printing and stacktrace printing, which means you probably shouldn't use this in a production environment. See here for more details.