Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llm chat -c fails for API key models #247

Closed
simonw opened this issue Sep 6, 2023 · 2 comments
Closed

llm chat -c fails for API key models #247

simonw opened this issue Sep 6, 2023 · 2 comments
Labels
bug Something isn't working
Milestone

Comments

@simonw
Copy link
Owner

simonw commented Sep 6, 2023

Running this:

llm -m 4 'hello'

And then:

llm chat -c

Produces this error:

Chatting with gpt-4
Type 'exit' or 'quit' to exit
> hi
Traceback (most recent call last):
  File "/opt/homebrew/Caskroom/miniconda/base/bin/llm", line 8, in <module>
    sys.exit(cli())
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/llm/cli.py", line 382, in chat
    for chunk in response:
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/llm/models.py", line 91, in __iter__
    for chunk in self.model.execute(
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/llm/default_plugins/openai_models.py", line 271, in execute
    completion = openai.ChatCompletion.create(
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create
    return super().create(*args, **kwargs)
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 149, in create
    ) = cls.__prepare_create_request(
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 106, in __prepare_create_request
    requestor = api_requestor.APIRequestor(
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/openai/api_requestor.py", line 138, in __init__
    self.api_key = key or util.default_api_key()
  File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.10/site-packages/openai/util.py", line 186, in default_api_key
    raise openai.error.AuthenticationError(
openai.error.AuthenticationError: No API key provided. You can set your API key in code using 'openai.api_key = <API-KEY>', or you can set the environment variable OPENAI_API_KEY=<API-KEY>). If your API key is stored in a file, you can point the openai module at it with 'openai.api_key_path = <PATH>'. You can generate API keys in the OpenAI web interface. See https://platform.openai.com/account/api-keys for details.
@simonw simonw added the bug Something isn't working label Sep 6, 2023
@simonw
Copy link
Owner Author

simonw commented Sep 6, 2023

Relevant code:

llm/llm/cli.py

Lines 330 to 366 in b2fc0a1

if conversation_id or _continue:
# Load the conversation - loads most recent if no ID provided
try:
conversation = load_conversation(conversation_id)
except UnknownModelError as ex:
raise click.ClickException(str(ex))
template_obj = None
if template:
params = dict(param)
# Cannot be used with system
if system:
raise click.ClickException("Cannot use -t/--template and --system together")
template_obj = load_template(template)
if model_id is None and template_obj.model:
model_id = template_obj.model
# Figure out which model we are using
if model_id is None:
if conversation:
model_id = conversation.model.model_id
else:
model_id = get_default_model()
# Now resolve the model
try:
model = get_model(model_id)
except KeyError:
raise click.ClickException("'{}' is not a known model".format(model_id))
# Provide the API key, if one is needed and has been provided
if model.needs_key:
model.key = get_key(key, model.needs_key, model.key_env_var)
if conversation is None:
# Start a fresh conversation for this chat
conversation = Conversation(model=model)

For some reason when the conversation is loaded it doesn't end up with the key, whereas when it's created fresh by this code it DOES get the key:

llm/llm/cli.py

Lines 364 to 366 in b2fc0a1

if conversation is None:
# Start a fresh conversation for this chat
conversation = Conversation(model=model)

@simonw
Copy link
Owner Author

simonw commented Sep 6, 2023

Here's why:

llm/llm/cli.py

Lines 330 to 335 in b2fc0a1

if conversation_id or _continue:
# Load the conversation - loads most recent if no ID provided
try:
conversation = load_conversation(conversation_id)
except UnknownModelError as ex:
raise click.ClickException(str(ex))

That's not ensuring that the model object with the working API key is made available to that conversation.

@simonw simonw closed this as completed in 17e6402 Sep 6, 2023
@simonw simonw added this to the 0.10 milestone Sep 10, 2023
simonw added a commit that referenced this issue Sep 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant