Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

litellm generator input validation breaks on default cases #755

Open
leondz opened this issue Jun 25, 2024 · 0 comments
Open

litellm generator input validation breaks on default cases #755

leondz opened this issue Jun 25, 2024 · 0 comments
Labels
bug Something isn't working generators Interfaces with LLMs

Comments

@leondz
Copy link
Owner

leondz commented Jun 25, 2024

litellm can be invoked with or without a provider

from https://docs.litellm.ai/docs/#litellm-python-sdk (example 1):

from litellm import completion
import os

## set ENV variables
os.environ["OPENAI_API_KEY"] = "your-api-key"

response = completion(
  model="gpt-3.5-turbo",
  messages=[{ "content": "Hello, how are you?","role": "user"}]
)

the garak constructor demands a provider

        if self.provider is None:
>           raise ValueError(
                "litellm generator needs to have a provider value configured - see docs"

this breaks the garak test, which is failing (available when OPENAI_API_KEY is set)

tests/generators/test_litellm.py:16: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <garak.generators.litellm.LiteLLMGenerator object at 0x7a2378d0e250>, name = 'gpt-3.5-turbo', generations = 10, config_root = <module 'garak._config' from '/home/lderczynski/dev/garak/garak/_config.py'>

...
E               ValueError: litellm generator needs to have a provider value configured - see docs

garak/generators/litellm.py:131: ValueError

resolution: the provider constraint should be relaxed, perhaps completely, and exceptions (etc) on model non-existence raised by litellm bubbled up

@leondz leondz added bug Something isn't working generators Interfaces with LLMs labels Jun 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working generators Interfaces with LLMs
Projects
None yet
Development

No branches or pull requests

1 participant