Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gemini client #1953

Open
wants to merge 11 commits into
base: main
Choose a base branch
from
Open

Conversation

marouanetalaa
Copy link

Description

Implement a Gemini LLMClient and test for it.

Related Issue

Implement a Gemini LLMClient #1901

Type of Change

  • 馃摎 Examples / docs / tutorials / dependencies update
  • 馃敡 Bug fix (non-breaking change which fixes an issue)
  • 馃 Improvement (non-breaking change which improves an existing feature)
  • 馃殌 New feature (non-breaking change which adds functionality)
  • 馃挜 Breaking change (fix or feature that would cause existing functionality to change)
  • 馃攼 Security fix

Checklist

  • I've read the CODE_OF_CONDUCT.md document.
  • I've read the CONTRIBUTING.md guide.
  • I've written tests for all new methods and classes that I created.
  • I've written the docstring in Google format for all the methods and classes that I used.
  • I've updated the pdm.lock running pdm update-lock (only applicable when pyproject.toml has been
    modified)

@kevinmessiaen kevinmessiaen self-requested a review June 11, 2024 02:05
@kevinmessiaen
Copy link
Member

Thanks for the contribution 馃憤 I'll take a look at the PR!

Copy link
Member

@kevinmessiaen kevinmessiaen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The client is working properly however it doesn't integrate with the Giskard (scan, raget, ...).

The ChatMessage that are sent as input to the LLM client can have 3 roles (system, assistant and user).

In our case assistant should be mapped to model. For system we will need to map it to user with a custom format the that the model understand the importance of the system prompt: https://www.googlecloudcommunity.com/gc/AI-ML/Implementing-System-Prompts-in-Gemini-Pro-for-Chatbot-Creation/m-p/715501

let me know if you need help on this. You can take a look on our Bedrock client implementation.

PS: ideally this code snippet should be working:

client = GeminiClient()
response = client.complete(messages=[
    ChatMessage(role='system', content='You are a "ping" service that always reply with "IT WORKS!"'),
    ChatMessage(role='user', content='Hello, does it works?'),
    ChatMessage(role='assistant', content='IT WORKS!'),
    ChatMessage(role='user', content='What is your goal?'),
])

assert response.role == 'assistant' # model should be mapped to assistant in the response too
assert response.role == 'IT WORKS!'


from logging import warning

from google.generativeai.types import ContentDict
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should also be moved in the try clause line 12 in order to generate the proper error message

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants