Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gemini client #1953

Merged
merged 21 commits into from
Jul 3, 2024
Merged

Gemini client #1953

merged 21 commits into from
Jul 3, 2024

Conversation

marouanetalaa
Copy link
Contributor

Description

Implement a Gemini LLMClient and test for it.

Related Issue

Implement a Gemini LLMClient #1901

Type of Change

  • 📚 Examples / docs / tutorials / dependencies update
  • 🔧 Bug fix (non-breaking change which fixes an issue)
  • 🥂 Improvement (non-breaking change which improves an existing feature)
  • 🚀 New feature (non-breaking change which adds functionality)
  • 💥 Breaking change (fix or feature that would cause existing functionality to change)
  • 🔐 Security fix

Checklist

  • I've read the CODE_OF_CONDUCT.md document.
  • I've read the CONTRIBUTING.md guide.
  • I've written tests for all new methods and classes that I created.
  • I've written the docstring in Google format for all the methods and classes that I used.
  • I've updated the pdm.lock running pdm update-lock (only applicable when pyproject.toml has been
    modified)

@kevinmessiaen kevinmessiaen self-requested a review June 11, 2024 02:05
@kevinmessiaen
Copy link
Member

Thanks for the contribution 👍 I'll take a look at the PR!

Copy link
Member

@kevinmessiaen kevinmessiaen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The client is working properly however it doesn't integrate with the Giskard (scan, raget, ...).

The ChatMessage that are sent as input to the LLM client can have 3 roles (system, assistant and user).

In our case assistant should be mapped to model. For system we will need to map it to user with a custom format the that the model understand the importance of the system prompt: https://www.googlecloudcommunity.com/gc/AI-ML/Implementing-System-Prompts-in-Gemini-Pro-for-Chatbot-Creation/m-p/715501

let me know if you need help on this. You can take a look on our Bedrock client implementation.

PS: ideally this code snippet should be working:

client = GeminiClient()
response = client.complete(messages=[
    ChatMessage(role='system', content='You are a "ping" service that always reply with "IT WORKS!"'),
    ChatMessage(role='user', content='Hello, does it works?'),
    ChatMessage(role='assistant', content='IT WORKS!'),
    ChatMessage(role='user', content='What is your goal?'),
])

assert response.role == 'assistant' # model should be mapped to assistant in the response too
assert response.role == 'IT WORKS!'

giskard/llm/client/gemini.py Outdated Show resolved Hide resolved
@kevinmessiaen kevinmessiaen self-assigned this Jun 24, 2024
@kevinmessiaen kevinmessiaen merged commit 565f58c into Giskard-AI:main Jul 3, 2024
14 of 15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

4 participants