Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feat/Fix] Refactoring Llava models into single file #475

Merged
merged 2 commits into from
May 26, 2024

Conversation

Luodian
Copy link
Contributor

@Luodian Luodian commented May 26, 2024

This PR includes the addition of new model classes for handling different configurations.

Key Changes

  1. Introduced LlavaQwenForCausalLM and LlavaMistralForCausalLM classes, extending the LlavaLlamaForCausalLM with specific configurations for Qwen2 and Mistral models respectively.

  2. Modified the model_runner.py to refactor the entry class definition to To support multiple model classes in one module (we make different llava with different llms into one module to clean the code).

Fix

  1. We fix small template issues in example usages of llava_qwen and llava_llama3 model

@Luodian
Copy link
Contributor Author

Luodian commented May 26, 2024

This is my test, after refactoring, llava_qwen and llava_llama3 works correctly. (please ignore the output of the template message, it's for debug and removed in commit.)

image image

@Luodian
Copy link
Contributor Author

Luodian commented May 26, 2024

@merrymercy @BabyChouSr @Qubitium

Thanks for maintaining such wonderful project. Please check this PR and feel free to give more suggestions, thanks!

@Luodian Luodian changed the title [Fix and Feat] Refactoring Llava models into single file [Feat/Fix] Refactoring Llava models into single file May 26, 2024
@merrymercy merrymercy merged commit 2b605ab into sgl-project:main May 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants