Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Version] Bump 0.2.0 #136

Merged
merged 1 commit into from
Jan 31, 2024
Merged

[Version] Bump 0.2.0 #136

merged 1 commit into from
Jan 31, 2024

Conversation

Harold-lkk
Copy link
Collaborator

  • Stream Output: Provides the stream_chat interface for streaming output, allowing cool streaming demos right at your local setup.

  • Interfacing is unified, with a comprehensive design upgrade for enhanced extensibility, including:

    • Model: Whether it's the OpenAI API, Transformers, or LMDeploy inference acceleration framework, you can seamlessly switch between models.
    • Action: Simple inheritance and decoration allow you to create your own personal toolkit, adaptable to both InternLM and GPT.
    • Agent: Consistent with the Model's input interface, the transformation from model to intelligent agent only takes one step, facilitating the exploration and implementation of various agents.
  • Documentation has been thoroughly upgraded with full API documentation coverage.

@Harold-lkk Harold-lkk merged commit 990828c into InternLM:main Jan 31, 2024
1 check passed
@Harold-lkk Harold-lkk deleted the v0.2.0 branch August 1, 2024 06:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant