Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

后续能否支持ChatGLM3的多轮 #419

Open
chenyangjun45 opened this issue Feb 22, 2024 · 2 comments
Open

后续能否支持ChatGLM3的多轮 #419

chenyangjun45 opened this issue Feb 22, 2024 · 2 comments

Comments

@chenyangjun45
Copy link

之前看了一下fastllm的源码,好像还没有支持ChatGLM3的多轮,后续是否有支持ChatGLM3多轮的计划

@TylunasLi
Copy link
Contributor

不知道是不是我的理解有误,看了下现有的源码,C++ API 已经支持ChatGLM3的多轮对话。
如果有需要修改的地方,欢迎贡献代码。

@chenyangjun45
Copy link
Author

chenyangjun45 commented Feb 23, 2024

class ChatglmModel(BaseModel):
    def process_response(self, response):
        response = response.strip()
        response = response.replace("[[训练时间]]", "2023年")
        return response
    
    def is_stop(self, token_id):
        return token_id <= 2
    
    def build_input(self, query, history=None):
        if not history: history = []
        prompt = ""

        for i, (old_query, response) in enumerate(history):
            prompt += "[Round {}]\n问:{}\n答:{}\n".format(i, old_query, response)
        prompt += "[Round {}]\n问:{}\n答:".format(len(history), query)
        return prompt

pyfastllm----fastllm----models.py,这里还是chatglm2的多轮写法呀,没有看到chatglm3的多轮
@TylunasLi

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants