Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create new release and ship to PyPI #1560

Closed
veekaybee opened this issue Mar 11, 2024 · 9 comments · Fixed by #1573
Closed

Create new release and ship to PyPI #1560

veekaybee opened this issue Mar 11, 2024 · 9 comments · Fixed by #1573

Comments

@veekaybee
Copy link
Contributor

There have been a number of changes since the January release, would it be possible to cut a release , 0.4.2, and ship to PyPI? Right now we're using this dependency in our own stack and it would be great to unpin it from a git hash, happy to do this if it's something I as a contributor have privileges for. Thanks!

@haileyschoelkopf
Copy link
Collaborator

Definitely, was thinking we should do this--will release sometime this week!

cc @lintangsutawika

@artemorloff
Copy link
Contributor

please consider merging these #1571 #1569 before version bump

@haileyschoelkopf haileyschoelkopf linked a pull request Mar 13, 2024 that will close this issue
@LSinev
Copy link
Contributor

LSinev commented Mar 14, 2024

May this implementation #1578 will find its way to the new release (as long awaited one #1543) too.

@haileyschoelkopf
Copy link
Collaborator

Hi @artemorloff ! Thank you very much for the PRs, I will review #1571 today or tomorrow. However, I don't think we'll include it in the new version release, in the interest of not introducing large changes that haven't had the time to settle

Likewise, regarding #1578 -- chat templating is still something both requested and desired, but we haven't merged it yet because we've been seeing inconsistent results including decreased scores when applying a chat template. Hope to get it added soon nevertheless, but likewise would prefer to put out v0.4.2 now and follow up with a v0.4.3 relatively soon if some other major changes get landed.

@LSinev
Copy link
Contributor

LSinev commented Mar 14, 2024

May I suggest using the milestones feature of github as (seems) it was used before. Because these PRs with big changes appeared now, as I think, because of this announcing issue happened just 3 days ago, so people with requested and desired implementations start getting them done quickly to be included in pypi. And with milestones some kind of planning and announcing can be done.

@haileyschoelkopf
Copy link
Collaborator

Thank you @LSinev for the feedback, well taken and appreciated! Will work on getting things better-documented for the future for certain, and set up milestones for visibility.

I'm still leaning toward putting out v0.4.2 ASAP since it's been a while and subsequently having much more frequent minor releases to keep PyPI more up-to-date than it has been, which I hope might be a good middle ground here, but if this is significantly disappointing to contributors then would like to know.

@kcz358
Copy link

kcz358 commented Mar 15, 2024

Hi, I asked about the chat template issue in issue #1543 and @LSinev kindly guide me to this PR and some other PRs and issues that might solve my problem. He also suggested me to leave some thoughts about some certain PRs.

I read some of the core changes in #1578 that might help solving this chat templating issue and here are my thoughts and reviews

Pros:
This PR supports chat template by revising the tok_batch_encode function. To enable apply_chat_template function, we simply needs to pass --model_args "is_chat_model=True,apply_template=True". It conveniently wrapping up the hf chat template function to use.

Cons:
I am kind of concern on how this revision could apply to many different cases as it kind of pre assume that every input string is a user role. Let's say I want to add some different system message to ensure best performance or some user, assistant few shot dialogue as in context examples. I am not sure how this can be achieved through this small revision.

At last, I feel this feature needs more changes before merging or releasing. Some of the possible solutions that I feel might possible is like allowing the user to pass in system prompt say like --system "You are a helpful assistant" for custom system prompt that applies to every requests. For assistant role that could possibly appear in the context might need to revise the few shot context function.

And again, thank you for your great work!

@haileyschoelkopf
Copy link
Collaborator

Thank you for the feedback @kcz358 !

If you'd like to also check out #1287 , this PR allows for system prompts to be passed in . The problem of dividing context appropriately amongst user / assistant content is something which we will have to fix for certain before merging in support.

@artemorloff
Copy link
Contributor

Hi @artemorloff ! Thank you very much for the PRs, I will review #1571 today or tomorrow. However, I don't think we'll include it in the new version release, in the interest of not introducing large changes that haven't had the time to settle

Hi @haileyschoelkopf ! I was just wondering if there was any follow-up to my change request. Would it be possible to provide an estimate of when you might be able to review the changes? I'm eager to incorporate any feedback you may have and move forward with the development process.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants