-
Notifications
You must be signed in to change notification settings - Fork 9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wish to convert flan-t5 model to GGUF format #3393
Labels
Comments
No one has converted flan-t5 architecture to ggml, but if that happens, it will be in the ggml repository, see here: ggerganov/ggml#12 |
This issue was closed because it has been inactive for 14 days since being marked as stale. |
4 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Prerequisites
Please answer the following questions for yourself before submitting an issue.
Expected Behavior
I was trying to convert google/flan-t5-large model to GGUF format using this colab.
I am importing the model this way
Current Behavior
I know that the current convert.py execution fails cause this type of model isn't supported
current error:
Environment and Context
I am currently running all of this in a google colab notebook
I request help to accomplish this conversion. Can someone please suggest a method to convert this flan model to GGUF.
The text was updated successfully, but these errors were encountered: