-
Notifications
You must be signed in to change notification settings - Fork 960
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support AlphaFold2 #265
Comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
AlphaFold2 is heavily used by the life science community to predict protein structures. AlphaFold is written in Jax and already runs on CPUs. However, it would be amazing to have a CPU version with ggml with all the speed-ups achieved through quantization.
Deepmind has recently implemented bfloat16 inference with pretty much equivalent performance. I think further quantization should be possible and beneficial.
Additionally, having a (python) dependency-free implementation would help us a lot in ColabFold.
The text was updated successfully, but these errors were encountered: