Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

block-sparse flash attention support #851

Open
jordiclive opened this issue Mar 22, 2023 · 3 comments
Open

block-sparse flash attention support #851

jordiclive opened this issue Mar 22, 2023 · 3 comments
Assignees
Labels
feature request New feature or request good first issue Good for newcomers

Comments

@jordiclive
Copy link

I saw flash attention was recently merged.

This approximate attention would be cool to have as well for training very large sequence lengths. https://github.com/HazyResearch/flash-attention/blob/main/flash_attn/flash_blocksparse_attention.py

@jordiclive jordiclive added the feature request New feature or request label Mar 22, 2023
@StellaAthena StellaAthena added the good first issue Good for newcomers label Apr 30, 2023
@natek-1
Copy link

natek-1 commented May 8, 2023

Hello,
I am new to the EleutherAI team and i think this would be a good issue to try to solve. May i be assigned this task please.

@StellaAthena
Copy link
Member

@natek-1 Welcome! Thank you for your contribution.

@dashstander
Copy link
Contributor

Hey @natek-1 , do you have any updates on this? It's totally alright if you haven't gotten a chance to look at it. Would it be alright if we assigned it to someone else?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request New feature or request good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

4 participants