Skip to content

Ronsor/rwkv-simple

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

rwkv_simple

rwkv_simple is an easy-to-use implementation of RWKV-6 (x060). We support multiple WKV kernels (a Triton-based one from Flash Linear Attention, the official CUDA kernel, and a pure-PyTorch one).

Flash Linear Attention

We use the WKV6 kernel from the Flash Linear Attention (FLA) project by default, so for best performance, please install FLA directly from its repository:

pip install -U git+https://github.com/sustcsonglin/flash-linear-attention

See: https://github.com/sustcsonglin/flash-linear-attention.

License

Copyright © 2024 Ronsor Labs. Licensed under the Apache License, version 2.0.

About

RWKV x060 implementation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages