Skip to content
View Raphael-Hao's full-sized avatar
🪄
Mogic
🪄
Mogic

Highlights

  • Pro
Block or Report

Block or report Raphael-Hao

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse

Starred repositories

18 stars written in Cuda
Clear filter

LLM training in simple, raw C/CUDA

Cuda 21,652 2,356 Updated Jul 12, 2024

Tile primitives for speedy kernels

Cuda 1,374 47 Updated Jul 12, 2024

how to optimize some algorithm in cuda.

Cuda 1,200 99 Updated Jul 12, 2024

GPU database engine

Cuda 1,171 120 Updated Jan 30, 2017

[MICRO'23, MLSys'22] TorchSparse: Efficient Training and Inference Framework for Sparse Convolution on GPUs.

Cuda 1,156 131 Updated Jul 11, 2024

FlashInfer: Kernel Library for LLM Serving

Cuda 786 69 Updated Jul 12, 2024

Examples demonstrating available options to program multiple GPUs in a single node or a cluster

Cuda 490 101 Updated Mar 14, 2024

Yinghan's Code Sample

Cuda 259 47 Updated Jul 25, 2022

Flash-LLM: Enabling Cost-Effective and Highly-Efficient Large Generative Model Inference with Unstructured Sparsity

Cuda 158 12 Updated Sep 24, 2023

[ICML 2024] Quest: Query-Aware Sparsity for Efficient Long-Context LLM Inference

Cuda 83 5 Updated Jul 3, 2024

play gemm with tvm

Cuda 79 10 Updated Jul 22, 2023

[EuroSys'24] Minuet: Accelerating 3D Sparse Convolutions on GPUs

Cuda 68 2 Updated Jun 7, 2024
Cuda 22 4 Updated Jun 20, 2024
Cuda 18 5 Updated Apr 10, 2024
Cuda 7 1 Updated Aug 9, 2023

Gallatin is a general-purpose memory manager for CUDA that allows for threads to quickly malloc and free memory of arbitrary size inside of kernels.

Cuda 4 Updated Mar 4, 2024
Cuda 3 Updated Feb 7, 2023