-
KEK
- Tsukuba, Japan
-
16:10
(UTC +04:00) - in/deshiyer
Lists (4)
Sort Name ascending (A-Z)
Starred repositories
Free and Open Source, Distributed, RESTful Search Engine
A simple and efficient Mamba implementation in pure PyTorch and MLX.
Simple, minimal implementation of the Mamba SSM in one file of PyTorch.
A repository for compiling graduate application materials for prospective computer science graduate students (Masters & PhD).
Online Cyberpunk 2077 hacking minigame solver.
Bayesian Modeling and Probabilistic Programming in Python
A good looking terminal emulator which mimics the old cathode display...
Approximate Nearest Neighbors in C++/Python optimized for memory usage and loading/saving to disk
Finetune Llama 3.2, Mistral, Phi & Gemma LLMs 2-5x faster with 80% less memory
A massively parallel, high-level programming language
A library of quantum algorithms for Qiskit.
This code implements a Radial Basis Function (RBF) based Kolmogorov-Arnold Network (KAN) for function approximation.
Qiskit Nature is an open-source, quantum computing, framework for solving quantum mechanical natural science problems.
Better implementation of Kolmogorov Arnold Network
A massively parallel, optimal functional runtime in Rust
A comprehensive collection of KAN(Kolmogorov-Arnold Network)-related resources, including libraries, projects, tutorials, papers, and more, for researchers and developers in the Kolmogorov-Arnold N…
[EMNLP'23, ACL'24] To speed up LLMs' inference and enhance LLM's perceive of key information, compress the prompt and KV-Cache, which achieves up to 20x compression with minimal performance loss.
TensorFlow code for the neural network presented in the paper: "Structural Language Models of Code" (ICML'2020)
A curated list of awesome Mojo 🔥 frameworks, libraries, software and resources
This repository contains demos I made with the Transformers library by HuggingFace.
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Course to get into Large Language Models (LLMs) with roadmaps and Colab notebooks.
Home of StarCoder: fine-tuning & inference!
Default set of Data Lab notebooks, by DL team and contributed by users