-
Bournemouth University
- Bournemouth / London, United Kingdom
-
13:42
(UTC +01:00) - nicolay-r.github.io
- in/nicolay-rusnachenko-b98635193
- @nicolayr_
- https://arekit.io
Block or Report
Block or report nicolay-r
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseLists (1)
Sort Name ascending (A-Z)
Stars
Language: Python
Sort by: Most stars
ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
TensorFlow code and pre-trained models for BERT
OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
PyTorch Tutorial for Deep Learning Researchers
Code and documentation to train Stanford's Alpaca models, and generate the data.
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
The ChatGPT Retrieval Plugin lets you easily find personal or work documents by asking questions in natural language.
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
💬 Open source machine learning framework to automate text- and voice-based conversations: NLU, dialogue management, connect to Slack, Facebook, and more - Create chatbots and voice assistants
Multilingual Sentence & Image Embeddings with BERT
Finetune Llama 3, Mistral, Phi & Gemma LLMs 2-5x faster with 80% less memory
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
Fast and memory-efficient exact attention
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
The standard data-centric AI package for data quality and machine learning with messy, real-world data and labels.
Train transformer language models with reinforcement learning.
Minimal, clean code for the Byte Pair Encoding (BPE) algorithm commonly used in LLM tokenization.
Large Language Model Text Generation Inference
Reverse engineered API of Microsoft's Bing Chat AI
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
An open source library for deep learning end-to-end dialog systems and chatbots.
Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
Leveraging BERT and c-TF-IDF to create easily interpretable topics.