-
LMU Munich
- Munich, Germany
- shengqiang-zhang.github.io
Block or Report
Block or report Shengqiang-Zhang
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseLanguage: Python
Sort by: Most stars
Starred repositories
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Models and examples built with TensorFlow
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
A very simple framework for state-of-the-art Natural Language Processing (NLP)
Python Implementation of Reinforcement Learning: An Introduction
An open-source NLP research library, built on PyTorch.
100+ Chinese Word Vectors 上百种预训练中文词向量
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
This repository contains code examples for the Stanford's course: TensorFlow for Deep Learning Research.
Train transformer language models with reinforcement learning.
Distributed Asynchronous Hyperparameter Optimization in Python
A natural language modeling framework based on PyTorch
TensorFlow Neural Machine Translation Tutorial
Translate darknet to tensorflow. Load trained weights, retrain/fine-tune using tensorflow, export constant graph def to mobile devices
MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, …
A Python implementation of LightFM, a hybrid recommendation algorithm.
A repo for distributed training of language models with Reinforcement Learning via Human Feedback (RLHF)
A TensorFlow Implementation of the Transformer: Attention Is All You Need
micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-…
A modular RL library to fine-tune language models to human preferences
Named Entity Recognition (LSTM + CRF) - Tensorflow
NCRF++, a Neural Sequence Labeling Toolkit. Easy use to any sequence labeling tasks (e.g. NER, POS, Segmentation). It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components.
Chinese NER using Lattice LSTM. Code for ACL 2018 paper.
Graph4nlp is the library for the easy use of Graph Neural Networks for NLP. Welcome to visit our DLG4NLP website (https://dlg4nlp.github.io/index.html) for various learning resources!
Tensorflow implementation of contextualized word representations from bi-directional language models