-
Stability.ai, Eleuther.ai
- Seattle, WA
- http:https://dmarx.github.io
- @DigThatData
Block or Report
Block or report dmarx
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseNLP
[ACL-IJCNLP 2021] Automated Concatenation of Embeddings for Structured Prediction
[AAAI 2019] A Unified Model for Opinion Target Extraction and Target Sentiment Prediction
code for our NAACL 2019 paper: "BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis"
Source code for "UniRE: A Unified Label Space for Entity Relation Extraction.", ACL2021. It is based on our NERE toolkit (https://github.com/Receiling/NERE).
🗣️ NALP is a library that covers Natural Adversarial Language Processing.
Ongoing research training transformer models at scale
An efficient implementation of the popular sequence models for text generation, summarization, and translation tasks. https://arxiv.org/pdf/2106.04718.pdf
Graph4nlp is the library for the easy use of Graph Neural Networks for NLP. Welcome to visit our DLG4NLP website (https://dlg4nlp.github.io/index.html) for various learning resources!
CRFsuite: a fast implementation of Conditional Random Fields (CRFs)
🔍 LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your d…
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
Trankit is a Light-Weight Transformer-based Python Toolkit for Multilingual Natural Language Processing
Multi-Task Deep Neural Networks for Natural Language Understanding
multi_task_NLP is a utility toolkit enabling NLP developers to easily train and infer a single model for multiple tasks.
A scalable Gensim implementation of "Learning Role-based Graph Embeddings" (IJCAI 2018).
Stanford NLP Python library for tokenization, sentence segmentation, NER, and parsing of many human languages
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Code for ALBEF: a new vision-language pre-training method
PyTorch code for BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
Multilingual Sentence & Image Embeddings with BERT
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference,…
💫 Industrial-strength Natural Language Processing (NLP) in Python
Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
Code and model for the paper "Improving Language Understanding by Generative Pre-Training"
Topic Evolution Analysis - an algorithm for analyzing knowledge flow in text based corpora
Tree-structured Long Short-Term Memory networks (http:https://arxiv.org/abs/1503.00075)