torch_random_fields
is a library for building markov random fields (MRF) with complex topology [1] [2] with pytorch
, it is optimized for batch training on GPU.
The key features include:
- Easy to plug into your research code
- Support for batch acceleration of any random field with arbitary binary or ternary connections on the GPU
- Fast training/inference with top-K logits, do not worry about too large label space
- Support for context-aware transition matrix and low-rank factorization
You may cite this project by:
@inproceedings{
wang2022regularized,
title={Regularized Molecular Conformation Fields},
author={Lihao Wang and Yi Zhou and Yiqun Wang and Xiaoqing Zheng and Xuanjing Huang and Hao Zhou},
booktitle={Advances in Neural Information Processing Systems},
editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
year={2022},
url={https://openreview.net/forum?id=7XCFxnG8nGS}
}
Check out the tutorial.
The well known linear-chain CRF which is heavily adopted in sequence labeling (POS-tagging, chunking, NER, etc.) is supported.
Check out the tutorial.
In torch_random_fields
, any random field with arbitary topology is supported. To be more precise, we require binary connections, although in some case ternary connections are also supported (yes, I am lazy).
Here we show a case of Dynamic Skip-Chain CRF, where:
- Some nodes (e.g., two nodes with the same words) are connected, which looks skipping the linear connection [3]
- Only the top-3 labels for each node are kept, greatly speeding up training and inference [4]
Ising model (or Potts model) is widely used in statistical physics and computational biology [5]. In this case, the random variables form a grid, but it can be fully connected.
- Linear-Chain CRF:
- maximum likelihood estimation
- structured perceptron
- piecewise training
- pseudo-likelihood
- General CRF:
- structured perceptron
- piecewise training
- pseudo-likelihood
-
Linear-Chain CRF:
- viterbi decoding
- batch loopy belief propagation
- batch mean field variational inference
-
General CRF:
- batch loopy belief propagation
- naive mean field variational inference
- batch naive mean field inference
- Gibbs Sampling
Some implementation borrows from these great projects with modifications:
- linear-chain crf: pytorch-crf, fairseq
- loopy belief propagation: pystruct
[1] An Introduction to Conditional Random Fields (Sutton and McCallum, 2010)
[2] Graphical Models, Exponential Families, and Variational Inference (Wainwright and Jordan, 2008)
[3] A Skip-Chain Conditional Random Field for Ranking Meeting Utterances by Importance (Galley, 2006)
[4] Fast Structured Decoding for Sequence Models (Sun, 2020)
[5] Improved contact prediction in proteins: Using pseudolikelihoods to infer Potts models (Ekeberg, 2013)