Skip to content

Commit

Permalink
init
Browse files Browse the repository at this point in the history
  • Loading branch information
JingfengYang committed Mar 1, 2022
1 parent 55ba806 commit ca5a75f
Showing 1 changed file with 15 additions and 0 deletions.
15 changes: 15 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,9 @@ There are many advances of using unified models (e.g. Transformer) to create rep
- [Table](#table)
- [Knowledge Graph](#knowledge-graph)
- [Retrieval Paragraphs as Knowledge](#retrieval-paragraphs-as-knowledge)
- [Biology / Chemistry](#biology/chemistry)
- [Protein](#protein)
- [Molecular](#molecular)
- [Modality Fusion](#modality-fusion)
- [Vision and Natural Language](#vision-and-natural-language)

Expand Down Expand Up @@ -88,6 +91,8 @@ There are many advances of using unified models (e.g. Transformer) to create rep

* [GraphCodeBERT: Pre-training Code Representations with Data Flow](https://arxiv.org/pdf/2009.08366.pdf), ICLR 2021.

* [Transformer Embeddings of Irregularly Spaced Events and Their Participants](https://arxiv.org/abs/2201.00044), ICLR 2022.

* [AlphaCode: Competition-Level Code Generation with AlphaCode](https://storage.googleapis.com/deepmind-media/AlphaCode/competition_level_code_generation_with_alphacode.pdf).

## Structured Knowledge
Expand Down Expand Up @@ -132,6 +137,16 @@ There are many advances of using unified models (e.g. Transformer) to create rep

* [Spider: Learning to Retrieve Passages without Supervision](http:https://www.cs.tau.ac.il/~oriram/spider.pdf), arxiv Dec 2021.

## Biology/Chemestry

### Protein

* [Transformer protein language models are unsupervised structure learners](https://openreview.net/pdf?id=fylclEqgvgd), ICLR 2021.

### Molecular

* [Graphomer: Do Transformers Really Perform Bad for Graph Representation?](https://arxiv.org/pdf/2106.05234.pdf), NeuralPS 2021.

## Modality Fusion

### Vision and Natural Language
Expand Down

0 comments on commit ca5a75f

Please sign in to comment.