Skip to content

Contextual embeddings are able to encode word meaning and polysemy to some degree. However, richer semantic information requires using representations other than texts, like knowledge graphs (KG). The goal of this project is to design a model to combining contextual and KG embeddings.

Notifications You must be signed in to change notification settings

tuananhfrtk/Combining-Contextual-words-and-KG-embeddings

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Combining-Contextual-words-and-KG-embeddings

About

Contextual embeddings are able to encode word meaning and polysemy to some degree. However, richer semantic information requires using representations other than texts, like knowledge graphs (KG). The goal of this project is to design a model to combining contextual and KG embeddings.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published