- Italy, Bologna
- @loretoparisi
Block or Report
Block or report loretoparisi
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseBERT
ACL22 paper: Imputing Out-of-Vocabulary Embeddings with LOVE Makes Language Models Robust with Little Cost
UmBERTo: an Italian Language Model trained with Whole Word Masking.
Repository to experiment with the possibility of generating text with BERT models
Repository to develop great prompt engineering techniques for BERT models
Suite of tools for deploying and training deep learning models using the JVM. Highlights include model import for keras, tensorflow, and onnx/pytorch, a modular and tiny c++ library for running mat…
The repository for the code of the UltraFastBERT paper
Easily use and train state of the art late-interaction retrieval methods (ColBERT) in any RAG pipeline. Designed for modularity and ease-of-use, backed by research.