Skip to content

Code for "Language Model Knowledge Distillation for Efficient Question Answering in Spanish" (ICLR 2024 Tiny Papers)

Notifications You must be signed in to change notification settings

AdrianBZG/tinyroberta-distillation-qa-es

Repository files navigation

tinyroberta-distillation-qa-es

Knowledge distillation of large models for question answering in Spanish