Stars
推荐系统入门教程,在线阅读地址:https://datawhalechina.github.io/fun-rec/
Repository for "Self-Distillation for Model Stacking Unlocks Cross-Lingual NLU in 200+ Languages"
A one-stop data processing system to make data higher-quality, juicier, and more digestible for (multimodal) LLMs! 🍎 🍋 🌽 ➡️ ➡️🍸 🍹 🍷为大模型提供更高质量、更丰富、更易”消化“的数据!
Chinese version of CLIP which achieves Chinese cross-modal retrieval and representation generation.