☄️ Parallel and distributed training with spaCy and Ray
training
machine-learning
natural-language-processing
distributed-computing
spacy
ray
parallel-training
-
Updated
Jul 31, 2023 - Python
☄️ Parallel and distributed training with spaCy and Ray
Machine learning library, Distributed training, Deep learning, Reinforcement learning, Models
Cross-lingual Language Model (XLM) pretraining and Model-Agnostic Meta-Learning (MAML) for fast adaptation of deep networks
This repository is a tutorial targeting how to train a deep neural network model in a higher efficient way. In this repository, we focus on two main frameworks that are Keras and Tensorflow.
Add a description, image, and links to the parallel-training topic page so that developers can more easily learn about it.
To associate your repository with the parallel-training topic, visit your repo's landing page and select "manage topics."