Machine learning library, Distributed training, Deep learning, Reinforcement learning, Models, TensorFlow, PyTorch
-
Updated
Nov 17, 2024 - Python
Machine learning library, Distributed training, Deep learning, Reinforcement learning, Models, TensorFlow, PyTorch
☄️ Parallel and distributed training with spaCy and Ray
Cross-lingual Language Model (XLM) pretraining and Model-Agnostic Meta-Learning (MAML) for fast adaptation of deep networks
A simple package for distributed model training using Distributed Data Parallel (DDP) in PyTorch.
This repository is a tutorial targeting how to train a deep neural network model in a higher efficient way. In this repository, we focus on two main frameworks that are Keras and Tensorflow.
Add a description, image, and links to the parallel-training topic page so that developers can more easily learn about it.
To associate your repository with the parallel-training topic, visit your repo's landing page and select "manage topics."