DeepEP
Efficient MoE communication library
An open-source communication library for Mixture-of-Experts (MoE) models, focused on efficient expert-parallel communication on GPUs.

Recent stories
0 linked stories
No linked stories yet.
Efficient MoE communication library
An open-source communication library for Mixture-of-Experts (MoE) models, focused on efficient expert-parallel communication on GPUs.
