The Mixture of Experts (MoE) models are an emerging class of sparsely activated deep learning models that have sublinear compute costs with respect to their parameters. In contrast with dense models, ...
[2025/11/24] 🔥 We have integrated our model Uni-MoE-2.0-Omni for evaluation within the Lmms-eval framework, see here. [2025/11/13] 🔥 We release the second version of Uni-MoE-2.0-Omni. It achieves a ...