UNITES Lab
University of North Carolina, AI Trustworthiness, Efficiency, and for Science Group
Pinned Loading
Repositories
Showing 10 of 13 repositories
- MoE-RBench Public
[ICML 2024] Code for the paper "MoE-RBench: Towards Building Reliable Language Models with Sparse Mixture-of-Experts"
UNITES-Lab/MoE-RBench’s past year of commit activity - moe-quantization Public
Official code for the paper "Examining Post-Training Quantization for Mixture-of-Experts: A Benchmark"
UNITES-Lab/moe-quantization’s past year of commit activity
People
This organization has no public members. You must be a member to see who’s a part of this organization.
Top languages
Loading…
Most used topics
Loading…