Change the repository type filter
All
Repositories list
23 repositories
DOGe
Publickeylogging
Public- [NeurIPS 2024 Spotlight] Code for the paper "Flex-MoE: Modeling Arbitrary Modality Combination via the Flexible Mixture-of-Experts"
Spatial2Sentence
PublicDistillation-Bench
PublicCryoNeRF
PublicOccult
PublicAgentSymbiotic
PublicHEXA-MoE
PublicOfficial code for the paper "HEXA-MoE: Efficient and Heterogeneous-Aware MoE Acceleration with Zero Computation Redundancy"glider
Publicmoe-retriever
PublicUQ-Merge
PublicC2R-MoE
PublicFlowTS
PublicscMoE
PublicMew
PublicMoE-RBench
Public[ICML 2024] Code for the paper "MoE-RBench: Towards Building Reliable Language Models with Sparse Mixture-of-Experts"q-newton
PublicOfficial code for the paper "Hybrid Quantum-Classical Scheduling for Accelerating Neural Network Training with Newton's Gradient Descent"moe-quantization
PublicMC-SMoE
Public[ICLR 2024 Spotlight] Code for the paper "Merge, Then Compress: Demystify Efficient SMoE with Hints from Its Routing Policy"Lingual-SMoE
PublicVPNs
PublicLLM_Toolbox
Public