HMoE: Heterogeneous Mixture of Experts for Language Modeling | Synapse