Researcher on deep learning.
Pinned Loading
-
pjlab-sys4nlp/llama-moe
pjlab-sys4nlp/llama-moe Public⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)
-
OpenSparseLLMs/LLaMA-MoE-v2
OpenSparseLLMs/LLaMA-MoE-v2 Public🚀LLaMA-MoE v2: Exploring Sparsity of LLaMA from Perspective of Mixture-of-Experts with Post-Training
-
A4Bio/GraphsGPT
A4Bio/GraphsGPT PublicThe official implementation of the ICML'24 paper "A Graph is Worth K Words: Euclideanizing Graph using Pure Transformer".
-
CASE-Lab-UMD/Unified-MoE-Compression
CASE-Lab-UMD/Unified-MoE-Compression PublicThe official implementation of the paper "Towards Efficient Mixture of Experts: A Holistic Study of Compression Techniques (TMLR)".
-
ChatGPT-ArXiv-Paper-Assistant
ChatGPT-ArXiv-Paper-Assistant PublicChatGPT/Gemini/DeepSeek based personalized arXiv paper assistant bot for automatic paper filtering. Powerful, free, and easy-to-use.
Python 3
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.