Welcome to the InternLM (Intern Large Models) organization. Intern-series large models (chinese name: 书生) are developed by Shanghai AI Laboratory and we keep open-sourcing high quality LLMs/MLLMs as well as toolchains for development and application.
- Intern-S1: a multimodal reasoning model with strong general capabilities and state-of-the-art performance on a wide range of scientific tasks.
 - InternVL: an advanced multimodal large language model (MLLM) series that demonstrates superior general performance.
 - InternLM: a series of multi-lingual foundation models and chat models.
 - InternLM-Math: state-of-the-art bilingual math reasoning LLMs.
 - InternLM-XComposer: a vision-language large model based on InternLM for advanced text-image comprehension and composition.
 
- XTuner: a next-generation training engine built for ultra-large MoE models, with superior efficiency.
 - LMDeploy: a toolkit for compressing, deploying and serving LLMs.
 - OpenCompass: a platform for large model evaluation, providing a fair, open, and reproducible benchmark.
 - Lagent: a lightweight framework that allows users to efficiently build LLM-based agents.
 
- HuixiangDou: a domain-specific assistant based on LLMs which can deal with complex techinical questions in group chats.
 - MindSearch: an LLM-based multi-agent framework of web search engine.