DeepThink is an AI-driven research team under Southern Cross AI, focused on building a modular, containerized AI application development platform. Our system is based on Ollama (for running large language models), LangChain (for orchestration), and Gradio (for web-based interaction). We aim to deliver a flexible, cross-platform development pipeline using Docker, enabling users to rapidly prototype and deploy offline AI apps.
- Develop a custom Docker image combining Ollama, LangChain, and Gradio.
- Enable local execution of LLMs with modular backend/frontend separation.
- Ensure image is compatible across platforms (x86, ARM).
- Implement a prompt → chain → response workflow using LangChain.
- Use Gradio to build an interactive frontend that starts automatically.
- Support optional modules like document-based Q&A using local knowledge.
- Create a multi-stage Dockerfile with Ollama.
- Integrate model caching, environment variables, and readiness probes.
- Support both CPU and GPU acceleration.
- Install LangChain and implement basic LLMChain logic.
- Use LCEL and Jinja2 templates for structured chaining.
- Add support for conversation memory, structured outputs, and asynchronous tasks.
- Build a Gradio Blocks-based UI with real-time chat interaction.
- Include features like KaTeX rendering, session persistence, and light/dark modes.
- Automatically launch frontend on container startup.
- Provide Docker Compose and Kubernetes deployment scripts.
- Integrate monitoring tools like Prometheus and OpenTelemetry.
- Document APIs, configs, and troubleshooting.
- Implement hybrid retrieval using BM25 + vector similarity.
- Add RAG pipelines with semantic chunking and query rewriting.
- Support multi-format documents and persistent embeddings.
By combining cutting-edge open-source tools in a unified platform, DeepThink enables developers to:
- Run LLMs locally with privacy and speed.
- Rapidly prototype AI workflows with reusable components.
- Easily adapt the stack for future extensions like voice input/output and mobile deployment.
🚀 Join us as we shape the future of modular, local-first AI development! 🎮
🎬 Click the thumbnail above or watch on YouTube to see our team introduce the DeepThink project, demonstrate key features, and walk through our development process.
You can find our supporting files and documentation here:
- 🧩 Document (Google Site)
- 🧩 Statement of Work (Docs)
- 🧩 Landing Page (Website)
- 🧩 Weekly Meeting Record (Docs)
- 🧩 Reflection log (Docs)
- 🧩 Scrum log (Docs)
- 🧩 Risk log (Docs)
- 🧩 Issue (Github Page)
| Name | Role | |
|---|---|---|
| XiangyuTan | System Integration and Logic Control | u7779491@anu.edu.au |
| Diming Xu | Frontend Interaction and UI | u7705332@anu.edu.au |
| Zhuiqi Lin | Tech Lead | u7733924@anu.edu.au |
| Boyang Zhang | Documentation and Testing | u7760642@anu.edu.au |
| Qingchuan Rui | Langchain Engineer | u7776331@anu.edu.au |
| Dongze Yu | Extension Function and Future Planning | u7775416@anu.edu.au |

