-
Notifications
You must be signed in to change notification settings - Fork 1
Home
Matthew Altenburg edited this page May 16, 2025
·
2 revisions
DeepThink is an AI-driven research team under Southern Cross AI, focused on building a modular, containerized AI application development platform. Our system is based on Ollama (for running large language models), LangChain (for orchestration), and Gradio (for web-based interaction). We aim to deliver a flexible, cross-platform development pipeline using Docker, enabling users to rapidly prototype and deploy offline AI apps.
- Develop a custom Docker image combining Ollama, LangChain, and Gradio.
- Enable local execution of LLMs with modular backend/frontend separation.
- Ensure image is compatible across platforms (x86, ARM).
- Implement a prompt โ chain โ response workflow using LangChain.
- Use Gradio to build an interactive frontend that starts automatically.
- Support optional modules like document-based Q&A using local knowledge.
- Create a multi-stage Dockerfile with Ollama.
- Integrate model caching, environment variables, and readiness probes.
- Support both CPU and GPU acceleration.
- Install LangChain and implement basic LLMChain logic.
- Use LCEL and Jinja2 templates for structured chaining.
- Add support for conversation memory, structured outputs, and asynchronous tasks.
- Build a Gradio Blocks-based UI with real-time chat interaction.
- Include features like KaTeX rendering, session persistence, and light/dark modes.
- Automatically launch frontend on container startup.
- Provide Docker Compose and Kubernetes deployment scripts.
- Integrate monitoring tools like Prometheus and OpenTelemetry.
- Document APIs, configs, and troubleshooting.
- Implement hybrid retrieval using BM25 + vector similarity.
- Add RAG pipelines with semantic chunking and query rewriting.
- Support multi-format documents and persistent embeddings.
By combining cutting-edge open-source tools in a unified platform, DeepThink enables developers to:
- Run LLMs locally with privacy and speed.
- Rapidly prototype AI workflows with reusable components.
- Easily adapt the stack for future extensions like voice input/output and mobile deployment.
๐ Join us as we shape the future of modular, local-first AI development! ๐ฎ
You can find our supporting files and documentation here:
- ๐งฉ Document (Google Site)
- ๐งฉ Statement of Work (Docs)
- ๐งฉ Landing Page (Website)
- ๐งฉ Weekly Meeting Record (Docs)
- ๐งฉ Reflection log (Docs)
- ๐งฉ Scrum log (Docs)
- ๐งฉ Risk log (Docs)
- ๐งฉ Issue (Github Page)
| Name | Role | |
|---|---|---|
| XiangyuTan | Team member | u7779491@anu.edu.au |
| Diming Xu | Team member | u7705332@anu.edu.au |
| Zhuiqi Lin | Team member | u7733924@anu.edu.au |
| Boyang Zhang | Team member | u7760642@anu.edu.au |
| Qingchuan Rui | Team member | u7776331@anu.edu.au |
| Dongze Yu | Team member | u7775416@anu.edu.au |