Unlock the power of AI-driven learning with our open-source platform π
Explore the docs Β»
View Demo
Β·
Report Bug
Β·
Request Feature
Table of Contents
"OpenBookLM is a game-changer in the education sector, providing an open-source platform for AI-driven learning experiences." ππ
OpenBookLM is designed to bridge the gap between traditional learning methods and modern AI-driven approaches. Our platform empowers users to create and share interactive, audio-based courses, while leveraging the power of AI for enhanced learning experiences.
-
Students π
- High school and university students
- Graduate researchers
- Academic professionals
-
Lifelong Learners π§
- Self-directed learners
- Professional development enthusiasts
- Knowledge seekers
- Integration with various AI models
- Flexible and customizable architecture
- Community-driven development
- Create and share educational podcasts
- Multilingual text-to-audio generation using Suno bark
- High-quality audio content management
- Forum-like community system
- Course rating and refinement
- Knowledge sharing platform
- Overcome English-only limitations
- Support for multiple languages
- Inclusive learning environment
graph TD
subgraph Client
UI[Next.js Frontend]
Auth[Better Auth]
end
subgraph Server
API[Next.js API Routes]
LLM[LLM Python Service]
Cache[Redis Cache]
end
subgraph Database
PG[(PostgreSQL)]
Prisma[Prisma ORM]
end
subgraph External
Cerebras[Cerebras API]
Sources[External Sources]
end
UI --> Auth
UI --> API
API --> LLM
API --> Cache
API --> Prisma
Prisma --> PG
LLM --> Cerebras
LLM --> Sources
OpenBookLM operates on a freemium model. We believe education should be accessible to everyone immediately. Users can access the site and utilize "Guest Mode" without the friction of signing up. Guest users are assigned temporary identities using cookies and receive a limited pool of credits for audio generation, document processing, and context tokens. Once they see the value, they can easily sign in via GitHub to unlock higher limits and persistent storage via Better Auth.
As OpenBookLM scales, there are several architectural paths we plan to explore to maximize speed and lower compute costs:
Currently, our heavy text-to-audio and PDF parsing tasks are handled by a Python backend. Rewriting these specific, CPU-intensive microservices in Rust or Go could drastically reduce our memory footprint and execution time, allowing us to serve more concurrent users on cheaper hardware.
Certain parsing tasks, document vectorization, and local token counting can be shifted directly to the user's browser using WebAssembly. By compiling Rust/C++ libraries into Wasm, we can offload compute from our servers to the client, making the application feel instantaneous and significantly reducing our cloud hosting costs.
Leveraging Redis at the edge and pushing static LLM summaries to CDNs can ensure that repeated queries (like summarizing popular open-source books) are served in single-digit milliseconds worldwide.
To get a local copy up and running, follow these steps.
- Node.js (v20 or later)
- Bun
curl -fsSL https://bun.sh/install | bash - Python (3.12 or later)
- Docker & Docker Compose (for DB and Redis)
- Clone the repository
git clone https://github.com/open-biz/OpenBookLM.git
- Install dependencies
bun install
- Set up Python environment
./setup/create_venv.sh source venv/bin/activate pip install -r requirements.txt - Set up environment variables
cp .env.example .env
- Start local services (Postgres & Redis)
docker compose up -d db redis
- Sync your database schema
bunx prisma db push
- Start the development server
bun dev
- Create a Notebook: Start by creating a new notebook for your study topic (Guests can do this instantly!)
- Add Sources: Upload URLs, documents, or other study materials
- Take Notes: Use the AI-powered interface to take and organize notes
- Study & Review: Engage with your materials through interactive features
- Share & Collaborate: Join the community and share your knowledge
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
Please see our CONTRIBUTING.md file for detailed guidelines on how to fork the repository, structure your branches, format your pull requests, and start building!
Distributed under the MIT License. See LICENSE for more information.
Project Link: https://github.com/open-biz/OpenBookLM
