A chatbot API built with Flask, integrating SambaNova OpenAI models for dynamic conversational AI experiences.
- Dynamic Model Selection: Choose between multiple AI models like Meta-Llama-3.1-405B, 70B, and 8B for different use cases.
- Chat History Management: Maintain context with persistent chat history for a natural conversational flow.
- System Prompt Customization: Configure the chatbot's personality and responses using customizable system prompts.
- Thinking Time Analysis: Monitor AI response times with a
thinking_budgetfeature. - Error Handling: Provides robust error handling and logging for seamless user experience.
- Static File Hosting: Serves an interactive frontend from the
staticdirectory for easy deployment.
- Python 3.8 or later
- Pip
- SambaNova OpenAI API Key
-
Clone the repository:
git clone https://github.com/your-username/ripple-chatbot.git cd ripple-chatbot -
Create a virtual environment:
python3 -m venv venv source venv/bin/activate # On Windows, use `venv\Scripts\activate`
-
Install dependencies:
pip install -r requirements.txt
-
Set up environment variables:
- Create a
.envfile in the project root:API_KEY=your_sambanova_api_key PORT=5000 - Replace
your_sambanova_api_keywith your actual API key.
- Create a
-
Run the application:
python app.py
-
Access the application: Open your browser and navigate to
http://localhost:5000.
Run the Flask server with:
python app.py- The primary endpoint for generating responses is
/api/generate. - Use an HTTP client like Postman, Curl, or your own frontend to send POST requests.
POST /api/generate
Content-Type: application/json
{
"message": "Tell me a joke.",
"chat_history": [
{"user": "Who are you?", "assistant": "I am your chatbot."}
],
"model": "Meta-Llama-3.1-70B-Instruct",
"system_prompt": "You are a funny assistant.",
"thinking_budget": 15
}{
"response": "Why did the scarecrow win an award? Because he was outstanding in his field!",
"thinking_time": 2.345
}Serves the static HTML frontend (static/ripple.html).
Generates a chatbot response based on the user input and chat history.
| Parameter | Type | Description |
|---|---|---|
message |
string | The user’s message for the chatbot. |
chat_history |
array | List of prior messages in the chat. |
model |
string | The model to use. Defaults to Meta-Llama-3.1-405B-Instruct. |
system_prompt |
string | A customizable prompt that sets the chatbot’s personality. |
thinking_budget |
number | Maximum response time for the AI (in seconds). Default: 10. |
api_key |
string | API key for SambaNova authentication (optional; defaults to .env key). |
| Variable | Description |
|---|---|
API_KEY |
Your SambaNova OpenAI API key. |
PORT |
Port to run the Flask server (default: 5000). |
ripple-chatbot/
│
├── static/
│ ├── ripple.html # Frontend HTML file
│ ├── styles.css # Stylesheet for the frontend
│ └── ... # Additional frontend assets
│
├── app.py # Main Flask application
├── requirements.txt # Python dependencies
├── .env # Environment variables
├── README.md # Project documentation
We welcome contributions! To contribute:
- Fork the repository.
- Create a feature branch:
git checkout -b feature-name
- Commit your changes:
git commit -m "Add feature-name" - Push to the branch:
git push origin feature-name
- Create a pull request.
This project is licensed under the MIT License.
Feel free to reach out if you have questions or feature suggestions. Happy coding! 🚀