A robust, embeddable chatbot designed for eCommerce websites to enhance customer engagement and optimize customer acquisition. This chatbot uses a Retrieval-Augmented Generation (RAG) pipeline to answer user queries based on uploaded files such as product catalogs, FAQs, and other resources.
- Upload files in formats such as PDF, TXT, or CSV.
- Automatically processes files and integrates them into the chatbot's knowledge base.
- Handles user queries based on the knowledge base.
- Allows follow-up questions with conversational context retained.
- Provides concise and accurate answers to user queries.
- User-friendly chat interface for integration into any eCommerce website.
- Easily embeddable using a React-based frontend.
- Streamlit Admin Panel: Streamlit App
- React Chatbot Interface: Vercel Deployment
- Backend API Test: Render Deployment
git clone https://github.com/karthik738/e-commerce-chatbot
cd e-commerce-chatbot
- Python 3.8+
- Pinecone API Key
pip install -r requirements.txt
-
Pinecone Setup: Set your Pinecone API key and environment in the
backend/main.py
file or as environment variables:export PINECONE_API_KEY=<your_api_key> export PINECONE_ENVIRONMENT=<your_environment>
-
Run the Backend:
cd backend uvicorn main:app --host 0.0.0.0 --port 8000
- Install Streamlit:
pip install streamlit
cd streamlit_app_final
streamlit run app.py
- Node.js and npm/yarn installed.
-
Navigate to the
chatbot_ui
directory:cd chatbot_ui
-
Install dependencies:
npm install
-
Configure API Endpoint: Update the
BASE_URL
inapi.js
to point to your backend's deployment URL:const BASE_URL = "https://e-commerce-chatbot-1-ydcl.onrender.com";
-
Start the Development Server:
npm start
-
Deploy the Chatbot UI: Use Vercel for deployment:
npm install -g vercel vercel deploy
- File Upload: Files uploaded via Streamlit are processed and indexed into Pinecone.
- Query Handling:
- When a user asks a question, the chatbot retrieves relevant chunks from Pinecone.
- Uses a language model (LLM) to generate answers based on the retrieved context.
- Answer Delivery: The chatbot sends back concise and relevant answers.
- Deploy the chatbot to a service like Vercel.
- Copy the provided embed script:
<iframe src="https://e-commerce-chatbot-black.vercel.app/" width="400" height="600" style="border:none;" ></iframe>
- Add the iframe to your eCommerce website's HTML.
ecommerce-chatbot/
│
├── backend/ # Backend API with FastAPI
│ ├── main.py # Entry point for the backend
│ ├── services/ # File processing and embedding logic
│ └── requirements.txt # Backend dependencies
│
├── streamlit_app_final/ # Streamlit admin interface
│ ├── app.py # Entry point for Streamlit app
│ └── utils/ # Utility functions for file upload and query
│
├── chatbot_ui/ # React-based chatbot interface
│ ├── src/
│ ├── public/
│ └── package.json # React dependencies
│
└── README.md # Documentation
-
Can the chatbot handle follow-up queries?
Yes, it retains conversational history for seamless follow-ups. -
How is the knowledge base updated?
Upload updated files via the admin panel to refresh the knowledge base. -
What file formats are supported?
PDF, TXT, and CSV.
We welcome contributions! Please create a pull request or open an issue for any improvements or bugs.
This project is licensed under the MIT License. See LICENSE
for details.