This project implements and compares the performance of PPO, DQN, and A2C algorithms on the CartPole-v1 environment using Python.
- Clone the Repository:
git clone https://github.com/namrapatel/PPO
cd PPO
- Install Dependencies: Ensure you have Python 3.8+ installed. Then, install the required packages:
pip install -r requirements.txt
- Go