Skip to content

namrapatel/PPO

Repository files navigation

Custom PPO vs. DQN and A2C on CartPole v1

This project implements and compares the performance of PPO, DQN, and A2C algorithms on the CartPole-v1 environment using Python.

Setup Instructions

  1. Clone the Repository:
   git clone https://github.com/namrapatel/PPO 
   cd PPO
  1. Install Dependencies: Ensure you have Python 3.8+ installed. Then, install the required packages:
pip install -r requirements.txt
  1. Go

About

PPO vs DQN and A2C on Cartpole

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published