Skip to content

Chakra-Network/verl-tools

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

VERL Setup for Dojo Integration

This repository contains all the necessary files to set up and run VERL with Dojo integration

Prerequisites

  • Machine with 8xA100 GPUs
  • Docker with NVIDIA runtime support
  • Git

Quick Start

  1. Clone this repository:
git clone <this-repo-url>
cd verl-files
  1. Choose your backend and start Docker container:
# For SGLang backend
./scripts/start-slang.sh

# OR for VLLM backend
./scripts/start-vllm.sh
  1. Connect to the container:
# Use the helper script (runs setup automatically, installs dependencies etc)
./scripts/exec.sh
# For SGLang backend
docker exec -it verl bash

# OR for VLLM backend
docker exec -it verl_vllm bash
  1. Inside the container, run setup:
./scripts/setup.sh
  1. Prepare data (if not already done by train.sh):
python dojo-src/prepare.py --local_dir ~/data/dojo
  1. Start training:
./scripts/train.sh

Configuration

  • dojo-config/tools.yaml: Defines the computer use tool schema
  • dojo-config/agent.yaml: Agent loop configuration
  • dojo-src/reward.py: Custom reward function implementation
  • dojo-src/tool.py: Computer use tool implementation
  • dojo-src/loop.py: Agent loop implementation
  • dojo-src/prepare.py: Data preparation script

Environment Variables

Set these before training (or modify in scripts/train.sh):

  • DOJO_API_KEY: Your Dojo API key

Scripts

  • scripts/setup.sh: Installs VERL and dependencies
  • scripts/train.sh: Prepares data (if needed) and starts training with GRPO
  • scripts/start-slang.sh: Starts Docker container with SGLang backend
  • scripts/start-vllm.sh: Starts Docker container with VLLM backend
  • scripts/exec.sh: Helper to exec into container and run setup

Notes

  • Files are preserved between docker runs via volume mounting at /workspace/verl
  • Configuration uses GRPO algorithm with multi-turn support
  • Training logs are saved to verl_dojo_grpo.log
  • The SGLang container is named verl, VLLM container is named verl_vllm

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published