Skip to content

Running Pose Sampler

Aditya Agarwal edited this page May 28, 2020 · 14 revisions

Pose Sampler

Training a pose sampling network using the costs calculated by PERCH 2.0.

  1. Follow the steps in Running-With-Docker Wiki under "Using Docker Image" to set up PERCH 2.0 and MaskRCNN with Docker.

  2. Clone the sampling CNN repo:

git clone https://github.com/SBPL-Cruz/perch_pose_sampler
  1. Start the Docker image and mount the cloned folder to Docker at /pose_sampler
  2. After getting into the Docker shell, run the following to make sure all Python modules are in the PYTHONPATH :
export PYTHONPATH=$PYTHONPATH:/pose_sampler/
export PYTHONPATH=$PYTHONPATH:/ros_python3_ws/src/perception/sbpl_perception/src/scripts/tools/fat_dataset
  1. Run the training code :
# For visualizing poses during training : 
Xvfb :5 -screen 0 800x600x24 & export DISPLAY=:5; #skip this if not using the CPU version 
cd /pose_sampler/utils
python train_classification.py \
  --dataset /data/YCB_Video_Dataset \
  --dataset_type ycb \
  --dataset_annotation /data/YCB_Video_Dataset/instances_train_bbox_pose_sampler.json \
  --test_dataset_annotation /data/YCB_Video_Dataset/instances_keyframe_bbox_pose_sampler.json \
  --batchsize 40 --nepoch 50 --render_poses
  1. Run tensorboard outside Docker to visualize in the browser :
cd perch_pose_sampler/utils
tensorboard --logdir experiments
Clone this wiki locally