Skip to content

Running Pose Sampler

Aditya Agarwal edited this page May 28, 2020 · 14 revisions

Creating Training and Testing Data

The data is created by running PERCH 2.0 6Dof flow on the YCB Video Dataset. For now it is for one object at a time only :

  1. Follow the steps in Running-With-Docker Wiki under Using Docker Image to set up PERCH 2.0 and MaskRCNN with Docker (make sure you are able to run Step 12 before going further).

  2. Make a new folder outside Docker for storing sampler data :

mkdir -p pose_sampler_data/sugar/test
mkdir -p pose_sampler_data/sugar/train
  1. Mount the above folder while running Docker to /data/pose_sampler_data.

    • If you are creating training data, config_docker.yaml should point to the training COCO annotation file
    • If you are creating test data, config_docker.yaml should point to the test COCO annotation file
  2. Next, check the run_ycb_6d function in fat_pose_image.py :

    • It should be set to run a single required object
    • If creating training data, the scene range should be 0 to 93
    • If creating test data, the scene range should be 48 to 60
    • The image range can be set as per requirement
  3. Run the code from inside Docker :

python fat_pose_image.py --config config_docker.yaml
  1. The PERCH C++ code will dump outputs as well as network data in json files in the ```perch_outputs`` folder
  2. Once you are done with running the code, copy the folders corresponding to images from perch_outputs`` to required train or test folder. ***Make sure there are no random run folders in perch_outputs`` if you are copying everything in the folder **:
cp perch_outputs/* pose_sampler_data/sugar/train
  1. Once the folders for both train and test are copied, you can run the convert_fat_coco.py script to convert the data to COCO format which can be used for training the network. The script will go through each pose in each scene and assign a score out of 1 using the cost computed by PERCH. It will also discretize the pose using viewpoints and inplane rotations.

Look for the code section on DATASET_TYPE = "ycb_sampler" :

  • Make sure this is the only section set to True and everything else is set to False :
  • For testing, the testing section should be uncommented (Output file : instances_keyframe_bbox_pose_sampler)
  • For training the testing section should be uncommented (Output file : instances_train_bbox_pose_sampler)
  1. Run the convert script to create json files containing the annotations in the YCB_Video_Dataset folder:
python convert_fat_coco.py

Training Network

Training a pose sampling network using the costs calculated by PERCH 2.0.

  1. Follow the steps in Running-With-Docker Wiki under "Using Docker Image" to set up PERCH 2.0 and MaskRCNN with Docker.

  2. Clone the sampling CNN repo:

git clone https://github.com/SBPL-Cruz/perch_pose_sampler
  1. Start the Docker image and mount the cloned folder to Docker at /pose_sampler
  2. After getting into the Docker shell, run the following to make sure all Python modules are in the PYTHONPATH :
export PYTHONPATH=$PYTHONPATH:/pose_sampler/
export PYTHONPATH=$PYTHONPATH:/ros_python3_ws/src/perception/sbpl_perception/src/scripts/tools/fat_dataset
  1. Run the training code :
# For visualizing poses during training : 
Xvfb :5 -screen 0 800x600x24 & export DISPLAY=:5; #skip this if not using the CPU version 
cd /pose_sampler/utils
python train_classification.py \
  --dataset /data/YCB_Video_Dataset \
  --dataset_type ycb \
  --dataset_annotation /data/YCB_Video_Dataset/instances_train_bbox_pose_sampler.json \
  --test_dataset_annotation /data/YCB_Video_Dataset/instances_keyframe_bbox_pose_sampler.json \
  --batchsize 40 --nepoch 50 --render_poses
  1. Run tensorboard outside Docker to visualize in the browser :
cd perch_pose_sampler/utils
tensorboard --logdir experiments
Clone this wiki locally