An object recognition system implemented using ROS and PCL, using a Kinect sensor with point cloud data.
The system uses the ROS approach of nodes to implement the functionalities, nodes implemented in the system:
- ransac (floor removing)
- downsample (sample reducer using voxelgrid)
- clustering (object segmentation using euclidian distance)
- association (object temporal association using KNN)
- vfh-tracker (object classification using VFH and KNN)
- snapshot (creator of models)
- viewer (object viewer)
- box-tracker (boundary box viewer)
For this work the ROS is needed, to install ros following this steps. the openni package is needed to read Kinect data. The installation tutorial can be found on the package page.
For run openni roslaunch openni_launch openni.launch
, this launcher initialize the roscore
and the openni nodes. For data visualization, the rviz package can be used. rosrun rviz rviz
the ransac node apply the ransac approach for remove the floor associated points in the cloud.
command: rosrun kinect ransac [-i]
the -i
optional paramether is an inverter, for return only the floor points.
the downsample node use the VoxelGrid approach implemented on PCL. With a 3 cm voxel size.
command: rosrun kinect downsample
the clustering node use the Euclidian distance approach implemented on PCL too. Recieves a dowmsampled cloud, and returns in different channels each object segmentated on scene. The limit of maximum clouds are set to 10, the clouds has a limit between 50 and 1000 points, with a distance threshold of 15 cm.
command: rosrun kinect clustering
the association module use the K-nearest-neighbors approach, each segment found on clustering step is associated through the time.
command: rosrun kinect association
this module use the descriptor VFH found on PCL to create a feature histogram for each object associated, and a KNN aproach to classify the histogram, the chi square distance is used to get the distance between the histograms. To this node works is needed a training set, with the model of the object you want to track. As result, a red boundary box is set on scene to show the tracked object.
command: rosrun kinect vfh-tracker '\model\path\folder'