Skip to content

pollen-robotics/teleop_hub

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Teleoperation Hub

A general project combining different methods for teleoperating Reachy2.

You can find the article on the development of the project and the technical details on Medium.

Methods available

There are currently 5 methods available:

  • the custom controller with the Vive tracker
  • the custom controller with the ArUco cube
  • the RGBD camera
  • the SOARM-100 robotic arm
  • the gamepad

vive_tracker aruco_tracker aruco_tracker aruco_tracker aruco_tracker

They all have their pros and cons, which you can read about in the article, or find out for yourself by testing them.

How to install this repository ?

  1. Clone the repository

     git clone https://github.com/pollen-robotics/teleop_hub.git
    
  2. Install the dependencies, according to which modalities and robot you want to use (we recommand to do it in a virtual environment):

     pip install -e ".[_modalities_]"
    

    For example, if you want to install the required libraries for Reachy2, Vive Tracker and RGBD Camera : pip install -e ".[reachy2, vive, rgbd]"

    The options are :

    • reachy2
    • vive
    • aruco
    • rgbd
    • arm
    • gamepad
    • all
  3. Apply the dev rules:

     setup-udev
    

Be careful, if you're using the RGBD modality with the supplied Orbbec class, you need to manually install the library pyorbbecsdk. The instructions are below in the specific RGBD Camera section.

How to use it ?

The project configuration file is used to select a particular teleoperation mode. You can therefore modify it according to which modality you want to use.

To do this, go to the project's config.yaml file:

cd src/noVR_teleoperation
nano config.yaml 

The first 4 lines are essential to set-up the project :

  • robot_ip:

    You can change localhost for the ip address of your robot - if you don't know how to find your Reachy2's IP address, go there.

  • tracker_type: the modality you want to use (among 'aruco', 'vive', 'rgbd', 'arm', 'gamepad')

  • control_mode: some of the methods can be used to control either a single arm or both arms (Tracker Vive, ArUco cube, SOARM-100): you therefore need to specify ‘dual_arm’, ‘l_arm’ (for the left arm) or ‘r_arm’ (for the right arm).

  • mirror_mode : if mirror_mode is set to true, you can control the robot in mirror mode, i.e. you can control it face to face, your right arm will control its left arm and vice versa. Otherwise, the robot will be controlled normally.

The rest of the parameters are specific to each teleoperation technique, and we will describe the set-up of each one in details below.

Don't forget to CTRL+X then Y to save the config.yaml and exit

Then you can launch the teleoperation program : python -m teleoperation

Be careful that the robot will bent its arms to 90° when the teleoperation is launched. So make sure there are no obstacles in its path (if it is too close to a table, for example).

What about teleoperating other robots ?

For the moment, only Reachy2 is available, as this projet was developed around it, but there are plans to add other robots.

If you'd like to add your own robot, it is possible ! You have to add a child class to the Robot one (in the robot folder), adjusting the various methods for controlling the robot's parts, and change the initialization of the Teleoperation class in the teleoperation.py file, at the self.robot level. Feel free to test it and to suggest your additions!

How to set-up each modality

Vive Tracker & ArUco cube

These two techniques use the SOARM-100 leader joystick, that we have slightly tuned :

  • add of a support on top to be able to switch easily from Vive tracker (screwed onto it) to the ArUco cube.
  • add of a joystick and buttons, so you can control several parts and switch modes with an Arduino system.
  • tuning of the trigger, so you can use either a Feetech motor or a potentiometer.

You can find the tutorial for creating the controller yourself below.

How to make the custom controller

There are two types of controllers depending on the gripper type you want to use:

Potentiometer controller (work in progress)

This is a DIY controller. Two variants are possible:

  • A version using a Vive Tracker
  • A low-cost version using an ArUco cube

BOM

ArUco Controller (~25 €)

Component Quantity Unit Price (€) Total (€)
Arduino Nano Every 1 14.00 14.00
Potentiometer 1 1.00 1.00
Push Button 2 0.25 0.50
Joystick Module 1 3.00 3.00
Micro USB Cable 1 2.00 2.00
PLA Filament (~130g) ~130g 4.00
Estimated Total 24.50

Vive Controller (~266 €)

Component Quantity Unit Price (€) Total (€)
Vive Tracker 1 100.00 100.00
Vive Lighthouse 1 140.00 140.00
Arduino Nano Every 1 14.00 14.00
Push Button 2 0.25 0.50
Joystick Module 1 3.00 3.00
Micro USB Cable 1 2.00 2.00
PLA Filament (~80g) ~80g 2.50
Estimated Total 266.00

💡 One Lighthouse can track two Vive Trackers → ~432 € for two controllers.


3D Printed Parts

You can find all the STL files in the project_resources folder

For each controller, print the following:

Core parts:

  • side_potentiometer_controller1 ×1
  • side_potentiometer_controller2 ×1
  • side_potentiometer_controller3 ×1
  • side_trigger ×1
  • button ×2
  • button_support ×1

Tracking-specific top:

  • ArUco version: side_aruco_controller
  • Vive version: side_vive_controller

Replace side by left/right based on the controller side you want.


Assembly Instructions

1. Print the controller parts

Use your preferred slicer and printer to produce the components listed above.

2. Assemble the controller

This step currently requires some soldering.

  • Assemble the push buttons

  • Attach the buttons to the controller handle

  • Mount the joystick

  • Mount the potentiometer

  • Connect the components according to the provided wiring diagram (see below)

  • Mount the Arduino board

  • Close the handle & add the trigger

3. Upload the code

  • Open the PlatformIO project
  • Connect the Arduino Nano Every to your computer via USB
  • Upload the firmware to the board

Feetech controller (coming soon)

Work in progress.

Vive Tracker

This modality requires 1 Vive tracker for each controller used (it is possible to teleoperate one or both arms), a Vive base station for it to be detected and SteamVR to get data.

Download Steam & SteamVR and enable the headset-free mode:
  1. Install Steam, as well as Python and OpenVR dependencies (we recommend to do it in a virtual environment)

sudo apt-get install steam libsdl2-dev libvulkan-dev libudev-dev libssl-dev zlib1g-dev python-pip

  1. Make Steam account & Log in.

  2. Install SteamVR : click on Library > VR > Tools > SteamVR

  3. Make a Symbolic Link from libudev.so.0 to libudev.so.1 for SteamVR to use

sudo ln -s /lib/x86_64-linux-gnu/libudev.so.1 /lib/x86_64-linux-gnu/libudev.so.0

  1. Install pyopenvr : python -m pip install openvr

  2. Disable the headset requirement : there are 2 files to modify using those commands on a terminal :

    a. gedit ~/.steam/steam/steamapps/common/SteamVR/resources/settings/default.vrsettings

    Change the value of "requireHmd" to false, "forcedDriver" to null, and 'activateMultipleDrivers" to true.

    b. gedit ~/.steam/steam/steamapps/common/SteamVR/drivers/null/resources/settings/default.vrsettings

    Change the value of "enable" to true.

Be careful that SteamVR updates can sometimes overwrite changes to files. If the base station and trackers are no longer detected, don't hesitate to check that the changes are still there. If not, edit again and restart SteamVR.

To launch the teleoperation, it needs :

  • Running SteamVR
  • Vive Base Station plugged, at least 1m away from the trackers, with no obstacles in the way (avoid being too close to a computer, which can interfere with the signal)
  • Detected trackers - You can find out more about pairing trackers on the Vive website

You need to find the name of your Vive Trackers to put them in the config file :

  1. Check that your tracker is well detected in SteamVR

  2. If you have two trackers, put them in front of the base station, the left tracker at the left side and your right tracker at the right side.

  3. Execute the triad_openvr script and get the names :

cd src/noVR_teleoperation/trackers/vive_trackers
python3 triad_openvr.py

The names will be printed in your terminal.

  1. Copy/paste them in the config file
cd ../..
nano config.yaml 

In the vive_trackers, put the left tracker in l_arm and the right_tracker in r_arm and save it.

ArUco cube

What you need is :

  • an ArUco cube
  • a camera

1. The cube

You need to print the provided cube (left or right_aruco_controller in the project_resources folder), in white for a better detection. Then, you have to print the ArUco markers : the PDFs for the markers are in the project_resources folder too. They are arranged so that the middle face is placed on the top face of the cube, when the joystick handle is facing you, and the other faces must be folded to either side of the top face.

You can customise your ArUco cubes yourself. Use this site to generate the sheet with the markers (we use ArUco 6x6 dictionary). Then, paste them at the center of each face, as shown in the diagram below, with the cross representing the top left corner of the marker.

cube back cube front

You can adjust the markers id (order is : back, up, front, left, right, down) and size directly in the config file, in the ARUCO section.

2. The camera

You can use a camera or your smartphone. To select it, you need to change the config file in the ARUCO section.

If you're using a camera :

  • set the usb_mode to true
  • set the camera_id : the integrated one is '0', if it's an external camera, the number is the index in the order of usb-connected devices.

If you're using your smartphone :

  • install the app "IP webcam" on your phone, then click on the 3 dots and "Start the server"
  • set the usb_mode to false
  • set the camera_id : use the IPv4 address written on the app (between http:// and :8080, not included - it should be something like '172.16.0.10')
  • make sure your phone and computer are on the same network, and that your phone stays unlocked.

By default, with_calibration parameter is set to false in the config file. That means that the intrinsic camera parameters used are estimated manually (which is sufficient for cameras with little distortion such as integrated computer cameras). But you have the possibility to use specific camera matrix and distorsion coefficients, by setting with_calibration to true and replacing the camera_matrix and dist_coeffs with your own values. We also provide the script to perform your calibration with a ChArUco board.

Steps to calibrate your camera
  1. Go to the camera_calibration folder : cd src/camera_calibration

  2. Generate your ChArUco board : python3 charuco_generator.py

  3. Print it and paste it on a rigid surface

  4. Launch the calibration script : python3 camera_calibration.py

By default, the camera taken into account is the one built into the computer, but you can select the one you want to calibrate by adding the --usb_mode (true or false) and -- camera_id (index or IP address) arguments: for example by executing python3 camera_calibration.py --usb-mode false --camera_id "172.16.0.10".

Move the board in all directions, the important thing is to have different orientations, close-up shots, distant shots and angled shots. This takes 20 images, in which the board must be placed quite a distance from the previous image.

Then, it saves the parameters in the camera_parameters folder, but also by updating the teleoperation config file.

Both

1. To setup your config file :

You need to modify the config file to adjust your controller settings, according to the custom controller you made.

For all :

  • gripper_type: set to 'feetech' or 'potentiometer' depending on what you have chosen for the joystick trigger
  • arduino_ports: you don't normally need to change this, as the name indicated is setup in the dev rules, but if necessary you can replace it with the port of your arduino (such as 'ttyACM0').

    if the pre-recorded name doesn't work, you can find the port name by doing ls /dev in your terminal, and looking for the ttyACM[0-10] linked to it.

For controllers with Feetech motors :

  • feetech_ports: you don't normally need to change this, as the name indicated is setup in the dev rules, but if necessary you can replace it with the port of your arduino (such as 'ttyACM0').
  • feetech_gripper_joints_limit: you can adapt the range of gripper values to your own trigger movement amplitude if required.

For controllers with potentiometers:

  • potentiometer_gripper_joints_limit: you can adapt the range of gripper values to your own trigger movement amplitude if required.

2. To set up your environment :

Once your config file and your controller(s) are ready (with a Vive tracker or a ArUco cube), you can launch the main script. Take your controller(s), the pose during the initialization will be the reference pose for the arm to be bent at 90°.

To understand how to use the system once it’s running, you can refer to the image below.

For one tracker only : one tracker

For two trackers : two trackers

RGBD Camera

The posture detection model is Mediapipe.

We use an Orbbec Femto Bold, but you are free to adapt the code to use your own RGBD Camera. To use an Orbbec camera, you need the package pyorbbecsdk.

Download pyorbbecsdk :
  1. Clone the repository (virtual environment recommended) : git clone https://github.com/orbbec/pyorbbecsdk.git

  2. Make sure you have the needed dependencies : sudo apt-get install python3-dev python3-pip python3-opencv

  3. Install the requirements :

    cd pyorbbecsdk
    pip3 install -r requirements.txt 
    
  4. Create a folder for the build :

     mkdir build
     cd build
     cmake -Dpybind11_DIR=`pybind11-config --cmakedir` ..
    
  5. Build the wrapper :

     make -j4
     make install
    
  6. Add the library directory to the list (to know your python path, write which python in your terminal):

    export PYTHONPATH=$PYTHONPATH:$(pwd)/install/lib/

  7. Import and apply dev rules :

    sudo bash ./scripts/install_udev_rules.sh
    sudo udevadm control --reload-rules && sudo udevadm trigger
    
  8. Install the library :

    cd ..
    pip install -e .
    

To set-up your environment :

  • Position the camera high up, to avoid getting occlusion. A calibration will be performed when the script is launched to calculate the orientation and adapt the calculation of the poses.

  • Position yourself with your head straight and your arms bent at 90°.

Note that the script will wait until your hands are correctly positioned before launching the teleoperation, to avoid any sudden movements by the robot.

  • You can move your head and your two arms, keeping your torso static. You can open and close the grippers by moving your index finger and thumb towards or away from each other. The orientation of the robot's hand is defined by the orientation of the elbow-wrist vector.

  • To stop the teleoperation, you have to tilt the head downwards for a prolonged period of time, until the streaming window closes (pitch needs to be > 25 for few seconds).

SOARM-100

It needs a leader SOARM-100 : you can either build it or buy it (all the infos are on GitHub).

To launch the teleoperation, it needs :

  • the arm powered and plugged to the computer
  • the specified port on the config file (in the 'Arm’ section)

    You can find the port name by doing ls /dev in your terminal, and looking for the ttyACM[0-10] linked to it.

When the teleoperation starts, the arm will spontaneously move into its initial position: accompany it without restraining it, because once it is in position, it will become compliant and fall.

You then have control of the right arm, which you can change using the keyboard commands shown in the image below. There are also commands for gripping from above, for increasing/decreasing the ratio between arm and robot movement in position and rotation, for pausing teleoperation and for changing reference points.

Take a look at the summary image, but don't forget to try it out straight away - it'll be much easier to get the hang of!

so100 commands

To quit teleoperation, you only have to ctrl + c the script.

Gamepad

You only need a PS4 gamepad, plugged to your computer.

so100 commands

The left joystick is used for forward/backward and left/right translation of the left arm, L1/L2 for up/down translation ; the right joystick and R1/R2 for the right arm.

Orientation is controlled using the side buttons (the triangle and cross allow upward and downward bending of the left wrist, the square and round allow inward and outward bending, and the arrows are for the right wrist): this means you can control the orientation and position of an end effector at the same time, although it does take a bit of practice.

You can open and close the grippers with the 'share' and 'options' buttons.

You can tune the ratios of movements on the various axes in position and rotation in the config file, in the “gamepad” section - at the end of the file (lower the coefficient to slow down movement).

How is it organised ? (under construction)

If you want to find out more about the project and contribute :

The project is structured as follows:

  • src : The Python source code

    • noVR_teleoperation : The main Python code for teleoperation.

    • camera_calibration : The camera calibration script.

    • setup : Contains script and configuration files for system setup (e.g., udev rules).

  • project_resources: All project resources, such as:

    • PDFs of the ArUco markers

    • STL files and the Arduino script for the custom controller (used for the Vive tracker and the ArUco cube)

  • images: Images for the documentation

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •