Skip to content

Roboskel-Manipulation/demo_motion_demonstration

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

demo_motion_demonstration

Description

This package launches the necessary nodes for a motion demonstration and replication by a UR3 cobot.

Human Position Tracking

The motion demonstration is based on visual input. An RGB-D camera is used for human tracking. Openpose is used for recognising 2D human joint positions. The 3D position of the wrist is then acquired via the associated point cloud information. The 3D coordinates are expressed in a static reference frame and then used as potential trajectory points for the control of the robot's end-effector (EE) position. This part of the pipeline is implemented in openpose_3D_localization.

Robot Trajectory Generation

Once the human movement onset is detected, every (right) wrist 3D position that becomes available is checked to filter outliers. After the detection of the end of the movement, the entire human movement trajectory can be smoothed using Bezier Curves, if asked by the user.

Raw human wrist positions or the corresponding Bezier curve points are then sent to an action server that publishes them at an appropriate publishing rate depending on their type (raw or Bezier points). Each point is then translated to appropriate EE positions and checked in terms of robot limits to avoid self-collisions or over-extensions of the robot arm. Valid EE positions are considered as desired positions and used for the calculation of the robot commanded velocities (see below).

The movement onset and end detection, the translation to the robot workspace and the checking for robot limits is implemented in the trajectory_process_utils package. The action server is implemented in the trajectory_replication package.

Robot Motion Generation

The desired robot positions are used as input to the cartesian_trajectory_tracking package which generates the commanded EE velocities published to the CVC. The output of the CVC is fed to the UR3 robot driver.

Pipeline demonstrations

Run

roslaunch demo_motion_demonstration demo_motion_demonstration.launch

Arguments

Input modality

  • visual_input: True if using visual input to produce the 3D keypoints either using the real camera or a rosbag. False if using already obtained 3D keypoints.
  • sim: True if using visual input from a rosbag.
  • live_camera: True if frames are generated by an RGB-D camera (False if they are generated by rosbags) NOTE: sim and live_camera arguments need to be set only if visual_input is set to true.

Trajectory preprocessing

  • smooth: True is smoothing the trajectory using Bezier Curves.

Citation

If you want to cite this work, please use the following bibtex

@inproceedings{dagioglou2021smoothing,
  title={Smoothing of human movements recorded by a single RGB-D camera for robot demonstrations},
  author={Dagioglou, Maria and Tsitos, Athanasios C and Smarnakis, Aristeidis and Karkaletsis, Vangelis},
  booktitle={The 14th PErvasive Technologies Related to Assistive Environments Conference},
  pages={496--501},
  year={2021}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published