A metapackage for simulating the tactile data from BioTac sensors mounted on a Shadow Dexterous Hand.
To get started, simply clone the necessary dependency and
- Clone this repository using the command below
git clone git@github.com:vmstavens/biotac_sim_plugin.git
- Source your catkin workspace using
source <catkin-workspace>/devel/setup.bash
- Build the package using
catkin build biotac_sim_plugin
- Run the provided demo
roslaunch biotac_sim_plugin biotac_sim_demo.launch
The project consists of two packages: biotac_sim_lib
and biotac_sim_demo
.
The biotac_sim_lib
package is responsible for generating the Gazebo model plugin for simulating the BioTac tactile sensor data. The package include the following header files
biotac_sim_lib.hpp
which is the header file containing the overall plugin structure.neural_network.hpp
which contains theNeuralNetwork
class which is responsible for the deep learning part including loading and running the model.helpers.hpp
which contains the necessarytemplate functions
,classes
andstructs
used inneural_network.hpp
andbiotac_sim_lib.hpp
.
Furthermore, the package contains config/model.yaml
which is the model architecture and trained parameters, scripts/set_default_model
which sets the default model to the one mentioned previously and finally src/*
which contains the source files for the above mentioned headers.
This package is not meant to be run directly and therefore contains no launch/example
files.
The biotac_sim_demo
package is responsible for running a demo of the biotac_sim_lib
. This can only be done in the Shadow Dexterous docker container environment, which can be installed as instructed here.
The demonstration can be run using the following command
roslaunch biotac_sim_plugin biotac_sim_demo.launch
This package contains several folders
examples/
which containsbiotac_sim_demo.py
, the example node run from the command above.launch/
which containsbiotac_sim_demo.launch
, the launch file running the node inbiotac_sim_demo.py
.models/
which contains a 3D model of a black pen, used as a prop for the demonstration.robots/
which contains wrapperlaunch
files andxacro
files for loading the plugin frombiotac_sim_lib
worlds/
which contains two worlds: one with and one without gravity. The world you want can be set in the launch file or run using the following commandorroslaunch biotac_sim_plugin biotac_sim_demo.launch gravity:=true
gravity:=false
in case you want gravity disabled.
Outside of standard ROS and Gazebo packages, the following dependency is needed
ros_utils
, which is a utilities package for ROS used forbase64
encoding and decoding.
This work is based on the biotac_gazebo_plugin by Philipp Ruppel, Yannick Jonetzko, Michael Görner, Norman Hendrich and Jianwei Zhang - Simulation of the SynTouch BioTac Sensor, The 15th International Conference on Intelligent Autonomous Systems, IAS-15 2018, Baden Baden, Germany.
The data set used to train the deep learning model can be found here
The 3D model used for demonstration is part of the dataset: A. Rasouli, J.K. Tsotsos. "The Effect of Color Space Selection on Detectability and Discriminability of Colored Objects."