A hardware module for detecting and differentiating arm interactions in the Raven-II surgical robot platform. This package enables accurate labeling of datasets for training force prediction machine learning models. Author: Mai Bui (bui23m@mtholyoke.edu)
For clearer demonstration, visit this link: Video Description
Raven-II is a research platform for minimally invasive robotically assisted surgery, developed by the Biorobotics Laboratory at the University of Washington, Seattle.
An Arduino-based hardware solution that addresses a critical challenge in surgical robotics: accurate labeling of robot-tissue interaction datasets. By detecting and differentiating between arm interactions in the Raven-II surgical robot platform, this package solves the problem of creating accurately labeled datasets for:
- Training force prediction models to develop haptic feedback systems
- Validating robot-tissue interaction detection
- Real-time interaction detection
- Distinct identification of individual robot arms
- Automated dataset labeling
- Integration with ROS ecosystem
- Woven copper+nickel plated polyester conductive cloth
- Loctite Clear Silicone Waterproof Sealant
- Ubuntu 20.04
- Python 3.8
- ROS Noetic
- Arduino Nano
- telematrix
pip install telemetrix
- Upload Telemetrix4Arduino to Arduino IDE by following this instruction from MrYsLab: Telemetrix4Arduino
Clone the Repository
cd ~
git clone https://github.com/https://github.com/MHC-RobotSimulators-Research/Interaction_detection_package.git
Build in your local catkin workspace:
cd catkin_workspace && mkdir build
cd build
cmake ..
make
Launch the interaction detection node:
rosrun interaction publisher_node