Integrates the vision, touch, and common-sense information of foundational models, customized to the agent's perceptual needs.
-
Updated
Dec 6, 2024 - Python
Integrates the vision, touch, and common-sense information of foundational models, customized to the agent's perceptual needs.
Visuo-tactile dataset with GelSight and depth camera for YCB objects.
Tactile perception dataset, comprising of the DIGIT sliding over YCB objects with ground-truth pose.
An official implementation of Touch100k: A Large-Scale Touch-Language-Vision Dataset for Touch-Centric Multimodal Representation
(In progress: see roadmap) Gaussian process implicit surface generation from manipulator contact measurements, for object modeling
Object shape exploration pybullet simulator for shape and pose recovery work (in progress)
This repository contains the code and the appendix for the paper Optimizing BioTac Simulation for Realistic Tactile Perception by Wadhah Zai El Amri and Nicolás Navarro-Guerrero.
Naive bayes classifier developed with numpy. Applied to recognize stimuly through haptic data.
A scalable and freely configurable function generator in VHDL
Add a description, image, and links to the tactile-perception topic page so that developers can more easily learn about it.
To associate your repository with the tactile-perception topic, visit your repo's landing page and select "manage topics."