Project for Social Robotics 2018/19 course of Robotics Engineering.
MiRo is a small pet-like robot intended to be a companion. In this project, MiRo behave as companion, standing on the desk while the user is working (for example, at pc).
- Task A: Awakening Randomly, MiRo awakes to look for caresses from the user. The more the time pass, the more MiRo feels lonely so the probability that it wakes up increase. Also, if the user touches MiRo, it awakes immediately.
- Task Aa: Face detection Awakened MiRo looks for the user face (Aa face detection task), turning on itself until the face is detected with both cameras (so, the face is in front of the robot).
- Task B: Approaching Then it comes near to the user. The user must put a hand close to MiRo, to make him detect it with the sonar. If the hand is near enough to the muzzle (where there is the sonar) MiRo stop and interaction begins.
- Task C: Interaction The user can choose to caresses MiRo on the back to make him happy. After some time, Miro will be satisfied and he will return to sleep. If the user don't want to interact, he must pat MiRo on head, and MiRo will go to sleep immediately.
More detail in the report
- ROS (code tested only on Kinetic version on Ubuntu 16 machine)
- Opencv 3.4
- Run in terminal:
sudo apt install ros-kinetic-opencv3 #(should be already installed with previous point) sudo apt install ros-kinetic-opencv-apps
- Run in terminal:
sudo pip install --ignore-installed tensorflow sudo apt install python-sklearn sudo pip install keras
Be sure to have correctly setup your machine and Miro, as described here MIRO_setup
- Download folder and compile with
catkin_make
- Run supports nodes and main singularly, in different terminals:
- Face Detection
This will run two window for the two camerasroslaunch look_caresses_pkg face_detect_double.launch
- Datainput.py for pattern recognition
cd [YOUR_PATH]/Looking-for-Caresses/src/look_caresses_pkg/src ./DataInput.py robot=rob01
- Coordinator (actual main)
rosrun look_caresses_pkg Coordinator
- Face Detection
A demonstration video is visible here
- The Pattern recognition node (DataInput.py) is taken from here
- MIRO website