Software & Data

With the exponential growth of collaborative robots in manufacturing application, physical interactions between humans and robots represent a vital role in performing tasks collaboratively. Most conducted studies concentrated on robot motion planning and control during the execution of a task. However, for effective task distribution and allocation, human physical and psychological status are essential. In this Github repository, a hardware setup and support software for a set of wearable sensors and a data acquisition framework, are presented, which can be used to develop more efficient Human-Robot collaboration strategies.

The video below describes the ROS Wearable Sensors package and provides a demonstration of using muscle activity to interact with a robot.

black and white logo of GitHub
Link to GitHub

The wearable sensors in human-robot collaborations could improve the interactions and make it safer. For instances, wearable sensors can assist in recognising the human mental state and physical activities. Subsequently, a robot could effectively and naturally perform the given task with the human. Moreover, the proposed setup enables collected live data during the human-robot interactions, which allows us to model the human behaviour to provide adaptable robotics controlling strategies that take the human behaviour in-consideration. Consequently, robots can efficiently adapt to human and guarantee the safety of the human while performing the intended task.

picture of a blue helmet, a raspberry pi3 and a headset

The below shows all the sensors, which have been integrated in one ROS network. Hence, different sensors data can be collected during Human-Robot Interaction experiments.

Hardware Setup: Integration with ROS network

The blue-box can be carried on the operator’s belt, and it is connected with the workstation via WiFi. The box is connected to four muscle activity sensors (https://www.sparkfun.com/products/13723), nose digital temperature and Phedgit IMU (https://www.phidgets.com/) for head movements via shield cables, while it is connected to a Muse2 headband (https://choosemuse.com/muse-2-guided-bundle/) via Bluetooth. These sensors are interfaced with the Raspberry-Pi using two interface circuits and a USB Bluetooth dongle..

Figure: PI Hat (Interface circuit between Raspberry-Pi 3 and sensors)

PI Hat (Interface circuit between Raspberry-Pi 3 and sensors)

Muscle-activity sensor interface circuit

Muscle-activity sensor interface circuit

Biomechanical ROS Messages

Heartrate

The presented hardware is supported by three ROS messages which are structured as shown in the table below. These messages contain raw sensory data, Interbeat interval (ibi) and Beat-Per-Minute (BPM).

Brain-Wave Signals

Muscle-Activity Message

Nose Temperature

Other ROS Packages

This package can be used with Force/Torque sensors (e.g. http://wiki.ros.org/ati_force_torque), motion tracking systems (e.g. VICON: https://github.com/ethz-asl/vicon_bridge), and IMU (https://github.com/ros-drivers/phidgets_drivers). The idea here is that messages from the surrounding environment (Human-Robot workspace) can be synchronised.

How to Install

sudo git clone https://github.com/Intelligent-Automation-Centre/bluebox.git

Contact: For any questions related this please email email Ali Al-Yacoub: a.al-yacoub@lboro.ac.uk

Al-Yacoub, A., Buerkle, A., Flanagan, M., Ferreira, P., Hubbard, E.M. and Lohse, N., 2020, September. Effective human-robot collaboration through wearable sensors. In 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA) (Vol. 1, pp. 651-658). IEEE.