how my research and knowledge becomes innovation for specific technological areas

Healthcare Innovation

Studies on HRI are used to provide acceptable technology that persuade end-users and support carers

Automotive Innovation

The interaction between the vehicle and the humans replicate the dynamics of Human Robot Interaction

Space Exploration

Space exploration requires trustworthy interaction between space exploration robots and astronauts

Industry 4.0

Video: View Invariant Robot Adaptation to Human Action Timing

The humanoid robot iCub perceives auditory landscape with the use of microphones in the head. The iCub infers the location of the speakers and optimises the estimation with specific head motor actions.

View Invariant Robot Adaptation to Human Action Timing
View Invariant Robot Adaptation to Human Action Timing

Video: Can a Humanoid Robot Spot a Liar?

This work investigates the possibility to detect deception in a human-humanoid interaction, by monitoring behavioral cues proven to be significantly affected by telling lies in presence of a human interviewer.

Can a Humanoid Robot Spot a Liar?
Can a Humanoid Robot Spot a Liar?

Publication Selection

Main publications in this research field resulting from state-of-the-art research and technological breakthroughs

2019 Can a Humanoid Robot Spot a Liar?

Aroyo A.M., Gonzalez-Billandon J., Tonelli A., Sciutti A., Gori M., Sandini G., Rea F. 2019, in IEEE-RAS International Conference on Humanoid Robots


An integrated model for the coordination of whole body movements of a humanoid robot with a compliant ankle similar to the human case is described. It includes a synergy formation part, which takes into account the motor redundancy of the body model, and an intermittent controller, which stabilizes in a robust way postural sway movements, thus combining the hip strategy with ankle strategy.



Fast reaction to sudden and potentially interesting stimuli is a crucial feature for safe and reliable interaction with the environment. Here we present a biologically inspired attention system developed for the humanoid robot iCub. It is based on input from unconventional event-driven vision sensors and an efficient computational method. The resulting system shows low-latency and fast determination of the location of the focus of attention. The performance is benchmarked against an instance of the state of the art in robotics artificial attention system used in robotics. Results show that the proposed system is two orders of magnitude faster that the benchmark in selecting a new stimulus to attend.



Nowadays, robots need to be able to interact with humans and objects in a flexible way and should be able to share the same knowledge (physical and social) of the human counterpart. Therefore, there is a need for a framework for expressing and sharing knowledge in a meaningful way by building the world model. In this paper, we propose a new framework for human–robot interaction using ontologies as powerful way of representing information which promote the sharing of meaningful knowledge between different objects. Furthermore, ontologies are powerful notions able to conceptualise the world in which the object such as Robot is situated. In this research, ontology is considered as improved solution to the grounding problem and enables interoperability between human and robot. The proposed system has been evaluated on a large number of test cases; results were very promising and support the implementation of the solution



Tantalizing evidence derived from psychophysics and developmental psychology experiments has shown that attention is task-dependent. Two characteristics of human control of attention are very relevant for humanoid robots, namely, the ability to predict the context (task dependence) from the observed stimuli, and the ability to learn an appropriate movement strategy perhaps over developmental time scales. In this paper we aim at implementing these features to control attention in a humanoid robot by including a set of trajectory predictors in the simple but effective form of Kalman filters, and, more importantly, a reinforcement learning based process that utilizes the predictors and the complete set of actions of the robot repertoire to generate a suitably optimal action sequence. Preliminary experiments show that the system indeed works correctly.



Seeing the world through the eyes of a child is always difficult. Designing a robot that might be liked and accepted by young users is therefore particularly complicated. We have investigated children's opinions on which features are most important in an interactive robot during a popular scientific event where we exhibited the iCub humanoid robot to a mixed public of various ages. From the observation of the participants' reactions to various robot demonstrations and from a dedicated ranking game, we found that children's requirements for a robot companion change sensibly with age. Before 9 years of age children give more relevance to a human-like appearance, while older kids and adults pay more attention to robot action skills. Additionally, the possibility to see and interact with a robot has an impact on children's judgments, especially convincing the youngest to consider also perceptual and motor abilities in a robot, rather than just its shape. These results suggest that robot design needs to take into account the different prior beliefs that children and adults might have when they see a robot with a human-like shape.


Collaboration in the specific research

Collaboration activated thanks to the research carried on in the field

Products and Draft Patents


The invention is an new process to register the existinance of sound objects,localize them, prioritize them, and direct the behaviour of an agent (e.g. a robot) towards them.


The product endows intelligent system with one camera to infer the gaze direction by looking at the head orientation and the iris position.






Prof. Samia Nefti Meziani/Robotics and Automation, Salford University


Prof. Davide Brugali/ Robotics, Universita degli Studi di Bergamo


In the quest of enabling function robots in our lives, trustful Partners shared the road to impacting solutions.


t:+39 010 71781420