Emorph: Event–Driven Morphological Computation for Embodied Systems

The mainstream computational paradigm in embodied intelligence is digital and it is clear that conventional digital systems have difficulties in performing robustly even in the most mundane tasks of perception. They require vast amounts of resources to extract relevant information, but still fail to produce appropriate responses for interacting with the real-world in real time. In addition, in sensory perception tasks, the data acquired from the sensors is typically noisy and ambiguous. “Frame-based” time sampling and quantization artifacts present in conventional sensors are particularly problematic for robust and reliable performance.
The situation is clearly different in biological systems. In particular, biological neural systems vastly outperform conventional digital machines in almost all aspects of sensory perception tasks. Despite its dramatic progress, information technology has not yet been able to deliver artificial systems that can compare with biology. There are limitations both at the technological level, and at the theoretical/computational level.
Analog computation – free from the limits of sampling – provides a solution. Analog devices are fast, as time constants are in the range of the rising time of the transistor currents. Event-driven computation intrinsically adapts the sensor response to the time constants of the real world. The sensor response is automatically regulated to match the incoming signal range, and so is robust. Moreover as only important events are coded, they are also efficient. The eMorph project thus aims to design novel, data-driven, biologically inspired, analog sensory devices while also developing new asynchronous event-driven computational paradigms for them.
eMorph aims to adapt the computational engine of the cognitive system (its morphology with respect to computation) to the dynamics of the real world rather than furiously sample the physical sensory signals in an attempt to obtain adequate bandwidth. Structure and morphology will be matched to the requirements of the robot’s body and its application domain with testing to be carried out on the advanced humanoid robotic platform, iCub (project RobotCub).

Interfacing with the real-world

Neuromorphic Sensor. Asynchronous, event-driven, space-variant

Interfacing with conventional digital systems

Embedded Low-Level Vision Algorithms

Extracting behaviourally relevant information from asynchronous, event-driven, sensory system

High-Level Vision Algorithms

Interacting with the real world

Advanced Robotics System Integration


Partners participating to the project

Robotics, Brain and Cognitive Sciences (RBCS)

Istituto Italiano di Tecnologia

Safety & Security Department,Unit New Sensor Technologies

AIT Austrian Institute of Technology

Institute of Neuroinformatics (INI)

University of Zurich and ETH Zurich

Laboratory of Integrated Advanced Robotics(LiraLab)

University of Genova