Search This Blog

Monday, August 15, 2011

Neuroimaging Of Brain Shows Who Spoke To A Person And What Was Said


Scientists from Maastricht University have developed a method to look into the brain of a person and read out who has spoken to him or her and what was said. With the help of neuroimaging and data mining techniques the researchers mapped the brain activity associated with the recognition of speech sounds and voices.
 In their Science article "'Who' is Saying 'What'? Brain-Based Decoding of Human Voice and Speech," the four authors demonstrate that speech sounds and voices can be identified by means of a unique 'neural fingerprint' in the listener's brain. In the future this new knowledge could be used to improve computer systems for automatic speech and speaker recognition.
Seven study subjects listened to three different speech sounds (the vowels /a/, /i/ and /u/), spoken by three different people, while their brain activity was mapped using neuroimaging techniques (fMRI). With the help of data mining methods the researchers developed an algorithm to translate this brain activity into unique patterns that determine the identity of a speech sound or a voice. The various acoustic characteristics of vocal cord vibrations (neural patterns) were found to determine the brain activity.
Just like real fingerprints, these neural patterns are both unique and specific: the neural fingerprint of a speech sound does not change if uttered by somebody else and a speaker's fingerprint remains the same, even if this person says something different.
Moreover, this study revealed that part of the complex sound-decoding process takes place in areas of the brain previously just associated with the early stages of sound processing. Existing neurocognitive models assume that processing sounds actively involves different regions of the brain according to a certain hierarchy: after a simple processing in the auditory cortex the more complex analysis (speech sounds into words) takes place in specialised regions of the brain. However, the findings from this study imply a less hierarchal processing of speech that is spread out more across the brain.
The research was partly funded by the Netherlands Organisation for Scientific Research (NWO): Two of the four authors, Elia Formisano and Milene Bonte carried out their research with an NWO grant (Vidi and Veni). The data mining methods were developed during the PhD research of Federico De Martino (doctoral thesis defended at Maastricht University

Scientists Developing Robotic Hand of the Future

 Researchers at Carlos III University of Madrid's (UC3M) Robotics lab are participating in the international research project known as HANDLE. The objective of the project is to create a robotic hand that can reproduce the abilities and movements of a human hand in order to achieve the optimal manipulation of objects.

HANDLE is a large scale "Integrated Project" that is part of the Seventh European Framework Programme FP7; Spain is a participant in the project, whose goal is to reach an understanding of how humans manipulate objects in order to replicate its grasping and movement abilities in an artificial, anthropomorphic articulated hand, thus endowing it with greater autonomy and producing natural and effective movements. "In addition to the desired technological advances, we are working with basic aspects of multidisciplinary research in order to give the robotic hand system advanced perception capabilities, high level information control and elements of intelligence that would allow it to recognize objects and the context of actions," explains the head researcher on the UC3M team working on this project, Mohamed Abderrahim, of the Madrid university's Department of Systems Engineering and Automation.
His team has already gotten very good results, in his opinion, in the areas of visual perception, and cinematic and dynamic systems, which allow the system to recognize an object in its surroundings and pass the information on to the robotic hand's planning and movement system.
The robotic hand that these researchers are working with is mostly made up of numerous high precision pieces of mechanized aluminum and plastic, as well as sensor and movement systems. In all, it has 20 actuators and can make 24 movements, the same as a human hand. Its size is also the same as that of an average adult male's hand and it weighs approximately 4 kilograms. According to the partner in the project who manufactures the hand, the approximate cost of the version that is currently in development at UC3M comes to about 115,000 euros.
The problems involved in imitating a hand
When trying to recreate the movements of a human hand with a robotic system, there are several complex problems that must be resolved. In the first place, there is a lack of space. This is because "a human hand is incredibly complete, which makes it a challenge to try to put all of the necessary pieces into the robotic hand and to integrate all of the actuators that allow for mobility similar to that of a human hand," comments Professor Mohamed Abderrahim. Second, another problem is that there are currently no sensors on the market that are small enough to be integrated into the device so that it can have sensitivity similar to that of a human hand and, thus, be able to make precise movements. Finally, although the researchers may manage to make a perfect robot from the mechanical and sensorial point of view, without intelligence elements the device will not be able to function autonomously nor adapt its movements and control to the characteristics of the objects, such as their geometry, texture, weight or use.
"It is not the same to take hold of a screwdriver to pass it to someone, or to put it away, as it is to use it, because in the last situation, it has to be reoriented in the hand until it is in the right position to be used. This position has to be decided by the intelligence part of the robotic hand," the researchers say. "A robotic hand that is able to perform this seemingly simple task autonomously," they say "only exists in science fiction movies." "My personal estimation is that it will take around 15 years of research into these areas to build a robotic hand that is able to perform certain complex tasks with a level of precision, autonomy and dexterity that is similar to that of a human hand," predicts Professor Abderrahim.
The research carried out by the HANDLE project's partners has brought about results that are very interesting in the area of visual perception, motion planning, new sensors, acquisition of motor skills using artificial intelligence techniques, etc. Nevertheless, important challenges still remain when it comes to integrating the results obtained by all of the partners into a single system, which will be the result of the next two years of work.
HANDLE (Developmental pathway towards autonomy and dexterity in robot in-hand manipulation) is a Large Scale "Integrated Project" funded by the European Union within The Seventh Framework Programme FP7, in which nine European institutions, coordinated by the Pierre and Marie Curie University of Paris (France), participate.

BUZZ

http://updatestofuture.blogspot.com/