The Ohio State University. In a paper published today in the Journal of Neuroscience , the researchers report that they used functional magnetic resonance imaging fMRI to identify a region of pSTS as the part of the brain activated when test subjects looked at images of people making different facial expressions. Further, the researchers have discovered that neural patterns within the pSTS are specialized for recognizing movement in specific parts of the face. One pattern is tuned to detect a furrowed brow, another is tuned to detect the upturn of lips into a smile, and so on. Martinez said that he and his team were able to create a machine learning algorithm that uses this brain activity to identify what facial expression a person is looking at based solely on the fMRI signal. Using this fMRI data, the researchers developed a machine learning algorithm that has about a 60 percent success rate in decoding human facial expressions, regardless of the facial expression and regardless of the person viewing it.
Small region of brain recognizes facial expressions
Researchers pinpoint part of the brain that recognizes facial expressions
Raised eyebrows? Wrinkled nose? Curled up corners of the lips? Most people looking at such expressions would immediately recognize surprise, disgust or happiness. Now researchers at Ohio State University in Columbus have identified which part of the brain accomplishes this feat. Aleix Martinez led the study.
Brain Responses to Dynamic Facial Expressions: A Normative Meta-Analysis
A facial expression [1] is one or more motions or positions of the muscles beneath the skin of the face. According to one set of controversial theories, these movements convey the emotional state of an individual to observers. Facial expressions are a form of nonverbal communication. They are a primary means of conveying social information between humans , but they also occur in most other mammals and some other animal species. Humans can adopt a facial expression voluntarily or involuntarily, and the neural mechanisms responsible for controlling the expression differ in each case.
Identifying facial expressions is crucial for social interactions. Functional neuroimaging studies show that a set of brain areas, such as the fusiform gyrus and amygdala, become active when viewing emotional facial expressions. The majority of functional magnetic resonance imaging fMRI studies investigating face perception typically employ static images of faces. However, studies that use dynamic facial expressions e. By using quantitative fMRI meta-analysis the present study examined concordance of brain regions associated with viewing dynamic facial expressions.