A common symptom that people with autism struggle with is the inability to interpret facial expressions. This can lead to difficulty reading social cues in their personal lives, at school, at work, and even in media such as movies and TV shows. However, researchers at MIT have developed an AI that has helped shed light on exactly why this is so.
A paper published on Wednesday in The Journal of Neuroscience revealed research that found neurotypical adults (those who don’t have autistic traits) and adults with autism may have crucial differences in a region of their brain called the IT cortex. These differences could determine whether or not they can detect emotions from facial expressions.
“The study suggests this for visual behavior [the IT cortex] plays a strong role.” Kohitij Kar, a neuroscientist at MIT and author of the study, told The Daily Beast. “But it may not be the only region. Other regions such as the amygdala are also heavily involved. But these studies show how to have good [AI models] of the brain will also be key to identifying these regions.”
Kar’s neural network is actually based on a previous experiment done by other researchers. In this study, AI-generated images of faces were shown to autistic adults and neurotypical adults, showing different emotions ranging from fearful to happy. The volunteers assessed whether the faces were happy – with the autistic adults having a much clearer indication of happiness compared to the neurotypical participants, e.g. B. a wider smile, needed to designate them as such.
Kar then fed the data from that experiment into an AI designed to roughly mimic the layers of the human brain’s visual processing system. First, he found that the neural network was able to recognize participants’ facial emotions as well as neurotypical ones. He then removed the layers and tested them again until he got to the final layer, which previous research has suggested is roughly mimicking the IT cortex. At this point, he found that the AI had trouble keeping up with the neurotypical adults and tended to mimic the autistic ones.
This suggests that this part of the brain, located at the end of the visual processing pipeline, could be responsible for facial recognition. This study could lay the groundwork for a better way to diagnose autism. Kar adds that it could also help in developing engaging media and educational tools for autistic children.
“Autistic kids sometimes rely heavily on visual cues for learning and for directions,” explains Kar. “Having an accurate model that you can put pictures in and the models tell you, ‘This is going to work best and this isn’t going to work.’ , can be very useful for this purpose. Any visual content such as movies, cartoons and educational content can be optimized with such models to maximally communicate with, help and encourage autistic individuals.”
https://www.thedailybeast.com/mit-ai-bot-discovers-why-some-autistic-adults-cant-detect-emotion?source=articles&via=rss MIT AI Bot Discovers Why Some Autistic Adults Can’t Recognize Emotions