New software capable of detecting intricate details of emotions that remain hidden to the human eye has been developed by researchers – and could change how mood or mental health disorders such as postnatal depression are spotted and treated.

The software, which uses a ‘digital mask’ to detect key features and expressions of the face, can evaluate the intensities of multiple different facial expressions simultaneously.

It can pick up the complicated emotions humans experience – with the algorithms able to show that someone is 5% sad or 10% happy, for example.

Funded by the European Research Council (ERC), the Manchester Met and University of Bristol team investigated how well computers could capture authentic human emotions in everyday family life. This included the use of videos taken at home, captured by headcams worn by babies during interactions with their parents.

The findings, published in Frontiers, show that scientists can use artificial intelligence to accurately predict human judgements of parents’ facial expressions based on the computers’ decisions.

Rebecca Pearson, Professor of Psychology at Manchester Metropolitan University and co-author and principal investigator of the project, said: “We found that mood or mental health conditions, such as postnatal depression, could be better understood through measuring subtle nuances in parents’ facial expressions. These results provide early intervention opportunities that were once unimaginable.

“For example, most parents will try to ‘mask’ their own distress and appear ‘okay’ to those around them. More subtle combinations can be picked up by the software, including expressions that are a mix of sadness and joy, or that change quickly.”

The team used data from the ‘Children of the 90s’ health study, also known as Avon Longitudinal Study of Parents and Children (ALSPAC). Parents were invited to attend a clinic at the University of Bristol when their babies were 6 months old where they were prvided with two wearable headcams to take home and use during interactions with their babies. Parents and infants both wore the headcams during feeding and play interactions.

The researchers then used automated software to analyse parents’ facial expressions in the videos, which they then checked and verified with the human eye.

The team looked at how frequently the software was able to detect the face in the video and checked how often the humans and the software agreed on facial expressions. Finally, they used artificial intelligence to predict human judgements based on the computer’s decisions.

Lead author Romana Burgess, PhD student on the EPSRC Digital Health and Care CDT in the School of Electrical, Electronic and Mechanical Engineering at the University of Bristol, explained: “Deploying automated facial analysis in the parents’ home environment could change how we detect early signs of mood or mental health disorders, such as postnatal depression.”

Now the team plan to explore the use of automated facial coding in the home environment as a tool to understand mood and mental health disorders and interactions. This will help to pioneer a new era of health monitoring, bringing innovative science directly into the home.

Burgess added: “Our research used wearable headcams to capture genuine, unscripted emotions in everyday parent-infant interactions. Together with the use of cutting-edge computational techniques, this means we can uncover hidden details that were previously unattainable by the human eye, changing how we understand parents’ real emotions during interactions with their babies.”

As an extension to the ERC project, headcam data is now being collected in teenagers by the Manchester Met team, with the plan to use the same methods to understand complex teen emotions at home. Data has now been collected on 70 parents and teenagers and is ongoing.

Professor Pearson added: “Teenagers’ emotions are even more complex, so the potential for our extended project is huge. To help us understand the meaning behind the data we’re holding interactive workshops in November, giving teenagers and parents the chance to have a go at facial coding and give us feedback on the technology.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here