Experts are wary of Apple’s AI research to detect mood

While Apple is reportedly working on AI technology capable of detecting mental health states and emotions, some are skeptical.

According to Jorge Barraza, assistant professor of psychology practice at the University of Southern California and technical director of Immersion, a provider of neuroscience technology.

“When we infer things from emotional AI at the macro level – which means we tend to see patterns at the macro level – at the individual level, it starts to get a little more questionable,” said Barraza.

Outside of a social context, “we don’t know how much sense [emotion] a to enable us to understand what are the psychological experiences of people, ”he added. “Different types of expressions or emotions can have different meanings, whether in a social context or not.”

The research project apparently grew out of an Apple-sponsored joint research project with UCLA that the university first released to the public in 2020, according to the the Wall Street newspaper.

Apple and UCLA researchers are looking to create algorithms that can use digital signals to detect depression or anxiety. Data points they use include facial recognition, sleep patterns, typing behavior, and vital signs

Mental health research

Researchers use Apple devices, including the iPhone and Apple Watch, with a Beddit Sleep Monitor device. The project started with 150 participants in 2020 and is expected to involve around 3,000 people by the end of 2023.

Neither Apple nor UCLA responded to requests for comment on the research project.

Researchers don’t just try to understand a person’s mental health and seek to determine whether a person is suffering from anxiety or depression.

Research based on personal devices, while unproven, could yield useful tools, Barraza said.

I consider this technology to be very promising. Not in terms of diagnosing things like depression or anxiety, but at least serving in a directional way to make people aware of their day-to-day life.

Jorge barrazaProfessor, University of Southern California

“I see this technology as very promising,” he said. “Not in terms of diagnosing things like depression or anxiety, but at least serving in a directional way to give people an awareness of their day-to-day.”

Apple’s interest in emotional AI

Apple’s interest in emotional AI began in 2016 when it purchased Emotient, a vendor that uses AI to read emotions.

Emotient is one of the growing number of providers in the field of emotional AI. Meanwhile, companies are using similar systems that use AI and machine learning to gauge employee engagement and assess potential job candidates.

Apple’s use of the technology is different from what others have done before, as researchers focus on multiple data points, Barraza said. He said that usually AI researchers focus on facial recognition (capturing expressions such as smiles and frowns) or voice analysis (tone and words used). Instead, researchers working with Apple and UCLA are looking at both facial recognition and voice analysis, as well as heart rate, sleep patterns, and more.

“We are talking about a big data set,” he said. “We don’t just rely on one piece of information to tell us how people live [their emotions]. “

Emotions differ depending on the social context

While technology can be useful in educating people about their emotional well-being, Barraza said the approach should always be viewed with skepticism, especially if the data is used to predict how one feels.

Despite the intentions of Apple or whoever owns the technology, it cannot be used in the way intended. Instead, it could be used in a way detrimental to an employee or maybe an older interpretation of the data.

Culture and emotions

Another challenge of emotional AI is how to deal with the way different emotions are viewed or perceived in different cultures.

“What might be different in certain cultures or certain subgroups or certain ages… this nuance is so difficult to detect,” said R “Ray” Wang, founder and senior analyst of Constellation Research.

Wang said the challenge for any business trying to develop emotional AI is knowing when the data is good enough.

Researchers need to determine the level of precision they want in order to avoid false positives and false negatives, he said. They should research where there might be a bias and where false patterns might be in the dataset. This could mean taking into account cultural differences, accents, or even racial differences that could affect a person’s emotional well-being.

However, as one of the largest manufacturers of mobile devices in the world, Apple may stand a chance to make the technology work thanks to its vast network of users.

“We are at the beginning of emotional AI,” Wang said. “It’s going to take off over time. But if you release it too soon and lose people’s trust, that’s the risk.”

Source link

About Dianne Stinson

Check Also

Quantum phase transition detected globally deep in the Earth

Cold and subducting oceanic plates are seen as fast-velocity regions in (a) and (b), and …

Leave a Reply

Your email address will not be published. Required fields are marked *