Apple Working to Launch Feature to Detect Depression, Cognitive Decline

Apple is aiming to launch a feature on its iPhones that can detect depression, anxiety and cognitive decline using various digital clues, such as how people type, move or sleep, among other things.

According to The Wall Street Journal, Apple is trying to expand the scope of its growing health portfolio. The data that may be used for detection of impairment includes analysis of participants’ facial expressions, how they speak, the pace and frequency of their walks, sleep patterns, and heart and respiration rates. 

It may also measure the speed of their typing, frequency of their typos and content of what they type, among other data points.

For privacy protection, the diagnosis work is supposed to be carried out on the device, with no data sent to its servers. 

The efforts are part of research partnerships Apple is conducting with the University of California, Los Angeles (UCLA) which is studying stress, anxiety and depression, as well as with pharmaceutical company Biogen, which is studying mild cognitive impairment. 

The code name for the UCLA project is “Seabreeze” while “Pi” is the code name for the Biogen project.

The UCLA study will track data for 3,000 volunteers from this year onward, while the Biogen one aims to recruit around 20,000 people to participate over the next two years.

It should be noted that this work is at a very early stage and may not result in any changes to Apple and its products.

However, The Wall Street Journal reports that Apple executives remain optimistic. Even though these efforts are in the early stages, Apple execs are excited about the possibility. Chief Operating Officer Jeff Williams, who oversees Apple’s health unit, has spoken enthusiastically about the company’s potential to address increasing rates of depression and anxiety as well as other brain disorders, according to people who have heard him talk about the efforts.

Privacy concerns

On the face of it, these efforts seem to be a massive privacy breach, as every detail of a user’s physical health will be scrutinised to diagnose them with a potential disorder or illness, either currently or in the future. To address these privacy concerns, Apple is aiming for algorithms that work on users’ devices and don’t send the data to Apple servers. This, however, does little to reduce the concerns around privacy, as users will still be vulnerable to a diagnosis, or a misdiagnosis, by a device that is analysing every small potentially meaningless movement by a user.

While there are usually early signs of major cognitive impairments or mental health issues, and these signs may result in distinctive smartphone usage, the challenge is creating algorithms that are sufficiently reliable and accurate enough to diagnose specific conditions.

These privacy concerns for algorithmic detection come after Apple recently announced another feature that it said would help combat child abuse. Apple announced a new detection tool called NeuralHash which can identify  child sexual abuse material (CSAM) stored on an iPhone without decrypting the image. 

Apple said it has implemented multiple checks to reduce the chance of errors before images are passed to the National Center for Missing and Exploited Children (NCMEC) and then to law enforcement. The feature will roll out in the United States this year.

Critics were not against Apple’s aim to report child abuse, but rather had concerns about the software it was using to scan and analyse private photos. The announcement led to questions about whether Apple’s new scanning technology will enable even broader surveillance around the world, and whether governments will start demanding that Apple scan for all sorts of forbidden content on the iPhones in their countries. 

However, Apple refuted these concerns in a FAQ response document, where it said that its “CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups”.

“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” Apple said.

Romessa Nadeem is a Project Coordinator at Media Matters for Democracy, which runs the Digital Rights Monitor.

No comments

leave a comment