您的 IP 地址: 未知 · 您当前的状态: 受保护未受保护的未知
博客 News

Is Apple developing tech to monitor mental health?

How would you feel if a tech company could track your mood swings or diagnose you with mental health issues? While it sounds bizarre, this is exactly what Apple’s developers are working on right now. Diagnosing people with depression and cognitive impairment based on their iPhone data alone is revolutional but also raises ethical questions.

Carlos Martinez

Carlos Martinez

Oct 13, 2021 · 3 min read

Is Apple developing tech to monitor mental health?

How is Apple planning to track your mental health?

Apple, together with the University of California (UCLA) and the pharmaceutical company Biogen, is working on technology to diagnose depression, anxiety, and cognitive decline. Special algorithms would analyze your mobility, sleep pattern, physical activity, heart rate, and even your typing behavior to see if you should be concerned about your physical and mental health.

Apple claims that this data would be processed only on your device, not on Apple’s servers.

Last year, UCLA collected Apple Watch and iPhone data from 150 volunteers and will track another 3,000 people this year to study stress, anxiety, and depression. Biogen uses similar data to identify mild cognitive impairment that could develop into Alzheimer’s. The results of these two studies might be used to create new features for Apple’s users.

While it’s still hard to predict whether UCLA’s and Biogen’s researches will develop into actual iPhone and Apple Watch functionalities, they already raise a lot of privacy concerns.

Why is it bad for your privacy?

We can all agree that Apple has noble intentions and wants to help those in need. However, many people don’t feel comfortable having their iPhone or smartwatch track everything they do. You can never be sure how this highly sensitive data will be handled and who will have access to it.

What could possibly go wrong?

  • Your smartphone could be hacked.
  • A vicious employee might snoop on your data.
  • You could lose your device, and a stranger might find out about your mental health issues and try to blackmail you.
  • Apple could fail to protect your privacy and accidentally expose your data, which happens often even for big tech companies.

To gather enough information for diagnosing people with cognitive impairment or depression, Apple would need to access your camera, location, exercise data, and your typing speed. Giving so much data to a tech company is a privacy nightmare, no matter how noble their intentions are.

Depression and other mental health issues are stigmatized in society, and many patients might not feel safe sharing this data with Apple.

How much of your privacy are you willing to share?

Since the launch of Apple Watch in 2015, the tech behemoth has shown a growing interest in health studies. Earlier this year, Apple announced plans to monitor your glucose level, body temperature, and blood pressure with a smartwatch. While it’s hard to tell if new features will ever see the light of day, Apple’s vision is loud and clear.

Apple’s Health app is already a powerful piece of software that can collect a lot of different data. Let’s have a look at some of its features:

  • Allows users to share their Apple Watch data with doctors.
  • Allows you to access and store your health records on your watch, such as medications, immunization, and lab results (digital COVID-19 vaccination card and test results included).
  • Allows you to create an emergency Medical ID card which lets first responders access your critical medical information from the Lock Screen.
  • Incorporating data from third-party meditation, fitness, or nutrition apps.
  • Allows you to share your Health app data with family members or any other people.

5 steps to enhance your online privacy and security

  • Don’t overshare on the Health app. While Apple allows you to merge data from third parties with Health and even import medical records, think twice before doing this. The more data you share, the more vulnerable you become.
  • Use strong passwords. It doesn't matter whether it’s your Health account or a third-party fitness app — always use strong and unique passwords. Combine lower-case and upper-case letters with numbers and special characters to create solid passwords.
  • Update your app on time. While Apple adds new features to the Health app with every update, it also patches known software vulnerabilities. When an app contains so much sensitive information, postponing updates could cause you trouble one day.
  • Check app permissions. If a third-party meditation app asks to access your contacts or calendar, this should raise your suspicion. You can find malicious apps even in the App Store, so always check reviews and read about the developers.
  • Use a VPN. A virtual private network encrypts your internet traffic and hides your IP, thus enhancing your online privacy. With one NordVPN account you can protect up to six different devices: laptops, smartphones, tablets, routers, and more.

    A VPN also shields your connection with encryption on public Wi-Fi, which is a common means of launching cyber attacks.

Online security starts with a click.

Stay safe with the world’s leading VPN