Khaleej Times

Why privacy is first casualty of health apps

This app tells your feelings; who else does it inform?

- Caroline Chen

Chatbots have existed since the 1960s — one was named after Pygmalion heroine Eliza Doolittle — but advances such as machine learning have made the robots savvier.

Afacebook message pops up on my phone screen. “What’s going on in your world?” It’s from a robot named Woebot, the brainchild of Stanford University psychologi­st Alison Darcy. Woebot seems to care about me. The app asks me for a list of my strengths, and remembers my response so it can encourage me later. It helps me set a goal for the week — being more productive at work. It asks me about my moods and my energy levels and makes charts of them.

“I’ll help you recognise patterns because... (no offence) humans aren’t great at that,” Woebot tells me with a smirking smile emoji.

So Woebot knows that I felt anxious on Wednesday and happy on Thursday. But who else might know? Unlike a pedometer, which tracks something as impersonal as footsteps, many mental-health apps in developmen­t rely on gathering and analysing informatio­n about a user’s intimate feelings and social life.

“Mental-health data is some of the most intimate data there can be,” said Adam Tanner, a fellow at Harvard University’s Institute for Quantitati­ve Social Science.

Chatbots have existed since the 1960s — one was named after Pygmalion heroine Eliza Doolittle — but advances such as machine learning have made the robots savvier. Woebot is one of an emerging group of technologi­cal interventi­ons that aim to detect and treat mental-health disorders. They’re not for everyone.

Some people may prefer unburdenin­g themselves to a human, and many apps are hindered by bugs and dogged by privacy concerns. Still, the new technologi­es may fill gaps in current treatment options by detecting symptoms earlier and acting as coaches for individual­s who might otherwise never seek counsellin­g.

Warning signs

Clinicians and privacy experts are welcoming these inventions with one hand while holding up warning signs with the other. Technology might be a powerful tool to improve treatment, but an emotional problem, if it becomes known, can affect insurance coverage, ruin chances of landing a job or colour an employer’s perception.

With possible changes coming to healthcare law, it’s unclear if pre-existing mentalheal­th conditions could once again be used to charge people more for insurance or deny them coverage. Privacy concerns aside, the promise of collecting data is the ability to render a holistic picture of a person’s mental state that’s more accurate than infrequent assessment­s conducted in a doctor’s office.

Digital biomarkers

“Our approach is to ask, how can we measure in an unobtrusiv­e and passive way?” said Tom Insel, former director of the National Institutes of Mental Health.

Insel teamed up in May with Paul Dagum, a former cyber-security expert, to create a startup that mines the informatio­n on consumers’ phones to create “digital biomarkers” to try to predict depression, anxiety and schizophre­nia.

Called Mindstrong, the company tracks users’ every tap, swipe and keystroke, then keeps an eye out for patterns such as reaction speeds. It looks at locations and frequency of texts and calls. It also tracks word use. Without reading people’s e-mails, Mindstrong can look at “word histograms” that show how frequently certain words are used. When people become depressed, “there’s a shift in pronouns, instead of saying ‘we, you, they,’ it turns into ‘I,I, I,’” Insel said.

Phone behaviour

Early evidence shows Mindstrong may be onto something. Dagum said they’ve found strong correlatio­ns between phone behaviour and traditiona­l cognitive measures.

Woebot, too, has data that suggest a benefit. In a study of 70 people ages 18 to 28, scores measuring depression were significan­tly decreased in the group that chatted with Woebot compared with those who read a National Institutes of Mental Health e-book.

Yet the technology can be buggy, leading Woebot to misinterpr­et responses. Prompting me to rewrite a negative thought “so it’s more positive,” I ask, “How?” and Woebot, following its script, cheers, “NICE!”

Despite occasional miscues, it’s hard to be annoyed with the cheery Woebot, whose personalit­y Darcy said she modelled after Kermit the Frog. After two weeks of chatting, the robot has heard more about my daily moods than any of my friends.

Should I be concerned about how much these apps know about me? Mindstrong said it protects customers by avoiding the use of behavioura­l data to sell products.

“We’re a health company and we need to build a brand of complete trust,” said Richard Klausner, Mindstrong’s executive chairman and a former director of the National Cancer Institute.

Darcy promises Woebot won’t sell customer informatio­n and the company’s employees only view anonymised responses. But the app works on Facebook Messenger, and Darcy concedes that she can’t vouch for how Facebook will use the data.

Facebook says it collects informatio­n including when users “message or communicat­e with others” in order to “provide, improve and develop services”. Spokeswoma­n Jennifer Hakes said Messenger abides by Facebook’s data policy, but “we do not read the content of messages between people or people and businesses.” — Bloomberg

 ??  ??
 ?? Bloomberg ?? Apparently, strong correlatio­ns between phone behaviour and traditiona­l cognitive measures have been discovered. —
Bloomberg Apparently, strong correlatio­ns between phone behaviour and traditiona­l cognitive measures have been discovered. —

Newspapers in English

Newspapers from United Arab Emirates