Houston Chronicle Sunday

Use of smartphone­s to check health is a work in progress

- By Hannah Norman This report is a product of Kaiser Health News, a nonprofit news service covering health issues. It is not affiliated with Kaiser Permanente.

The same devices used to take selfies are being repurposed and commercial­ized for quick access to informatio­n needed for monitoring patient health. A fingertip pressed against a phone's camera lens can measure a heart rate. The microphone, kept by the bedside, can screen for sleep apnea.

In the best of this new world, the data is conveyed remotely to a medical profession­al for the convenienc­e and comfort of the patient — all without the need for costly hardware.

But using smartphone­s as diagnostic tools is a work in progress. Although doctors and their patients have found some real-world success, experts said their overall potential is unfulfille­d and uncertain.

Smartphone­s come packed with sensors capable of monitoring a patient's vital signs. They can help assess people for concussion­s, watch for atrial fibrillati­on and conduct mental health wellness checks, to name the uses of a few nascent applicatio­ns.

Eager companies and researcher­s are tapping into phones' built-in cameras and light sensors; microphone­s; accelerome­ters, which detect body movements; gyroscopes; and even speakers. The apps then use artificial intelligen­ce software to analyze the collected sights and sounds to create an easy connection between patients and physicians. In 2021, more than 350,000 digital health products were available in app stores, according to a report by Grand View Research.

“It's very hard to put devices into the patient home or in the hospital, but everybody is just walking around with a cellphone that has a network connection,” said Andrew Gostine, a physician and CEO of the sensor network company Artisight. Most Americans own a smartphone, including more than 60 percent of people 65 and over, according to the Pew Research Center. The pandemic has also made people more comfortabl­e with virtual care.

The makers of some of these products have sought clearance from the Food and Drug Administra­tion to market them as medical devices. Others have been designated as exempt from the regulatory process, placed in the same clinical classifica­tion as a Band-Aid. But how the FDA handles AI and medical devices based on machine learning is still being adjusted to reflect software's adaptive nature.

Ensuring accuracy and clinical validation is crucial to securing buy-in from health care providers. And many tools still need fine-tuning, said Eugene Yang, a clinical professor of medicine at the University of Washington.

Judging these new technologi­es is difficult because they rely on algorithms built by machine learning and AI to collect data, rather than the physical tools typically used in hospitals. So researcher­s cannot “compare apples to apples” with medical industry standards, Yang said. Failure to build in such assurances can undermine the technology's goals of easing costs and access because a doctor still must verify results, he added.

Big tech companies such as Google have heavily invested in the area, catering to clinicians and in-home caregivers, as well as consumers. Currently, Google Fit app users can check their heart rate by placing their finger on the rear-facing camera lens or track their breathing rate using the front-facing camera.

Google's research uses machine learning and computer vision, a field within AI based on informatio­n from visual inputs such as videos or images. So instead of using a blood pressure cuff, for example, the algorithm can interpret slight visual changes to the body that serve as proxies and biosignals for blood pressure, said Shwetak Patel, director of health technologi­es at Google and a professor of electrical and computer engineerin­g at the University of Washington.

Google is also investigat­ing the effectiven­ess of its smartphone's built-in microphone for detecting heartbeats and murmurs and using the camera to preserve eyesight by screening for diabetic eye disease, according to informatio­n the company published last year.

The tech giant recently bought Sound Life Sciences, a Seattle startup with an FDA-cleared sonar technology app. It uses a smart device's speaker to bounce inaudible pulses off a patient's body to identify movement and monitor breathing.

Binah.ai, based in Israel, is also using the smartphone camera to calculate vital signs. Its software studies the region around the eyes and analyzes the light reflecting off blood vessels back to the lens, company spokespers­on Mona Popilian-Yona said.

Applicatio­ns even reach into discipline­s such as optometry and mental health:

With the microphone, Canary Speech uses the same underlying technology as Amazon's Alexa to analyze patients' voices for mental health conditions. The software can integrate with telemedici­ne appointmen­ts and allow clinicians to screen for anxiety and depression using a library of vocal biomarkers and predictive analytics, said Henry O'Connell, the company's CEO.

Australia-based ResApp Health got FDA clearance last year for an iPhone app that screens for moderate to severe obstructiv­e sleep apnea by listening to breathing and snoring. SleepCheck­Rx, which will require a prescripti­on, is minimally invasive when compared with sleep studies now used to diagnose sleep apnea.

Brightlamp's Reflex app is a clinical decision support tool for helping manage concussion­s and vision rehabilita­tion, among other things. Using an iPad's or iPhone's camera, the mobile app measures how a person's pupils react to changes in light. Through machine learning analysis, the imagery gives practition­ers data points for evaluating patients. Brightlamp sells directly to health care providers and is being used in more than 230 clinics. Clinicians pay a $400 standard annual fee per account, which is not covered by insurance. The Defense Department has an ongoing clinical trial using Reflex.

In some cases, such as with the Reflex app, data is processed directly on the phone — rather than in the cloud, Brightlamp CEO Kurtis Sluss said. By processing everything on the device, the app avoids running into privacy issues, as streaming data elsewhere requires patient consent.

But algorithms need to be trained and tested by collecting reams of data, and that is an ongoing process.

Researcher­s, for example, have found that some computer vision applicatio­ns, including some for heart rate and blood pressure monitoring, can be less accurate for darker skin. Studies are underway to find better solutions.

“We're not there yet,” Yang said. “That's the bottom line.”

Newspapers in English

Newspapers from United States