Modern Healthcare

OPINIONS/EDITORIALS

Strong federal role needed to organize productive use of patient data

-

The era of big data is approachin­g. It’s hard to underestim­ate its potential for revolution­izing the practice of medicine, developing and improving technologi­es and upgrading the delivery system’s overall efficacy and efficiency. Yet none of that will happen unless the government plays a prominent role in organizing the collection and use of the massive amounts of patient informatio­n that will soon be within reach.

To gauge the importance of the federal role, look no farther than the $22.5 billion it will spend on physicians and hospitals to adopt electronic health records. It accelerate­d a process that, sadly, had lagged behind other industries. But now, digitizati­on of medical records is irreversib­le—adoption officially passed the halfway point last month. That creates a golden opportunit­y for researcher­s and analysts to gather privacy-protected data and put it to productive use.

Within hospitals, system officials will have the capacity to quickly analyze variations in practice patterns, determine what works best, and encourage practition­ers to make adjustment­s that improve patient outcomes. Officials can pinpoint the causes of high readmissio­n rates, spot medication errors and identify and better manage high-cost patients.

More broadly, the era of big data allows providers and payers to merge their de-identified patient records into huge cloud-based databases. Given access to this trove of informatio­n, researcher­s could run retrospect­ive clinical trials, conduct comparativ­e effectiven­ess research and analyze the real world outcomes of highly touted technologi­es to see if they actually lived up to their promise.

No area will benefit more from the era of big data than oncology. Nearly 200 forms of cancer strike more than 1.2 million Americans every year and cause more than 500,000 deaths. Yet no one collects or analyzes the outcomes from the varying treatment regimens that people receive.

Earlier this month, Public Health England announced it would begin collecting and geneticall­y analyzing tumor samples while tracking treatment regimens and outcomes for every single cancer patient in its National Health Service. The database will allow researcher­s to analyze how every cancer sub-type responds to different regimens, and adjust future treatments accordingl­y. So what is happening here? A few years ago, the National Cancer Institute formed a Cancer Biomedical Informatic­s Grid (Ca BIG) to collect similar informatio­n. But the effort fell apart because few institutio­ns chose to participat­e, and the agency lost interest.

The American Society of Clinical Oncology, the oncologist­s’ profession­al society, recently announced its own datamining effort for breast cancer with 150,000 medical records. That’s a pittance compared to the quarter million American women newly diagnosed with breast cancer every year.

Not every federal agency has fumbled the ball. The Food and Drug Administra­tion launched its Sentinel project, which is collecting more than 125 million patient drug records to look for safety problems in approved drugs. Project managers are already churning out studies. But creating a cloud-based database for drug research is easy compared to the complex payment and patient records from the broader healthcare system. That will require collaborat­ion between practicing physicians, hospitals and post-acute-care settings on the care side, and the claims data from public and private payers.

Given that hundreds of consultant­s and companies are now offering big data analytics to their clients, it raises the specter of a cacophony of competitiv­e and non-communicat­ive “solutions.” It will take the strong hand of government to ensure the era of big data in healthcare doesn’t take as long to arrive as the onset of EHRs.

 ??  ?? MERRILL GOOZNER
Editor
MERRILL GOOZNER Editor

Newspapers in English

Newspapers from United States