Jon Zimmerman VP and General Manager, GE Healthcare (Seattle)
DIGITAL ISN’T ALL BRAND NEW; many of us have been at this for nearly 40 years. I call us ‘digital natives’, and we’re people who have spent many years working with digital technology, but who remain passionate about it and eager to solve the latest challenges in the industries we serve.
What is new today is the ability to do advanced processing thanks to significant new ubiquitous connectivity, storage and computing power. The biggest evolution I’ve seen in my time is Cloud computing. It is a major game changer. Those of us who have worked with digital technology for a long time are starting to figure out how to embrace the power of the new technologies in combination with what is already out there.
There is incredible digital power in healthcare: The ability to take a picture inside of somebody’s heart, from the outside — without disturbing them — is an incredible accomplishment in and of itself. The question is, could that idea be applied to delivering better care in more places or to developing better therapies through applied intelligence? These are the kinds of questions that keep digital natives engaged.
The outcomes that our customers want us to work with them on fall into four main categories: clinical quality, operational efficiency, financial performance and research. We work a lot on the financial side, because the U.S. has a very complex payment system. All too often, doctors do great work, but they aren’t paid for it: If the appropriate information about a patient and provider encounter is not received, the payer has the right to deny payment. Denied cases add up to $2 billion per year.
To address this, we developed Denialsiq, which contains algorithms that turn the payers’ confusing codes into plain English, so providers can understand the root cause for the denial. The hope is that the system can eventually take outputs from Denials IQ and change how claims are created in the first place, to avoid problems — creating what I call a ‘self-healing revenue cycle’. One customer recently told me, ‘We would spend 90 per cent of the time discovering the cause and 10 per cent fixing it; now, the percentages are reversed.’
One of the biggest healthcare challenges, operationally speaking, is that demand far outstrips supply in terms of clinical professionals, and that will only increase as the population gets older and sicker. How do you keep up with that? Well, you better have some really good information systems in place, to facilitate the best possible flow of information and care. We need tools to help doctors and nurses be more efficient in their daily work, and that’s a big part of what we do.
I don’t really believe in artificial intelligence. First, there is nothing artificial about the intelligence generated from computer algorithms. It is real intelligence. It can be impactful to all the things we just talked about — clinical, financial, operational and research outcomes. Secondly, I also try to help our colleagues and customers realize that if intelligence is not applied, it doesn’t really matter. If there is a discovery in terms of efficiency, quality or finances, our customers expect us to put it to work to create an outcome for them. In my time, I’ve seen way too much innovation for the sake of innovating — but not really moving the ball forward to achieve an outcome. That’s why we try to have a discipline whereby all the ‘intelligence’ that we can generate with these new computing capabilities and networks is applied to a problem. That way, it’s sustainable and valuable, and it meets customer needs.
One of the biggest challenges for any company is getting to a place where you have consistent data across the organization. Before you can start working to make your data consistent — or to ‘normalize’ it, as we say — you first have to understand its current state: What you are capturing right now, where it comes into the system, how it gets there, etc. The reason data is so variant across organizations is simple: We’ve had 40 years of computer systems that have basically been run out of context with one another. That’s why data is so diffused.
Once your data becomes normalized, you can start to discover patterns across your various systems — and it is those patterns that will help you identify inefficiencies or quality disparities.