When assessing medical devices, government entities like the Office for Civil Rights (OCR) are too focused on the privacy of information rather than medical device safety and security, according to Kevin Fu, director of the Archimedes Center for Medical Device Security.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
What Fu, who is also associate professor and computer science and engineering research fellow at the University of Michigan, means is that when it comes to medical device initiatives, the focus is on complying with HIPAA and keeping the patient’s health information private. The focus has not, however, been on making sure those devices are safe and secure, he said at the HIT Summit in Boston.
“To me we’ve been focused on privacy and HIPAA,” Fu said. “But we’ve forgotten about safety.”
This is in part due to medical device vendors not putting the proper security protocols into their technology, he said.
“It’s more important to point out these were risks that were built into devices from the get-go and if you’re an HDO it’s sort of too late,” Fu said. “You’re basically trying to retroactively compensate for the flaws that were baked in from the very beginning … but it’s only been in the last year or two that we’ve started to see … the effects.”
Lack of medical device safety
To illustrate the lack of medical device safety, Fu used implantable defibrillators as an example.
These were risks that were built into devices from the get-go and if you’re an HDO it’s sort of too late. You’re basically trying to retroactively compensate for the flaws that were baked in from the very beginning. Kevin Fudirector, Archimedes Center for Medical Device Security
“Built into every defibrillator that I’m aware of is a wireless [unauthenticated] debugging command to reduce fatal heart rhythms,” Fu said. “It’s there for a safety reason. The reason it’s there is because when the patient is given their implant and then their body is sewn up, we need to test if the defibrillator is working safely and effectively.”
He explained that built into every defibrillator is a feature that detects when a patient’s heart is in trouble and will automatically deliver a lifesaving shock to restore the heart to its normal rhythm.
With his students, Fu experimented to see how difficult it would be to make a medical device, like a defibrillator, malfunction. Fu explained that his students were able to connect to the defibrillator’s network and then deliver malformed network packets. Two seconds later, Fu said that the Windows operating system failed and a power line anomaly was detected, which he said indicates that the medical device isn’t working properly.
“[Medical devices] just weren’t designed to withstand this kind of stuff,” Fu said. “You’re always going to have your devices fail if somebody wants to try to cause harm.”
Where medical device security problems are born
The truth is, Fu said, that it’s difficult to verify that a software update is secure. Fu said he even had his students create a very convincing fake software update.
“A lot of folks will deploy technology not realizing that even the technology itself has risk and it’s just because, again, all these security flaws were baked in from the beginning,” he said.
Fu added that not only is it difficult to verify that a software update is secure, but if a vendor does their due diligence in engineering a safe software update, it could take them up to a month. “How is that going to work when you have WannaCry or a piece of ransomware come out?” he said.
The FDA’s regulatory guidance
Fu said that one common claim vendors make is that the “FDA rules prevent software updates,” Fu said. “I’ll just say that’s a bunch of bull-ony.”
Fu said that the FDA’s most recent 2014 guidance document is much improved from its 2009 version which basically said, “everybody get along.”
In the FDA’s 2014 guidance document, the agency set out expectations for what needs to be built into medical devices from a security standpoint before that device hits the market, Fu said.
“It’s really, to me, a people problem,” Fu said. “The tech is the easy part and the hard part mind you is the culture.”