A couple of weeks ago we wrote about the Supreme Court of Canada’s decision to permit class action lawsuits and its impact on the state of privacy across the nation. The article discussed three ways to mitigate the risks due to lack of audit of EMR access.
last week, charges were laid against a former hospital staff in Ontario Canada. The ruling was issued under the Personal Health Information Protection Act (PHIPA) for snooping. An Ontario judge convicted the former clerk $36,000, two year probation, & 300 hours of community service for breach of trust!
The ruling sets a precedent, PHIPA cases had never resulted in a conviction. Health Minister Eric Hoskins had promised sweeping changes to the act in September including doubling the maximum fine for individuals from $50,000 to $100,000. The hospital where the breach took place, remains a subject of a $412-million class action lawsuit on behalf of thousands of patients whose personal information was leaked.
My comment on today’s ruling was: “PHIPA and other Canadian privacy laws need enforcement.” The U.S. is often criticized for having weak sectoral privacy laws. However, as an FTC director put it “Well-intended laws without enforcement aren’t helpful.” This landmark case shall set for a change or tone in privacy law enforcement. Canada needs to put its money its mouth is.
This particular breach was served by order HO-13, by Ontario’s data protection authority, called the IPC. The order is raising a question by hospital CIOs. How can we prevent this from happening , given that vendors in EMR space haven’t caught up with the times on predictive auditing?
a solution for EMR automated audit is very much needed especially that this can happen to any hospital using outdated tech
CIO’s are concerned because the technical deficiency in the audit systems and processes are pervasive across Canada. In fact this could be happening at any other hospital going undetected
Existing industry solutions in this area are outdated, they so much rely on triggers to detect un-authorized access.
A trigger can be, “check if user requests records of patients with the same last name as the user.” Check if user requests more than an X number of records per day. Coming up with the rules is cumbersome, error prone, and human intensive. Such solutions often generates false positives which causes audit fatigue (that’s when a system administrator receives too many false alarms).
Predictive Audit Analytics can automate reporting and filtering capabilities that can allow healthcare privacy officers to more quickly identify suspicious activities — like snooping and identity theft – and distinguish them from legitimate record accesses. The result is a faster and easier auditing process.
Dr. Daniel Fabbri, a security researcher Vanderbilt University and founder of Maize Analytics, has been developing security systems for several years to better protect sensitive data, including personal health information. He said that
Using automated analysis, a privacy office can audit 95% of the accesses automatically, leaving 5% or less left for manual review.
Dr. Daniel Fabbri — Maize Analytics
Protecting health privacy requires active participation on the human and technical sides of the coin. An active and well informed staff combined with advanced technology to support privacy operation.
Contact and Credits
If you are interested in learning more, contact email@example.com.
Posted From Ki DESIGN Magazine
A publication, (ISSN 2367-9980) is registered in the Library and Archives Canada, 395 Wellington St Ottawa ON. All rights reserved. Copyright ©2014. Reproduction without permission is prohibited. Phone: +1800.221.3286.