Striking Gold or Striking Out?
By Annie Macios
For The Record
Vol. 20 No. 21 P. 12
Tapping into prescription data to evaluate patient records helps insurers but raises privacy concerns.
The information highway just got bigger, but the newest route isn’t published on any map or atlas. Personal health information is the latest traveler on the digital superhighway. The cost to ride? Possibly your privacy.
With the recent focus on corporate data mining of personal health information for various uses, privacy concerns have come to the forefront with both technology vendors and privacy watchdog groups weighing in on the issue.
Deborah C. Peel, MD, founder of Patient Privacy Rights, says it’s a tremendous secret that the nation’s treasure trove of personal electronic health information is used by corporations without patient consent. Although it’s unclear how many companies take health data obtained for one purpose and use it for other purposes, many start out simply trying to automate business processes and then later realize how valuable the information is, so they sell it.
“It’s disturbing that a private for-profit corporation can make billions of dollars selling health information and not a dime goes to help sick people. This is the most sensitive data on earth and includes things like genetics, mental health information, and information on other sensitive illnesses. It should never be used in any way without consent unless otherwise required by law,” says Peel.
How Data Mining Works
Minnesota-based Ingenix offers Medpoint on a subscription basis to insurance companies seeking the most up-to-date pharmacy history information about a prospective client. The database is compiled from data electronically retrieved from multiple sources, which creates a profile of the patient’s prescription history and even includes information such as an individual’s possible diagnoses and predictive risk assessments.
With this usage, the technology is deployed when someone applies for insurance in which the applicant authorizes the company to retrieve the appropriate information to help them make sound decisions regarding coverage. In the past, insurance companies have relied heavily on medical information given by prospective subscribers, which hasn’t always painted an accurate picture of their health status.
“With Medpoint, once they have the authorization, insurance companies are able to access up to five years of prescription history,” says Ingenix Senior Vice President John Stenson.
This information, known as a prescription profile, is used as part of the underwriting process to determine items such as rates or whether to offer someone insurance at all. “The underwriting process involves trying to assess risk. This sort of information has been used in the process for a long time and is always used to determine risk,” Stenson says.
By using Medpoint’s fast, secure electronic delivery system, insurance companies can act on the information contained in the prescription profile immediately, which reduces the time, costs, and risk associated with underwriting new policies. In short, Medpoint helps consumers get coverage at a fair price and in a timelier manner than in the past.
Stenson says privacy issues associated with this sort of information sharing are important. Medpoint complies with HIPAA standards because insurers can only access information if they have authorization from the individual being searched, and the authorization must meet the criteria set forth for maintaining privacy.
“The prescription profile comes in a standardized format. When you log into the database, you must have an authorization for an individual or small group. The search is specific, and the information provided is standardized and for a specific purpose,” says Stenson.
Medpoint works by first logging into the system using proper authorization and then obtaining an individual’s or group’s accessible medical profile. Then the requesting individual inputs specific identifiers on the request screen. Within minutes, Medpoint returns the results in a secure electronic health document.
Medpoint’s design doesn’t permit inappropriate use of the technology, according to Stenson. For example, he notes that it’s not designed to allow an individual to go in and search the system for the 10,000 healthiest individuals and then market directly to them. “Someone is actually applying for coverage, and it can only be used for individual and small group coverage. It is not to research data,” Stenson says.
As in the past, checking medical information is inherent in underwriting applications.
“Seldom or never is it the only part of the underwriting process. Once you have all of the information to do underwriting, the process unfolds with steps such as the acceptance of the rate, enrollment, eligibility, and proceeding to being set up in billing. Medpoint is just part of the whole process—from underwriting to administering the program. It’s one tool in conjunction with other underwriting tools,” says Stenson.
When looking at additional applications for this technology, Stenson points to the value of using it in a hospital or for populating health records. “Hospitals might want to use Medpoint for specific needs in the future. With the proper authorization, hospitals could retrieve prescription information in areas of the hospital such as the [emergency department], where it’s imperative to have accurate information to treat someone quickly. It’s important in the care decisions,” he says.
While many industries use data servers for information retrieval, few are providing it for this specific purpose. Proponents of the practice say the bottom line is that technology such as Medpoint accelerates the underwriting process and, when used in combination with other tools and data retrieval techniques, helps consumers get coverage more quickly and affordably.
However, the technology may come at a cost, according to privacy experts. Deven McGraw, director of the Health Privacy Project in Washington, D.C., and Peel both agree that data mining can deteriorate public trust and undermine HIT’s value in improving healthcare delivery. They find the authorizations obtained for use by insurance companies problematic on several levels.
“There is a lot of debate in Washington saying people must be given the right to consent, but that puts all the burden on the individual,” says McGraw. “Just the fact that the insurance companies had to get authorization shows you the limits of consent protecting privacy, and patients can’t say no because they need insurance. Ultimately, we need more clear rules on the use of this data.”
Peel says these authorizations are actually illegal because a person can’t meaningfully give consent to disclose information that doesn’t yet exist; advance blanket consent is not contemporaneous. “The consents they get are, frankly, illegal. Let’s face it, with advance consent, nobody knows what’s going to be done with their information. Just ask them. That’s why [Patient Privacy Rights] is fighting for obtaining electronic consent at the time someone wants to use specific health information that you already know about, such as seeing a lab test result,” she says.
Gaps in the System
The fact remains, however, that Ingenix, along with the many other companies providing this sort of information, didn’t violate any laws. “The interesting part of the equation is that these databases occur in the first place. There are gaps in HIPAA that leave people vulnerable, and that can be fixed in a policy context,” says McGraw.
With data mining so prevalent, the question lies with how this sensitive information is being leaked through the healthcare system. Initially, the information is used properly for things such as payments, which are administered through a pharmacy benefit manager (PBM). In turn, because PBMs are not under HIPAA’s jurisdiction, they sell the information to companies such as Ingenix to populate the databases.
“When you go to a pharmacy, use of personal information to fill your prescription and seek payment from your health plan are fine, but it’s the secondary uses where data continues to flow for other purposes that creates the problem,” says McGraw. “One way to fix the problem is to put constraints on how insurance companies can use this information, but it’s not so easy. On one hand, the companies providing the information can say, ‘We’re allowed to do this, and there is nothing illegal about it, even though it might make us look bad.’ But on the other hand, people making profit on the exchange of this sort of information is not good.”
A major change to HIPAA rules occurred in 2002 that eliminated an individual’s right to consent. The rule was changed to give covered entities the power to decide when to use and disclose health information for treatment, payment, and healthcare operations.
“What is different is the inclusion of use for healthcare operations,” says Peel. “On the face of it, it sounds innocuous, but it is not a small number of ‘trusted’ entities that are using the information. At last count, there are more than 4 million corporations, government agencies, hospitals, self-insured employers, doctors, and others who have access to personal health information without giving you notice or the opportunity to object.”
The worst part, according to Peel, is that these entities are not required to keep an audit trail, so there is no way to know how far data travels and to what extent. Plus, she says there is no requirement of breech reporting.
What kind of personal health information may be misused? Peel says everything from lab tests and genetic records to medical and claims records are subject to data mining because businesses that originally store or transfer the information later find that they can make money from selling it. For example, information is sold to medical device manufacturers, marketers, drug companies, and even to employers who want the information for making decisions regarding hiring and promotion.
She also points out that data mining creates the perfect scenario for identity theft because health records also include all the necessary information (Social Security number, birth date, address, etc) to open a bank account.
Sleeping Watchdog
While all 50 states have strong laws requiring consent for the sharing of information, Peel says HIPAA has enabled them all to be ignored by the billion-dollar data-mining industry.
McGraw points out that “downstream” uses and users are not covered and are out of reach of federal governance. Because millions of dollars are at stake, she says putting a policy in place to govern these uses is not an easy task. “We need the information to be liquid for legitimate purposes like claims payments or to provide information for the best quality of care, but how do you create a system that works and protects privacy?” she questions. “It’s enormously challenging work because we are at a moment of great opportunity.”
“There is no monitoring of the vast data-mining industry whatsoever. There is no way to monitor the theft and misuse of data because there is no audit trail of every disclosure,” says Peel.
Putting health records in an electronic format makes it easier and faster to access information, and there are many obvious benefits to doing so. Technology also has a far greater capacity to create restrictions on the use, but that technology must be employed.
“We have an opportunity to examine the rules we have on the books and to go in and make changes to protect privacy and realize the benefits of EMRs [electronic medical records]. We won’t have rapid adoption of EMRs if people are worried that they can’t trust that their personal information will be confidential and private,” says McGraw. The system is not transparent enough, she adds, and the unfortunate by-product is that, in this instance, patients are afraid to get prescriptions filled because they don’t want their information in the system once they learn how data can be used.
To illustrate this, Peel cites a Health and Human Services finding showing that 600,000 patients did not seek early treatment for cancer because they were afraid they would have to choose between their jobs or treatment. “Building a system without privacy has the unintended consequences of death, suffering, higher costs, and bad or absent data. Having health data opens up the possibility of doing new kinds of research that could never be done before, but we won’t have the data if millions won’t even walk through the door,” says Peel.
To create a more transparent and privacy-protective system, Peel believes there must be audit trails and a safe database—think Fort Knox—where people can collect and control their own information without it being secretly data mined, and that everyone should have an independent consent management tool to set preferences for sharing their health records. By having one place to set all consents—giving specific consents for data use based on personal preferences—people can change who can see and use their records instantly, block out data, and keep complete audit trails of every disclosure.
“It’s critically important that we get absolute privacy. The great news about health IT is that we can completely change the way we do consents. Smart technology means we no longer have to have one-size-fits-all blanket, coerced consents,” says Peel. “Technology can, for the first time, mean that we can have far better, cheaper, interactive, and clearer consents than were ever possible with pen and paper.”
— Annie Macios is a freelance writer based in Doylestown, Pa.