Given recent headlines, it would be easy to believe that for a "data breach" to occur a hacker must be involved. While this may be the most commonly reported type of data breach, employers are exposed to the risk of many other forms of data breaches each day.
Take, for example, the employee who leaves on an empty train seat an unencrypted company laptop containing the confidential medical records of 100 patients of the physician's office where the employee works. Or the employee who accidentally e-mails an unencrypted spreadsheet listing the names and medical conditions of 75 employees enrolled in the company's wellness program to the wrong e-mail address—not his supervisor john.smith@acme.com but an unknown john.smith@aol.com.
These are also data breaches. What's more, because both involve entities subject to the Health Insurance Portability and Accountability Act (HIPAA), employers must comply with HIPAA's breach notification rule which, among other things, requires each employee impacted by the breach to be notified within a specified time frame.
Up until August of this year, the incidents described above may not have led to an investigation by the U.S. Department of Health & Human Services (HHS) Office for Civil Rights (OCR), the federal agency that enforces HIPAA, because each breach involved less than 500 people. That changed on Aug. 18, 2016, when OCR announced that it will now investigate data breaches of all sizes.
If the threat of an investigation into a company's data security practices is not enough to cause concern, this must also be considered: HIPAA permits a state's attorney general to investigate and bring civil actions against entities that violate HIPAA's breach notification rule. Therefore, an employer that suffers a data breach could now be investigated by either a state's attorney general's office or OCR.
So, is your company required to comply with HIPAA's breach notification rule? And, if so, what do you need to do if you suffer a security incident to comply with the rule? And what if you are not covered by HIPAA but still retain medical information? Read on.
Are You Required to Comply with HIPAA?
If an employer sponsors an employee group health plan, the plan is required to comply with HIPAA.
One of the biggest misconceptions about HIPAA is that it applies only to health care providers, such as doctors or hospitals. This is not true. In addition to health care providers and health care clearinghouses, HIPAA applies to health plans, such as the group health plans offered by many employers and even the flexible spending accounts that many employers provide, if certain elements are met. Many employers are under the false impression that they do not have to comply with HIPAA and HIPAA's breach notification rule.
With that said, an important distinction must be made: if an employer does have a HIPAA-qualifying health plan, only the plan is subject to HIPAA, not the employer's entire business. So, for example, if a retail company sponsors a group health plan that pays the cost of its employees' medical care, the company itself—or "plan sponsor"—is not a HIPAA-covered entity, but the group health plan is.
[SHRM members-only Q&A: Medical Privacy: What are the HIPAA privacy notice requirements for employers that sponsor a group health plan?]
Cloud Storage Service
If an employer stores health information using a cloud storage service, the cloud storage provider must comply with HIPAA and must have a HIPAA-qualified contract with the employer that governs how employees' health information will be used and safeguarded.
On Oct. 6, 2016, HHS released guidance for cloud service providers (CSPs) that store electronic health information for HIPAA-covered entities—for example, a group health plan that electronically stores employees' health information using Google cloud storage.
According to HHS's guidance, even if a CSP is unable to open or access health information it is storing because the health information is encrypted and the CSP does not have the decryption key (or password), the CSP is a "business associate" and therefore must comply with HIPAA.
A business associate is an entity that creates, receives, maintains or transmits health information on behalf of a covered entity for the purpose of claims processing or administration, data analysis, benefit management or billing. An entity that receives health information from a HIPAA-covered entity, and provides legal, actuarial, accounting, consulting, data aggregation, management, administrative, accreditation or financial services is also a business associate.
HIPAA requires that a business associate that suffers a breach to notify the covered entity and provide the name of each individual whose health information has been compromised as a result of the breach.
However, because HIPAA also requires that HIPAA-covered entities and business associates to enter into business associate contracts, a HIPAA-covered entity can contractually expand the obligations that a business associate must comply with in the event of a breach. So, for example, a group health plan can require that if the claims processor it works with suffers a breach, the processor must not only notify the group health plan of the breach, but also must notify each employee affected by the breach and pay the cost of credit-monitoring services for each employee.
Have You Actually Suffered a Breach?
To trigger HIPAA's breach notification rule, an entity must suffer a breach of "unsecured" health information.
HIPAA does not limit the definition of a breach to security incidents involving electronically stored records; a security incident involving paper records can also constitute a breach. However, in order for HIPAA's breach notification rule to be triggered, an entity must suffer an unauthorized disclosure of health information "that is not rendered unusable, unreadable or indecipherable to unauthorized persons," referred to as "unsecured" health information.
Electronic health information that is encrypted, and health information in paper form that has been shredded, are two examples of "secured" health information.
There are three situations that HIPAA specifically excludes from the definition of a breach:
- An employee who works for a HIPAA-covered entity or business associate unintentionally obtains health information in good faith, and does not use or disclose the health information. For example, a doctor mistakenly receives a FedEx package containing medical information about an individual who is not a patient and immediately shreds the documents.
- An employee who works for a HIPAA-covered entity or business associate and has access to health information inadvertently discloses health information to another employee and the receiving employee does not share the information with anyone. For example, an employee of Acme, a company's third-party benefits administrator, sends an e-mail containing medical information about a health plan participant to a colleague who does not do any work with that particular health plan. The Acme employee who inadvertently receives the information deletes the e-mail.
- A HIPAA-covered entity or business associate discloses health information to a person who is not authorized to receive it but has a good-faith belief that the receiving party will not be able to retain the information. For example, a doctor sends an e-mail containing health information to the wrong e-mail address, but the e-mail bounces back due to the address being wrong.
Even if an entity does not fall within these exemptions, it may still be exempt from complying with HIPAA's breach notification rule in the event of a breach of unsecured health information if it can show a low probability that the health information has been compromised.
What Should You Do to Comply?
If you are a HIPAA-covered entity, you have suffered a breach and the breach involves unsecured health information, you must comply with HIPAA's breach notification rule.
In the event of a breach, the rule requires a HIPAA-covered entity to:
- Send a notice to each individual whose unsecured health information has been, or is reasonably believed to have been, disclosed as a result of the breach "without unreasonable delay" and in no case later than 60 days after the breach is discovered. (Substitute notice is permitted under certain circumstances.)
- Send the notice by first-class mail, unless the individual to be notified has agreed to e-mail notification.
- Include within the notice: a brief description of the breach; a description of the types of information involved in the breach; the steps individuals should take to protect themselves from harm; a brief description of the steps the entity is taking to investigate the breach, mitigate the harm and prevent further breaches; and a toll-free telephone number, e-mail address, website or postal address for individuals to use to contact your company to ask questions.
- No later than within 60 days of the end of the calendar year in which the breach was discovered, notify HHS by submitting a breach report on its website.
If the breach involves more than 500 residents of a state or locality, the entity must:
- Without unreasonable delay, and in no case later than 60 days after the breach is discovered, notify "prominent media outlets."
- No later than 60 calendar days from the discovery of the breach, notify HHS by submitting a breach report on its website.
What About Entities that Are Not Covered by HIPAA?
Non-HIPAA covered entities that suffer a breach must ensure they are in compliance with state breach notification laws and the Federal Trade Commission's (FTC's) health breach notification rule.
Employers that are not covered by HIPAA are not immune from reporting obligations in the event of a security incident involving health information. Numerous states require notification within a specified time frame if residents' medical or health information is compromised as a result of a security incident. These states include Arkansas, California, Florida, Illinois, Missouri, Montana, North Dakota, Oregon, Rhode Island, Texas and Virginia.
Entities that maintain "personal health records" are required to comply with the FTC's health breach notification rule in the event of a breach. The prevalence of wearable technology has led to more entities retaining "personal health records": these are defined as an electronic record of health information that reasonably identifies an individual, that can be drawn from multiple sources, and that is managed, shared and controlled by the individual or primarily for the individual.
For example, a website that enables users to input information about their weight, blood pressure and other general health information might be considered a personal health record. The FTC's rule closely tracks the requirements of HIPAA's data breach notification rule. HIPAA-covered entities and business associates are exempt from compliance with the FTC's rule.
Takeaways for Employers
Employers might want to consider taking the following steps:
- Establish a security incident response team that is trained on how to comply with HIPAA's breach notification rule and develop an incident response plan.
- Review—and if necessary, enhance—administrative, physical and technical safeguards for health information to both reduce the risk of a security breach and ensure compliance with HIPAA.
- Develop templates for notice letters.
- Conduct simulations to test the effectiveness of the incident response plan.
Kwabena A. Appenteng is an attorney with Littler in Chicago.
Was this article useful? SHRM offers thousands of tools, templates and other exclusive member benefits, including compliance updates, sample policies, HR expert advice, education discounts, a growing online member community and much more. Join/Renew Now and let SHRM help you work smarter.
Advertisement
An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.
Advertisement