It may have just become harder for organizations that fall under the umbrella of the Health Insurance Portability and Accountability Act (HIPAA) to de-identify their MRI data.

report from the Mayo Clinic published earlier this month in the New England Journal of Medicine found that commercially available facial recognition technology could be used to successfully match a patient's de-identified  MRI scans to their photo.

While HIPAA laws were constructed to adapt to the kinds of shifting circumstances typified by technology, healthcare organizations could find themselves struggling to provide a level of anonymity that is becoming more and more elusive.

MRI images, for example, posed a challenge even before the introduction of facial recognition tech.

"Here's the thing: I don't know that you can de-identify an MRI photo," said Rolf Lowe, an attorney at Wachler & Associates.

According to the Mayo Clinic report, metadata such as a subject's name, identification number and the date of the scan itself are typically scrubbed from an MRI image before sharing to insure compliance with HIPAA.

However, Lowe indicated that HIPAA would also dictate the removal of photos and comparable images, which could include those generated by an MRI. There is software on the market that can be used to remove or blur facial images in medical images, but the Mayo Clinic report alleges it's not in wide use since doing so may obscure medically relevant details while still ultimately failing to prevent re-identification.

This makes de-identification efforts something of a challenge, and HIPAA doesn't afford a whole lot of wiggle room.

"The only way you could really get around it is if you kind of got consent from the individual," Lowe said.

Introducing facial recognition into the equation layers onto those difficulties and an already laborious de-identification process. HIPAA identifies two approved methods for de-identification of data, the first involving the engagement of a statistical expert to ensure that the information being targeted for release couldn't be used in conjunction with a database that is publicly available or otherwise able to identify someone.

Meanwhile, the second—the "safe harbor"—method would require that identifiers like names, email addresses, phone numbers and birthdates be removed—with one important catch.

Iliana Peters, a shareholder with Polsinelli, indicated that the entity performing the de-identification also has to ensure it has no foundation to believe the information is otherwise re-identifiable.

It's a broad framework, one in which facial recognition technology already seems to fit neatly inside. And that may be the point.

"I think it's important to understand that the [HIPAA] rules themselves are written in a way that's meant to adapt," Peters said.

She pointed out how difficult it would be to adapt or change laws every time a new piece of technology arrived on the stage. Whether or not that means there won't eventually be changes to HIPAA or other privacy laws to accommodate the challenges that tech poses to those under their dominion remains to be seen.

Lowe floated the possibility that the U.S. Department of Health and Human Services could eventually weigh-in with an opinion.

"I would [expect to] see some guidance coming through," Lowe said.