It's not unusual for platforms to allow targeted ads—companies want to reach interested consumers, and users may prefer ads that are relevant to them.

But a suit brought against Facebook Inc. by the U.S. Department of Housing and Urban Development on March 28 shows platforms allowing targeted ads may face increased legal risks if in-house counsel aren't monitoring advertiser posts or implementing guidelines, lawyers said.

HUD alleged Facebook's advertising platform was “encouraging, enabling, and causing housing discrimination” by allowing posters to exclude protected categories from seeing housing ads. On March 19, Facebook chief operating officer Sheryl Sandberg said the platform would no longer allow housing, employment or credit ads to target specific ages, gender or ZIP codes.

“There's no law out there that says you can't target your ads to a particular demographic for candy bars. But there are certain categories, such as housing, employment, where there are restrictions and those restrictions can be violated by the platforms that allow for these advertisements … if it appears that they're complicit,” said Nilesh Patel, an advertising and intellectual property attorney at Frost Brown Todd and former senior in-house counsel at Sprint Corp.

Keri Bruce, a partner in Reed Smith's entertainment and media industry group, said sometimes it makes sense to target ads to a specific gender or other category, if a product or service is specifically for them. She said in-house counsel should know what is being advertised on their platform, looking out for housing, employment or credit-related targeted posts.

If platforms allow advertisements for these categories, Aaron Wiener, an advertising attorney at Gordon Law Group in Chicago, said in-house counsel should ensure advertisers can't target or exclude protected categories, such as women and minorities.

“If you are advertising a certain category of advertisement then you perhaps reduce the type of targeting criteria you make available for those ads,” Bruce said.

She and Patel noted that platforms that don't limit targeting options may be relying on advertisers not to violate discrimination laws. Platforms should implement and publicly post guidelines about what targeted advertisement is allowed.

Even with these guidelines, Patel said platforms not reviewing targeted advertisements pre-posting are risking anti-discrimination violations.

“You're trusting that the advertiser is being accurate and truthful in the categories that they're selecting,” Patel said. “In an online world, if you don't have review processes in place you can put those restrictions in, but who is ultimately minding the shop?”

Read More: