A proposed rule change by the U.S. Department of Housing and Urban Development might make algorithmic decision-making more prominent, despite bias concerns.

Earlier this month, HUD closed the comment period regarding its proposal to amend the agency's interpretation of the Fair Housing Act's disparate impact standard. The agency wrote the considerations were to better reflect the U.S. Supreme Court's 2015 ruling in Texas Department of Housing and Community Affairs v. Inclusive Communities Project.

"[This] would really make it much more difficult to bring a housing discrimination case," said  Linda Morris, an ACLU Women's Rights Project attorney. (The American Civil Liberties Union also submitted comments opposing the proposed rule change.)

In addition to other updates, the housing agency added three defenses to shield an algorithm used for a practice or policy from claims of discrimination. Such algorithms can be used to determine credit scores, for example, or create targeted housing advertisements.

Defenses under the proposal include leveraging a "statistically sound" algorithm, using a third party that creates or maintains the algorithm, or using an algorithm whose inputs are not substitutes for a protected characteristic. Such defenses "provide[] a huge loophole" to defendants, Morris said. She noted credit agencies, insurers, housing companies, advertisers and other institutions leverage algorithms that can significantly impact someone's homebuying or renting abilities.

She added while the debate continues regarding the objectiveness of algorithm-powered decisions, sometimes "there is no discrimination intended but oftentimes the inputs in the algorithm are biased."

Some think the rule change could encourage more entities to use algorithms, whose usage HUD noted is "complicated, yet increasingly" common.

Chris Kieser, an attorney for Pacific Legal Foundation, said, "I think this will allow housing authorities to use more algorithms without having to scrutinize the data for racial impact, which is the overall goal of the regulation." 

Kieser noted that one of President Donald Trump's administration priorities is to limit "race-based decision-making. Broad disparate impact liability encourages race-based decisions and potentially quotas." Kieser added that the PLF views racial balancing as forbidden by the Constitution.

Other groups, however, expressed deep concern over the rule change. "I would say this highly problematic reading of how algorithms are used or can be used will travel across housing into employment as well," said Bertram Lee, policy counsel at Public Knowledge, which submitted a comment against the proposed changes.

While Kieser said he hopes other agencies are looking at HUD's proposed changes, opponents see a concerted effort to dismantle disparate impact and civil rights across federal agencies.

"Our concern certainly is this is part of a broader pushback of civil rights," said Bruce Mirken, media relations director of The Greenlining Institute, which also submitted a comment against HUD's proposed changes.

Mirken cited a January article in The Washington Post that reported on an internal U.S. Department of Justice memo directing senior civil rights officials to review how disparate impact regulations could be changed or removed, and their impact.

"This may be the first concrete manifestation of that and a sign of what may come," Mirken said.