The Department of Housing and Urban Development alerted Twitter Inc. and Alphabet Inc.’s Google last year that it is scrutinizing their practices for possible housing discrimination, a sign that more technology companies could be ensnared in a government probe of their lucrative demographic ad targeting tools, according to three people with direct knowledge of the agency’s actions.
HUD charged Facebook Inc. with housing discrimination Thursday, alleging the social networking giant’s targeted advertising platform violates the Fair Housing Act by “encouraging, enabling, and causing” unlawful discrimination by restricting who can view housing ads.
“They want to make sure that other companies aren’t getting away with something that one company is investigated for,” said someone with direct knowledge of HUD’s outreach to other tech companies who is not authorized to discuss the communications.
One of the people with direct knowledge of the agency’s actions said the reviews are ongoing. Investigations usually take months if not longer. The Facebook probe began in late 2016.
A Twitter spokesman said company policies prohibit targeted advertising when it comes to racial or ethnic origin, religion, negative financial condition and commission of a crime. He declined to comment further on HUD’s interest in the company.
Google did not immediately respond to requests for comment.
Thursday’s charges caught Facebook off guard, coming a week after the social media giant agreed in a sweeping settlement with civil rights groups to overhaul its microtargeting ad system for job, housing and loan advertisements after discrimination complaints.
“We’re surprised by HUD’s decision, as we’ve been working with them to address their concerns and have taken significant steps to prevent ads discrimination,” Facebook spokesman Joe Osborne said.
He said a breakdown occurred when the government asked for total and unfettered access to the company’s user base, a request Facebook denied because it would have set a dangerous precedent.
“While we were eager to find a solution, HUD insisted on access to sensitive information — like user data — without adequate safeguards,” Osborne said. “We’re disappointed by today’s developments, but we’ll continue working with civil rights experts on these issues.”
HUD Secretary Ben Carson accused Facebook last August of enabling housing discrimination by allowing advertisers to exclude people based on race, gender, ZIP Code or religion. The move followed a nearly two-year preliminary investigation initiated during the Obama administration.
“Facebook is discriminating against people based upon who they are and where they live,” Carson said in a statement Thursday. “Using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door in someone’s face.”
HUD officials said the agency seeks to “address unresolved fair housing issues regarding Facebook’s advertising practices and to obtain appropriate relief for the harm Facebook caused and continues to cause.”
If a U.S. administrative law judge finds after a hearing that discrimination has occurred, the judge may award damages for harm caused by the discrimination, impose fines, or order injunctive relief, according to HUD officials. If the matter is decided in federal court, the judge may also award punitive damages.
The housing agency claims that Facebook mines users’ extensive personal data and uses characteristics protected by law — race, color, national origin, religion, familial status, sex and disability — to determine who can view housing ads, regardless of whether an advertiser wants to reach a broad or narrow audience.
HUD alleges that Facebook’s targeted advertising platform enabled advertisers to exclude people classified as parents, those who were not born in the United States, non-Christians, or people interested in accessibility, Hispanic culture, or a wide variety of other interests that closely align with groups protected under the Fair Housing Act.
HUD also accuses Facebook of enabling advertisers to exclude people based on where they live by drawing a red line around those neighborhoods on a map, evoking the memory of decades-old practices in which minority neighborhoods were marked “hazardous” in red ink on maps drawn by the federal Home Owners’ Loan Corp.
According to the government’s charges, Facebook combines data it collects about user attributes and behavior with information it obtains about user behavior on other websites and in the non-digital world. The agency alleges that Facebook then uses machine learning and other prediction techniques to classify people to project their likely response to an ad.
The practice may re-create groupings that are protected under the law, HUD says, and therefore have the same effects as the intentional discrimination of decades past.
“By grouping users who ‘like’ similar pages (unrelated to housing) and presuming a shared interest or disinterest in housing-related advertisements, [Facebook’s] mechanisms function just like an advertiser who intentionally targets or excludes users based on their protected class,” the complaint said.
The HUD case probably won’t result in significant financial harm for Facebook, but it illustrates how authorities have stepped up their oversight of how the tech giants operate, said Jeff Chester, executive director of the Center for Digital Democracy, a nonprofit consumer privacy advocacy group in Washington.
The Cambridge Analytica scandal — in which the political consultancy was able to improperly access data on 87 million Facebook users without those users’ permission — “was a huge wake-up call” for regulators, he said. “Cambridge Analytica wasn’t a freak accident. It was emblematic of how the industry operates all over the world every day. These companies have had unprecedented access to our lives and until recently no one was doing anything about” privacy violations and discrimination.
Chester also said Facebook would keep working to address HUD’s complaints because the Federal Trade Commission reportedly is weighing fines against Facebook for allegedly violating a 2011 consent decree aimed at protecting consumers’ privacy, and “Facebook is maneuvering to get the best possible deal with the FTC.”
Facebook, in its settlement with fair housing and other civil rights groups last week, said it would withhold a wide array of detailed demographic information — including gender, age and ZIP Codes, which are often used as indicators of race — from advertisers when they market housing, credit and job opportunities. The company plans to create a separate ad portal by the end of the year to limit how much these advertisers can microtarget their audience.
Facebook is also building a tool for users to search and view all housing ads across the country, regardless of whether they received the ads in their individual news feeds.
But HUD officials say the settlement does not go far enough in remedying housing discrimination.
The private settlement resolving legal challenges by the National Fair Housing Alliance, the American Civil Liberties Union, the Communications Workers of America and others focused on making it harder for advertisers to discriminate. HUD officials say there’s a lot more that happens in Facebook’s machine learning algorithm that influences what advertisers learn about people as well as what people see.
“Unresolved fair housing issues remain with Facebook’s advertising platform,” said HUD spokesman Raffi Williams. “Until HUD can verify that Facebook’s practices are in full compliance with the law, we will continue to use all resources at our disposal to protect Americans from the harmful effects of discrimination.”
Carson, in a departure from his immediate predecessors, has only once used his authority as HUD secretary to scrutinize widespread housing discrimination. After initially suspending a preliminary investigation into Facebook that began under the Obama administration in late 2016, Carson moved ahead under public pressure and filed his sole secretary-initiated complaint against the platform last year.
The lawsuits over Facebook’s ad practices followed a 2016 ProPublica investigation that found that the company enabled advertisers to exclude African Americans, Latinos and Asian Americans.
Jan writes for the Washington Post. Times staff writer James F. Peltz contributed to this report.