The leading social network said housing, employment or credit ads will no longer be allowed to target by age, gender or zip code — a practice critics argued had led to discrimination.
The changes came as part of a settlement with the National Fair American Civil Liberties Union, National Fair Housing Alliance, Communication Workers of America and others.
“Today’s changes mark an important step in our broader effort to prevent discrimination and promote fairness and inclusion on Facebook,” chief operating officer Sheryl Sandberg said in a statement announcing the changes.
“But our work is far from over. We’re committed to doing more, and we look forward to engaging in serious consultation and work with key civil rights groups, experts and policymakers to help us find the right path forward.”
The ACLU called the agreement a “historic settlement” that will result in major changes to Facebook’s advertising platform.
Under the settlement, Facebook will take proactive steps to prevent advertisers from discrimination when sending job, housing or credit ads to users of Facebook, Instagram and Messenger.
“Advertisers will no longer be able to exclude users from learning about opportunities for housing, employment or credit based on gender, age or other protected characteristics,” ACLU attorneys Galen Sherwin and Esha Bhandari said in a blog post.
“Ad-targeting platforms can be used to exclude users on the basis of race, gender or age, as well as interests or groups that can serve as proxies for those categories (think ‘soccer moms’ or ‘Kwanzaa celebrators’).”
The ACLU said it began exerting pressure on Facebook several years ago to stop its use of an “ethnic affinity” category, which labeled users as Asian American, Hispanic or African American based on what they liked on Facebook.
The organization said Facebook took some steps to eliminate discriminatory targeting but did not always follow through.
For certain ad categories, Facebook will create a separate portal for such ads with a much more limited set of targeting options excluding Facebook users’ age, gender, race or other characteristics.
Facebook will also implement a system of automated and human review to ensure compliance and to study the potential for unintended biases in algorithmic modeling