We live in an increasingly divisive world. You can’t turn on the television, open a newspaper or listen to the radio without hearing one group of people yelling at or complaining about another.
The Internet, particularly most social media channels, can be a combination of all of the above, bringing news, entertainment, culture and politics into a mixture that can be as vitriolic as it is exclusionary. Facebook has taken its share of criticism for it, and developers there are working hard to repair their reputation and solve the problem.
This month, they announced additional steps to ensure that Facebook advertising is safe and civil. Most recently, they put the target on targeting, itself—by limiting the types of potential customers businesses can include from boosted posts and other paid advertisements.
“A business’s success depends on finding the right customers. Targeted advertising helps millions of businesses grow each year by connecting them with customers,” the social media giant said in a statement. “It also helps people see more relevant ads and connect with businesses in more meaningful ways.
“Even though targeting is an important tool to reach people, we have heard concerns about potential abuse, particularly about the feature that lets advertisers exclude people from their ads,” it continued.
The concern, according to Facebook, is that businesses could used the exclusion feature to limit ad views to people based on their color, race, cultural background, sexual orientation and more.
Facebook consulted with privacy, data ethics and civil rights experts, as well as charitable and advocacy organizations to determine how to tackle the potential problem. They’ve hired additional people to review ads and are working on learning systems to filter them, as well. They are also providing additional education to advertisers, so they understand their obligations to the law.
“Our goal is to catch ads that violate our policies before they run on Facebook,” the statement said. “We’re also doing more to help advertisers understand their obligations under our policies, including the policies that prohibit use of our tools to wrongfully discriminate.”
Like the newsfeed algorithm and the nature of Facebook itself, the process will continue to evolve.
“Our review is continuous; the process will be ongoing and we’ll continue soliciting feedback,” Facebook said. “We take our responsibility to keep advertising safe and civil seriously, and we will keep exploring more ways to make targeting work for people and businesses.”
What do you think? Is this enough? Let us know in the comments!