Ms. Sandberg said the company “never intended or anticipated this functionality being used this way — and that is on us.”
Facebook has grown into one of the world’s most valuable companies by offering advertisers the ability to quickly and easily target its users based on a vast array of information, from the type of home they live in to their favorite television shows. But the company is facing a new wave of scrutiny over how those tools can be misused, particularly after it disclosed this month that fake accounts based in Russia had purchased more than $100,000 worth of ads on divisive issues in the run-up to the presidential election.
The site has also been criticized for not anticipating that its technology could be put to nefarious use.
“The appearance of these offensive terms was embarrassing for Facebook and reflects the tendency of Silicon Valley companies to overly trust algorithms and automated systems to manage advertising,” said Ari Paparo, chief executive of Beeswax, an advertising technology start-up in New York. “The media business is all about people and influence, so there’s a necessary role for human moderation and control.”
This is not the first time that Facebook has faced issues stemming from a lack of human oversight. Earlier this year, after a series of violent acts appeared on Facebook Live broadcasts, the company said it would add 3,000 people to the 4,500-member team of employees that reviews and removes content that violates its community guidelines.
But this was the first time that Ms. Sandberg, who is responsible for Facebook’s entire advertising organization, has directly addressed the company’s high-profile ad issues in public. Ms. Sandberg, a veteran of the digital advertising industry, grew to acclaim in Silicon Valley by developing Google’s sales organization in the search giant’s early days. She joined Facebook in 2008, and was asked to do the same for the social network.
Facebook has faced thorny questions about race and its ad-targeting tools before. Last fall, ProPublica reported that advertisers could use those tools to exclude certain races — or what the social network called “ethnic affinities” — from housing and employment ads, a potential violation of the Fair Housing Act of 1968 and the Civil Rights Act of 1964. Facebook, which assigns the updated term “multicultural affinity” to certain users based on their interests and activities on the site, no longer allows that…