In 2019, Julie Sweet, newly appointed CEO of global consulting firm Accenture, held a meeting with senior executives. He had a question: Should Accenture give up some of the work it does for a leading client, Facebook?
Tensions have grown over the years over a particular task that Accenture is performing for the social network. In eight-hour shifts, thousands of full-time employees and contractors rank Facebook’s most harmful posts, including images, videos, and messages about suicides, beheadings, and sexual acts, and try to prevent them from spreading online.
Examining hundreds of Facebook posts in one shift, some of the Accenture employees said they began to experience depression, anxiety and paranoia. In the United States, a worker participated in a class action lawsuit to protest his working conditions. News coverage attributed Accenture to the terrible work. So Sweet had ordered a review to discuss the increased ethical, legal and reputational risks.
At the meeting at Accenture’s Washington office, attendees expressed concerns that she and Ellyn Shook, head of human resources, were concerned about the psychological cost of working for Facebook and the damage to the firm’s reputation. Some administrators who oversaw the Facebook account argued that the problems were manageable. They said that the social network is too lucrative to lose a customer.
The meeting ended without a decision.
Facebook and Accenture have rarely talked about their deals and have even admitted to working with each other. But their secret relationship lies at the heart of the world’s largest social media company’s effort to distance itself from the most toxic part of its business.
Facebook has been under scrutiny for years for violent and hateful content circulating on its site. CEO Mark Zuckerberg has repeatedly promised to clean up the platform. It promoted the use of artificial intelligence to weed out toxic posts and announced efforts to hire thousands of workers to remove messages that AI didn’t.
Behind the scenes, Facebook quietly paid others to take on most of the responsibility. Since 2012, the company has hired at least 10 consulting and staffing firms around the world to review their assignments alongside a wider network of subcontractors, according to interviews and public records.
No company has been more important to this effort than Accenture. The Fortune 500 company, known for providing high-end technology, accounting and consulting services to multinational corporations and governments, has become Facebook’s largest partner in content moderation, according to a review by The New York Times.
Accenture took the job and earned it some reputation, as it signed at least $500 million (€422 million) per year contracts with Facebook for content moderation and other services, according to a New York Times review. Accenture employs more than a third of the 15,000 people Facebook says it hires to moderate their posts. While the deals provide only a small fraction of Accenture’s annual revenue, they provide an important lifeline to Silicon Valley. Within Accenture, Facebook is known as the “diamond client”.
Their previously unreported contracts redefined the traditional boundaries of an outsourcing relationship. Accenture absorbed the worst aspects of moderating content and took Facebook’s content problems on its own. As a cost of doing business, it reviewed posts to address workers’ mental health issues. As these workers pushed for more wages and benefits, he grappled with labor activism. And when they spoke out against the study, it was quietly subjected to public scrutiny.
15 current and former employees said these issues were compounded by Facebook’s tough hiring goals and performance targets, and multiple changes to its content policies that Accenture is struggling to keep up with. And when faced with work-related legal action from moderators, Accenture remained silent, as Facebook claimed it wasn’t responsible because workers belonged to Accenture and others.
“You can’t have Facebook as we know it today without Accenture,” said Cori Crider, co-founder of Foxglove, a law firm that represents content moderators. Facilitators like Accenture, for glamorous fees, have allowed Facebook’s business to keep the core human issue at arm’s length.
The New York Times interviewed more than 40 current and former Accenture and Facebook employees, business attorneys, and others about their relationships, including accounting and advertising work. Many spoke anonymously due to confidentiality agreements and fear of retaliation. The New York Times also reviewed Facebook and Accenture documents, legal records and regulatory filings.
Facebook and Accenture refused to allow admins to comment. A Facebook spokesperson, Drew Pusateri, said the company’s content moderation “recognises that things can be tough, so we’re working closely with our partners to continually evaluate how we can best support these teams.”
Accenture spokeswoman Stacey Jones said the work is a public service “necessary to protect our society by keeping the internet safe.”
Neither company mentioned the other by name.
Much of Facebook’s work with Accenture goes back to the nudity issue.
In 2007, millions of users joined the social network every month, and many of them posted nude photos. A deal Facebook reached that year with New York attorney general Andrew Cuomo required the company to remove pornographic posts flagged by users within 24 hours.
Team members said that Facebook employees who oversee the content were soon overwhelmed by the volume of work. The company’s chief operating officer, Sheryl Sandberg, and other executives, the three of them, forced the team to come up with automated solutions to scan content.
Facebook also started looking into outsourcing, they said. Outsourcing was cheaper than hiring people and offered tax and regulatory advantages as well as the flexibility to grow or shrink rapidly in areas where the company lacks offices or language expertise. Sandberg said it helped promote the idea of outsourcing, and middle managers worked on the details.
In 2010, Accenture signed an accounting contract with Facebook. By 2012, this had expanded to include an agreement to moderate content, specifically outside the United States.
Two former Facebook employees who took part in the trip said that year, Facebook sent Accenture employees to Manila, Philippines, and Warsaw, Poland, to train employees to sort posts. Accenture employees were taught to use a Facebook software system and the platform’s guidelines to leave content open, remove, or upgrade for review.
What started as a few dozen Accenture moderators quickly grew.
Former employees said that by 2015, Accenture’s San Francisco Bay Area office had assembled a team codenamed Honey Badger just for Facebook’s needs. Accenture provided around 300 employees in 2015, and grew to nearly 3,000 in 2016. They are a mix of full-time employees and contractors, depending on the location and role.
Facebook has also expanded its content work to other firms such as Cognizant and TaskUs. Facebook provides a third of TaskUs’ business, or $150 million a year, according to legal filings.
The job was tough. With over 90 percent of objectionable material coming to Facebook and Instagram being removed by AI, outsourced workers must decide whether to leave posts that AI doesn’t catch.
They receive a performance score based on correctly reviewing posts according to Facebook’s policies. Accenture employees said they could be fired if they made more than 5 percent mistakes.
Employees at Accenture began to question the implications of viewing so many hateful posts.
In Dublin, an Accenture moderator who sifted through Facebook content said she left a suicide note on her desk in 2018, a mental health counselor featured on the episode. The worker was found safe.
In Austin, moderator Joshua Sklar, who resigned in April, said he was reviewing 500 to 700 posts each shift, including images of corpses after car crashes and videos of tortured animals.
He added that if workers get around Accenture’s chain of command and communicate directly with Facebook about content issues, they risk being scolded. This has slowed Facebook from learning about and responding to issues, he said.
Facebook said anyone filtering content could raise concerns.
Spencer Darr, another former moderator in Austin, said at a legal hearing in June that the job required him to make unimaginable decisions, such as deleting a video of a dog swimming alive from him or simply marking it as offensive. “The job of content moderators is impossible,” he said.
In 2018, Accenture launched WeCare policies that it says limit mental health counselors’ ability to treat workers. Their titles were changed to “wellness coaches,” and they were instructed not to do psychological assessments or diagnoses, but to provide “short-term support” such as taking a walk or listening to calming music. According to the 2018 Accenture guidebook, the goal was to teach moderators “how to respond to difficult situations and content.”
Accenture’s Jones said the company is “committed to helping our employees who do this important work succeed both professionally and personally.” Workers can see psychologists outside.
By 2019, the industry’s scrutiny was growing. That year, Cognizant said The Verge came out of content moderation after the tech site announced the low wages and mental health effects of workers in an Arizona office. Cognizant said the decision would cost him at least $240 million in revenue and lead to 6,000 layoffs.
Multiple Accenture CEOs have discussed doing business with Facebook.
In 2017, Accenture’s then-chief Pierre Nanterme questioned the ethics of the business and whether it followed the firm’s strategy of providing services with long-term high profit margins and technical expertise, three executives involved in the discussions said.
No action was taken. Nanterme died of cancer in January 2019.
Five months later, Sweet, a longtime Accenture attorney and director, was appointed CEO. Three former colleagues said he soon ordered a review of the moderation job.
Last year, an Austin worker was one of two Accenture’s who participated in a class action lawsuit brought against Facebook by US moderators. Facebook argued that it was not responsible for the employment of workers by firms like Accenture, according to court records. After the judge ruled against Facebook in the case, the company reached a $52 million settlement with workers in May 2020.
Former executives said the discussion about Facebook contracts for Sweet spanned several meetings. He later made a few changes.
In December 2019, Accenture created a two-page legal statement to inform moderators about the risks of the business. The document said the work had “the potential to negatively impact your emotional or mental health.”
Last October, Accenture went even further. It first listed content moderation as a risk factor in its annual report, saying it could leave the firm vulnerable to media scrutiny and legal trouble. Two people familiar with the policy change said Accenture is also restricting new moderation customers. Any new contract required the approval of top management.
But they said Sweet also left some things untouched.
Among them: contracts with Facebook. In the end, people said the customer was too valuable to walk away. – This article was originally New York Times