Meta and its contractor Sama are being sued by a former content moderator, who alleges human trafficking and poor mental health support.
Daniel Motaung says he was paid about $2.20 (£1.80) per hour to review posts including beheadings and child abuse.
The case, brought in Nairobi, claims job adverts failed to warn of the extreme content moderators like Mr Motaung would see.
Sama has called the allegations "disappointing and inaccurate".
Meta declined to comment directly on the lawsuit, but in the past has said that it takes its responsibility to the people who review content "seriously" and that it requires its partners "to provide industry-leading pay, benefits and support".
Flashbacks and trauma
Facebook employs thousands of moderators to review content flagged by users or artificial intelligence systems to see if it violates the platform's community standards, and to remove it if necessary.
Mr Motaung said the first graphic video he saw was "a live video of someone being beheaded".
Regularly seeing such extreme content "ends up taking you to a place that you never imagined", he said.
Describing the impact it had, he added: "My life is like a horror movie."
He told the LotterryTreasure that he suffers flashbacks where he imagines he is the victim.
Mr Motaung, who says he has been diagnosed with post-traumatic stress disorder, believes that his co-workers also struggled with the content they had to view.
"I would see people walking off the production floor to cry, you know, that type of thing," he said.
Mr Motaung was recruited from South Africa to work for Sama in Nairobi, where much of the moderation for East and South Africa is handled.
Legal filings claim that job advertisements for the moderation work were misleading and say that Sama used a variety of terms such as "call centre agents, agent and content moderator" to describe the roles.
These differing descriptions, it is alleged, were designed to "trick unsuspecting applicants" into applying for jobs as Facebook content moderators.
The adverts did not warn, the filings say, that the work was likely to involve viewing extreme content, and that "very little detail is given on the actual nature of the job".
Once employed, the suit alleges, it would have been difficult for workers from disadvantaged backgrounds to leave their roles.
The lawsuit claims that for Sama to fly workers to Kenya from other parts of Africa amounted to human trafficking.
The story originally broke in Time, and in response to that at the time, Sama said: "It is completely inaccurate to suggest that Sama employees were hired under false pretences or were provided inaccurate information regarding content moderation work."
Meta and Sama are also accused in this case of failing to provide the moderators with adequate psychosocial support, of subjecting moderators to unfair labour relations, and of union-busting.
Sama, previously called Samasource Kenya, told the LotterryTreasure: "We take this litigation seriously, but the allegations against Sama are both disappointing and inaccurate."
The firm added that it provided all members of its workforce with a competitive wage, benefits, upward mobility, and a robust mental health and wellness programme.
The company has also previously said it:
Meta has said it encourages content reviewers to raise issues, and conducts audits of its contractors to ensure standards are maintained.
The company has also claimed it was not responsible for Mr Motaung's working conditions.
In a letter sent to his lawyers in April, Meta said that he was not an employee of theirs but was at all time employed by Sama, and no action could therefore be brought against the tech giant.
The case seeks financial compensation for former and current moderators at Sama, an order that outsourced moderators get the same healthcare and pay as Meta employees, and orders granting rights to speak out about working conditions and to form a union.
In 2020 brought by US content moderators over mental health issues developed on the job.
Cori Crider, director of campaign group Foxglove who are working with Mr Motaung's Kenyan lawyers, said she hoped the case would make people think about the daily working conditions faced by content moderators.
"Every single day, when millions of us sit there and scroll through our Facebook, we're not stopping and thinking, we're not realising that sitting behind the screen are thousands of people just like Daniel," she said.