Go to JKU Homepage
LIFT_C
What's that?

Institutes, schools, other departments, and programs create their own web content and menus.

To help you better navigate the site, see here where you are at the moment.

Insights from Research

Sara Maric reports on her job as a content moderator

Content moderators watch, label, and remove user-generated and AI-generated content shared on social media platforms such as Facebook, Instagram, TikTok, and OpenAI. Contrary to common beliefs, Artificial Intelligence (AI) is unable to detect and remove a majority of this type of content.

Humans need to screen videos and images depicting acts of torture, abuse and sexual violence. At the same time, content moderators themselves are meticulously recorded, monitored, and controlled, all with the help of technology. On a typical working day, they are required to review up to 1,000 postings daily, making a decision within seconds before immediately moving on to the next. As one content moderator put it:

  “We had to work 150 tickets per hour. You know, it’s crazy. If you don’t do this, you will be warned by your team leader.” 

Content moderators work under precarious conditions and lack proper professional training for handling potentially traumatizing content. They have little to no support or access to mental health experts while not earning much more than the minimum wage. One content moderator said:

“So this actually was the most horrible part of it because it's a job that is really affecting your well-being, affecting your mental health. You know, to the extent that even your physical health is also affected. But again, you cannot talk about it at all.“

Content moderation is typically not handled directly by platform organizations but is outsourced to specialized companies, a practice known as business process outsourcing (BPO). These BPO firms are primarily located in countries of the Global South but also operate in Europe to screen content in various European languages. Despite being a low-wage sector, the prospect of an office job and stable income makes content moderation an attractive alternative, especially for workers in countries where average wages are very low, and unemployment rates are high. In Germany, content moderators earn slightly above the minimum wage. However, the high cost of living makes content moderation a financially precarious occupation in Western Europe. Furthermore, migrating content moderators in Europe (but also those migrating to countries such as Kenya) often face a double financial burden: they must cover their own living expenses while also providing financial support for their families in their home countries. Their salaries are barely sufficient to make ends meet.

The majority of content moderators are deliberately recruited from low-wage countries or regions affected by war and unrest, where they are trying to escape difficult political and economic situations. Their migrant background makes them vulnerable, leaving them highly dependent on stable employment to support their families in their native countries and to maintain their residency permits. As one moderator explained:

“When we talk of moderators here in Kenya, we were coming from different countries, some from South Africa, others from Nigeria, from Ethiopia, from Uganda, Rwanda, Burundi, Kenya, you know, and all these other countries in Africa. (...) What they are doing is just to bring a bunch of young people, they expose them to this very toxic content. Then they’re finished with them. They know these people are no longer even able to take care of their lives as a result of this work. Then they dispose of them, and then they bring another bunch of people.”

The companies take advantage of the young age and work force’s inexperience. Unaware of the job's true nature, they are often misled into believing they are accepting a "translation job". Consequently, these young individuals enter into a job that severely impacts their mental and physical health, as they are subjected to continuous re-traumatization by having to watch disturbing content daily. A content moderator from Kenya reported

„They are choosing students all the time. You have some visa problems, you have some money problems and you need that job. They are using you.”

In addition to reviewing user-generated content, content moderators are also responsible for training AI models like those used in autonomous driving or in language models like ChatGPT. By outsourcing content moderation, platform organizations and generative AI companies hide the workers and their working conditions from the public, thereby avoiding any association with the workers’ exploitation. These workers put their physical and mental health on the line to keep social media platforms and AI models safe. This raises an important question: What can we do to improve these workers' working conditions?

 

About this research

Sara Maric (JKU) is currently conducting a study about content moderators. She has interviewed 50 content moderators from Kenya, Germany, and Morocco. Content moderators are difficult to access since the large platform organizations and their contractors want to ensure that this part of their work does not come under public scrutiny.

 

  • Additional Research Laura Thäter on her research about improving platform workers' working conditions.

    Article by Laura Thäter
  • Extended Artistic Statement Lennart Grau and Carla Streckwall about their artistic work

    Artistic Statement
  • Additional Information Material Academic articles, documentation, etc.

    Info Material