Case Study: Analyzing TikTok’s Ethical Failures with the Six Stages Framework

1.0 Introduction: A Decision Beyond the Headlines: TikTok Ethical Failures

TikTok’s decision to cut over 400 content moderation jobs is more than a simple business restructuring; it is a complex ethical event that reveals deep-seated priorities within the organization. When viewed through a standard business lens, the move can be explained away with the logic of efficiency and automation. However, this surface-level explanation often obscures the human cost. As one analysis notes:

“That systems of harm often wear the mask of logic.”

This case study will use the Six Stages Framework (SSF) as a “corrective lens” to move beyond the headlines and business rationale. By applying this framework, we can peel back the mask of logic to gain a deeper, more structured understanding of the systemic issues and profound ethical implications of TikTok’s decision. To begin, we must first understand the analytical tool we will be using.

2.0 The Analytical Tool: What is the Six Stages Framework (SSF) Lens?

The Six Stages Framework (SSF) Lens is a metaphorical tool designed to bring complex issues of equity, bias, and inclusion into sharp focus. Its core purpose is to move a person’s perspective “from distortion to clarity,” helping them to see systemic patterns of exclusion instead of isolated, random events.

The power of the SSF Lens can be understood through the analogy of upgrading from ordinary eyeglasses to advanced smart glasses.

Ordinary Glasses (Basic Correction)SSF Smart Glasses (Augmented Insight)
This represents correcting a distorted view to achieve basic clarity. Just as prescription glasses make a blurry world sharp, this initial step helps an observer see inequity clearly where it was previously blurry or even invisible.This represents an active tool that doesn’t just correct vision but augments it. It overlays structural clarity, interprets, flags, and questions a situation, revealing patterns of exclusion and helping the observer understand the underlying systems at play.

In short, while ordinary glasses correct your sight, the SSF Lens corrects your insight. With this tool defined, we can now apply it to the TikTok case to analyze the decision-making process and its consequences.

3.0 Applying the Framework: A Step-by-Step Analysis of TikTok’s Decision

Using the SSF Lens allows for a structured ethical analysis of TikTok’s actions, moving beyond the company’s public statements to reveal the underlying principles—or lack thereof—guiding its operations.

3.1 Identifying the Ethical Stage: A System of Resistance

The first step in the analysis is to place the organization on the framework’s continuum. Based on the evidence, the SSF analysis places TikTok at Stage –2: Resistance, with the potential of sliding toward Stage –3: Denial.

An organization operating at Stage –2 exhibits several key characteristics:

  • Appearance vs. Reality: The organization may have inclusive policies or values statements on the surface, but its decisions are fundamentally driven by cost-cutting, risk avoidance, or convenience.
  • Missing Empathy: Empathy is absent from the decision-making process. The human impact of a decision is secondary to its operational or financial benefits.
  • Safety as a Strategy: User safety is treated as a strategy, not a core principle. The organization resists embedding safety structurally when it conflicts with profit or efficiency, which means safety is ultimately seen as optional.

3.2 Uncovering the Blind Spots: The Three Active “Caves”

Within the SSF, “Caves” are the ethical blind spots or self-serving rationales that allow inequity to persist. They are the hidden logics that justify harmful decisions. These “Caves” are the mechanisms that keep an organization locked in a stage of Resistance. In the case of TikTok’s moderation cuts, three caves were particularly active.

  1. The Cave of Efficiency This cave prioritizes speed, output, and cost over care and nuance. It is active in TikTok’s decision because the shift toward automation and outsourcing is framed as a logical step to cut costs and streamline operations. This rationale completely overrides the ethical necessity for human judgment and contextual understanding, which are slower and more expensive but critical for effective moderation.
  2. The Cave of Technology over Humanity This cave is defined by the belief that technology, such as AI, can fully replace nuanced, trauma-aware content moderation. This is relevant because TikTok is placing its trust in automated systems to perform a role that is deeply human. AI systems cannot understand the cultural, emotional, or relational weight of trauma, abuse, or racism online. This blind spot ignores the critical ethical and emotional dimensions of online harm in favor of a technological solution.
  3. The Cave of Silence This cave is activated when the voices of key stakeholders are excluded from the decision-making process. It is evident here because the groups most affected were silenced. These include the content moderators whose labor shielded users from harm and the vulnerable users the platform claims to protect, such as young people, marginalised voices, and trauma survivors. Critically, this means that the community most at risk is not leading the safety discussion.

Identifying these caves reveals the flawed reasoning behind the decision. The framework then prompts us to ask deeper questions to define the human cost of these blind spots.

4.0 The Critical Questions: Defining the Human Cost

The SSF Lens shifts the focus from what happened to why, who, and how by prompting critical “vision questions.” These questions are designed to expose the ethical stakes of a decision and reveal who truly bears the risk.

For the TikTok case, the key questions are:

  • Who benefits from this decision, and who is at risk?
  • Whose labor is seen as disposable?
  • Whose safety is being deprioritized or seen as optional?
  • What assumptions are being made about the ‘neutrality’ of technology?

The purpose of these questions is to reveal the uneven distribution of risk and reward. They show that the benefits (cost savings, efficiency) flow to the company, while the risks (exposure to harmful content, loss of safety) are pushed onto its most vulnerable users and its devalued workforce. Answering these questions pierces the “mask of logic” worn by the company because they prevent silence, speed, and surface-level solutions from blurring our moral vision.

Furthermore, the framework doesn’t just diagnose the problem; it illuminates a better path forward by contrasting a Stage –2 response with what an ethically robust system would demand. An organization operating from a higher stage (Stage +3 or +4) would treat user safety as a non-negotiable. It would invest in transparent, trauma-informed moderation strategies that blend AI with human-led contextual review. It would establish independent ethical review panels with diverse lived experience at the center and ensure protection and respect for moderator wellbeing. This contrast highlights the true ethical cost of TikTok’s decision.

5.0 Conclusion: Seeing the Pattern, Not Just the Event

The Six Stages Framework Lens reveals that TikTok’s decision to cut its content moderation team was not merely a logical business move. It was a clear and demonstrable prioritization of profit and efficiency over human safety, care, and equity.

By identifying TikTok’s operational stage as Resistance (–2), the framework showed a system where empathy is missing and safety is treated as optional. By uncovering the active Caves of Efficiency, Technology over Humanity, and Silence, the analysis exposed the self-serving rationales and ethical blind spots that made such a harmful decision possible. The SSF transforms individual moments of exclusion into a discernible pattern of systemic issues. It leaves us with a final, powerful question to consider.

“If TikTok’s leaders were wearing their SSF lenses… what might they see now?”

Clip

https://youtube.com/clip/UgkxfoEoyOPDgpecvcdBX3pV2tDj65A6SST5?si=TIY7vbAI00r8bci4

@sixstagesframework

🟡 “When Safety Becomes Optional: Taking the SSF Lens to TikTok’s Moderator Cuts” What happens when a global tech giant decides to prioritise automation and efficiency over human safety? In this episode and reflective post, I take the SSF lens – the Six Stages Framework- to TikTok’s recent decision to cut over 400 UK-based content moderator roles. As the company leans more heavily on AI and off-shore solutions, we’re left to ask: What stage is TikTok operating from? Which “caves” are influencing their choices? Who is being centred, and who is being erased? And most importantly… what would a Stage +3 or +4 equity response look like? This story isn’t just about one company. It’s about the systems we build, the people we silence, and the moments we normalise harm. 👓 Put on your SSF lenses with me—and let’s explore what we’re really seeing when safety becomes optional. SSF & Inclusion Themes: SixStagesFramework ThroughTheSSFLens EquityLens InclusionMatters SocialJusticeLens BiasAwareness Tech + Ethics Focus: DigitalSafety AIandEthics TechAccountability ContentModeration AIandBias PlatformResponsibility Reflective + Reach: YouCantUnseeIt LeadershipReflection ConsciousLeadership EmpathyInTech VoicesThatMatter#equityinpsychology AI Technology TikTok

♬ original sound – Dr Shungu-Six Stages Framework – Dr Shungu-Six Stages Framework

https://www.linkedin.com/pulse/case-study-analyzing-tiktoks-ethical-failures-six-stages-m-gadzah-tqide

https://open.spotify.com/episode/4LYn6mollc0Fgt8ooasXiY?si=QFruen3-QMmxMUDgZLka4Q

Login Here

List of Services

  • Executive Leadership coaching from £250/ hour
  • EDI Supervision and support £130/hour
  • Diversity Equity and Inclusion Coaching £150/hour
  • Bespoke Diversity & Inclusion Training from £2000/day
  • Race and mental health coaching £120
  • Anti racism expert affidavits starts from £800
  • Individual diversity assessments- inclusion profiles (prices vary)
  • Organisational diversity assessments (prices vary)
  • Psychological assessment for anti racism claims starts from £1,400

Get in Touch

If you would like to know more about what we can offer then please get in touch and let us know what you are looking for.

Contact

Sign Up for our Newsletter
Updates, News, Resources & Discounts