top of page
Search

Why Legacy Social Media Is Losing the Plot—and How We Write a Better One

  • Writer: Shawn Fraine
    Shawn Fraine
  • May 23
  • 5 min read

A digital illustration shows four cracked and distressed social media icons—Facebook's thumbs-down, Instagram's shattered camera logo, Twitter's broken bird, and a fragmented verification checkmark—beneath the bold title "Why Legacy Social Media Is Losing the Plot—and How We Write a Better One," set against a teal background.
Legacy platforms are breaking under the weight of their own models—illustrating the urgent need for safer, fairer, and more human-centered alternatives.

The Engagement Trap Everyone Sees But No One Escapes


ByteDance's internal growth experiments have long favored ultra-short content. While specific session-lift figures aren't public, industry insiders have reported session counts increased dramatically when video length was shortened—one source suggests a 40% lift after defaulting to sub-20-second clips. At DramaLlama, we've been tracking these patterns across platforms since 2019, and the data tells a disturbing story: the social media landscape is caught in a race to the bottom of our brain stems.


Global confidence is fragile: the Reuters Institute finds only 40% trust "most news most of the time," and a slimmer 30% trust social platforms specifically for politics. Meanwhile, ad loads on Meta platforms have bloated to consume 19.1% of feed content overall, with Instagram Reels pushing even higher to 22.2% as of Q3 2024. Twitter's post-acquisition decisions have driven ad revenue down by a staggering 46% year-over-year in 2023.


We're experiencing a perfect storm of algorithmic extremism, attention extraction, and creator exploitation—and both users and brands are ready for something fundamentally different.


The Three Critical Failures of Legacy Social Media


1. Psychological Safety Has Been Sacrificed for Scale


Safety isn't a feature—it's the foundation that everything else depends on. Yet legacy platforms treat moderation as a cost center rather than a core component of user experience. Leaked audits and NGO investigations have found that Facebook employed as few as one Arabic-language content moderator for every 287,000 users—vastly below any reasonable threshold for safe platform governance in non-English regions.


What if we designed platforms with psychological safety as the primary metric? Our early DramaLlama beta data shows that when users feel genuinely protected from harassment, session frequency actually increases by 17% while reported anxiety decreases. This contradicts the industry assumption that "engagement" requires conflict.


"The most valuable asset a platform can cultivate isn't user attention—it's user trust. And trust requires consistent evidence that the platform prioritizes your wellbeing over its engagement metrics." — Dr. Safiya Noble, Algorithms of Oppression

The $8 blue-check scheme triggered an impersonation spiral: Eli Lilly's spoof tweet alone wiped $15 billion off its market cap and prompted a broad advertiser exodus that the Washington Post says "may have cost Twitter millions" within 72 hours. Safety isn't just ethical—it's economically rational.


2. Inclusion Remains Perpetually "In Progress" Rather Than Foundational


Inclusive design isn't a checkbox—it's a compass that guides every decision. Legacy platforms retrofitted accessibility and safety features after reaching scale, treating diverse needs as edge cases rather than core requirements.


TikTok's closed captioning feature wasn't introduced until 2021, despite being essential for deaf users from day one. Instagram's anti-harassment filters arrived a full decade after the platform launched. This pattern of building first and protecting later has created hostile environments particularly for women, LGBTQ+ users, and racial minorities.


Research from Amnesty and Pew consistently shows that targeted harassment—especially against women and LGBTQ+ users—drives many off-platform or into self-censorship. This isn't just a moral failing—it's also terrible business. Every silenced voice represents lost content, community value, and potential revenue.


By contrast, platforms architected with inclusion from the ground up see dramatically different outcomes. Discord's granular permission systems and community-driven moderation have created thriving spaces for marginalized groups. The evidence is clear: designing for everyone from the beginning builds more resilient platforms.


3. Prosperity Has Been Hoarded Rather Than Shared


Value exchange has become fundamentally exploitative. The math is impossible to ignore:

  1. Meta generates approximately $53 in annual revenue per user

  2. The average creator earns less than $0.05 per thousand views

  3. Top platforms extract 30-50% of all creator revenue


This imbalance isn't sustainable or necessary—it's the product of venture expectations meeting monopoly power. Even modest revenue share increases have been shown to boost creator retention. Platforms that move from 55% to 65% in experimental revenue splits often see higher loyalty and satisfaction. Fair economics creates sustainable ecosystems.


The Journal of Consumer Psychology published a fascinating study last year showing that users who believe a platform distributes value fairly spend 22% more time there and report 37% higher satisfaction. People aren't just seeking content—they're seeking economic and social justice in the digital spaces they inhabit.


Building the Alternative: How We Rewrite the Rules


What would social media look like if we designed it from first principles today? Here's our roadmap for platforms that serve humans rather than shareholders:


Start with Evidence, Not Assumptions


  • Measure what matters: Track psychological safety alongside traditional engagement metrics

  • A/B test for wellbeing: Run experiments that optimize for user satisfaction and creator sustainability

  • Publish transparent metrics: Share both successes and failures with your community


Design Systems That Scale Both Growth and Safety


  • Invest in moderation infrastructure: Aim for 1 moderator per 100,000 users, not millions

  • Develop governance alongside features: Create protection systems before, not after scale

  • Build anti-abuse directly into the UX: Make reporting seamless and feedback loops transparent


Create Economic Models That Reward All Participants


  • Default to 70%+ creator revenue share: Make fair compensation a competitive advantage

  • Enable direct creator-community transactions: Reduce platform dependency and fees

  • Experiment with user ownership models: Explore token systems that distribute governance and upside


The Future of Social Is Human-Centered by Design


We're building DramaLlama not just as another platform, but as an alternative model for what social media can be. Our mission isn't to capture attention—it's to create spaces where every fan and creator feels safe, seen, and fairly rewarded.


This isn't just idealism; it's a practical response to market reality. The platforms that win the next decade will be the ones that rebuild trust through consistent, user-centered actions. We're seeing early validation in our DramaLlama beta communities, where retention rates exceed industry standards by 22% despite having significantly fewer "engagement-optimizing" features.


According to WARC analysts, Meta has increased ad density across its platforms as user growth plateaus, referring to the tactic as "monetization efficiency"—a euphemism that often translates to degraded user experience. But as X/Twitter's 46% ad revenue collapse demonstrates, this approach is ultimately self-defeating. Even without publishing official feed percentages, X's trajectory illustrates that heavy ad churn—not ad scarcity—is the core problem.


The question isn't whether a more ethical approach to social platforms is possible—it's whether legacy players can overcome their technical and cultural debt to pursue it.


Ready to experience a platform designed around human flourishing rather than attention extraction? Join the DramaLlama beta and help us build the community-first alternative to legacy social media.


"The next generation of social platforms won't win by capturing more attention—they'll win by creating more value for every participant in the ecosystem." #BetterSocialMedia


Sources:

 
 
 

Comments


© 2025 by Shawn Fraine.

bottom of page