Algorithmic Accountability

This document outlines the Users Union's position on social media algorithms — specifically, any platform that uses a personalized feed to automatically surface content: videos, images, articles, text.

A search engine that ranks results by recency or popularity doesn't qualify. What we're concerned with is the personalized feed: content selected for you specifically, by a system optimizing for your continued attention. That distinction matters, because personalization is where the complexity, and the harm, lives.

We want to do something about that.



Doomscrolling

Doomscrolling isn't a willpower problem. It's an engineering problem. Platforms have spent billions of dollars and decades of research making sure you don't stop. The algorithm is not neutral — it is optimized, at enormous scale, specifically to capture and hold your attention. You were never meant to win that fight.

To understand why, it helps to know what you're actually up against. Social media feeds are not passive delivery systems. They are the first information systems in history to combine personal targeting, real-time behavioral feedback, at a billion-user scale. And they operate with zero accountability to the people they're built around.

• Personal targeting. Unlike television, radio, or print, which broadcast the same content to everyone, these systems build a unique profile for each user and serve content tailored specifically to keep you watching. Not the average person. You.

• Real-time behavioral feedback. Every pause, replay, skip, and scroll is recorded and fed back into the system instantly. The algorithm learns what holds your attention as it holds it, updating its model of you in real time, at no additional cost

• Billion-user scale. Previous persuasion systems — advertising, propaganda, even early internet ads — had real costs per person reached, which limited their reach and their refinement. These systems scale to billions of users essentially for free, which means they can be tested, tuned, and optimized to a degree no prior technology has approached.

We are only beginning to understand what it means to have grown up with this, or to have spent a decade inside it. The research is young, the regulation is nearly nonexistent, and the platforms themselves have little incentive to find out. What we do know is that these systems were not designed with your wellbeing in mind, and that nobody is currently required to account for the difference.



Political Radicalization

Recommendation algorithms are designed to keep you watching by serving content similar to what you just engaged with. In practice, this creates a rabbit hole effect: each video, post, or article leads to something slightly more intense than the last. Researchers have documented this pattern consistently across YouTube, Facebook, and TikTok — users who engage with mainstream political content are systematically nudged toward more extreme versions of it, not through any deliberate editorial choice, but because more extreme content tends to generate more engagement.

The reason is simple. The emotions that drive the most engagement, like outrage, fear, and tribal identity, are also the emotions that fuel political extremism. The algorithm has no interest in politics. It has an interest in your attention. Extremist content, on all sides of the political spectrum, is exceptionally good at producing those feelings, so the system rewards it with reach. The amplification isn't ideological. It's mechanical.

The result is a radicalization pipeline that requires no conspiracy to operate. A user doesn't choose to become more extreme, they follow the recommendations. Communities don't choose to become echo chambers, the algorithm learns what the group engages with and narrows the feed accordingly. What begins as curiosity ends, for some, in worldviews they would not have arrived at on their own. The platform bears no responsibility for the journey. That is the problem.


What is Algorithmic Accountability?

Algorithmic accountability means that the systems shaping what billions of people see, think, and feel must answer for what they do. Not to shareholders. Not to advertisers. To users.

The first step is simple: a reset button. Everyone has felt their feed turn on them — too dark, too political, too strange — and had no way out. The ability to wipe your algorithmic history and start fresh is the most basic form of user control imaginable. It costs platforms nothing. They just don't offer it, because a fresh algorithm is a less potent one. That has to change.

None of this happens on its own. Tech companies have no incentive to change a model that is enormously profitable. Regulators move slowly and are routinely outpaced by the technology they're trying to govern. Individual users have no leverage.

That's what the Users Union is for. We are a collective voice for the people these systems are built around — and currently built against. The reset button is where we start. My feed, my rules is where we're going.