In the first six months of 2025, YouTube has allegedly terminated 5,003,437 channels. That breaks down to 27,643 per day, 1,151 per hour, and 19 per minute. At the current pace, more than 10,000,000 channels will be terminated by the end of the year. By the time you finish reading this, another 100 channels are likely already terminated.
While the volume of removals is troubling, what is far more disturbing is how the removals are made. 81.8 percent of these terminations are due to supposed “spam, deceptive practices, and scams.” That category is, for the most part, extremely vague and is enforced by YouTube’s AI-driven automated systems. Appeals are sent to the same automated, non-human reviewer that issued the original termination, with many being rejected within minutes. No human interaction. No reason or understanding of context by a faceless and nameless AI bot.
YouTube’s automated process from start to finish:
- AI flags the creator.
- AI terminates the channel.
- AI reviews the appeal.
- AI upholds its own decision.
Creators are denied any human review, communication, or explanation. Zero human evaluation of intent, history, or nuance. Automated moderation has become the Substitute for Human Intelligence (SHI) in the space. It has turned what should be a content review process into a mechanical cycle that leaves creators disconnected from the platform they built their life around. That disconnect is one of the most significant sources of frustration and resentment in the creator community. When an individual’s livelihood can be wiped away with an automated scan, the relationship between platform and user is broken.
Creators build lives around their channels. They rely on them for income, business partnerships, communication with audiences, and years of hard work. When an AI system deletes all of that in seconds without human intervention, the fallout is dramatic. The impact is felt across large and smaller channels. Small and mid-sized channels are removed at the same rate and with far fewer resources to engage in legal action or to protect themselves from what appears to be the whims of an automated system.
Creators who do decide to fight the termination face a process that provides little clarity. Even those who have been successful in court have had their channels remain terminated because there is no clear process to compel acknowledgment of those decisions inside YouTube’s automated ecosystem.
The result is a system where creators assume all the risk and the platform takes no responsibility.
The AI system can end a creator’s career with no human review process available to them.
This problem extends into a larger question for the entire creator economy. How much of the enforcement of a platform’s Terms of Service should be delegated to an AI system when those non-human decisions have the power to determine whether an individual can continue to support themselves?
Parler and PlayTV were built to address these problems. The platforms center moderation decisions around human oversight. The rules on both platforms are simple and focus on legality and clarity in communication. Parler’s model gives creators ownership of their data, control of their audiences, and stable monetization with 80/20 splits across tipping, subscriptions, and paid content. With Cartix, creators also have access to 90/10 merchandise splits and a unified creator ecosystem that is not beholden to external companies for processing or moderation decisions.
This stands in contrast to how larger platforms are built. It is rooted in a belief that the relationship between creators and the platforms they build on must be human and accountable if it is going to be sustainable.
For creators who have been caught in YouTube’s recent wave of automated terminations, the stories continue to emerge. Years of work vanish in seconds. Channels are left without communication options or clear pathways to resolution.
The growing frustration with automated systems within the creator community points to a broader existential problem for the entire creator economy. A platform cannot remain healthy for long when the people who create its value feel that their channels can be taken without explanation or human review.
Parler will continue to shine a light on these issues in the coming weeks. The future of the creator economy will be determined by systems that understand the value and vulnerability of the individual humans who make it work.
For more information about Parler Technologies, Inc., visit
