TikTok just got a new U.S. power structure. Now, some users say the algorithm got a new set of silent rules.
Within days of a group of mainly American investors, approved by President Donald Trump, taking control of TikTok’s U.S. operations, politically engaged creators began posting the same complaint in different ways: their videos were not just underperforming, they were disappearing from followers’ feeds, search results, and watch histories.
The stakes are not subtle. TikTok is where campaigns recruit, protest movements organize, and news travels faster than corrections. So when users say content is being throttled right after a change in ownership, the question becomes less about glitches and more about leverage. Who gets to decide what is visible?
When a Post Exists, but Nobody Can See It
PBS NewsHour reported that some TikTok users say the platform is censoring and limiting content, including posts and messages about Jeffrey Epstein and the shooting deaths of U.S. citizens by federal agents in Minneapolis. In one example, a creator said his U.S.-based followers could not access a post about the shooting death of Renee Good.
The user described a strange split-screen reality in which the video appears normally on his own page, but looks like it never existed to people trying to view it. According to the PBS segment, he said:
“So, this is how it should appear. The latest video under my three pinned videos is the ICE shooting analysis. But this is what my page looks like, according to people who message me. And even weirder, this is how it appears in their watch history, a blank square.”
That is not a typical complaint about a bad day in the algorithm. It is an allegation that visibility itself is being selectively denied, a modern form of soft deletion that is hard to prove and easy to dismiss.
A Senator Says TikTok Put His ICE Post in Time-Out
The controversy is not limited to creators chasing views. PBS reported that California State Senator Scott Wiener said TikTok would not let him share a post about ICE for several hours.
His message was tied to a legislative push, not a meme. In the segment, Wiener said:
“I am advancing a bill now to say that, in California, it’s not going to be just local and state law enforcement who can be sued if they violate your rights, but federal agents can as well.”
If an elected official claims a platform temporarily blocked his post about federal agents, that puts TikTok in a familiar crossfire. Is it enforcing rules on sensitive topics, struggling with systems changes, or responding to political pressure? The platform’s problem is that the timing invites suspicion.
Newsom Steps In, and TikTok Points to a Power Outage
California Governor Gavin Newsom announced a new state investigation to determine whether TikTok is violating state law by censoring content critical of President Trump, according to PBS NewsHour.
TikTok, for its part, pointed away from politics and toward infrastructure. PBS reported that the company issued a statement saying it had suffered a cascading systems failure that caused multiple platform bugs after a power outage at one of its U.S. data centers.

Two things can be true at once. Large platforms do suffer outages and bugs that produce weird, uneven symptoms. But a systems failure is also a convenient umbrella explanation when the visible damage is politically concentrated and happening during a politically charged transition.
TikTok’s credibility problem here is not that outages are impossible; it’s that the platform is already seen as an opaque gatekeeper. When something goes wrong, outside observers cannot audit the difference between a moderation choice, a tuning tweak, and a genuine technical collapse.
The Real Story Is Control, Not Code
PBS NewsHour framed the moment with a blunt reminder about who holds the keys. Tech journalist Jacob Ward, host of The Rip Current, told PBS:
“Well, I mean, I think, on the one hand, it’s important to just remember, in the context of American public discourse, right, that the way we communicate with one another is controlled by a handful of private companies.”
That is the pressure point. TikTok is not a public square with a neutral referee. It is a privately owned, constantly tuned distribution machine. The public only sees the outputs: views, reach, removals, and dead air.
Now add the new ingredient: a Trump-approved change to who controls TikTok’s U.S. operations. Even if the day-to-day moderation policy did not change at all, users who already distrust institutions are primed to connect any visibility problem to a political motive.
And when the content at issue includes ICE shootings, federal agents, and Epstein, the suspicions multiply. These are not topics that typically get brands excited. They are also the kind of topics that can trigger safety systems, misinformation enforcement, graphic-content filters, or harassment policies.
In other words, the platform has plenty of non-political reasons to reduce distribution. But it also has every reason to be careful about appearing political, especially after a significant shift in ownership and control.
What Would Censorship Look Like in 2026?
Traditional censorship is apparent. A government bans a pamphlet. Police seize printing presses. A judge orders a publication stopped.
Platform-era censorship, if it is happening, can look more like this:
- A post uploads successfully but does not appear in followers’ feeds.
- A video appears on a creator’s page but does not populate in watch history for viewers.
- Sharing tools temporarily fail on specific topics, accounts, or keywords.
- Search and recommendation systems silently exclude specific terms.
Those symptoms are consistent with several explanations. A bug. A backend migration problem. A moderation flag. A sudden shift in automated ranking. An enforcement wave. Or a combination, which is what TikTok suggested with its cascading systems failure explanation.
The problem is that the user experience is indistinguishable across those possibilities. The platform sees the levers. Users see the blackout.
What Happens Next: Receipts, Rules, or More Fog
Newsom’s investigation raises the pressure. If a state probe seeks internal explanations for why certain posts were blocked, delayed, or ghosted, TikTok may face a choice between transparency and corporate secrecy.
Meanwhile, creators will keep doing what creators do. They will test workarounds, swap keywords, post screen recordings, and compare notes. That kind of crowdsourced troubleshooting can produce evidence, but it can also produce mirages. Platforms are noisy systems, and humans are pattern-finders.
The next meaningful datapoint will be whether TikTok can provide precise, verifiable details about what failed, when it failed, what functions were impacted, and how it affected distribution. General statements about outages calm investors, but they do not convince users who feel singled out.
For now, TikTok is caught in a trap of its own design. The app runs on invisible choices. When the choices start to look political, every glitch becomes a motive hunt.