Unfortunately, YouTube lost many users after it was slowed down. Many use VPNs to continue accessing the platform (27%), but the some do not
use VPNs and prefer similar Russian platforms, like VK Video or RuTube. The Russian government has actively
nurtured these domestic alternatives by investing heavily in their infrastructure, improving algorithms and enticing popular creators with monetization incentives to migrate from YouTube.
Since March 2022, in response to Russia’s “fake news” law, TikTok has
blocked uploads, live streams and international content, leaving Russian users with an isolated feed consisting almost entirely of older, Russian-made posts.
Laws affecting social media According to Russian law, a social media is defined as a website or application with more than 500,000 daily users. They are subject to certain
restrictions, including a ban on publishing extremist materials, inciting terrorism, and distributing pornography and drug-related information, as well as on disseminating materials from “undesirable” and “extremist” organizations. Any platform hosting such materials must promptly remove them. Failure to do so can result in fines or blockage of the infringing platform.
But that’s not all. Under the so-called
Yarovaya law, Russian social networks are required to store user information, including the full content of calls and messages for six months and metadata for one year and provide it to security services upon request, as the networks are considered “information dissemination organizers” (in the Reyestr ORI). Platforms on this list that refuse to comply may face restrictions or even be fully blocked.
Personal data laws in Russia further increase pressure on these platforms. All personal data of Russian citizens must be stored locally, and compliance is enforced through Roskomnadzor audits and technical checks. Noncompliant entities may be severely fined, throttled or blocked.
Those that are unwilling to cooperate end up blocked and unavailable to Russian users without a VPN. Unsurprisingly, most foreign social networks are blocked in Russia. However, even for those that try to cooperate, there is no guarentee things will work out.
For instance, in early 2022, YouTube removed over 9,000 channels and 70,000 videos related to the war in Ukraine as part of its compliance with Russia’s enforcement efforts. Before being banned itself, Facebook reported that it had
restricted access to 1,723 items in Russia from January to June 2022 due to Roskomnadzor requests (content deemed harmful to minors, extremist, etc.).
The authorities do hold users liable too, meaning anyone using Russian platforms should exercise particular caution in their online speech. Users face criminal and administrative liability for posts that “discredit the army,” spread false information about the army (e.g., content about the war in Ukraine that does not align with official viewpoints), disseminate material from “undesirable” or “extremist” organizations, post “extremist” content or share other illicit content. For instance, just as of last year, 273 criminal cases on “fakes” about the army had been
initiated.
This year, the authorities went further by forbidding the purchase of Instagram verification badges, which they deem to be “financing an extremist organization,” and users doing so can
face up to eight years in prison. Russia has also outlawed advertising on banned social platforms. Starting in September, individuals or companies can be fined for running ads on Facebook, Instagram and other banned platforms, even retroactively.
There was a significant change this summer. Before then, end users were not held liable for consuming content; they were accountable only for posting or disseminating it. However, on July 22, the Duma
passed legislation introducing administrative fines, ranging from RUB 3,000 to RUB 5,000, for individuals who deliberately search for or access content deemed extremist (as defined in Russia’s federal extremist register), including via VPNs.
Experts view this as a dangerous trend and the beginning of broader control over individual online behavior. As mentioned, the new law takes effect this month.
Government surveillance and AI monitoringThe Russian government widely uses technology, including AI tools, to track users’ activities on social networks. In late 2022, Roskomnadzor launched an AI-powered system called Oculus. By February 2023, Oculus was operational and could automatically
review over 200,000 images and videos per day for prohibited content. It recognizes banned symbols, images and text, flagging content related to extremism, unsanctioned protests, fake news about the war, drug use, suicide and what Russian authorities deem to be “LGBT propaganda.”
Another Kremlin AI tool is Vepr, an analytics system designed to identify “points of information tension” online. Vepr
scans social media posts for “fake news” about officials and the army, negative opinions of the government, manipulation of public opinion, societal polarization and the “discrediting” of traditional values. It can even predict how such content might spread. Recently, government officials
announced that Vepr can track down threatening content in several minutes.
In addition to new AI projects, Russia has strengthened its existing surveillance infrastructure. SORM (System for Operative Investigative Activities), which requires telecom and internet providers to install equipment that gives the security services direct access to user communications, has been
upgraded. It employs deep packet inspection (DPI) to monitor content across networks. The security services do not need even to present search warrants to providers – under SORM, they have direct network
access.
Additionally, as previously mentioned, the Yarovaya law
requires internet companies to store all user communications, including texts, calls, images and metadata, for at least six months and to hand them over to the authorities upon request.
Additionally, under the 2019 “Sovereign Runet” law, DPI filters (TSPU devices)
were installed at internet service providers (ISPs), allowing Roskomnadzor to centrally block or throttle online content and even isolate Runet traffic. By early 2023, most major ISPs had reportedly deployed the required filtering, enabling the government effectively to shut down platforms. By late 2024, the state was reportedly
using this toolkit to slow down YouTube nationwide.
Moderation on the platforms By 2022, the most popular social media platform, VKontakte (VK), had already introduced automatic content filtering according to Roskomnadzor’s blacklist, thereby
blocking all prohibited content. Posts containing keywords related to banned topics, such as war criticism or supporting gay rights, are quickly deleted by automated filters or fast-acting moderators. Overall, VK extensively cooperates with the authorities. The company has confirmed that it
shares user data, including private chats, with law enforcement upon request.
Telegram, the second most popular communication service in Russia, claims to protect user privacy from the Russian state. In 2018, it refused to hand over encryption keys to security services, leading to a two-year ban that the government failed to enforce effectively and later
lifted. By the time of the war in Ukraine, Telegram was fully accessible. Telegram
claims that it has never shared any user data with Russian authorities. However, in 2025, journalist investigations revealed that Telegram’s network in Russia may have been
compromised by entities tied to the security services.
Telegram itself has resisted moderation. Unlike VK, it has generally not removed opposition channels. In fact, many “undesirable” or banned organizations still run Telegram channels that reach Russians.
TikTok has responded to Russia’s “fake news” crackdown by isolating its Russian users – cutting them off from foreign content while still allowing state-aligned, pro-government videos to slip through loopholes in enforcement of its policy.
In 2025, Russia has achieved near-total control of its social media landscape through surveillance tools, AI-driven censorship, strict enforcement of laws, and platform moderation. There is an ongoing trend of steadily increasing government control over platforms, content and individual online behavior.