2024 Global Cyber Confidence Index

Arrow pointing right
ExtraHop Logo
  • Productschevron right
  • Solutionschevron right
  • Why ExtraHopchevron right
  • Blogchevron right
  • Resourceschevron right

Arrow pointing leftBlog

A Crash Course in Russian Information Warfare

Michael Clark

June 11, 2024

In the latest installment in a series of virtual wine tasting events, ExtraHop hosted Dr. Bilyana Lilly, CISSP, for a talk on Russian information warfare and the impacts U.S. companies can expect. Dr. Lilly is the Chair of the Resilience Track for the Warsaw Security Forum, author of Russian Information Warfare, and a firsthand witness to Russian information warfare tactics in Bulgaria during her home country’s transition to democracy.

What Is Russian Information Warfare?

According to Dr. Lilly, the Russian government defines information warfare as “a confrontation between two or more states in the information space.” The information space is what we refer to as cyberspace in the West.

Information warfare is part and parcel of Russia’s cyber warfare doctrine and has several objectives. The most obvious is to inflict damage on information systems, processes, and resources. The other goals are more insidious. The Russian government aims to undermine the political, economic, and social systems of its adversaries, conduct widespread psychological manipulation of target populations to destabilize the state and society they live in, and to coerce opposing states to make decisions that benefit Russia.

Describing Russian Cyber Tactics with the CHAOS Model

The Russian state achieves its goals through both cyber operations and strategic messaging. Dr. Lilly created the CHAOS model to describe the government’s multi-pronged approach.

  • C - Cyber: Refers to the cyber operations we think of immediately, like distributed denial of service (DDoS) attacks, website defacement, or other disruptive cyber campaigns conducted by the Russian Federal Security Service (FSB), Foreign Intelligence Service (SRV), military intelligence directorate (GRU), hacktivists, and more. The Russian government seems to have a tacit agreement that cybercriminals can act without fear of prosecution so long as they target enemies of the Russian state.
  • H - Hype: Refers to the volume of Kremlin-controlled media coverage being sent to the target country. This includes state-run media outlets, like RT (formerly Russia Today) and Sputnik, as well as so-called troll farms, which are groups of bloggers paid to spread information beneficial to Russia’s goals that may or may not be located in Russia. The topics of fake news stories disseminated by Russia aren’t necessarily important as long as they undermine people’s confidence in their government and its decision making. Media coverage plays a crucial role in Russian disinformation campaigns.
  • AOS - Associated Operations: Include other actions that help Russia achieve its goals. For example, Russia is known to sponsor protests on both sides of the political spectrum that have the potential to inflame public sentiments, create divisiveness, and turn notable public events to their benefit via strategic messaging (including disinformation campaigns). Russia has also been tied to coup and assassination attempts, explosions, and more.

The Russian state uses all of these tactics during “peacetime.” During times of declared war, Dr. Lilly says, Russia uses these tactics to weaken its targets before conducting kinetic operations.

For example, prior to its invasion of Ukraine, Russia targeted Ukrainian communication and command and control structures with defacement and DDoS attacks. During the invasion, Russia launched cyberattacks against internet service providers. Russia also leveraged an extensive disinformation campaign, which included deepfakes of Ukrainian President Volodymyr Zelenskyy, to intimidate and confuse the Ukrainian populace.

Information Warfare Against the U.S., Past and Present

Dr. Lilly says Russian actors are known to reuse successful tactics, like those they used during the 2016 presidential elections. Since we’re in another election year, it stands to reason that Russian cyber activities will be similar.

Dr. Lilly disagrees with the sentiment common in 2016 that the U.S. election interference campaign at the time was unprecedented—Russia has used the same playbook in campaigns across Eastern Europe. The first stage in this sort of information warfare is old-fashioned spying. Russia needed to understand the intricacies of U.S. elections, like how the electoral college works and which states are “swing states,” where targeted pressure can have a bigger impact.

Once Russian spies had gathered enough intelligence, other Russian actors began exploiting social media to spread strategic messaging designed to divide and polarize U.S. voters. Troll farms pushed polarized rhetoric on a variety of topics, like gun control, abortion, and police violence. These trolls posed as Americans and used dis- and misinformation to spread extremist views on both sides of the political spectrum. The goal was to divide the U.S. and make consensus on any of these issues impossible, rather than achieve a particular policy outcome.

This tactic helped clarify Russia’s ultimate goal: not for a certain president to win, but to make democracy impossible in the U.S. The ideal outcome for Russia is a homegrown coup, or at the very least, the erosion of American democracy from within.

These social media campaigns coincided with cyber actions and media coverage, the most notable example of which was the hack of the Democratic National Committee (DNC). Notably, the hackers didn’t immediately release the emails and other information they stole. Instead, they timed the release to coincide with a media campaign during the Democratic National Convention, which led to the resignation of Debbie Wasserman Schulz, then-chair of the DNC, and arguably created chaos for the democratic party. Twelve GRU officers were indicted in 2018 for their infiltration of DNC computer networks and theft of email content.

Russia also sponsored protests during and after the 2016 election. In fact, Dr. Lilly says, protests and inflammatory news coverage actually increased after the election in order to take advantage of angry voters. This tactic helped clarify Russia’s ultimate goal: not for a certain president to win, but to make democracy impossible in the U.S. The ideal outcome for Russia is a homegrown coup, or at the very least, the erosion of American democracy from within.

The good news? We know their playbook already, says Dr. Lilly. And, as she discussed in a talk at RSA Conference 2024, the ongoing war in Ukraine makes interfering in U.S. politics more difficult. On one hand, Russia has experienced significant brain drain and loss of access to necessary software and hardware since the beginning of the war. On the other hand, those remaining people and resources are focused on winning the war.

However, those constraints are balanced by two factors. For one, Russia has a deep desire to retaliate against the U.S. for its support of Ukraine, Dr. Lilly says. Secondly, artificial intelligence and machine learning techniques have made deepfakes an extremely cost-effective tactic. A deepfake robocall impersonating Joe Biden in New Hampshire during the 2024 primary election, for instance, reportedly only cost $500 to create. What it comes down to, according to Dr. Lilly, is Russian threat actors finding the time and the kompromat that Americans will believe.

Mitigating the Flow of Disinformation

To combat the impact of disinformation and misinformation campaigns, we should all think critically about the media we consume and share, even when it seems to confirm our preexisting beliefs. The Cybersecurity and Infrastructure Security Agency (CISA) shares the following tips when viewing content in any form:

  • Recognize the risk: Foreign actors frequently build their audiences by sharing entertaining, non-controversial content. Eventually, they begin to mix in disinformation and steer followers to more extreme positions. The same actor will use this tactic in multiple groups to pit them against each other.
  • Question the source: Foreign influence content is designed to look like real news. Before accepting what you read as truth, check who produced the content and question their intent. Verify that the author is real and qualified to cover the topic. Was the content published recently? Outdated information may be misleading. Be wary of clickbait headlines designed to make you feel strong emotions and make sure the content of the article backs up the headline’s claims. Evaluate the author’s argument: is the content made up of facts or opinions? Does the author support their argument with evidence and cite their sources?
  • Investigate the issue: Being well informed means getting information from a variety of places. Before sharing controversial or emotionally charged content, search for other reliable sources that confirm the information. If the content isn’t from a credible source or you can’t find a second reliable source, don’t share it.
  • Think before you link: Disinformation is designed to make you feel a certain way—shocked, angry, or smug. In the moment, it may feel righteous or necessary to share the content, but before you do, take a moment to let your emotions cool. Ask yourself why you’re sharing something. Sometimes taking no action is the best way to improve a discussion and thwart disinformation.
  • Talk to your circle: CISA says it’s probably not worth engaging with every piece of disinformation you see online, but offers tips for speaking with friends and family. Make sure you have the latest evidence prepared before having a conversation about disinformation. Ask yourself if weighing in on a particular post will help the conversation or cause more conflict. If you decide to respond, do so privately. Discussions can become dramatic with an audience, while a private conversation could be more constructive. Repeating a false claim only amplifies it, even when debunking it, so focus on the facts. Think “the sky is blue,” rather than “the sky is not green.” Try to make the person you’re speaking with feel heard and understood, and they’ll be more likely to listen.

How Network Visibility Can Help

RevealX detects over 100 techniques across 12 of the 14 tactics outlined in the MITRE ATT&CK framework, 78 of which are known to be used by threat actors linked to the SVR, FSB, and GRU, including APT28, APT29, Turla, Sandworm Team, and Gamaredon Group.

In previous campaigns, Russia-linked threat actors have gained initial access by brute forcing (T1110) service accounts or targeting valid accounts (T1078) that belong to users that no longer work at a victim organization. Russia-linked threat actors also use residential proxies (T1090), protocol tunneling (T1572), and communicate using encrypted channels (T1573), application layer protocols (T1071), non-application layer protocols (T1095), or non-standard ports (T1571) to hide their traffic or make it appear less suspicious to security teams. RevealX detects all of these techniques, and more.

In the current geopolitical climate, cybersecurity is national security. ExtraHop is here to help.

Missed the webinar? Watch a full recording here.

How is your organization preparing for increasing Russian cyber activity? Join the conversation on the ExtraHop Customer Community.

Explore related articles

Experience RevealX NDR for Yourself

Schedule a demo