How to Stop Losing the Information Wars – Foreign Policy

On Oct. 14, Facebook and Twitter made the decision to remove a dubious New York Post story from their platformsprovoking heated debate in the internets various echo chambers. The article in question purportedly revealed influence peddling by Democratic presidential nominee Joe Bidens son Hunter Biden, and the social media giants suspected that the uncorroborated claims were based on hacked or fabricated correspondences. Weeks before the U.S. presidential election, Silicon Valleys swift and decisive action in response to disinformation is a stark contrast to its handling of hacked emails from Hillary Clintons presidential campaign four years ago.

A week prior, on Oct. 7, the U.S. Justice Department announced that it had seized nearly 100 websites linked to Irans Islamic Revolutionary Guard Corps (IRGC). These sites had been engaged in a global disinformation campaign, targeting audiences from the United States to Southeast Asia with pro-Iranian propaganda. But it wasnt just the government engaged in countering adversaries online: One day later, Facebook and Twitter reported that they had taken down more than a dozen disinformation networks used by political and state-backed groups in Iran, Russia, Cuba, Saudi Arabia, and Thailand.

In the grand scheme of things, the events of Oct. 7 and 14 were hardly noteworthy. In recent years, private and public actors alike have had to ramp up their efforts against botnets, troll farms, and artificial intelligence systems that seek to manipulate the online information environment and advance certain strategic objectives. These actors came under unprecedented scrutiny in the aftermath of the 2016 U.S. presidential election.

But while cyberspace may be a new front in the fight against disinformation, disinformation in and of itselfas well as the societal discord it can sowhas been a national security concern for decades; the Cold War was largely waged by propagating competing versions of the truth. And much as the threat of fake news is nothing new, so too is the way policymakers deal with itor try to.

Therein lies the real problem. In countering disinformation emanating from the Kremlin, Chinese Communist Party (CCP), and IRGC, among others, the United States continues to rely on the same dated playbook that led to success against Soviet propaganda operations, known as active measures, in the 1980s. But this anti-disinformation strategy, like most else developed in the 1980s, has been rendered largely obsolete by an evolving media landscape and emerging technology.

Now, if the United States is going to have any hope of getting back on its front footand put a stop to adversaries attempts to sow confusion and cynicism domesticallyits going to have to seriously reconceive its old playbook.But that cant be done without Big Tech companies, which are the linchpin in the fight against disinformation.

Granted, some state-citizen reconciliation is needed to mend the fraught ties of the post-Snowden era. In 2013, the whistleblower Edward Snowden leaked documents exposing widespread cooperation between U.S. technology companies and the National Security Agency, triggering widespread backlash from technology companies and the public, who lamented the lack of personal privacy protections on the internet.

Since then, the chasm between Silicon Valley and the U.S. national security community has only widenedbut there are signs that the tide may be shifting: Companies like Facebook, Twitter, and Google are increasingly working with U.S. defense agencies to educate future software engineers, cybersecurity experts, and scientists. Eventually, once public-private trust is fully restored, the U.S. government and Silicon Valley can forge a united front in order to effectively take on fake news.

Disinformation crept onto the national security radar just as Ronald Reagan assumed the presidency in early 1981. After the CIA was publicly disgraced during the Church Committee hearingswhich exposed the CIAs controversial (and in some cases illegal) intelligence gathering and covert action against foreign leaders and U.S. citizens alikeReagan recruited William Casey to revamp the agency. On moving into his seventh-floor office at Langley, Casey, known to be a hawk, was dismayed to learn that the CIA was collecting almost no information on Soviet active measuresand doing even less to counter them.

Casey reorganized key offices within the CIAs Directorate of Intelligence to focus on better understanding Soviet active measures and instructed the Directorate of Operations to ramp up its collection of classified intelligence on Soviet propaganda. By mid-1981, the scale of the Soviets efforts became clear.In an August 1981 speech on Soviet disinformation campaigns against NATO, Reagan revealed that the Soviet Union had spent around $100 million to sow confusion in Western Europe after NATO developed the neutron warhead in 1979.

Of Moscows latest efforts, Reagan said he didnt know how much theyre spending now, but theyre starting the same kind of propaganda drive, which included funding front groups, manipulating media, engaging in forgery, and buying agents of influence. In 1983, for example, Patriot, a pro-Soviet Indian newspaper, released a story claiming that the U.S. military had created HIV and released it as a biological weapon. Over the next four years, the story was republished dozens of times and rebroadcast in over 80 countries and 30 languages.

By 1982, the CIA estimated that Moscow was spending $3 billion to $4 billion annually on global propaganda efforts. The Soviet Politburo and Secretariat of the Communist Party, which directed the active measures, made no major distinction between covert action and diplomacy; to the Kremlin, disinformation was a tool to advance the strategic goals of the Soviet Union in its competition with the West.

With the nation fixated on Soviet propaganda, senior leaders from across the Reagan administration came together to form what came to be called the Active Measures Working Group. Led by the State Departmentand including representatives from the CIA, FBI, Defense Intelligence Agency, and Defense and Justice departmentsthe national security bureaucracy quickly went on the offensive. Through the end of the Cold War, the group was effective not only in raising global awareness of Soviet propaganda efforts but also in undermining their efficacy. In fact, U.S. anti-disinformation campaigns were so successful that Soviet premier Mikhail Gorbachev in 1987 instructed the KGB to scale back its propaganda operations.

Clearly, those days are long gone. In stark contrast to the triumphs of the 1980s, the United States since the turn of the century has largely failed to counter disinformation campaigns by geostrategic competitors like Russia, China, and Iran.

The opening salvo of a new, digitized phase of state-level competition for influence occurred in 2014, when Russia seized Crimea from Ukraine. As he moved troops to the strategic Black Sea outpost, Russian President Vladimir Putin publicly claimed that those forces occupying Crimea could not possibly be Russian special forceslying outright to the global community. In the years since, the Kremlins disinformation campaigns have increased in volume, velocity, and variety.Today, state-level actors such as Russia, China, Cuba, Saudi Arabia, North Korea, and others employ armies of trolls and bots to flood the internet with false, misleading, or conspiratorial content to undermine Western democracy.

If Washington is still fighting the same enemy, then what went wrong?

The United States counter-disinformation playbook has been predicated on two unspoken assumptions, neither of which is valid today: first, that shining light on lies and disinformation through official government communications is an effective tactic; and second, that Washington can keep up with the speed and scale of disinformation campaigns. In fact, debunking efforts by government officials do little to discredit propaganda, and the volume of threats vastly exceeds the U.S. governments ability to identify and counter them. These inferences take U.S. credibilityand technological prowessfor granted, which is hardly inevitable.

Broadly speaking, three factors have changed the disinformation game since the 1980sand rendered the assumptions that formed the bedrock of the United States campaign against Soviet active measures obsolete. First, the global media environment has become far more complex. Whereas in the 1980s most citizens consumed their news from a handful of print and broadcast news outlets, today, world events are covered instantaneously by a tapestry of outletsincluding social media, cable news, and traditional news channels and publications.

Second, U.S. adversaries have relied on bots to amplify fringe content and employed trolls to generate fake content to advance their strategic objectives.Finally, rising political polarization has accelerated consumers drive toward partisan echo chambers while increasing their suspicion of government leaders and expert voices. Against such a backdrop, the Active Measures Working Groupa relic of simpler timescan no longer be successful.

Indeed, in the early days of the coronavirus pandemic, U.S. efforts to stem Chinese disinformation about COVID-19 backfired; Beijings disinformation campaigns accelerated between March and May. By June, Twitter reported that it had removed 23,750 accounts created by the Chinese government to criticize protests in Hong Kong and to extol the CCPs response to COVID-19.

To complicate matters further, the one anti-disinformation campaign where the United States has been successful in recent years is hardly a generalizable case. The U.S.-led Operation Gallant Phoenix, fighting the Islamic State, was able to steadily erode the groups legitimacy by undermining its propaganda machine. From a multinational headquarters in Jordan, the coalition flooded the internet with anti-Islamic State content and hobbled the groups ability to broadcast its message globally.

But a campaign against the Islamic State is far from a viable blueprint for countering Russian, Chinese, and Iranian disinformation campaigns. The international communityprivate sector tech firms includedshares the broad consensus that the Islamic State must be defeated. This sort of political harmony hardly exists, for example, on how, or whether, to forcefully counter Chinese-led disinformation efforts related to COVID-19.

Its clear that the United States is losing the information wars, in part due to a lack of innovation among the key stakeholders in the executive branch.But not all is lost. The next administration can make the United States a viable competitor in the global information wars by developing a comprehensive counter-disinformation strategy that is predicated on three different pillars.

Before any decisive counter-disinformation strategy can be formulated, key constituencies will need to come to some sort of consensus about data ethics. A commission staffed by leaders from the executive branch and media organizations must first draft a set of first principles for how data should be treated in an open and fair society; philosophical rifts like those between Twitter CEO Jack Dorsey and Facebook CEO Mark Zuckerberg over the role of speech need to be overcome.Any effective campaign in pursuit of the truth requires a set of guiding principles to inform the types of speech should be permitted in digital town squares and when speech should be fact-checkedor, in extreme cases, removed entirely.

Once first principles are established, the White House can erect a policy framework to guide defensive actions and appropriate resourcing to counter foreign disinformation campaigns. In the spirit of the Active Measures Working Group, an effective counter-disinformation strategy will require a whole of government approach, likely anchored by the State Department and supported by the Pentagon, the intelligence community, and other key stakeholders.

Finally, though the U.S. government can and should do much more to counter disinformation campaigns, it should be clear-eyed about the fact that its ability to shape the information environment has eroded since the 1980s.A comprehensive counter-disinformation strategy would be smart to recognize the limits of government action given the speed and scale with which information moves across social media today.

Thus, its important to nest government-led counter-disinformation activities within a broader set of actions driven by the private sector. Playing the role of coordinator, the United States should encourage the creation of a fact-checking clearing house among social media platforms to rapidly counter suspected disinformation. Indeed, Facebook and Twitter have already begun adding fact-checked labels to potentially false or misleading poststo the ire of Donald Trump. This should be encouraged and expanded to operate at the speed and scale with which content is generated and disseminated across social media.

The government could also use innovative investment pathways such as the Defense Innovation Unit or Joint Artificial Intelligence Center to incubate the development of new AI technologies that media platforms could use to spot deepfake technologywhich can be used to create fake videos, new images, and synthetic textat work. Deepfakes are rapidly becoming an inexpensive, fast, and effective means by which actors can wage irregular warfare against their adversaries.

Regardless of the precise form it takes, the future incarnation of the Active Measures Working Group should seek out Silicon Valley leaders to not only help co-lead the initiative but to also staff other key posts across the executive branch. In the end, the pathway to U.S. preeminence requires mobilizing the countrys unique assets: its ability to innovate, marshal resources at scale, and to come together in times of distressas after 9/11. Only a response marked by bipartisanship within governmentas well as strong partnerships with actors outside of itcan give the United States the reality check it desperately needs.

See more here:
How to Stop Losing the Information Wars - Foreign Policy

Related Posts
This entry was posted in $1$s. Bookmark the permalink.