Richard Stengel was a firsthand witness to the disinformation campaign launched by Russia that accompanied its annexation of Crimea from Ukraine in 2014.
Stengel, former editor of Time magazine and undersecretary of state for public diplomacy and public affairs during the Obama administration, says the methods Russia used then were mirrored two years later in the 2016 U.S. presidential election. In the latter case, he says, the U.S. was caught flat-footed. Will the country be able to combat the expected onslaught in 2020?
The potential answer to that question is in Stengel’s new book “Information Wars: How We Lost the Global Battle Against Disinformation and What We Can Do About It.”
Stengel joins “Chicago Tonight” in conversation.
Below, the introduction to “Information Wars.”
The ﬁrst thing you notice when you walk into the White House Situation Room is how cramped and stuffy it is. There’s so little space that if people are already sitting at the table, you have to slowly snake your way in between them like you’re taking a seat in the middle of a row in a crowded movie theater. Excuse me . . . Pardon me . . . Sorry. And try not to bump the National Security Advisor. For some reason, the air-conditioning doesn’t work all that well, so it can get pretty fragrant. And unless you’re the President of the United States, every guy keeps his suit jacket on and his tie tightened.
It was early in 2014, and it was my ﬁrst time in the room with President Obama. I was the new Under Secretary of State for Public Diplomacy. He was in shirtsleeves and came in without greeting anyone—focused, intense, all business. I had known President Obama when I was a journalist and had that chummy, jokey rapport with him that journalists and politicians cultivate. But this was a side of him that I had never seen before.
The meeting was about the role of international broadcasting, which was part of my brief at the State Department. International broadcasting meant the legacy organizations that were better known during the Cold War: Voice of America, Radio Free Europe, Radio Liberty. You may not pay attention to them anymore, but they still have a $750 million budget—a nontrivial number even to the federal government. Ben Rhodes, the President’s deputy national security advisor, sketched out the topic and then called on me. I started to lay out all the traditional stuff that these entities were doing, and I could see the President was impatient. “I caught the pass, Rick,” he said with- out a smile. Hmm. In a nanosecond, I pulled back to 30,000 feet and said, well, the real problem was that we were in the middle of a global information war that was going on every minute of the day all around the world and we were losing it.
Then, a different response from the head of the table. “Okay,” the President said, “what do we do about it?”
That is the question. There is indeed an information war going on all around the world and it’s taking place at the speed of light. Governments and non-state actors and individuals are creating and spreading narratives that have nothing to do with reality. Those false and misleading narratives undermine democracy and the ability of free people to make intelligent choices. The audience is anyone with access to a computer or a smartphone—about four billion people. The players in this conﬂict are assisted by the big social media platforms, which beneﬁt just as much from the sharing of content that is false as content that is true. Popularity is the measure they care about, not accuracy or truthfulness. Studies show that a majority of Americans can recall seeing at least one false story leading up to the 2016 election. This rise in disinformation—often accompanied in authoritarian states by crackdowns on free speech—is a threat to democracy at home and abroad. More than any other system, democracies depend on the free ﬂow of information and open debate. That’s how we make our choices. As Thomas Jefferson said, information is the foundation ofdemocracy. He meant factual information.
Disinformation is as old as humanity. When the serpent told Eve that nothing would happen if she ate the apple, that was disinformation. But today, spreading lies has never been easier. On social media, there are no barriers to entry and there are no gatekeepers. There is no fact-checking, no editors, no publishers; you are your own publisher. Anyone can sign up for Facebook or Twitter and create any number of personas, which is what troll armies do. These trolls use the same behavioral and information tools supplied by Facebook and Google and Twitter to put poison on those platforms and reach a targeted, receptive audience. And it’s just as easy to share something false as something that’s factual.
One reason for the rise in global disinformation is that waging an information war is a lot cheaper than buying tanks and Tridents, and the return on investment is higher. Today, the selﬁe is mightier than the sword. It is asymmetric warfare requiring only computers and smartphones and an army of trolls and bots. You don’t even have to win; you succeed if you simply muddy the waters. It’s far easier to create confusion than clarity. There is no information dominance in an information war. There is no unipolar information superpower. These days, offensive technologies are cheaper and more effective than defensive ones. Information war works for small powers against large ones, and large powers against small ones; it works for states and for non-state actors—it’s the great leveler. Not everyone can afford an F-35, but anyone can launch a tweet.
Why does disinformation work? Well, disinformation almost always hits its target because the target—you, me, everyone—rises up to meet it. We ask for it. Social scientists call this conﬁrmation bias. We seek out information that conﬁrms our beliefs. Disinformation sticks because it ﬁts into our mental map of how the world works. The internet is the greatest delivery system for conﬁrmation bias in history.
The analytical and behavioral tools of the web are built to give us information we agree with. If Google and Facebook see that you like the Golden State Warriors, they will give you more Steph Curry. If you buy an antiwrinkle face cream, they will give you a lot more information about moisturizers. If you like Rachel Maddow or Tucker Carlson, the algorithm will give you content that reﬂects your political persuasion. What it won’t do is give you content that questions your beliefs.
So, what do we do about it?
First, let’s face it, democracies are not very good at combating disinformation. I found this out ﬁrsthand at the State Department, where the only public-facing entities in government that countered ISIS messaging and Russian disinformation reported to me. While autocracies demand a single point of view, democracies thrive on the marketplace of ideas. We like to argue. We like a diversity of opinion. We’re open to different convictions and theories, and that includes bad and false ones. In fact, we protect them. Justice Oliver Wendell Holmes famously argued that the First Amendment protects “the thought that we hate.”4 And frankly, that’s a handicap when it comes to responding to disinformation. It’s just not in our DNA as Americans to censor what we disagree with. “The spirit of liberty,” said Learned Hand, “is the spirit which is not too sure that it is right.”
Disinformation is especially hard for us to ﬁght because our adversaries use our strengths—our openness, our free press, our commitment to free speech—against us. Our foes use free media just like political candidates do. They understand that our press’s reﬂex toward balance and “fairness” allows them to get their own destructive ideas into our information ecosystem. Vladimir Putin knows that if he says the sun revolves around the earth, CNN will report his claim and ﬁnd an expert who will disagree with it—and maybe one who supports it just to round out the panel. This quest for balance is a journalistic trap that Putin and ISIS and the disinformationists exploit. In a fundamental way, they win when an accepted fact is thrown open for debate. Treating both sides of an argument as equal when one side is demonstrably false is not fair or balanced—it’s just wrong. As I used to tell the foreign service ofﬁcers who were working to counter disinformation, “There aren’t two sides to a lie.”
What is perhaps most disturbing is that disinformation erodes our trust in public discourse and the democratic process. Whether it’s Mr. Putin or ISIS or China or Donald Trump, they want you to question not only the information that you are getting but also the means through which you get it. They love the stories in Western media about information overload and how social media is poisoning the minds of young people. Why? Because they see us questioning the reliability of the information we get, and that undermines democracy. They want people to see empirical facts as an elitist conspiracy. Social media was a godsend to their disinformation efforts. On Facebook and Twitter and Instagram, information is delivered to you by third parties—friends, family, celebrities—and those companies don’t make any guarantee about the veracity of what you’re getting. They can’t; it’s their economic model. And your friends are not exactly the best judge of what’s fact and what’s not. Under the law, these companies are not considered publishers, so they are not responsible for the truth or falsity of the content they are delivering to you. That is a mistake. They are the biggest publishers in history.
Not that long ago, the internet and social media were seen as democratizing and emancipating. The idea was that universal access to information would undermine authoritarian leaders and states. In many cases, it does. But autocrats and authoritarian governments have adapted. They have gone from fearing the ﬂow of information to exploiting it. They understand that the same tools that spread democracy can engineer its undoing. Autocrats can spread disinformation and curtail the ﬂow of accurate information at the same time. That’s a dangerous combination for the future of democracy.
This challenge is different from those we’ve faced before. It is not a conventional military threat to our survival as a nation, but it is an unconventional threat to our system of beliefs and how we deﬁne ourselves. How do we ﬁght back without changing who we are?
As you will see, I don’t believe government is the answer. In a democracy, government is singularly bad at combating disinformation. That’s in part because most of those we are trying to persuade already distrust it. But it’s also not good at creating content that people care about. That’s not really government’s job. Early on at the State Department, I said to an old media friend, “People just don’t like government content.” He laughed and said, “No, people just don’t like bad content.”
This is not a policy book, though there is policy in it. It’s not a traditional memoir, though the book is in the ﬁrst person. It’s not journal- ism, though I’ve tried to use all the skills I learned over a career as a journalist. Is it history? Well, it’s somewhere between the whirlwind of current reporting and what we once called history. But with today’s accelerated news cycle, where memoirs come out a few months after the actions they describe, it’s more like history as the Greeks saw it, a narrative about the recent past that provides perspective on the present. It’s the story of the rise of a global information war that is a threat to democracy and to America—a story that I tell through my own eyes and experiences at the State Department.
I spent a little under three years at State during President Obama’s second term, from early 2014 to the end of 2016. I came to it after seven years as the editor of Time and a lifetime as a journalist. As head of Time, I used to say my job was to explain America to the world, and the world to America. That’s not a bad deﬁnition of my job at State. I brought other experience with me as well. I spent three years working with Nelson Mandela on his autobiography. I was the head of the National Constitution Center in Philadelphia. The ofﬁcial description of my job at the State Department was to support U.S. foreign policy goals by informing and inﬂuencing international audiences. Some people called it being “propagandist in chief,” but I liked to say that I was the chief marketing ofﬁcer of brand America.
The story is not a view from the top. Despite that opening anecdote, I was not in the Oval Ofﬁce conferring with President Obama on key decisions. But it’s not a view from the bottom either; I was the number-ﬁve-ranked person at the State Department. In the grand scheme of things, the Under Secretary for Public Diplomacy isn’t a big deal, but the job is not a bad vantage point from which to tell this particular story. No, I couldn’t see everything that the President or the Secretary of State saw. But in government, it’s harder to see below you than above you. While I missed a lot of what those below me saw, I saw a lot of what those above me missed.
There’s a lot in the book about how government and the State Department work. I found government too big, too slow, too bureaucratic. It constantly gets in its own way. And sometimes that’s not a bad thing. Like, now. I used to joke with my conservative friends that they should be in favor of big government because big government gets nothing done. But at the same time, I came to realize that the only people who could really ﬁx government are those who understand it best. The dream of an outsider coming in to reform government is just that—a dream. This also bears repeating: I found that the overwhelming number of people in government are there for the right reasons—to try to make things better. To work for the American people. To protect and defend the Constitution. They are true public servants. Even when I grew frustrated, I never doubted that.
The rap on me in government was that I saw every problem as a communications problem. I wouldn’t say this was quite true, but I saw that communication was a critical part of every problem. And that not thinking about and planning for how to communicate something generally made the problem worse. And you know who else saw it that way? ISIS and Vladimir Putin and Donald Trump. For all three of them, communications—what we in government called messaging—was not a tactic but a core strategy. They all understood that the media cycle moves a lot faster than the policy cycle, and policy would forever play catch-up. They knew that it was almost always better to be ﬁrst and false than second and true. One problem with the U.S. government is that we didn’t really get that; we saw messaging as an afterthought.
Even though my position had enormous range—covering educational and cultural exchanges as well as public affairs—I ended up focusing on two things: countering ISIS’s messaging and countering Russian disinformation. Before I went into government, smart people told me to ﬁnd a few things to concentrate on and not to worry about the rest. As it turned out, I felt like these two issues found me. History happened, I jumped in, and I worked on them to the exclusion of almost everything else. Both involved a global trend: the weaponization of information and grievance. ISIS perfected a form of information warfare that weaponized the grievances of millions of Sunni Muslims who felt spurned by the West and by their own leaders. Russia spent decades developing its own system of information warfare, which helped Putin weaponize the grievances of Russians who felt a sense of loss at the fall of the Soviet Union. In fact, our word “disinformation” is taken from the Russian dezinformatsiya, which was reportedly coined by Stalin. Both ISIS and Russia saw and depicted America as a place riven by hypocrisy, racism, and prejudice, and the primary source of global injustice. This book’s narrative is chronological, and the story rotates back and forth between Russia and ISIS, a structure that reﬂects the reality of my job. I tell the story in real time with the knowledge I had at the time.
And then, two-thirds of the way through my time ﬁghting these battles, Donald Trump entered the American presidential race, and it felt like everything suddenly connected. The information battles we were ﬁghting far away had come home. Trump employed the same techniques of disinformation as the Russians and much the same scare tactics as ISIS. Russian propagandists had been calling Western media “fake news” long before Donald Trump. The Russian disinformation techniques we saw around the annexation of Crimea and the invasion of Ukraine were transposed to the American election space. Only this time, they were done in English—pretty poor English mostly—not Russian. For ISIS, Trump’s candidacy conﬁrmed all that they had been saying about the Islamophobia of the United States and the West. Trump’s “Muslim ban” was propaganda gold for ISIS. All three of them—ISIS, Putin, and Trump—weaponized the grievances of people who felt left out by modernity and globalization. In fact, they used the same playbook: ISIS sought to Make Islam Great Again; Putin yearned to Make Russia Great Again; and we know about Mr. Trump. The weaponization of grievance is the uniﬁed ﬁeld theory behind the rise of nationalism and right-wing strongmen.
I found that there was a malign chain of cause and effect among the three. In ﬁghting Assad and seizing territory in Syria, ISIS helped create an exodus of Syrian refugees, millions of whom made their way to Europe. Putin’s indiscriminate bombing in Syria accelerated that mass relocation. Then Russia, through disinformation, helped weapon- ize the idea of immigration by stoking fears of refugees and terrorism. And along came Donald Trump, who made the fear of immigration a central part of his campaign.
I see that very clearly now, but did I see it then? Not really. Did anyone in the U.S. government see it? I’m not sure. If people did see it, they didn’t talk about it, and not much was done about it. I’m not sure how much we could have done anyway.
Every scene in the book is designed to show how both Russia and ISIS weaponized information and grievance; how Russian disinformation entered the American election; how Donald Trump weaponized grievance and used many of the same techniques and strategies as Russia and ISIS did; how government isn’t much good at responding to a threat like this. In many ways, the ﬁght against ISIS’s messaging looks like a success story. We actually did a fair amount, and ISIS went from seeming omnipresent on social media to being conﬁned to the dark web. But the truth is, I don’t know that what we did made any difference. Crushing ISIS militarily had a heck of a bigger effect than dueling with tweets. As I used to tell my military colleagues, losing a city to ISIS sends a terrible message, but taking a city is the best message of all. Ultimately, it’s not a military ﬁght; it’s a battle of ideas between Islamic extremists and the much larger audience of mainstream Muslims. ISIS was always more of an idea than a state, and that idea is far from dead.
The ﬁght against Russian disinformation was murkier. It was difﬁcult to get started, didn’t gain much traction, and then mostly faded away. Combating Russian disinformation was harder than countering ISIS in part because everyone agreed that ISIS was an irredeemable enemy, while lots of people at State and the White House were ambivalent about hitting back at Russia. Some of that hesitance came from people who didn’t think it was the government’s job to counter any kind of disinformation, which is a fair point. Some of it came from people who thought that countering Russia’s message only made things worse. And some came from people who felt that it was more effective to treat Russia as a fellow superpower (even though it was not) than a fading regional player.
But the scale of Russian disinformation was beyond what we were capable of responding to. The Russians had the big battalions; we had a reluctant, ragtag guerrilla force. They also had the element of surprise. Maybe a few old Cold Warriors might have seen it coming, but mostly we did not. It hadn’t been all that long since the 2012 election when people had mocked Mitt Romney for saying that a revanchist Russia was our number one geopolitical foe. Frankly, it’s not that they were so sophisticated, it’s that we were so credulous. The Global Engagement Center, created during my ﬁnal year and designed to be a centralized hub for countering all kinds of disinformation, is potentially a powerful weapon in this ﬁght.
Finally, when it came to countering Donald Trump’s disinformation, we were pretty much paralyzed. No one wanted to do that. Let me correct that: plenty of people wanted to do it, but almost no one thought it was practical or right or legal to do so. Moreover, everyone at the White House and at the State Department thought, Well, Hillary is going to win, and the White House really didn’t want it to look like we were putting our ﬁnger on the scale. After all, the Russians and Trump were preparing to question the integrity of the election when Trump lost. No one wanted to give them any evidence they could use to say the election was rigged, which is precisely what they would have done.
For the ﬁrst six weeks after Donald Trump entered the race in June 2015, Russia did almost nothing to support him. The Russians seemed as bewildered as the rest of us at what he was doing. They were always and resolutely anti-Hillary, but it took them a while to become pro- Trump. They were reading the polls too. When they did come around to supporting him, it was pretty clear they didn’t think he would win. What they wanted was a loss close enough that they could question the legitimacy of Mrs. Clinton’s victory. They were as surprised by Trump’s victory as, well, Trump was.
I saw Russian disinformation enter the American presidential campaign and was alarmed by it, but to this day, I’m not sure what impact it had. Russian messaging had a lot of reach but hardly any depth. Sure, Russian ads and stories on Facebook reached 126 million people, but those 126 million people saw exponentially more content than a few Russian ads. Moreover, as data today suggests, the ads themselves were not very successful. People didn’t recall them or act on them. What had a more signiﬁcant effect was the false and deceptive content that the Russians seeded onto all platforms, not just the buying of ads on Facebook. But in the end, disinformation tends to conﬁrm already held beliefs; it’s not really meant to change people’s minds. Disinformation doesn’t create divisions; it ampliﬁes them.
So, did Russian disinformation tip the election to Donald Trump? I don’t know. By televising hundreds of hours of Trump’s campaign speeches, CNN did a whole lot more to elect him than Russia Today did. Televising his rallies sent a message to voters: this is important, pay attention—after all, we are. And millions of voters’ deeply held antipathy to Hillary Clinton did a lot more to defeat her than a few hundred Russian trolls in St. Petersburg. The Russians sought to sow doubt about the election, hurt Hillary, and help Trump, without any expectation that it would tip the balance.
My experience in government changed my view of the information and media industry in a fundamental way. As a journalist, I had always seen information as the lifeblood of democracy. That’s how the Framers saw it too. Like so many, I saw the rise of the internet as a fantastic boon to global freedom and democracy—the more knowledge people had, the better able they would be to choose how to govern themselves and live their own lives. I still do. But these new tools and platforms are neutral. As Aristotle said of rhetoric, it can be used for good or ill. I came to see that dictators and autocrats and con men quickly ﬁgured out how to use these new tools to fool and intimidate people. They used the tools of democracy and freedom to repress democracy and freedom. We need to use those same tools to protect those values.
I had always believed in the notion that the best ideas triumph in what Justice William O. Douglas called “the market place of ideas.”12 This notion is found in John Milton and John Stuart Mill and is a bedrock principle in our democracy. But everyone presumed that the marketplace would be a level playing ﬁeld. That a rational audience would ultimately see the truth. I think we all now know that this is a pipe dream. Unfortunately, facts don’t come highlighted in yellow. A false sentence reads the same as a true one. It’s not enough to battle falsehood with truth; the truth does not always win.
In foreign policy, there’s the classic divide between realism and idealism. When it came to information, I’d always been an idealist. I believed that sunlight was the best disinfectant. I left ofﬁce as an information realist. Disinformation, as I said earlier, isn’t a new problem, but the ease with which it can be spread on social media is. Today we are all actors in a global information war that is ubiquitous, difﬁcult to comprehend, and unfair. It is a war without end, a war without limits or boundaries. A war that we still don’t quite know how to ﬁght.
To say the truth is under attack is a beautiful phrase. But the problem is that people have their own truths, and these truths are often at war with one another. We no longer seem able to agree on what is a fact or how to determine one. The truth is, it’s impossible to stop people from creating falsehoods and other people from believing them.
So, looking back, there was a lot that we saw that we did some- thing about. There was a lot that we saw that we didn’t or couldn’t do anything about. And there was a lot that we just didn’t see. I saw part of the picture but not all of it. I wish I had been able to connect the dots faster. I wish I had been able to do more. And there was always the sense that it couldn’t happen here.