This article appears in the Fall 2014 issue of The American Prospect magazine.
Though few people realize it, it was Bill Clinton who first uttered the phrase on which U.S. foreign policy has remained so hopelessly fixated ever since. During his 1996 re-election campaign, striving to describe America’s role in the world, Clinton began to speak of this country as the “indispensable nation.” In his acceptance speech at the Democratic National Convention that year, Clinton proclaimed, “We remain the world’s indispensable nation to advance prosperity, peace, and freedom and to keep our own children safe from the dangers of terror and weapons of mass destruction.” (Clinton himself apparently got this phrase from historian James Chace and adviser Sid Blumenthal.) The words stuck and became a regular catchphrase for his second-term secretary of state, Madeleine Albright—so much so that many people mistakenly attribute the words to Albright.
“Indispensable nation” was the formulation Clinton put forth to the public. In private, however, he came to hold a different, more modest view of America’s role. Strobe Talbott, his close friend and, eventually, deputy secretary of state, recounted in a memoir how, after leaving the White House, Clinton told Talbott his goal as president had been to prepare America to live in a multipolar world. “I wanted to build a world for our grandchildren to live in where America was no longer the sole superpower, for a time when we would have to share the stage,” said the former president. Talbott asked Clinton why he had never said this while he was president. In an interview for my book The Obamians, Talbott recalled Clinton’s response: “That’s why you’re a wonk and I was president of the United States. Because it’s political suicide to say, ‘Here is my vision—my vision is that we have to prepare for our children’s or grandchildren’s era when America’s not going to be the top dog.’ I’d have been ridden out of town on a rail!”
It is this duality—this conflict between what American leaders privately recognize and what they feel they have to say—that hobbles American perceptions of the world today. In public, politicians of both parties feel compelled to return again and again to the touchstone of the United States as being “indispensable,” continuing to speak as if this was 1945, after the end of World War II, or 1989, at the end of the Cold War. On November 9, we will observe the 25th anniversary of the fall of the Berlin Wall, and there have been all sorts of changes in the world since then. Yet we remain stuck in the formulations and self-conceptions of these earlier periods. We seem unable to acknowledge to ourselves that other nations of the world do not always need us as a leader in exactly the same way they did in 1945 or 1989. Moreover, in resolving international crises, other nations have become indispensable to the United States, too—far more than they were in the recent past. Indeed, in some crises, other nations are even more indispensable than is the United States.
In short, America’s view of its role in the world has become downright, well, unrealistic. As Bill Clinton admitted, but only to his friends, we need to find a more realistic sense of the world as it is working today. Call it a different form of realism.
On November 9, we will observe the 25th anniversary of the fall of the Berlin Wall. Despite the myriad changes in the world since then, we remain stuck in the formulations and self-conceptions of an earlier time. This November 10, 1989 file photo shows Germans from East and West standing on the Berlin Wall in front of the Brandenburg Gate, one day after the wall opened.
In talking about foreign policy, the word realism comes up over and over again. In the loosest street definition of the word, realism boils down to whatever the speaker believes—those who disagree are then labeled “naïve” or “ideological.” Alternatively, the word is mistaken to mean “tough-minded.” For example, in popular terminology, some conservatives called Ronald Reagan a “realist” in dealing with the Soviet Union.
But this usage was doubly inaccurate. It’s not that Reagan was wrong about the nature of Soviet power; it’s that he wasn’t a “realist.” First, his own views and policies changed in the White House, from his early, arms-buildup years to the period when he began to do business with Mikhail Gorbachev. Few on the right remember it now, but from 1986 on, conservatives railed at Reagan’s policies. (Howard Phillips of the Conservative Caucus once branded Reagan “a useful idiot for Soviet propaganda.”) During the 1988 campaign, George H.W. Bush, then Reagan’s vice president, took a position to the right of Reagan on Soviet policy, arguing that Reagan was too willing to believe Gorbachev represented a fundamental change in the Soviet leadership.
Bush’s opposition underscores the second, more important reason that Reagan wasn’t a “realist.” Traditionally in foreign policy, “realism” has had a far more specific meaning. It refers to the school of thought that argues American foreign policy should be devoted to questions of national interest, geopolitics, and balance-of-power concerns, not to questions of values such as political freedom or democracy. The essence of this approach was expressed, in its simplest terms, by the man who came to embody traditional “realism” more than anyone else in American foreign policy: Brent Scowcroft, the national security advisor first to Gerald Ford and then to the senior Bush. What happens inside another country’s borders is none of our business, Scowcroft argued over and over again; what we care about is a nation’s behavior in dealing with the rest of the world.
Scowcroft was merely the most prominent of the many realists who became the dominant force in American foreign policy. He was himself a disciple of Henry Kissinger (who was originally America’s leading realist, but whose thoughts and style are so voluminous, changing, and intensely personalized that by now they defy categorization). And Scowcroft in turn served as friend and mentor to a generation of other senior American officials serving both Republican and Democratic presidents, including Robert Gates and Colin Powell.
Obama has resorted to the use of force overseas so often and in so many places, no intelligent person could possibly call him a pacifist or an isolationist. Here the president greets U.S. and Estonian troops following remarks at the Tallinn Airport in Tallinn, Estonia, September 3, 2014.
During the Cold War, American liberals rarely if ever identified themselves as realists. In those days, realism was used to justify American support for dictators around the globe, such as the Shah of Iran or Ferdinand Marcos of the Philippines, and for coups against democratically elected leaders like Salvador Allende in Chile.
In 1981, when he was vice president, George H.W. Bush, who eventually became another icon of realism, visited Manila and told Marcos, “We love your adherence to democratic principles and to democratic processes.” That remark pointed to another problem with Cold War realism: Ironically, it sometimes caused American leaders to stretch or deny reality. I was personally turned off on the realists in Washington when I served as a correspondent in Beijing. America’s support for China during the last two years of the Cold War, based on principles of realism and the geopolitical struggle against the Soviet Union, caused American leaders to deny or explain away Beijing’s continuing political repression (and, later, the massive role of the state in its economy). From the 1970s through the 1990s, Democratic candidates, from Jimmy Carter to Clinton, made it a staple of their campaigns to criticize Kissinger, Scowcroft, and their realist principles. It is a sign of how much things have changed, then, that Hillary Clinton recently praised Henry Kissinger’s new book and sought to advertise her close association with him.
With the end of the Cold War, the continuing struggles between liberals and realists endured through the 1990s. Realists like Scowcroft strongly opposed the idea of military action in the Balkans, while liberals favored the use of force for humanitarian purposes. In one famous confrontation, Albright asked Powell, then chairman of the Joint Chiefs of Staff, “What’s the point of having this superb military that you’re always talking about if we can’t use it?” Liberal leaders such as Bill Clinton and Albright pressed forward with the expansion of NATO, thus extending military protection to new democratic governments in central Europe, over the opposition of many realists, including a few Democrats.
It was not until a decade ago, at the time of George W. Bush’s invasion of Iraq, that liberals began to identify themselves closely with the realists. Scowcroft and other realists had been the earliest and strongest opponents of that war and, in the process, Scowcroft won widespread praise from Democrats who cast aside their decades of objections to his views. (Amused, he told me during a 2002 interview for my book on Republican foreign policy, Rise of the Vulcans, “I think my views over the last 25 years have changed very little. Twenty-five years ago, I was a leading hawk. I feel the same way about things, and now I’m a leading dove, apparently.”) George W. Bush seemed to stigmatize the very notion of idealism, defined as a belief in democratic principles and political freedom, with his repeated assertions—mostly after the invasion, when no weapons of mass destruction were found—that the war was being fought on behalf of such ideals.
After the Iraq War, liberals construed realism to mean simply anti-interventionism, even though historically it had meant both much more than that and, occasionally, much less than that. (Remember, Scowcroft and his boss, George H.W. Bush, were willing to mount a military intervention in Panama, where, a year earlier, Reagan had not.) By the time of the 2008 campaign, Barack Obama proclaimed himself to be an admirer of “foreign-policy realism,” citing Scowcroft and Bush 41 as models for the way he would deal with the world.
Obama indeed started out his presidency as a classic realist, in the old-fashioned sense of the word. In 2009, when the Green Movement brought millions of people into the streets of Tehran, he kept a cool distance, trying to avoid doing or saying anything that might get in the way of his negotiating a nuclear agreement with the Iranian regime. He cut back on U.S. funding of programs for democracy and civil society in Hosni Mubarak’s Egypt. En route to her first visit to Beijing, Hillary Clinton told reporters she wouldn’t waste much time trying to talk to Chinese leaders about human rights.
In 2009, when the Green Movement brought millions of people into the streets of Tehran, Obama kept a cool distance. Here a supporter of Iranian opposition leader Mir Hossein Mousavi flashes a victory sign during a Friday prayer in Tehran, Iran, Friday, July 17, 2009.
This first, realist phase of the Obama administration lasted for roughly two years. By early 2011, Obama began to branch out in new directions, contrary to the advice of the old realists. Above all, Obama—like countless other people—became swept up in the possibilities of the Arab Spring; he concluded that a new wave of democratic change was sweeping through the Middle East, and that it made no sense to link American policy to the region’s monarchs and dictators. He abandoned Mubarak, in the process infuriating the Saudis (and causing a wave of anxiety in Israel, too). Over the advice of Secretary of Defense Gates, he approved the use of force against Moammar Gadhafi in Libya, eventually forcing him out of office.
At the apex of Obama’s idealist phase, in a speech about the Middle East in May 2011, he announced that American policy would no longer seek to preserve the existing order of authoritarian regimes. It would support reform, he said, adding, “The United States opposes the use of violence and repression against the people of the region. The United States supports a set of universal rights. And these rights include free speech, the freedom of peaceful assembly, the freedom of religion, equality for men and women under the rule of law, and the right to choose your own leaders, whether you live in Baghdad or Damascus, Sanaa or Tehran.”
But then the Arab Spring came crashing down, hard. Libya deteriorated into chaos and, worse, civil war broke out in Syria. In Egypt, the generals gave way to a democratically elected government headed by Mohamed Morsi and the Muslim Brotherhood, which proved unable to govern and was replaced, in a coup, by the military. In the end, Obama found it easier to do business with the old Egyptian regime in a new incarnation. In a ludicrous return to the old style of realism, Secretary of State John Kerry visited Egypt last June and proclaimed that General Abdel Fatah al-Sissi, its leader, “gave me a very strong sense of his commitment” to human rights—the very day before the regime sentenced three reporters from Al Jazeera to jail, amidst its continuing series of crackdowns on dissent. Back to the future: This was an almost risible evocation of George H.W. Bush’s paean to Marcos’s love of democracy.
After the Arab Spring came crashing down, Obama found it easier to do business with the old Egyptian regime in a new incarnation. Here Egyptian demonstrators protest the Muslim Brotherhood and then-President Mohammed Morsi in Cairo, Friday, November 30, 2012.
Obama’s struggles raise the obvious question: What does realism mean these days? By now, virtually all politicians call themselves realists—from Hillary Clinton, with her muscular view of America’s role in the world, to Rand Paul, whose aides tell the Wall Street Journal he’s a realist, not an isolationist. Harvard University’s Stephen Walt, one of the leading academic proponents of realism, advocates a virtual American withdrawal from the Middle East (“It’s time to walk away and not look back,” he wrote last summer)—although the leading real-world practitioners of realism, like Scowcroft and Gates, favor no such thing.
Traditionally, old-line realists called for a foreign policy based on national interest, yet there are today even more disagreements than ever in the past about what those words mean. Is it in America’s national interest to eradicate the development of the new, well-funded, and murderous Islamic State of Iraq and the Levant (ISIL) in the heart of the Middle East, or, by contrast, avoid another American war that might alienate people throughout the region? Is it in America’s interest to work out an accommodation with a rising China, or to help other Asian countries avoid being dominated by it?
Moreover, in the old days, realists maintained that American foreign policy should ignore considerations like democracy and political freedom. Yet in speech after speech and article after article over the past two decades, practitioners of post–Cold War American foreign policy, such as Secretaries of State Albright, Condoleezza Rice, and Hillary Clinton, argued that the distinction between national interest and values, between realism and idealism, is a false one. In many of the problems confronting the United States, geopolitical considerations and values go hand in hand—for one reason because most of our allies, like Japan, South Korea, and the nations of NATO, are also democracies. Even Kissinger, in interviews for his recent book, has taken to arguing that the distinction between realism and values is artificial, and that American policy is based on both.
Let us come back now to that Bill Clinton notion of America as an “indispensable nation,” forged amid the 1990s triumphalism of the end of the Cold War, and to his own private admission that it would be political suicide to acknowledge America was heading into a multipolar world.
I think a start to a new form of realism lies in the simple awareness that the notion of America as the “indispensable nation”—and all the concomitant formulations about American “leadership”—obscure our ability to understand and deal with a changing world.
This is not to say America should retreat from the world, or should refrain from using its power to seek to influence the course of events around the world. What I mean is that the incessant incantation of American “leadership” and our self-description as “indispensable nation” are, sometimes, less than the full truth—and, more importantly, grate against other nations whose help we need and, in the process, actually undermine the leadership and influence we seek to wield. Yes, other nations sometimes need our help, and deserve it. But the fundamental reality we fail to recognize is that they do not need our “leadership” and will not follow it in the same way they did during the Cold War or the decade after it.
This was one of the primary mistakes of the Iraq War, which was, among many other things, a stunning diplomatic failure. Proponents of that war sincerely believed in 2002 that whatever initial objections to the war the nations of Europe and the Middle East might harbor at the outset, their leaders would in the end fall in line behind the United States, as they had during and immediately after the Cold War. But they were wrong again and again. Turkey would not support an American invasion from its territory. More importantly, many of our old allies, friends, and partners—from France and Germany, to Mexico and Pakistan—refused to support the resolution authorizing force against Iraq at the U.N. Security Council.
What was the intellectual mistake of those who believed in 2002 and 2003 that once America led, others would follow? They failed to recognize that nations that had sided with American leadership in the Cold War would feel free to operate far more independently, once freed from the fundamental need for American military protection. Yes, these countries might need us, sometimes and in certain ways, but not as much as they used to.
Let’s take one real-world example where the mumbo jumbo about America as the “indispensable nation” gets in the way of our comprehension of the world. Consider the past year of crisis in Ukraine.
Far on the outskirts of Berlin, in a place called Karlshorst, is a museum that attracts few American tourists. It is called the Deutsch-Russisches Museum (the German-Russian Museum), left behind after Soviet troops departed from East Berlin two decades ago. Go into it, and you are swept up into one of the most tangled and bloody histories of any two major powers in the world.
Looked at from the United States, the recent upheavals in Ukraine present above all a new confrontation between Washington and Moscow, with (as news stories and columns regularly remind us) overtones of the Cold War. But from Europe, the recent crisis has also meant, among other things, a new chapter in the long-running epic of Germany and Russia. Ukraine, once the locus of bloody warfare between Nazi Germany and the Soviet Union, is now the focus of a struggle for influence, trade, and markets in eastern Europe between a rising Germany and a declining Russia.
To be sure, this is not to explain away Vladimir Putin’s behavior. Rather, it is to point out that in determining the course of events in Ukraine, Germany is as “indispensable” as the United States, if not more so. It was the (welcome) decision of a Germany-led European Union to bring in Ukraine that touched off the crisis in the first place. And, because Germany has vastly more trade with Russia than the United States does, it has been Germany that largely determines the course, extent, timing, and scope of economic sanctions against Russia. It is German (and EU) sanctions that Putin worries most about; the United States, with smaller trade, is secondary.
Consequently, the Western official who leads the way in dealing with Putin is not so much Obama as it is German Chancellor Angela Merkel. She uses the sanctions as a pressure valve, deciding when to ramp them up a bit more and when to hold back. Germany has its own interests. They are commonly taken to be commercial, and they are, but only partly. Germany has its geopolitical interests, too: It doesn’t want Russia to dominate eastern Europe, yet Germany doesn’t want to do anything that would create long-term instability in the region, either. Germany, at various times, has given crucial support to or has frustrated American policy.
A bas relief inside the Deutsch-Russisches Museum (the German-Russian Museum) in East Berlin, left behind after Soviet troops departed from the city two decades ago.
Back in the United States, politicians and commentators drone on and on about America’s “leadership” role in dealing with Ukraine. The problem is the impact that the “indispensable nation” trope has on American policy. Congress becomes swept up in whether the United States should pass tougher sanctions on Russia than the Europeans—never recognizing that the U.S. sanctions matter less than the European ones do. In fact, unilateral American sanctions could conceivably undermine the cause of tougher action against Putin, if, in passing them, the United States alienates Europe and deters it from taking stronger action on the European sanctions that count.
This same analysis applies elsewhere beyond Ukraine. Under Obama, the United States has had remarkable success in winning surprisingly strong international backing for sanctions against Iran, seeking to induce it to stop its nuclear weapons program. Yet American critics, and many in Congress, thunder about passing stronger, unilateral American sanctions—failing to understand that Iran is affected more by the international sanctions than the American ones, and that any action that weakens the international coalition also undermines the strength of the American position.
Barack Obama’s greatest strength has been that, in the face of relentless attacks from the political right and from the foreign-policy commentariat, he has, from time to time, been willing to admit to the changes in America’s role in the world, which Bill Clinton acknowledged only in private and only after he left the White House.
Obama has managed, with difficulty, to put together and to work through coalitions of nations, where George W. Bush, in the early years of his presidency, was unable to do so. Of course, sometimes America’s friends and partners don’t do exactly what we want. Whenever this happens, Obama is accused of failing, or of being weak, when in fact the realities are that the United States can’t succeed unilaterally, and that American presidents can’t compel compliance the way they did a half-century ago.
In one of his recent columns, Walt asked, “Is Barack Obama More of a Realist Than I Am?” The answer to that question is no, not in the way Walt defines his own realism. Obama obviously does not favor American withdrawal from the Middle East, for starters. But yes, in the sense that Obama has been, from the start of his presidency, a realist about America’s role in the world—a role that is still powerful, but also less dominant than it once was.
To be sure, Obama frequently resorts to the old standbys. In his recent speech announcing military action against ISIL, he declared, “American leadership is the one constant in an uncertain world.” But the recognition that America’s post–World War II role may not last forever also shows up in his ideas and speeches. In his Nobel Prize speech in 2009, he said, “The United States has helped underwrite global security for more than six decades with the blood of our citizens and the strength of our arms.” The verb tense of that sentence caused some jitters in Europe; it seemed to raise the question “What about now or in the future?” In another first-year speech, he famously declared, “The nation that I’m most interested in building is our own.”
One of the recurrent themes of Obama’s presidency is that every time he gives voice to this new sort of realism about America’s role in the world, there are howls of outrage. Take the great 2009 brouhaha over “Greek exceptionalism.”
The phrase “American exceptionalism” did not start with Ronald Reagan, shown here on October 10, 1986, with Soviet head of state Mikhail Gorbachev in Reykjavik, Iceland, where they met for a two-day summit conference on disarmament.
The phrase “American exceptionalism” did not start with Ronald Reagan. In fact, ironically, the first regular use of the phrase “American exceptionalism” seems to have been in the 1920s, by communists who were concerned that America was immune to socialism. Since then, however, people all over the political spectrum have used the phrase to refer to the belief that the United States is somehow different or special. Reagan himself didn’t throw around the phrase “American exceptionalism” (he was shrewd enough to recognize that a word ending in “-ism” sounded too academic for his audiences)—but others used the words to characterize Reagan’s views. Since that time, politicians on the right have gradually turned the word into a slogan and litmus test, proclaiming that they believe in American exceptionalism and asking if their opponents do.
On Obama’s first trip to Europe, he was asked if he believed in American exceptionalism. He answered, “I believe in American exceptionalism, just as I suspect that the Brits believe in British exceptionalism and the Greeks believe in Greek exceptionalism.”
That statement seems obvious (and, well, unexceptional), and I have yet to understand what was wrong in saying so. But his line about Brits and Greeks believing in exceptionalism, too, became a staple of years of anti-Obama attacks. And such attacks have not been confined to the political right. The notion that, in launching military action against Libya, the United States should let French and British warplanes take a large part in the invasion made plenty of sense, and was in the interest of saving American money and preserving its long-term power. Yet the idea of “leading from behind,” the phrase chosen by an Obama administration official to describe this approach, once again caused a furor: It brought up the touchy question of America’s post–World War II role.
Sometimes, it seems as though Obama is speaking in a different language, an idiom that is different from what Americans are used to. It is the language that politicians like Bill Clinton learned to avoid using. And many American reporters and commentators, with the right-wing criticisms of Obama in the backs of their minds, look for new controversies and don’t even try to explain what his words mean. Take the uproar over Obama’s words in late August as he was preparing to take action against ISIL: “We don’t have a strategy yet.” That sentence was seized upon again and again as a sign that Obama was clueless, or that his foreign policy was lacking.
It took the columnist Walter Pincus, writing in the Washington Post, to explain that at the moment Obama was speaking, his administration was in the process of working with other governments in Europe and the Middle East to find a common approach for dealing with ISIL. And so the “we” to which Obama referred meant “the United States and the countries we are talking with.” That made sense; even if the Obama administration knew what it wanted, it hadn’t arrived at a joint strategy with its partners. But in the idiom to which Americans are accustomed, “we” is assumed to mean simply the president and his team, or the administration, or the White House, or perhaps the royal “we”—something that is entirely American, rather than multilateral in nature. This sort of misunderstanding, of course, is not just an illustration of the superficial news coverage of Obama, but also of the public’s resistance to a more circumspect role for the United States, as well as Obama’s failure to communicate his views clearly.
“Let me say it clearly: The United States can, must, and will lead in this new century,” then-Secretary of State Hillary Clinton said in a 2010 speech. Here she takes part in a United Nations Security Council meeting during the 67th U.N. General Assembly at the U.N. Headquarters on September 26, 2012.
In this new sense of American realism, Obama is ahead of his time, and he is paying a price for it. There will almost certainly be a counter-reaction. Most of the contestants getting ready to run for president in 2016 are already competing with one another to see who can proclaim the greatest devotion to U.S. “leadership” (and to mention as little as possible America’s need for support and help from other countries). In one recent column, Senator Marco Rubio decried the idea that the United States should “walk away from its traditional role as the guarantor of global security,” leaving aside questions of how long the United States will play that role, whether the American people will support it forever, or whether other countries are as willing to go along with it as much as they had in the past.
Nor are such sentiments confined to Republicans. Hillary Clinton resorts to the “American leadership” and “indispensable nation” phrases as much as any politician in this country. This is not, for Hillary Clinton, merely a matter of slogans; she clearly and deeply believes that America can and should play the same role in the world as it did in 1945 or 1989. When Obama announced the end of American combat operations in Iraq in the summer of 2010, Clinton thought that his speech did not sufficiently address the question of America’s continuing role in the world. So she quickly set out to give a speech of her own, as a corrective to what she saw as Obama’s omission. It became one of the principal statements of her views as secretary of state. “Let me say it clearly: The United States can, must, and will lead in this new century,” she asserted. This was, she said, a “New American Moment.” The speech was reminiscent of Henry Luce’s declaration in 1941 that this was “the American century.”
In her travels around the world from 2009 to 2013, Hillary Clinton must have recognized how much the world has changed, even since the 1990s. It is more evident today than two decades ago, in the (private, post-presidential) acknowledgment by Bill Clinton, that the United States is heading for a time when we will have to share the stage, an era when “America’s not going to be the top dog.”
But Bill Clinton also went on to say that anyone broaching that thought might be committing political suicide and “ridden out of town on a rail.” In her own campaign, and even in the White House if she gets there, Hillary Clinton will presumably recognize in private the ever-greater need to obtain support from other nations, while always speaking in public of America as playing the same hegemonic role it did in the last half of the 20th century.
Obama has been different, at least at times. With ISIL as the latest example, Obama has resorted to the use of force overseas so often and in so many places, no intelligent person could possibly call him a pacifist or an isolationist. Yet it has been Obama’s role to be more realistic, to broach in public, however gingerly, the idea that America’s role in the world is changing. That has, in fact, been a hallmark of his foreign policy.
He hasn’t been completely ridden out of town on a rail. But Obama’s approach to the world does reflect a new form of realism, and in a country accustomed to being told its post–World War II leadership role is an enduring one, his realism has not gone down smoothly.