Do Less Harm: The Lesser Evil of Non-Intervention

Mohammad Mosaddeq, ajaxed

History does not repeat itself, either as tragedy or farce or anything else. Saddam Hussein was neither Slobodan Milosevic nor Ho Chi Minh, neither of whom in turn was Stalin or Hitler, any more than Yalta was Munich, Kyoto was Yalta, or September 11, 2001 was December 7, 1941. Nor should we regard al-Qaeda as the equivalent of the Nazis, the Communists, the Baathists, or of anyone but themselves. All challenges qualify as unique challenges, as do all enemies and the dangers they present. Wars will always be atrocious, but sometimes the absence of war will be atrocious, too. Circumstances count. Ends count, means count, and the relations between them count. Attaching a Roman numeral to the prospect of war does not make it either just or smart. Metaphorical overstretch bids to be the thought disorder of our time.

If all these seem to be primitive clichés, such clichés have their moments, especially when overlooked at immense cost. Imprisoned within the prism of Munich, Lyndon Johnson, Dean Rusk, Robert McNamara, McGeorge Bundy, and Maxwell Taylor peered at Vietnam in 1964 and saw Czechoslovakia in 1938. The problem with historical lessons being that they tend to be overlearned (which means not learned at all), the Bush team glanced backward and concluded that America was at risk from certain small- or medium-size powers that were, or were supposed to be, equipped with weapons of mass destruction and that, consequently, preventive war was sound policy.

It seems clear that preventive war in Iraq failed not because a wise policy was poorly executed but because the whole project was doomed to begin with. Even if the resort to war had been wise in the first place, it was unwise in the second, third, and fourth places, if for no other reason than that George W. Bush’s government possessed an exalted idea of its own power, an overblown sense of its own righteousness, no capacity to appreciate the limits of its knowledge and its power, and an impoverished idea of the complexity of the world around it.

Beyond the immediate panic that made the war politically feasible, however, there was a methodical lineage: the tradition that the U.S. may, without condition and regardless of consequence, do as it pleases abroad. And that, when it does so, it not only deserves to triumph but succeeds. True, Manifest Destiny in a triumphalist spirit is not our only tradition. But in this sixth year of the Iraq War, the triumphalist strain in American foreign policy remains a clear and present danger to American national security, however diminished it may be by the grim experience of the Bush Doctrine as enshrined in the 2002 National Security Strategy and as wheeled out in Iraq a year later.

By now, it no longer comes as news that the official rationale for war was revealed to be a compound of deceits. But at a time when scarcely anyone in public life has a good word to say about the war that actually took place, we would do well to recall that the Iraq adventure was not only falsely advertised but grounded in an enduring fantasy—that the United States, with a “coalition of the willing” in tow, could refashion the world at will. That fantasy survives, though tempered, in John McCain’s foreign policy and among some Democratic advisors. Bush articulated its logic in his second inaugural:

. . . as long as whole regions of the world simmer in resentment and tyranny, violence will gather, and multiply in destructive power, and cross the most defended borders, and raise a mortal threat. There is only one force of history that can break the reign of hatred and resentment, and expose the pretensions of tyrants, and reward the hopes of the decent and tolerant, and that is the force of human freedom. We are led, by events and common sense, to one conclusion: the survival of liberty in our land increasingly depends on the success of liberty in other lands.

If liberal hawks doubted it in 2002, it ought to be clear by now that democratic governments do not spring full-blown from occupying armies, whether they be strictly American, chiefly American, or cobbled together by a “League of Democracies” under American supervision. In Iraq, this was predictable and, indeed, predicted.

A quick survey of the historical ledger reveals that U.S. foreign policy since World War II has done far more damage through an excess of interventionary zeal than through a lack of it. This is not to say that all conceivable interventions will be mistaken. It is not to foreclose all multilateral interventions for humane purposes (in particular, to prevent or halt genocide), even where no core, direct, and immediate national interest presents itself. It is not to say that all American military outposts overseas ought to be dismantled. Even an empire guilty of imperial overstretch has real enemies.

But it is to say that a presumption against going to war, even for humanitarian purposes, continues to be the correct default position. The strategic and moral reasons line up and point in the same direction. Strategically, one ill-advised military intervention erodes more legitimacy than a dozen decisions not to intervene, and the legitimacy forfeited by a preventive war will likely outweigh any military advantages gained from it. This matters profoundly because what Suzanne Nossel has called “a foreign policy with legitimacy at its core” matters profoundly. In a world where the currents of ideology propel America’s enemies, the ability to defeat them, to drain their swamp, depends on the plausible appearance of being a power that acts on behalf of collective goods and not merely narrow state interests. The moral imperative matters deeply—and not simply as an appendage, for even strategy finally achieves justification insofar as practical success can be considered a moral good.

I am not a pacifist. I supported the Bosnia war—indeed, I wrote in favor of it for years before American, French, British, and other leaders finally aligned in the face of unending slaughter. I supported the Kosovo war as well, albeit with anguished acknowledgment that air bombardment can be a terrible instrument even when we think the ends to be just. And I supported the Afghanistan invasion, though with the same proviso about war from the air and no illusions about the sublimity of the warlords of the Northern Alliance. In 1999, Mother Jones published my article “The End of the Absolute No,” arguing against the anti-war movement’s automatic rejection of force.

But one need not be a pacifist or a sentimentalist to stare straight at the sheer horror of war. If the consequences of averting war may also be terrible (as in Saddam’s Iraq, which likely would have devolved into Qusay and Uday’s Iraq), the iron logic of warfare guarantees, with a degree of certainty rarely found in human affairs, that the alternative will also be a catastrophe. Those who speak blithely about “regime change,” about “taking out” tyrants and “nuking” defiant nations, overlook the historical truth that, before a war begins, those who start it routinely and customarily overestimate its ease and underestimate its cost in treasure and gore. This cognitive distortion serves its purpose well, enabling what is, after all, a horrific choice. The toll in deaths and maimings; the damage to persons, property, and infrastructure; the mounting costs—these bland phrases stand for immensities. There exists ample reason for the cliché—much honored in the breach—to the effect that war must be a last resort. The moral threshold for going to war must be very high.

Yet American policymakers reach it regularly. When such a uniform error recurs regularly, it is wise to understand why it recurs, and break the habit. A bright thread of uniform error runs through the American foreign policy of the last sixty years. Nowadays, it goes by the unpretty name regime change. Sometimes it has been motivated by Cold War realism; sometimes by the crudest of economic imperialisms; sometimes by messianic nationalism; sometimes by mixtures of two or three of these. The intrusion of state violence has always been justified in the language of liberty and democracy against tyranny. Sometimes the enemies have been tyrants; sometimes, in fact, democrats. More often than not, the intervention backfires.

Regime change from without­—overthrow, to use a less euphemistic term—confuses power and violence. This is the confusion against which Hannah Arendt warned when she pointed out that violence represents not the fruition of power but its defeat. Power, the power to move the world in a given direction, must ultimately be persuasive, not coercive. Coercion may be necessary to destroy an immediate threat that cannot be persuaded from its aggressive course, but the test of the virtue of that coercion must meet just-war criteria of the sort that Michael Walzer has set out. Coercion, in other words, ought to be a communicative act—it must pay “decent respect to the opinions of mankind.” An element that is at least necessary for persuasiveness is the ability to persuade allies that the projection of force can be justified. Collective security, in other words, amounts not to a pious wish but to an operational requirement. The ability to assemble a plausible multinational force, including neighbors directly concerned, epitomizes the “global test” that John Kerry was mocked for invoking in 2004. A “coalition of the willing” that omits obvious partners will be a coalition with irreparable defects.

For the most part, when the United States has set out, on its own and absent direct provocation, to overthrow a government, and to think that, having installed a new one, it could tinker with the effects and bring about a happy outcome, disaster has been the result. To be sure, the frequently cited counterexamples of Grenada and Panama may, to varying degrees, be conceded. But, again, unilateral American intervention has done considerably more harm than good over the past decades. It is worth revisiting this sorry lineage for a moment not because it tells the whole story of American foreign policy—it does not—but because it underscores some of the profound risks of reckless intervention.

While some elements of the Iraq debacle may be unique, others were prefigured. From the liberal critique, three grim tales from half a century ago and the periphery of the Cold War merit a second look: Iran (1953), Guatemala (1954), and Vietnam (1961). The first two were coups and the third, eventually, a full-blown war. The first two were driven by direct economic interests and the third, not. The first two were largely underhanded, though they left fingerprints in broad daylight for any observer inclined to detect them; the third required a national mobilization, culminated in defeat, and was so traumatic as to leave behind a “syndrome.” All three had in common a drastic misunderstanding of facts on the ground as well as a blitheness about the destructive consequences to come.

The decisions cut across party lines. The first two were launched by the government of Dwight Eisenhower, a moderate man who resisted the lure of rollback, yet succumbed to the belief that Cold War exigencies justified strenuous intervention. Eisenhower dipped his finger into Vietnam, too, in what was in retrospect no more than a prologue to Kennedy’s plunge, which, in turn, was no more than a prologue to Johnson’s immersion. None of these were rogue operations; they were based on a consensual, and deeply flawed, assessment of the nature of politics in poor countries. They were, in brief, the reflexes of ideology.

In Iran, American-British intervention deposed the democratically elected Mohammad Mosaddeq and installed the Shah on the throne, where he remained for twenty-six years. The consequences were many and destructive—first, the discrediting of Middle Eastern social democracy, then the Shah’s own dictatorship, then (the fuse stretching for more than a quarter-century) the Khomeini theocracy, and not least, the rise of pan-Islamism.

In Guatemala, the American-sponsored coup overthrew the elected social-democratic government of Jacobo Arbenz Guzmán, the beneficiary of the first peaceful transition in the nation’s history. With this regime change, Guatemalans were subjected to four decades of military autocracy and an oppression generally credited, if that can be the right word, with some two hundred thousand deaths. The United States tied itself to one of the most brutal military regimes on the continent. The young Che Guevara, a witness to Arbenz’s government and its overthrow, was but one of the Latin American revolutionaries who were spurred by the overthrow of the reformer to opt for the path of guerrilla war and Communist dictatorship.

In Vietnam, the war left some two to three million Vietnamese and fifty-eight thousand Americans dead, along with many millions left wounded and much of the land poisoned­­—whereupon the political outcome was almost exactly the same in 1975 as if the Viet Minh had taken power as envisioned by the Geneva Accord of 1954. And worse: the Vietnam War produced a sideshow, the destruction of the non-Communist Cambodian monarchy and, partly as a result, the ascension to power of the genocidal Khmer Rouge. When one considers the argument for legitimacy through collective security, it is germane that the ally who knew Vietnam best, Charles de Gaulle’s France, advised vociferously against the American expedition.

Guatemala, Iran, Vietnam—disagreeable as one may find the term “blowback,” the sheer volume of bad consequences that followed from these interventions ought to give pause to anyone who speaks casually of regime change. Millions of deaths are attributable, as a plain matter of cause and effect, to these three interventions. To put the matter this way is not to detract in the slightest from the criminality of those who commit war crimes on any side, nor is it to whitewash Communist regimes anywhere. But it remains incontrovertible that unilateral regime change by the United States is a proven evil. The burden of proof­—a very heavy burden—ought to be on those who believe it has been a lesser one.

The avoidance of war, by contrast, counts as a definite good. It would be simpler, of course, if that were the only good, but even if it does not qualify as an absolute value, it is a very great one. The Vietnam War was sufficiently calamitous to drive this lesson home even to conservatives. It was in the spirit of hard-earned realism that, for all the bellicosity of the Reagan administration’s rhetoric and its support of proxy forces in Central America, when Secretary of Defense Caspar Weinberger offered tests for the deployment of American troops in combat, he included this stricture: “The commitment of U.S. forces to combat should be a last resort.” The mere stipulation was a measure of the force and duration of the Vietnam lesson.

The next president will confront the world that George W. Bush made. The impulse to preemptive war generated a “coalition of the willing” that was thin to start with, dwindled over time, and has left the United States virtually alone in the current sand trap. This unpromising setting, alas, is the only one that exists and the one that the next president inherits. There will be transnational terrorists to fight, as well as nuclear proliferation, disease, climate change, poverty, and genocide to contend with. However vexed these matters, however situation-specific the imperatives, one general principle should apply: collective action remains superior to unilateral action. To say that the United States ought to lead—to take the initiative in assembling coalitions for action—is fine; to say that the United States is, pace Madeleine Albright, the indispensable nation is excessive. (Close observers of French politics argue that Bill Clinton did not endorse intervention in Bosnia until 1995, when Jacques Chirac assumed the presidency and overcame Francois Mitterrand’s time-honored tilt toward Milosevic.) In order to ground collective action, however, the first order of business for the next American president must be to regain legitimacy.

Legitimacy should not be seen as an optional feel-good fillip. The American military and the Bush administration discovered late that counterinsurgency and so-called nation-building do not work if approached chiefly as afterthoughts; to have any significant chance of success, they must be built into policy from the start. The same applies to legitimacy in a struggle against a jihadist movement that recruits with, and thrives on, ideas. Since the chief danger to American security in the world today, and for the foreseeable future, derives from the hypothetical combination of Islamist terrorism and nuclear weapons, and since Islamist terrorism presents itself as an ideological crusade, it is more than a platitude that ideological crusades must be countered ideologically—and intelligently. It is a necessity.

But ideology is not everything. Whether we look at the looming dangers of nuclear proliferation, climate change, or genocide, there is no substantial international goal whose attainment does not require, as an absolute prerequisite, a high degree of legitimacy. The loss of legitimacy the United States has incurred in the course of the Iraq War and the accompanying “war on terror” spills over into many other areas. Toward the end of gaining or regaining legitimacy, certain short-term measures seem obvious: shuttering the Guantánamo camp and other “black” prisons, and inaugurating legal protections and civilian trials for all prisoners; recommitting the United States to the Geneva Conventions; declaring that torture will not be the policy of any American agency, and that anyone who tortures under color of American policy will be punished accordingly; and withdrawing American combat troops from Iraq, renouncing permanent bases there, and committing funds to benefit and resettle refugees.

But short-term legitimacy grants just that and no more. A world of incessant injuries to human survival and dignity is a world in which legitimacy must be acquired and sustained. Legitimacy is a project for the long-term, a project that requires constant tending and renewal. This, more than any operational imperative, explains why military actions, including responses to aggression, should be multilateral. If standing and competent multinational forces present themselves, all the better. But in their absence, it remains possible to assemble improvisational coalitions, as in Bosnia and Kosovo. When Colin Powell in 1990 revised the Weinberger Doctrine, he proposed to require that any American resort to force receive “genuine broad international support.” This was Powell’s way of acknowledging the need for a “decent respect for the opinions of mankind” (though this was, to be sure, a principle he abandoned in 2003). It may not be possible to derive criteria for intervention—for humanitarian purposes, or against aggression—that can be specified with sufficient breadth to constitute uniform and binding ethical injunctions, any more than domestic laws can be applied unambiguously and uniformly. It can, however, be said that a “responsibility to protect” people from genocide or mass murder within sovereign states qualifies as a moral imperative, though equally that a test of this responsibility ought to be that its exercise proves capable of mobilizing a substantial coalition of national forces.

If unilateral force should be the recourse of last resort, then what should be the prior resorts? For all its shortcomings, the United Nations Security Council remains the most legitimate font of authority. It sufficed to counter North Korean aggression in 1950. It sufficed to justify the war to expel Saddam Hussein from Kuwait. It sufficed to bring weapons inspectors back to Iraq. Nonetheless, the Security Council often has been stymied and UN forces, as in Somalia, Bosnia, and Rwanda, have not always performed much better. What then? Regional groupings of nations have a presumptive value entitled to recognition by the United Nations. NATO in the Balkans has a legitimacy that the United States, acting alone or with cobbled-together surrogates, does not. A proposed “Concert” or “League of Democracies,” on the other hand, would be more noteworthy for its potential to reawaken a martial Cold War spirit, undermine the United Nations, and make the world safe, in the aftermath of the Iraq debacle, for the next phase of neoconservatism. It may also be short-sighted in the practical sense, for in pursuit of collective security, or even the “responsibility to protect,” it may still be necessary to ally with flagrantly undemocratic regimes for limited purposes.

American foreign policy carries with it an obligation to support democracy, but the most useful way to do so is to support the institutions of civil society that remain its necessary preconditions—to use diplomacy and funds, discreetly, with an eye to effects, not bombast, to help implement the rule of law, freedom of speech and association, and other human rights. Direct U.S. aid to civil society, human rights organizations, and so on often tends to boomerang against these groups. Reckless threats of military action boomerang even more savagely—a point frequently made by human rights groups and by such courageous defenders of liberty as the Iranian Nobelist Shirin Ebadi. “The world must be made safe for democracy,” but, critically, in the passive voice that was actually Woodrow Wilson’s in 1917. Wilson’s crusading idealism can be justly criticized both for its overreach and its lapses, but the crusading boom that George W. Bush has superimposed upon it makes Wilson look modest.

It is true that, for the United States, criteria for intervention cannot be anything so simple as, When enemies make loud rumbling noises and launch first strikes, and only then, go to war with them. The interventions in Bosnia, Kosovo, and Afghanistan were on balance just and defensible, although not all their elements and not all for the same reasons—Bosnia because all lesser means to protect the Bosnians failed, and because there was both UN Security Council and genuine regional support; Kosovo because the first and third conditions applied; Afghanistan because the Taliban had, by sheltering al-Qaeda and extending it free rein, in effect declared war against the United States. But if the world is haunted by a wrong-headed intervention in Iraq, it is equally and justifiably haunted by the failure to intervene in Rwanda, and now in Darfur.

What American foreign policy needs, then, is not so much a formula but a sensibility. Containment is better than attack. War is a lesser evil at best. Legitimacy matters. Collective security is the best kind. State sovereignty is not the highest of all principles—neither the sovereignty of a nation that harbors mass murderers nor the sovereign right of the United States to wage war with ease.

Todd Gitlin is professor of journalism and sociology at Columbia University. He is the author of twelve books, including, most recently, The Bulldozer and the Big Tent: Blind Republicans, Lame Democrats, and the Recovery of American Ideals.

OG Image: