History Resumes: Sectarianism’s Unlearned Lessons

The influence of sectarianism in politics is about as welcome a topic among policymakers as the drunken uncle or the drug addict son is at the family dinner table. Indeed, a strong case can be made that it is because policymakers in powerful countries, above all in the United States and Western Europe, within the UN system, especially in the departments of political affairs and peacekeeping, and at the World Bank and the IMF, tend to craft their strategies and make their decisions as if sectarianism were a minor concern rather than the central one that it has always been in most parts of the world, that, like a sort of Philosopher’s Stone in reverse, it has turned so many supposed geostrategic sure things into either disappointments or outright failures.

In Afghanistan the Soviets thought Pashtun tribal loyalties would be no match for the kind of modernization they had imposed on the Central Asian republics to Afghanistan’s north, although anyone who has spent any time in post-1989 Uzbekistan, Tajikistan, or Kyrgyzstan will know that the sectarianism and tribalism that was quiescent during the Soviet era are back with a vengeance. As for NATO, the alliance went into Kosovo imagining the province to have the potential for the kind of ethnic comity that truly did exist in Bosnian cities like Sarajevo and Tuzla before 1992, when, in fact, Kosovo proved to be a zero-sum game. When the province was ruled from Belgrade, the Serbian minority held sway. When NATO arrived, it was the Albanians’ turn, and while on utilitarian grounds the oppression of a minority is perhaps to be preferred to the oppression of a majority, that was scarcely what NATO intended, or predicted would happen once Yugoslav regulars and Serb militias had been sent packing.

But the textbook example of this amnesia about the importance of sectarianism has been the American involvement in Iraq. The great physicist Max Planck once criticized his colleague James Jeans for refusing to relinquish his theory even in the face of facts that should have caused him to do so. Jeans, Planck wrote to a mutual colleague, “is the very model of a theorist as he should not be, just as Hegel was in philosophy: so much the worse for the facts if they don’t fit.” By analogy, one can say that the people who called for an invasion of Iraq in 2002 and early 2003—many of whom, lest it be forgotten, were liberal democrats (beyond the usual suspects in Congress, these included the current editor of the New Yorker magazine and the then executive editor of the New York Times, institutions not exactly known for their support of the Bush-Cheney administration)—were the very model of interventionists as they should not be. And, again echoing Planck, it is arguably the current Hegelian consensus that history is an evolutionary progress in a positive direction toward an ideal end state in which some form of liberal, law-based, rights-observing capitalism is, as Francis Fukuyama has put it, history’s culmination, “the only viable alternative for technologically advanced societies,” a conclusion that would almost certainly bring a smile to the lips of the members of the Politburo Standing Committee of the Communist Party of China. But unable to free themselves from the bear trap of one version or another of contemporary Western progress narratives, Fukuyama, his erstwhile neoconservative comrades, and many prominent activists within the largely left-leaning human rights movement, either remain entirely blind to the perdurability of sectarianism, or else imagine that—rather as Marx thought that once communism had been achieved the state would wither away—once prosperity has been achieved sectarianism will also disappear.

This consensus has been building for a long time. At least from the second half of the nineteenth century onward (and possibly quite a bit earlier), there has been a growing consensus among philosophers, politicians, social critics, and historians alike that to understand what is going on in the world it is necessary to think in larger and larger units of time, population, economy, and social structure. From the era in which the French historian Ernest Renan extolled the nation-state as an exercise in “large-scale solidarity,” in his 1882 essay “What Is the Nation,” and Marx insisted that the forces of capitalist production inevitably destroyed every autonomous countervailing economic structure or stubborn social form that might impede it, through twentieth-century Bolshevism and fascism, to our own time in the context of various forms of an essentially borderless capitalism that, while wounded by the recent economic convulsions in the developed world, still predominate, sectarianism has been consistently relegated to the status of atavism. Yes, it is still powerful in disfavored or crisis-wracked parts of the world like the Islamic Middle East or sub-Saharan Africa, and can even crop up in the rich world in times of crisis, as has been demonstrated by the recent rise of xenophobic parties in Western Europe. But over the long term, the tide of history is carrying all of global society in a very different direction.


Why so many people are so convinced of this, and not only in the “first world,” is not entirely clear. Obviously, part of the explanation is the penumbral hold that the Christian progress narrative still maintains over our thinking. On this account, for all the bumps and glitches that humanity is bound to face along the way, history, too, is a progress toward a global society. One does not have to be a person of faith to adhere to this view. To the contrary, it was that stern non-believer Raymond Aron who proposed that what made modern times unique was that they were the era of universal history. And, indeed, whether such an analysis is derived from Aron’s analytic template, or Fukuyama’s unwise effort to update it with his argument that history is bound to “end” in some form of liberal capitalist society, or, instead, from the still influential modernization theorists of the 1950s like the Rostow brothers in the United States, or else, more recently, either from the left mysticism of Michael Hardt and Toni Negri, the so-called biopolitics of the followers of Michel Foucault, or, on the other end of the ideological spectrum, from the free-market ideology that began with Ronald Reagan and Margaret Thatcher, what has united these otherwise often largely incompatible analyses was the unwavering conviction that premodern allegiances of the type that sectarianism exemplifies might flare up and cause a great deal of trouble now and then but nonetheless had already had their ticket to the dustbin of history validated. In other words, almost anywhere on the squishy bog that is the contemporary intellectual landscape—right or left, technocratic or legalistic, unilateralist or post-national—we will most likely find ourselves sinking into the muck of one modern iteration or another of Hegel’s ideas about universal history and the theodicy that accompanied it.

That the Greeks, or, indeed, great Renaissance political thinkers like Machiavelli and Guicciardini, who essentially believed that history was a series of cycles, not a progress, would have laughed at such a crude account is worth pointing out, if only as an at least partial vindication of the intuition of educated pagans, like Celsus in the late Roman empire, that Christianity was not going to be an intellectual improvement over Greek philosophy. But as a faith, Christianity can hold that we are progressing toward a day of final judgment and the end of history without having to provide empirical grounds for its claims. The adherents of secular progress narratives, however, can plead no such justification. To the contrary, the claims of a Fukuyama or of the contemporary ideologues of the global human rights movement—in many ways, the most powerful and influential of all contemporary utopian progress narratives—insist that however much their views have a component of hope, their confidence in the rightness of these views is based on the empirical evidence.

Nothing could be further from the truth. For if empirical criteria and historical evidence are the grounds on which intelligent people base their understanding of which direction the world is going in, it is surely just as plausible to claim that sectarianism has been as defining of modern history as universalization has been, and even that, at least in some parts of the world, sectarianism, in a broad sense that would include tribal, ethnic, racial, religious, and, within particular religions, confessional identities and loyalties, has played as central a role as globalization. The fact that this does not seem obvious—or, more precisely, that, at least in the West, the political and intellectual ruling classes seem to have to “rediscover” sectarianism every time it crops up (which, of course, is often) only to forget about it again until the next crisis imposes yet another rude awakening—is a reoccuring mystery of modern world affairs.

The US involvement in Iraq has been a textbook illustration of this syndrome. Most people who have looked at the question would agree that if there is a single criticism of the US invasion of Iraq in 2003 that even remaining defenders of the decision to overthrow Saddam Hussein would accept, it is that those who planned and carried out the operation underestimated the power and tenacity of sectarianism. This is scarcely what they intended, but then, by now, there is a long history of “cosmopolitan,” universally minded great powers blundering into conflicts in which none of the belligerents feel that there is anything historically doomed or atavistic about fighting in the name of race, ethnicity, religion, confession, or tribe. Some defenders of these interventions hew to the idea that the problem is not the universalizing assumption, but rather the lack of follow-through, which usually turns out to mean the lack of imperial determination to stamp out the atavism in question. The example most frequently used was the largely successful abolition of suttee under the British Raj in India, a reform that made Marx support Britain’s colonial project in the subcontinent, and which finds its echoes in the “human rightsist” justifications for continued Western military and NGO presence in Afghanistan that are so heavily based on the occupiers’ role in the emancipation of women there.

This starry-eyed account of the imperial “civilizing mission,” now supposedly, at least in the American Jacksonian and neoconservative rendering of the contemporary policy debate, being undermined by Western liberal guilt and softness to which has now been added tiers-mondisant indulgence toward if not complicity with jihadism, is largely bunk. Ussama Makdisi, the Arab scholar who once wrote that in an important sense the history of sectarianism was in fact the history of the modern world, surely was far closer to the mark. For on the ground, whatever some idealistic imperialists (and it would be to turn history into a morality play to pretend they did not exist) and missionaries may have hoped for, the British and French colonial empires may have justified their conquests on the basis of their civilizing mission, but they governed on the basis of “divide and conquer,” which is to say, on the accentuation of sectarian and ethnic divisions where they existed at all, and their outright invention where they did not.

By playing one group off against another, the British in West Africa, the Belgians in Rwanda and Burundi, and the French in some of their Maghrebi and Sahelian colonies, picked favorites among the peoples over whom they ruled. The Belgians’ racist fantasies about the superiority of the supposedly “Hamitic” (read “less black”) Tutsi over the Bantu Hutu is the most extreme example of this. (Ironically, when the tribal balance in Belgium shifted from Walloon to Fleming in the 1950s, the newly dominant Flemings switched Brussels’ support toward the Hutu, with whom Flemish missionaries identified as being, like themselves, members of an oppressed majority.) In Algeria, the French singled out the Kabyles as being more European and thus more amenable to being civilized, and accentuated by their policies the divisions between Arab and Berber-speaking peoples. And notably in South Africa and Nigeria, British imperial rule fomented tribal and ethnic divisions. If the modern world looks the way it does, that is to say riven by sectarian conflicts and allegiances, and if the neo-Hegelian fantasies best known through Fukuyama’s work about an inevitable shift of global society from the local to the universal seem so threadbare, the influence of colonialism on the shaping of modern sectarianism also played a major role in the world in which we find ourselves.


Donald Rumsfeld famously said, “You go to war with the army you have—not the army you want or might wish to have at a later time.” It was a clever remark, even if it was, as so often with Rumsfeld, rather too self-exculpating, since the most salient question, had the Office of the Secretary of Defense not been carried away by the fantasy of cheap and easy victory in Iraq, should have been how you use the army you have. Had the senior civilian leaders at the Pentagon thought that through, rather than planning for drawdown of US forces in Iraq before they had even reached Baghdad, the war might well have gone a great deal better for the United States. But that would have required Rumsfeld, and his colleagues in the senior echelons of the Bush administration, to recognize that you go to war in the world you live in, not the one you might wish you lived in, and much of that world, including, of course, the Islamic Middle East, was ineradicably sectarian. To declare an interest: my own view is that even had the war in Iraq, and, above all, the postwar period, been prosecuted with more foresight, competence, and realism, it would still not have been worth fighting, on either strategic or moral grounds. But to be fair, the outcome was obviously not the one Washington had sought. Whatever else can be said about the Bush administration’s intentions, the United States did not overthrow Saddam Hussein in order to install the sectarian Shia government that now controls all of Iraq outside of Kurdistan and whose de facto independence is itself testimony to the power of sectarianism—though in this case of a sectarianism more to Washington’s liking.

To the contrary, the idea within the Bush White House, Vice President Cheney’s office, and at least some elements of Donald Rumsfeld’s Pentagon and Condoleezza Rice’s National Security Council, was that, with Saddam Hussein gone, a non-sectarian democracy could be brought into being—proof, as if proof were needed, that wishful thinking has always been America’s abiding sin. In fairness, Iraq was special in many ways, and the US decision to go to war there a perfect storm. There were those within the administration, notably Paul Wolfowitz (in my view, by far the most honorably idealistic prowar voice within the senior leadership), who were genuinely persuaded that democracy would follow the dictator’s fall. There were others who allowed themselves to be persuaded of this by both Wolfowitz and those aligned with him, or by Iraqi exiles like Ahmed Chalabi and Kanan Makiya (this included not just supporters of the administration but influential liberals, like Michael Ignatieff, with their vision of the US as the global enforcer of human rights norms). Some believed in the existence of weapons of mass destruction, or, at least, that what they viewed as the inevitable collapse of the sanctions regime would eventually embolden Hussein to restart the program. Still others, most prominently Vice President Cheney, were thought to have felt that the invasion of Afghanistan had not fully restored the global sense that America could not be defied with impunity, while still others, though they were never as central to the decision making as many more hard left opponents of the war imagined, indeed thought the US had a geostrategic interest in having privileged access to Iraqi oil. And at least some believed that the overthrow of Hussein would tilt the scales toward America’s Israeli ally, by eliminating one of the Jewish state’s most tenacious and, more importantly, richest enemies, or at least unblock the Middle East logjam in a way that would be to Israel’s advantage.

Moreover, the proximate justification for the war was not democracy but rather the existence of weapons of mass destruction, though later Paul Wolfowitz would concede that the administration had settled on WMD as the “core” reason to go to war because it had been “the one issue everyone could agree on.” Even the Bush administration did not imagine it could undertake a preemptive war in the name of democracy. But when no weapons of mass destruction were in fact found, the Bush administration fell back on the “democracy at the point of a gun” argument. Given the early mistakes made by the American interim administration led by L. Paul Bremer—an official with no serious experience of or expertise in Islam or the Middle East—which involved demobilizing the Iraqi army without making any provision for the cashiered soldiers’ economic survival, and the discovery that the supposedly non-sectarian Iraqi exile leaders—notably Ahmed Chalabi, who had so beguiled official Washington in the run-up to the invasion—had no domestic constituencies to speak of, even a more historically and politically literate overseer might not have rescued the American project in post-Hussein Iraq. But the failure to understand fundamental Iraqi political realities almost certainly doomed the democracy project, at least insofar as democracy was understood in Washington as being synonymous with American-style liberal democracy. For as Valli Nasr observed in an influential postmortem essay he wrote for Foreign Affairs in 2006, “The Bush administration thought of politics as the relationship between individuals and the state, and so it failed to recognize that people in the Middle East see politics also as the balance of power among communities.”


What Nasr said about Iraq is, in reality, overwhelmingly the global rule, not some atavistic Middle Eastern exception. This should have been clear long before the US occupied Iraq in 2003; that it is not clear even after Iraq is inexcusable. Yet—as shown by the NATO intervention in Libya that overthrew Muammar el-Qaddafi and now seems to have opened the floodgates to a tribal struggle for power that may very well have already caused the partition of Mali along ethnic lines (the Bambaras in the south against the Tuaregs in the north) and may yet lead to a de facto partition of Libya itself on a similar basis; the irresponsible talk about intervening in Darfur, as if the sectarian dimension to that conflict were not central to it; and, above all, the reality that the open wound that is Afghanistan is, first and foremost, a Pashtun insurgency as much or more than it is jihadism, Taliban-style—we continue to believe (and the credo is wholly bipartisan) in the viability, and, worse, the morality, of military interventions whose aftereffects we are simply not willing to think about too deeply. The fact that terrible crimes continue in one given country after another (as indeed they do; nothing I am writing here should be interpreted as denying that reality), in short, that “something must be done,” continues to blind us to the question the Bush administration should have asked itself about Iraq but chose to finesse: “We intervene, but then what happens?” There is a French expression, “On s’engage et puis on voit” (“One commits oneself and then one sees what happens”), that was said to be Napoleon’s favorite. In the end, it landed him on the Berezina. It would be better, for the West and for the world as a whole, if the United States would not follow in the emperor’s footsteps, not imagine, as neoconservatives so often do, that national will and sense of purpose is sufficient, or, as the human rightsists both outside and within the Obama administration do, that the global rule of law is an inevitability and that the salient question now is not whether but when powerful states will live up to their responsibilities so that it happens sooner rather than later.

But the calls for an intervention in Syria, where even supporters of toppling the Assad regime cannot credibly deny the sectarian reality of politics there, suggest that we not only have learned nothing from the interventions of the past twenty years, but that in fact we are incapable of learning anything, and, instead, are condemned to repeat the same mistakes over and over again, rationalizing our failures by extolling our good intentions and brandishing our consciences like cudgels. But perhaps this should not surprise us. After all, we live in a time in which the United States, the largest debtor nation in the history of the world, crafts its foreign policy as if it were still the great creditor nation it was a generation ago, and bases much of its foreign policy on ambitious programs of human rights campaigning and democracy “promotion” that might have made sense at the zenith of the Enlightenment but which have little chance of working in the Enlightenment’s aftermath. But while all this reassuring and self-flattering dreaming goes on, a rising tide of authoritarian illiberal capitalism from northeast Asia, and of sectarianism in what for lack of a better word we rather over-optimistically call the developing world, and—who knows?—perhaps in Western Europe too, if things continue on their present course, intrudes like a civilizational memento mori.

David Rieff is a journalist and author. He is finishing a book on the global food crisis.

OG Image: