The Faith that Supports U.S. Violence


I n the second year of the U.S. occupation of Iraq many people in the U.S. still tend to think of the United States not as the imperial empire that it is, but as the Promised Land, the embodiment of Western virtue, the incarnation of “freedom and democracy.” Therefore in pursuit of a presidential “war on terrorism,” everything is permitted. From starting a war to setting aside the rules of warfare, the U.S. is entitled to do what it wants when it wants, provided the action can be justified in terms of saving U.S. lives. 

Whether we call this outlook superpower or chosenness syndrome, national essentialism or millennialism, at its root lies “the belief that [U.S.] history, under divine guidance, will bring about the triumph of Christian principles” and eventually the emergence of “a holy utopia.” Such belief in the unique moral destiny of the U.S. may be held independent of Christian principles. Its historical origins, however, trace back to colonial New England, and even further to the Bible. It is omnipresent in every part of the country, though its strongest regional base lies in the South and West. 

Long before the birth of the Republic, ideas of “chosenness” have been at the heart of a complex ideology of rule that has resonated powerfully in U.S. society. Both Massachusetts Bay Colony Puritans and early 19th century Protestant millenarians conceived of the United States as an exceptional nation, chosen by God to be the acme of freedom and to redeem humankind. As historian Ernest Tuveson observed during the Vietnam War-era, the idea of the “redeemer nation” through which God operates is also the foundation of the notion of continuous warfare between “good” and “evil” people. Virtually every politician who exploits the religious emotions of people in the U.S. for the purpose of waging war consistently draws on these ideas and images, embodied in religious and secular texts. 

Today no single millenarian ideology exists but rather a complex religious spectrum in which ideas of chosen people and visions of the United States as God’s model of the world’s future figure prominently. But just as in the past, these ideas link directly to the apocalyptic “defining moment,” in which the leaders at the top of society summon the people to fulfill some sacred mission of redemption, or to play some new role of universal significance for the sake of humanity. Usually, the key moments occur when the president, for his own political purposes, declares them. In periods of great economic or political change or national crises, when the community is beset with fear, presidents have been instrumental in situating the crisis, establishing its conditions, and pointing toward solutions. At such times, too, millenarianism can generate support for policies of imperialism and war or for advancing democratic ideals in the process of overcoming enemies. 

In the 18th and 19th centuries, politicians repeatedly used different forms of this messianic national faith to justify killing Native Americans, conquering Mexicans, and taking over the continent. In the 20th century they used it to overcome “isolationism” and construct a global empire. Economic greed, the blind ambition of leaders, racial superiority, and the desire to establish relations of domination remained their own justification for killing, but invariably the civil religion concealed these more important, baser motives. Through over 200 years of expansion, belief in “Americans” as the Chosen People has reigned, enabling U.S. political leaders to wage war more or less at will, and to feel intensely righteous in threatening others, yet never allowing others to call them to account.

For the past four years President George W. Bush, his top foreign policy advisers and their aides, have carried religious Manicheanism to new levels. They have trampled on the U.S. constitution, violated international law, and turned nationalism in a more authoritarian, “potentially fascistic” direction. Religious conviction did not drive them to do any of these things. But for reasons of domestic politics and their (Congressionally unsupervised) control of huge military forces, Bush chose to do them while posturing about God. 

In January 2001, in his first inaugural address, Bush suggested that God operated through the people of the U.S. to achieve His purpose. Although Bush was using his speech writer’s expressions, they overlapped with his own sense of the world divided between warring powers of good and evil. Two years later, March 19-20, 2003, Bush started his second colonial war (Afghanistan being the first). He attacked without provocation the sovereign state of Iraq, which had already been crucially weakened through a decade of UN economic sanctions and posed no imminent threat to the U.S. or any other state. The war, launched on grounds of fear, was in clear violation of the UN and Nuremberg Charters and the U.S. Constitution, which gives no president the power to wage “anticipatory” or “preventive” war absent real, imminent threat. 

U.S. leaders created an Iraq “threat,” then wrapped themselves in the mantle of their historic millenarian creed in order to justify waging an immoral, illegal war to eliminate it. Many citizens, conditioned to imagine themselves part of the “redeemer nation,” and duped by their government and their televisions, supported “Operation Iraqi Freedom” and are now experiencing the collapse of their idealized version of what the U.S. means to the world. One year after Bush’s victory declaration, the Iraqi nationalist resistance has stretched the U.S. military to its limit and frustrated U.S. millenialist expectations. Rather than putting an end to terrorism, Bush and Tony Blair have spread the danger and made their citizens objects of hatred, revulsion, and reprisal throughout the Middle East. 

More importantly, in the course of waging Bush’s “war against terrorism,” U.S. military forces have committed systemic human rights abuses that qualify, under Nuremberg principles and later international treaties, as “crimes against humanity.” From Afghanistan to Iraq, they have directly attacked and brutalized civilian populations, and imposed upon them collective punishments. From the prison cages of Guantanamo to Baghram air base in Afghanistan, from Abu Ghraib prison and more than a dozen other Iraqi detention facilities (Al Qaim, Asad, Mosul, etc.) and beyond in the Pentagon’s labyrinthine gulag of prisons, the charges steadily mount: murder and rape; widespread, officially-condoned infliction of torture; degrading treatment of prisoners of war and “security detainees” of all ages. Up and down the military chain of command the evidence accumulates of cover-ups, acts of omission, professional negligence, and dereliction of duty. 

In this situation, the U.S. state is liable for violations of the 1949 Geneva Conventions, the 1984 UN Convention Against Torture, and the charters of the ad hoc UN criminal tribunals for the former Yugoslavia and Rwanda. Individual liability accrues especially to Bush and Rumsfeld, who approved the criminal policies establishing the U.S. global torture system. Bush, whose “razor-sharp distinction of the ‘good guys’ and the ‘bad guys’…filtered down the ranks,” bears primary responsibility for creating the climate which condoned his subordinates’ torture of detainees. This is why he, Rumsfeld, Undersececretary of Defense for Intelligence Stephen Cambrone, Iraq ground force commander Lt. Gen. Ricardo Sanchez, and others in the military and civilian chains of command can only go on lying, covering up, stonewalling—and hope that enough of the U.S. public is sufficiently millenarian and misinformed on the relevant facts not to care about their policies, or whether they go unpunished for them. 

What needs to be explained here is how the dangerous myth of the United States as a special nation, the sole superpower destined to reign militarily supreme over all others, was rejuvenated and influenced this outcome. 

 

The “American” Century 

W hen the defeat of Germany and Japan left the United States with a nearly global empire, many in the U.S. saw further confirmation of their own uniqueness: victory had proven them morally superior not just to their defeated enemies but to all nations. The triumphant U.S. had destroyed the German and Japanese empires, and through occupation reforms was in the process of lancing the poison of Nazism in Germany and the emperor ideology and militarism of Japan’s political culture. Unsurprisingly, the victor retained intact its own virulent civic religion. 

Victory in World War II and the early emergence of an uncooperative Soviet Union compelled U.S. leaders to reset their national goals. The process of defining both a new enemy and another altruistic mission to democratize the world afforded rich opportunities for religious nationalism, involving citizens in a new epic struggle between good and evil.

Two early post-World War II events reinforced popular belief in the U.S. as Captain America, the peerless nation with a moral imperative to set the world aright. One was the government’s exclusive possession of a stockpile of doomsday weapons and the bombers to deliver them; the other, the first international war crimes trials. To most in the U.S., the exclusive possession of nuclear weapons was a fitting capstone to the great victory over the Axis. People in other countries saw the nuclear destruction of two Japanese cities as indicative of a military power so vast as to be able to hold humankind itself captive to whatever geopolitical design for the planet that Washington policy-makers sought to advance. 

Helping to offset such fears, at least in the eyes of most in the U.S., were the judicial actions taken against German and Japanese war leaders between 1946 and 1949. Both events may be assessed in the same context of messianic yearning for a better world that came with the ending of the war. The ad hoc trials, for example, were generally seen as demonstrating that Washington was using its new God-like power to enforce the rule of law. At Nuremberg, the proceedings went forward on the basis of an international agreement, signed August 8, 1945, two days after the U.S. dropped the first atomic bomb on Hiroshima (a largely non-military target) virtually obliterating the entire city. Chief prosecutor Robert H. Jackson managed to have the Charter of the International Military Tribunal reflect his broad vision of aggression against other states as the “supreme crime” in international law. The ensuing courtroom drama, however, focused increasingly on the German genocide of European Jewry, inadvertently shifting attention from “aggression” to “crimes against humanity” and conventional war crimes.  

Thanks partly to the way that Jackson framed the Charter for the European war, the legal categories of “crimes against humanity” and “war crimes” posed little threat to the U.S. and its “Nuremberg principles,” allowing the U.S. and its allies to achieve impunity for the urban bombings that culminated in the nuclear annihilation of Hiroshima and Nagasaki. On some charges, Jackson’s definitions even helped acquit many Nazis, including the admiral who had led Germany’s unrestricted submarine warfare in the Atlantic. Nevertheless, the German trials were the product of a multilateral negotiating process among the four Allied governments occupying Germany. The same could not be said of the Tokyo trial, held against the backdrop of U.S. Navy control of the entire Pacific and near total U.S. domination of Japan. 

Gen. Douglas MacArthur, the tribunal’s sole convening authority, was intent on preventing war responsibility from being attributed to the emperor and avoiding any international adjudication of Japan’s chemical and biological warfare in China. But he was also concerned that the United States bear no legal or moral liability for its deliberate mass extermination of civilians by conventional and atomic bombing. When a lawyer for the defense at the Tokyo trial attempted to do that by “setting Japanese civilian losses at Hiroshima and Nagasaki against Chinese civilian losses,” the judges overruled him. Nonetheless, both the unaddressed question of U.S. war crimes and Emperor Hirohito’s absence from the trial hung over the proceedings. 

The Allied powers had bombed methodically, as had Germany and Japan at a much earlier date. But the U.S. had gone a step further, probably killing more Japanese civilians in five months of terror bombing than the estimated deaths from five years’ bombing of Germany. When a lawyer for one of the Japanese defendants tried to introduce Secretary of War Henry Stimson’s apologia for President Harry S. Truman’s decision to drop atomic bombs on Hiroshima and Nagasaki, the bench rejected the evidence. 

In the end, U.S. leaders avoided a real public debate on the policy of developing and using nuclear weapons. The world’s only nuclear power at the time had set a new standard of killing the innocent, but its leaders had more pressing concerns than addressing the new means of warfare. They pretended that they did not see the irony of the Allied side resorting to “forms of total war far in excess of what the aggressor Axis powers might have imagined.”

President Harry Truman saw himself as a crusader for freedom and against evil, which in practice meant against any nation that refused to exist in a subordinate relationship to the U.S. As he embarked on fighting the Cold War and launching the arms race, Truman did more than create a “mission.” Recognizing that he needed the sanction of religion to generate popular support for a policy of global anti-communist military intervention, Truman tried to enlist Christian churches in a crusade against godless and atheistic Marxism. 

First to be enlisted was the anti-Semitic Pius XII. Truman next tried, unsuccessfully, to mobilize the Protestant leaders of the World Council of Churches in a “religious anti-communist front” against the Soviet Union. He failed because he completely ignored the historic schism in Christianity between the Catholic Church and Protestant denominations. Nevertheless, thereafter for over 40 years, politicians and foreign policy elites managed to sell the Cold War to the U.S. public as one of history’s greatest religious crusades—“a global conflict between the god-fearing [Christians] and the godless [Communists].” 

In the wake of the Nixon and Ford presidencies, and the failure to indict for war crimes any senior leader of the Vietnam War, Jimmy Carter came into office aspiring to be a peace president. A born-again evangelical Christian, he talked in vague terms about the moral failures of his predecessors and his own human rights agenda for U.S. foreign policy. But, after failing to rein in the Pentagon, the State Department, or the CIA, his Administration quickly resumed business as usual with murderous regimes in Latin America, Iran, and Afghanistan. Carter even ended up extending the Monroe Doctrine to the oil-rich states of the Persian Gulf. After he set the example of connecting with voters by constantly evoking God, the Bible, and his own religious beliefs, the game of national politics degenerated further. 

When Ronald Reagan became president in 1981, the millenarian tendency in its toxic expression reemerged. During the 1980 election campaign, Reagan, and various conservatives in Congress, looked for a political pitch that would appeal to the least sophisticated voters. They decided to moralize the election and in so doing gave the Christian right a respectability, visibility, and political clout that it had not previously enjoyed. In short, a coalition of rightist secularlists successfully organized right-wing Protestants, Catholics, and Jews. 

Once in office, Reagan invoked for the U.S. public John Winthrop’s vision of the new Chosen people covenanted with God. Winthrop had sanctioned the Puritans to live in the “shining city on a hill” and be a model for humanity to imitate. Reagan set about organizing “traditional Protestant interests in fundamentalist religion, censorship, and stricter divorce and anti-abortion laws”—all in contrast to the politics of the 1960s and 1970s, which had “denounced the small-town mentality of Puritanism.” 

During the early 1980s, Cold War hatreds and the bipolar vision of the world which sustained them were waning. Reagan tried to reverse these trends by harkening the nation back to the 1950s and early 1960s. He re-labeled the Soviet Union “the evil empire” and inaugurated a new round of investment in nuclear weapons and missiles. John F. Kennedy, who had started the Vietnam War and accelerated the nuclear arms race, offered Reagan a model. Reagan’s massive peacetime military build-up and his treatment of third-world nations followed Kennedy precedents. 

Reagan declared a “war on terrorism,” aimed mainly at suppressing the rise of independent nationalism in tiny countries like Nicaragua. His first-term offensive against the Soviet Union, which unfolded in the form of a nuclear arms and missile race, had the unintended effect of exacerbating tension between Soviet reactionaries and reformers. Militant Republican Protestantism easily rationalized both endeavors.

The ending of the ideological Cold War and the unexpected collapse of the Soviet Union in 1991, largely for internal reasons, reinforced the inevitable victory claims of those who saw themselves as the chosen people. Various advocacy groups of the neo-conservative movement put pressure on the White House to take a more confrontational approach to regimes like Saddam Hussein’s and not to worry about allies. They sought a bigger role for the military and the development of newer, more dangerous weaponry that no other nation could ever hope to compete against. Redemption rhetoric grew stronger while senior officials of the Pentagon went on increasing their influence in national policy decision-making. 

After U.S. leaders were free to operate in a world without military rivals, the most hawkish of the geopolitical planners—the secularists who proudly called themselves neo-conservatives—pondered how to construct an international order that would insure permanent U.S. domination of the planet. They focused on overthrowing foreign governments that still defied U.S. power, especially in regions of great strategic value like the Persian Gulf and Iraq. By the time of the Kosovo War in 1999, the radical neo-cons had targeted both the UN Security Council, which had refused to explicitly authorize the war, and the principles of (external) sovereignty and equality among states, on which rested the entire international legal order. 

The messianic spirit in U.S. foreign policy gained strength from the short wars that Presidents Bush and Clinton waged during the 1990s against weak, impoverished states. Confronting the growing economic importance of Western Europe and Japan, Clinton understood that the one dimension in which the U.S. reigned supreme was military power. So he “turned to the military to help manage world affairs” and afterwards never failed to support the Pentagon’s “insatiable demands” for resources. By the time Clinton left office, all expectations of a post-Cold War “peace dividend” had been dashed, thanks to the financial burdens that unacknowledged empire continued to necessitate. 

John R. Bolton of the right-wing American Enterprise Institute, soon to be appointed by presidential candidate George W. Bush as an undersecretary of state, and Condoleeza Rice, a born-again Christian whom Bush, two years earlier, had tapped to be his foreign policy advisor, hinted in 2000 at what lay ahead: the advent of unabashed unilateralism supported by a resurgent moralism. 

Bolton, writing in an academic law journal, targeted the very idea of a system of international law based on legitimate sources of authority. Law, he argued, had only rhetorical and political value. Because of its unique status, the United States could not be “legally bound” or constrained in any way by its international treaty obligations. The U.S. needed to “be unashamed, unapologetic, uncompromising American constitutional hegemonists,” so that their “senior decision makers” could be free to use force unilaterally.  

Rice, appearing in the January/February 2000 issue of Foreign Affairs , was equally contemptuous of international law. She claimed that in the pursuit of its national security the United States no longer needed to be guided by “notions of international law and norms” or “institutions like the United Nations” because it was “on the right side of history.” The “right side,” of course, meant “God’s side” and was a clear intimation that should they capture the White House, key elements of the Republican Party might try to discard the Westphalian concept of formal equality of states in order to insure that the U.S. remained globally dominant forever, no matter how much the world changed. 

By the time the Republican Party took over the presidency and both houses of Congress in 2000, the conditions were in place. Right-wing extremists set about implementing a version of international order based on their ignorance of history, their scorn for the ideas of others, and their unquestioned belief in U.S. destiny to rule the world. In this sense, Bush’s elevation to the presidency was an event of pivotal importance, for it inaugurated an era in which the U.S. would withdraw from human rights treaties and norms and conduct its foreign policy as an unapologetic hegemon, disdainful of international law, the interests of other states, and the concerns of other peoples.

The power of U.S. corporations, casting a long shadow over state and federal government, indirectly prepared the ground, while long-term demographic trends in the U.S. South and West also contributed to the restructuring of power at the congressional level. From 1964 onward, precisely because President Johnson signed the Civil Rights Act, the South became potentially the largest political region for the Republicans. Over the next three decades, white (and, to a lesser extent, black) migration from the north, combined with a realignment of political support, transformed the 11 states of the old Confederacy into the largest region in the nation, and an integral part of the Republican electoral coalition. 

Interestingly, these were also the Bible belt states marked by special forms of vague yet distinctive religious cultures. Here flourished the worldview of tens of millions of Bible “literalists,” “conservative fundamentalists,” and “millenarians.” The literalists are people who believe in a literal interpretation of scripture. The conservative fundamentalists, at odds with certain principles of modern science, are focused on a catastrophic end-time struggle between the forces of good and evil; their mind-set is particularly supportive of moral crusades, patriarchalism, and militarism. The “millenarians” run the gamut—they are both blacks and whites, this-worldly and other-worldly, liberals opposed to Bush as well as conservatives who support him with religious fervor. 

Concurrent to the rise of the South, Western states also experienced a significant population increase, at the expense of California. Many of them transformed into strongholds of right-wing conservatism and anti-minority prejudice. Conservative westerners chafed at federal government regulation and were deeply resentful of “presumed domination by the East.” In this region the exaltation of “America” and support for fundamentalism, militarism, and conservative causes were particularly strong, which is one reason why the West became a major source of money for the Republican Party. 

By 1994, when this process of demographic change was reaching full force, “aggressively conservative” white Southerners controlled virtually all the key leadership positions in the House of Representatives. The ideas of conservative whites—core supporters of the Republican Party who westerners like Barry Goldwater and Ronald Reagan reinvented by exploiting racism and bigotry—dominated the nation’s political discourse. Republicans and Democrats alike embraced the notion that the U.S. had a God-given mission to spread its values, promote its corporate interests, and establish its military presence everywhere. 

Over the next six years, the rightward shift of U.S. political institutions, which began with Reagan, peaked. Influenced by the nation’s new status as the world’s sole superpower, rabid interventionism became publicly acceptable. Nevertheless, the public still had not come around to favoring the unilateralist, confrontational approach to world problems that neo-conservative ideologues and militarists had been developing since the Cold War ended.

In December 2000, the conservative Supreme Court selected as president the candidate with the hidden past and right-wing temperament who had failed to win a majority of the popular vote. The born-again George W. Bush, like the Reaganites he gathered around him—Rumsfeld, Cheney, Powell, and their minions—imagined he could transform the international order without having to forge common understandings with allies. A few days after the inauguration, at his first National Security Council meeting, he made regime change in Iraq his top, secret priority. For the next eight months, Bush and his “team” floundered around, beset with thoughts of Iraq and indifferent to al-Qaeda, the most serious security problem confronting the United States. Meanwhile their unilateralist rhetoric and repeated treaty pullouts were arousing the world’s concern. 

Then on September 11, 2001, a small group of mainly Saudi terrorists destroyed the World Trade Center, symbol of U.S. financial power, and damaged the Pentagon, icon of U.S. militarism. Over 2,800 were killed. Disbelief at the attack, followed by feelings of vulnerability and anger, spread widely. The next day, after Bush had told his advisers gathered in the White House’s emergency operations center that “nothing else” but war matters, “Rumsfeld noted that international law allowed the use of force only to prevent future attacks and not for retribution.” Whereupon Bush yelled, “No. I don’t care what the international lawyers say, we are going to kick some ass.” 

Bush laid out his Administration’s response when he vowed before Congress (September 20, 2001) to destroy “every terrorist group of global reach.” Depicting the problem in theological terms as a fight between good and evil, he declared, “Every nation in every region now has a decision to make: either you are with us or you are with the terrorists.” That evening, at a private White House dinner, Bush asked British Prime Minister Tony Blair to support his removal of Iraq’s Saddam Hussein from power. Britain, the other nation with a “Chosen People paradigm,” became Bush’s first ally for war on Iraq. 

During the two years before the Administration launched its unprovoked attack on Iraq, few in the U.S. questioned their government’s conduct of the first Bush war in Afghanistan. Denied the facts needed to assess the situation and constantly misinformed by their government and the corporate mass media, they easily swallowed the Big Lie that was the Administration’s main justification for its policy of regime change in Iraq. When Hussein’s alleged stockpiles of weapons of mass destruction proved illusory, the neo-cons shifted to arguing that (in George Bush’s words) the U.S. had an historic duty to use their “wonder-working power” to spread “democracy” and “freedom” to countries whose peoples were unable to or incapable of deciding for themselves their best political arrangements. This was the same post-Cold War, Wilsonian argument that the elder Bush had deployed in invading Panama (December 1989) immediately after the fall of the Berlin Wall. 

In today’s quasi-wartime environment, Bush continues to pursue a fraudulent “crusade,” “calling” to “rid the world of evil.” He lards his speeches with religious rhetoric and aggressively woos religious groups, a key part of his electoral base. He and his circle spread messianic nationalist myths and lie with astonishing persistence and clarity. 

Until recently, roughly half the nation, including many who are either among the least informed or the most misinformed of social and political realities, still give tacit support to his domestic and foreign policy agendas. Popularity polls indicate that this may be changing quickly as more of the public perceive the consequences of his failed policies in Iraq and Central Asia. Nevertheless, Bush still retains the loyalty of his natural constituents and even benefits from the fake “alliance” of convenience between religious fringe groups: specifically, conservative evangelical Christians who see in the birth and growth of Israel the fulfillment of biblical prophecy leading to the second coming of Christ and Jewish fundamentalists who, equally lunatic, also anticipate the arrival of the “messiah” once all of biblical Israel has been “reclaimed” from the Palestinians. 

Millenarianism in the G.W. Bush era is the historic expression of a resurgent imperialism asserting its Puritan and evangelical Christian roots while lashing about in its political weakness. The Bushite practice of millenarian politics is nothing but posturing about God, religion, the patriarchal family, and “American values” by calculating “realists,” militarists, and Christian rightists who care mainly about spreading U.S. dominion. The neo-conservative quest for global domination through preventive war and other bold assertions of U.S. power found its stride during the second Bush presidency, and in less than three years managed to spread chaos throughout the world. With another national election in the offing, Bush can neither conceal the abject failure of his Administration’s policies nor prevent the U.S. people from increasingly perceiving him as a fundamentally untrustworthy, incompetent leader.


Herbert Bix teaches at Binghamton University, New York, and is the author of Hirohito and the Making of Modern Japan , which won the Pulitzer Prize.