Posts Tagged american empire

Coming to Terms With the American Empire

Coming to Terms With the American Empire is republished with permission of Stratfor.”

By George Friedman

“Empire” is a dirty word. Considering the behavior of many empires, that is not unreasonable. But empire is also simply a description of a condition, many times unplanned and rarely intended. It is a condition that arises from a massive imbalance of power. Indeed, the empires created on purpose, such as Napoleonic France and Nazi Germany, have rarely lasted. Most empires do not plan to become one. They become one and then realize what they are. Sometimes they do not realize what they are for a long time, and that failure to see reality can have massive consequences.

World War II and the Birth of an Empire

The United States became an empire in 1945. It is true that in the Spanish-American War, the United States intentionally took control of the Philippines and Cuba. It is also true that it began thinking of itself as an empire, but it really was not. Cuba and the Philippines were the fantasy of empire, and this illusion dissolved during World War I, the subsequent period of isolationism and the Great Depression.

The genuine American empire that emerged thereafter was a byproduct of other events. There was no great conspiracy. In some ways, the circumstances of its creation made it more powerful. The dynamic of World War II led to the collapse of the European Peninsula and its occupation by the Soviets and the Americans. The same dynamic led to the occupation of Japan and its direct governance by the United States as a de facto colony, with Gen. Douglas MacArthur as viceroy.

The United States found itself with an extraordinary empire, which it also intended to abandon. This was a genuine wish and not mere propaganda. First, the United States was the first anti-imperial project in modernity. It opposed empire in principle. More important, this empire was a drain on American resources and not a source of wealth. World War II had shattered both Japan and Western Europe. The United States gained little or no economic advantage in holding on to these countries. Finally, the United States ended World War II largely untouched by war and as perhaps one of the few countries that profited from it. The money was to be made in the United States, not in the empire. The troops and the generals wanted to go home. Read the rest of this entry »

, , , , , , ,

No Comments