In political science studies, it is not often that one reads of American Imperialism. The term imperialism is almost exclusively associated with colonial exploits of major European powers such as Britain, France and Germany in the West; and China and Japan in the East. Although a late joiner of the imperial club, the United States…