Google
×

American imperialism

American imperialism is the expansion of American political, economic, cultural, media, and military influence beyond the boundaries of the United States of America. Wikipedia