Explore BrainMass

Explore BrainMass

    American Foreign Policy and Isolationism in the 30s and 40s

    Not what you're looking for? Search our solutions OR ask your own Custom question.

    This content was COPIED from BrainMass.com - View the original, and get the already-completed solution here!

    What was the American foreign policy in the 1930s, and what were the arguments in favour of isolationism? How and why did America's isolationist stance erode entering into the 1940s?
    How did American foreign policy goals shape the American approach to the war?

    Provided is the background to this discussion to include some resources.

    © BrainMass Inc. brainmass.com December 24, 2021, 11:16 pm ad1c9bdddf


    SOLUTION This solution is FREE courtesy of BrainMass!

    As such, the Second World War represents a turning point in American foreign affairs, and it is perhaps hard for us to understand why the US took so long to take effective action against the Axis Powers.

    c. Lindbergh, C. (1941, Sept. 11). Des Moines speech. Retrieved from http://www.pbs.org/wgbh/amex/lindbergh/filmmore/reference/primary/desmoinesspeech.html
    d. United States Congress. (1936, Feb. 24). The Nye report. Retrieved from https://www.mtholyoke.edu/acad/intrel/nye.htm

    The reason why America didn't engage in foreign affairs prior to WW2 was because it had not emerged as a world superpower prior to this occurrence. In the beginning of the 20th century, the traditional empires still held sway over much of the globe with Great Britain, France, and other European imperial powers still possessing huge sway over their colonies. America was content on rising through a slow progression and amassing its wealth but certainly was content with being separated from the colonial wars and Empire wars that had plagued Europe for several centuries. Separated by two seas, America wanted to remain absent from these wars and did not advocate engaging in the same type of Empire Imperialism that these countries in Europe engaged in. This isn't to say that America didn't engage in Imperialism or "manifest destiny" but it is to note that America refrained from the posturing and warring that was evident with the European empires.

    The First World War was a defining moment in changing the former Empires' stranglehold on power as many Empires were dissolved as a result of this war including the Ottoman Empire and Austria/Hungary empire amongst others. This war also weakened other empires, which set the stage for a complete destruction of these empires during the Second World War, which is primarily what occurred on a whole scale basis. Country's such as France, Great Britain, and others emerged from this conflict unable to be Empires. America emerged as the economic superpower, which resulted in Military superiority and this is why America began to take a more active role on the international stage. Had America had the economic and military might that it possessed after WW2, they may have been inclined to take a more active role on the international stage and would have engaged in the war much earlier. Because they were attempting to remain isolated from the "European troubles", America refrained from this for much of its existence until it was forced onto the international stage as a result of its power and influence. Now America is the "empire" that is at the forefront of foreign affairs across the globe similar to the prior European empires.

    This content was COPIED from BrainMass.com - View the original, and get the already-completed solution here!

    © BrainMass Inc. brainmass.com December 24, 2021, 11:16 pm ad1c9bdddf>