Why Did the United States Enter World War I?
17 APR 2018
CLASS
World War I was a global conflict that began in the eastern European nations of Austria-Hungary and Serbia and expanded to include the countries of France, Germany, Great Britain, Italy and the United States, among other smaller nations. The participation of the United States in the war was a deciding factory in an Allied victory.
1 Why Did We Have World War I?
World War I was a conflict that had been brewing for some time, but the official inciting incident that led to the declaration of war was the assassination of Austro-Hungarian Archduke Franz Ferdinand by a student terrorist who was part of the Young Bosnia organization. Bosnia had been under Austro-Hungarian rule, and Bosnia and Herzegovina wanted to be part of Serbia.
The assassination and Austria-Hungary's subsequent response led to armed conflict because of Russia's mutual defense alliance with Serbia. Russia and Serbia had an agreement to defend one another if either country was attacked, and thus the Austro-Hungarian attack on Serbia was met with retaliation by Russia. For its part, Germany and Austria-Hungary had a similar arrangement, so when Russia attacked Austria-Hungary, Germany was pulled into the conflict.
France was then pulled in because of its alliance with Russia. Germany then retaliated against France, pulling Britain, and eventually Japan, into the war. Other causes that had fomented discontent in the described regions were imperialism and contested ownership and access to resources in countries in the Far East and Africa.
2 Why Did the United States Decide to Enter World War I?
Initially, after the outbreak of conflict in Europe in 1914, the United States, which didn't have a mutual defense alliance agreement with the countries in question, had vowed to stay neutral. This was a popular sentiment in the United States, a nation that wasn't eager to enter a global conflict. However, Great Britain was a close trading partner of the United States, and when Germany began to attack nonmilitary ships around the British Isles, the United States began to pay attention to what was happening and warned Germany that any attack on nonmilitary ships in the region could bring about retaliation from the United States. While Germany stated that they'd ensure passenger protection before launching any attacks, they struck a number of ocean liners and sank a number of U.S.-owned merchant ships. After this transgression, the United States entered the conflict. The United States had a formidable army and its military power was a decisive factor in an Allied victory.
3 What Happened After World War I Ended?
In the aftermath of Germany's surrender in 1918, there were numerous political and social changes. The Treaty of Versailles, signed after the close of the war, required several countries to make reparations to other nations as well as provisions for the governance and economic future of contested parts of Europe.
Among the most critical were circumstances in Germany – which had been decimated by the conflict – that led to the rise of Hitler and the second world war. An uneasy democracy, a tremendous loss of population and a chaotic political landscape created tension in Germany and left a vacuum in leadership, allowing for the rise to power of the far right wing National Socialist German Workers Party, aka the Nazis. What the Germans perceived as humiliations stemming from the circumstances of the Treaty of Versailles were used as fodder by the Nazis to convince the nation that they were being abused and kept from power that was rightfully theirs.
Great Britain had been crippled economically, but reparations helped to ameliorate the damage to the nation's wealth. Socially, though, the war had been devastating to the citizens. Austria-Hungary declared their division into two separate nations and exiled the Habsburg family. Bosnia, Serbia, Herzegovina and other allied nations formed Yugoslavia.