US History Since 1876
posted by Jason .
Did the United States win the war for the Allied Powers? Discuss the American contribution to the war effort and evaluate its significance.
What do you think? Could the European nations won the war without the U.S.?
WWI ended with the Americans only seeing very little action.
Its clear that the brunt of the fighting in the West was taken by the British Empire/Commonwealth and the French. The Bolshevik Revolution in Russia gave Germany the opportunity to wage the war on one front. With the entry of the US into the war the Germans felt that they needed to make a serious push to end the war in 1918. The strategic thinking of the Germans was that the arrival of fresh US soldiers would tilt the balance of the war in the Allies favor and the Germans wanted to make the push before the full weight of the US could be felt. In looking at the results of the German offensive in 1918, it is again clear that the brunt of the offensive was taken by the Anglo-French soldiers