World War I brought the U.S. together and tore it apart, set the stage for redefinition of American citizenship and the role of the U.S. in global politics. To understand the U.S. today, one must understand “How WWI Changed America.”
When World War I broke out in Europe in 1914, President Woodrow Wilson declared the U.S. neutral. By 1917, President Wilson announced, “the world must be made safe for democracy” and brought the nation to war.
To influence public opinion in favor of the war, the U.S produced films, commissioned colorful posters, published propaganda pamphlets and recruited everyday Americans to “sell the war.”
Advocates of peace argued for the continuation of American neutrality. Objection to the war became identified as dangerous to the nation. Political fear and the controversy of war opposition led to the first Red Scare in the U.S.
African Americans made substantial contributions in WWI. By 1920, nearly one million African Americans left the rural South in a movement called “The Great Migration” which would transform the U.S.
In World War I, one out of every five soldiers in the U.S. Armed Forces was an immigrant. For some it was a path to citizenship. For the nation it proved pivotal to building a more inclusive definition of “American.”
American Indian contributions to the war effort helped win the war and, in 1924, citizenship for all Native Americans in the U.S.
World War I marked the first time American women formally served in the armed forces. That sacrifice and service helped win the war and win women the U.S. suffrage movement and the federal right to vote with the 19th Amendment.
Occurring against the backdrop of World War I, one of history’s most deadly pandemics added to the horrors and devastation brought on by the conflict.
The “home” that soldiers returned to was quite different than the one they left in 1917-1918. It set the stage for the arts movements of the Roaring Twenties and for better veterans’ services in World War II.