Do you think that the war had a positive or negative on American society? Think about:
How propaganda campaign influenced people’s behavior.
The new job opportunities for African Americans and women.
How the government controlled the industry
When the First World War was raging in Europe, the United States suffered from the effects of war. These effects were, however, positive for American society, even though there were riots of violence. Before the war had begun, women and Blacks had little job opportunities. When the men were drafted off to war, women took over the jobs that the men did and they were paid for it too. Also, African Americans were gaining jobs that men had left behind to go fight in the war.
One negative effect was the depiction of American propaganda. During the war, the attack against German-American escalated. Citizens of the United States stopped buying German products. On such product was alcohol. German breweries were attacked and the alcohol business almost collapsed. Another discriminatory crime was that German Americans were attacked and even killed. All German names were hated and towns even changed their name if it was German.
During World War I Congress had given all economic and industrial power to Woodrow Wilson. With his orders the government took over small business companies. These companies built weapons for the war effort. With the increase in sales, the average wage increased by 20%. The annual income for the united states doubled almost every year that Europe was at war.
No comments:
Post a Comment