How did World War I impact women in the United States?
a. Women received equal pay for equal work.
b. Women were prohibited from working as Red Cross volunteers.
c. Women worked jobs that had been held almost exclusively by men.
d. Women no longer held traditional jobs such as nursing or teaching.

Respuesta :

Neuron
Probably the most important change that happened at the time was that C - Women worked jobs that had ben held almost exclusively by men. 

Due to the fact that the majority of men, or at least a good portion of them, were on the battlefield in Europe, women had to take over the role in certain key areas that were necessary for the victory of the US. 

The best answer is that all available food was needed to support the war effort. This poster is from 1917 which points to World War I as the conflict- Japan was an ally at that time- though the concept of a "Victory Garden" would be used again in World War II. The idea is that people could grow their own food so that the larger farms in the U.S. could grow food to feed the Allies fighting in France.