How did World War I change…

History Questions

How did World War I change women’s roles in the United States? Did women receive greater educational opportunities, fight alongside men in the military, replace men in the workforce, or earn more money than men?

Answer

Scroll to Top