During World War I, women had taken on jobs in fields such as nursing, teaching, and social work while the men were fighting overseas. This helped expand notions of women's capabilities and led many to believe that they should have the right to vote.