World War II has changed the roles for Women in society and has gained them rights. In fact, World War II became a big turning point in history for Women and has changed Womens lives and lifestyle all over America. When the Men were sent off to fight for War, Women were called to fill in for employment. World War II not only had a big impact on Women but showed that Women were capable of doing what Men could do and that they are equal. Women played a part in the U.S success in World War II.