Gender Roles In American Society

Words: 823
Pages: 4

Until the past few decades, women and men have been viewed very differently throughout the course of American history. For the majority of our past, women have been viewed as housewives. They were the ones to keep the up and maintain the home. This would include cooking, cleaning, taking care of the children, and various other tasks. During this time, men had very different roles altogether. For example, men were expected to provide for the family. They would be the ones with jobs, the ones who handled the financial leadership of their homes. In all truth, this “defined” standard lasted in America until the year 1941. This was the year that the Japanese attack a US naval base on the coast of Hawaii, known as Pearl Harbor. This attack forced the United States into the war both on the European and Pacific theaters, respectively. The men went off to serve in the armed forces, leaving several women here at home. What exactly were they to do? They adapted. As the men were fighting abroad, women flocked to factories and occupations everywhere. The women here in America produced several goods for the war, thus fueling the American war machine. This effectively brought women into the workforce, changing the American society forever.
From this point on , things were different. Once the war was over, the men returned to their
…show more content…
Many of the stigmas and prejudice held against women such as “You belong in the kitchen!”, or “Shut up, you can’t talk!” have been abolished. Now women have the opportunity to become whatever they want to be. As a young women, this is something I am extremely proud of, however, it is not perfect. As I have mentioned before there is much work to be done to ensure the future of gender equality, but I know that proud and strong women all around this nation will stand and lead the charge into a better future, as we have time and time again. Thank