France has a stereotype of being a liberal country with liberal women-topless beaches, women who refuse to shave their armpits, and women having the freedom to use experimental birth control however they chose. What was the journey France took to gain this stereotype, and is it true?
World War I not only introduced women in America to different roles in the home and the workplace, but also introduced new roles to womenen in France. Looking at the French poster from WWI above, it's interesting to think about how women in France handled WWI and the aftermath differently than we did here in America. Women in both countries were worried about the same things-providing for their families and just trying to survive the war and come out of it as normal as they were before it started. However, we all know that same sense of normalcy was never attained by either group of women. How did the women feel before the war?
In 1848 , the Women Right's in France called for universal suffrage, education, and employment.
In the 1970s, the feminist movement gained steam in France over the issues of reproduction and abortion; similar to America.
Now, let me paint of picture of today in France:
It seems as though France is stereotyped as a free place for women, and a place were equality may have been reached, it seems as though, French women experience the same problems women around the world face regarding equality and the wage gap.
When I was researching this country, it was intriguing to find the passionate language that the French women and men wrote about the Women's rights movement. In researching America, almost the same facts were laid out, but lacked the same level of passion as the French. I thought this was a very interesting find.
Have a country you'd like to know more about? Comment and let me know what you want to learn!