Lena Dunham, star and producer of HBO show Girls, has continuously been outspoken on her views. She has spoken a great deal about body image, as well as feminism. One of my favorite quotes by her is regarding people who reject using the word ‘feminism.’ She said “I think women who reject the term don’t know what it means. It’s not a concept you reject. If you’re a feminist, you believe in equal opportunities.”
Before we can get men on board, we need all women to be on the same page. How do you think we can rid the stigma of the “f- word”?