November 24, 2020

Why feminism has to stop being such a dirty word

Many times I’ve been called a feminist for speaking out about women’s issues such as abuse, equal pay and reproductive rights—yet, rarely is that ever meant as a compliment. By definition, feminism is rooted in the belief women should have equal rights and access to opportunities as men. Within that lies a range of beliefs and efforts regarding issues hindering gender equality—particularly among women of color—but at its core, feminism is grounded in advocating for a woman’s right to choose her path and have equal opportunities to thrive personally and professionally.

Feminism is often stereotyped as a movement based on taking opportunities away from men—socially, economically and politically—which tends to label feminists as being anti-men rather than pro-women. So many times I’ve heard women say, “I believe in women’s rights, but I’m not a feminist.” What they mean is, “I believe in women’s rights, but I don’t want to be associated with people who call themselves feminists because then I will be judged unfairly.”

If you had asked me about my views on feminism when I was a teenager, I probably would have said women are already treated equally. This is largely because a) I was naive and b) I was raised in a feminist household and never realized it until I grew up. For most of my childhood, my parents ran a company and a family, all the while raising my sister and me to believe our opportunities were limitless, no matter what we chose to do (key word: chose). At no point did they call themselves feminists or indicate their attitude toward a woman’s right to choose the life she wants while being on equal footing as men was in any way unique. It just seemed, well, fair.

It wasn’t until I was out of college and in the workplace that I became aware of gender inequality because just like any type of discrimination, it’s easy to think it doesn’t exist or isn’t a significant problem if it’s never happened to you. More than once, I’ve watched men who lacked experience, work ethic and measurable results get promoted over more competent female colleagues, steadily earning raises along the way while women in the same position (and often with more experience and stronger performance) watched their salaries remain stagnant. More than once, I’ve watched women change careers or leave their fields entirely after having children because they felt like they’d never catch up to their peers without being able to work nonstop on nights and weekends just to stay competitive.

When it comes to the gender pay gap—found to exist in most industries, regardless of education level—the statistics are even bleaker. We regularly hear women earn 80 percent of what men make, which is true, but when broken down by race, women of color make significantly less than their white counterparts with black women earning 63 percent of what men make while Hispanic and Latina women earn 54 percent. There is no single cause for this, which is why there is no simple answer. Education, occupation, motherhood, time spent away from the workforce and other measurable factors all play a role. But it’s the unmeasurable factors—gender bias and discrimination among them—that come into play when evaluating data that can’t be explained statistically.

Women who pursue leadership roles are often hindered by gender bias, whether realized or not. A 2015 study found when women are judged as being “forceful” or “assertive” in the workplace, their perceived competency drops 35 percent and perceived worth decreases by about $15,000 (comparatively, men were found to experience a 22 percent reduction in perceived skills and a $6,500 drop in worth). Some studies have shown employers aren’t as likely to hire mothers (even if they never left the workforce after having children) compared to women without children, and when they do, they often earn less than their child-free counterparts. It doesn’t help that Mississippi and Alabama are the only states in the country with no regulations to combat sex-based discrimination at work.

Women’s rights extend far beyond the workplace, particularly concerning social and political equality. Issues of harassment, abuse and stalking are more likely to affect women, who statistically are more likely than men to be physically assaulted, stalked or raped in their lifetime. Women hold about 20 percent of seats in U.S. Congress (women of color hold 7.1 percent) and 24 percent of statewide elective offices across the country. When it comes to state legislatures, Mississippi ranks near the bottom of the list with women making up only 13.8 percent of leadership.

There isn’t enough space in this entire newspaper, much less this column, to dive into every issue American women face on a daily basis. But long as feminism is burdened by the stereotype of women wanting to surpass or otherwise weaken men, solutions to those issues will be that much more complicated.

If you believe women and men should have equal rights and opportunities, you are a feminist. If you believe women deserve the right to choose between work and family or a balance of both, you are a feminist.

It’s time to stop treating advocating for women’s rights like a radical movement reduced to a Facebook meme and start treating it like what it is.

Long overdue.

(Originally published by The Oxford Eagle)