Recently The Atlantic published a controversial article, “The Confidence Gap” by Katty Kay and Claire Shipman. The article suggests that women’s lack of confidence holds them back in the workplace, whilst essentially dismissing the fundamental lack of equality for women in society and policy. The undoubtedly well-meaning piece stirred up some frustration from both women and men nationwide. After reading too many feminist theories that seem to demean women instead of support them, I was among the frustrated.
As a young woman I cannot decide if I feel empowered or like a victim. I recently graduated from a women’s college, follow feminism in the media, have a vagina, and of course read “Lean In”; you could say I am an expert on the modern day female. I know so many smart and successful women. Nevertheless, after surrounding myself with things that should make me feel inspired, I somehow feel defeated. Where are these elusive female travesties that I so often hear about in the media? The “confidence gap” is nothing more than a reflection of a media driven culture that gives women like me no reason to feel self-assured.
Perhaps a woman's desire to be more risk averse in her decision making is mistaken for lack of confidence? But if I have learned anything from dating, having male friends, and generally existing in society, it’s that confidence is not a female issue. It is a human issue. Everyone suffers from lack of confidence from time to time. I certainly cannot deny that a lack of confidence in certain areas has held me back in the past, however I never considered that it was because I was doomed by my X chromosomes.
For the years leading up to this pivotal postgrad point in my life, I have been beaten like a dead horse. I have been told that as a woman in the workplace I will be less respected than men, I will get paid less, if I am assertive I will be perceived as a ‘bitch’, and now that I am less confident than my male counterparts. I feel that I have not even been given a fighting chance. Every time I receive criticism in the workplace will I think to myself, “it’s because I am a woman”? I don’t want to be that girl. I don’t want to be a victim.
I’m tired of older, superior, and supposedly wiser women telling me who I am because I was born without a dick. Feminism seems to be on a constant campaign of obsessive gender reflection. Do women have any problems that aren’t caused by men? While in our pursuit for equality, we push further from what we are trying to achieve. We compare and consequently separate ourselves from men. We create clubs “Women in Tech”, “Women in Media”, and hammer figurative signs to the door: “No Boys Allowed”. These woman only societies are arguably beneficial to society, however I wonder what we could do if we decided to work together? While women still have a long road ahead in terms of public policy--
--Within the past year, the Iowa Supreme Court ruled that a woman can be fired if her boss finds her attractive, a New York court ruled that unpaid interns cannot sue for sexual harassment, and the Paycheck Fairness Act was defeated by Republicans who claim women actually prefer lower paying jobs--
--I’m still not convinced that we’re leveling the playing field by creating “women only” societies, constantly comparing ourselves to the opposite gender, and blaming lack of confidence for shortcomings in the workplace. I think we need to ask ourselves if we even want to be playing on the same field?