Last month on a flight to the West Coast I sat next to a young woman who’d just gotten her Ph.D. in sociology. She was also a Christian, and like me, she’d spent serious time overseas. But unlike me, she was appalled by how American women are “marginalized” and “oppressed” when they should be treated like men virtually across the board. She criticized traditional marriage roles, and it puzzled her how, in a progressive, industrialized country like ours, women could still be expected to shoulder most housework and childcare duties even when they work outside the home. In her thinking, the best approach was to make genders more equal and fairly similar in their characteristics.
I listened. I nodded in agreement occasionally because she had some good points. But when she finished, I said, “You seem really passionate. But what exactly are you fighting against? God made men and women equal but different, and that’s His perfect will. Why would we want to cancel that with an androgynous culture?”
For a second, she looked like I’d slapped her. Luckily the plane was landing, so our conversation lasted only 30 more seconds before seatbelts flew off. And she never really answered my questions.
Don’t get me wrong. In our broken world, there are gender-related injustices. And I’m no shrinking violet, having been in the workforce all of my adult life, often in patriarchal foreign countries, and I’ve enjoyed being an adventurous teacher, writer and world traveler. But if given the chance and a family to come home to, I’d actually love to do their laundry and make dinner for them. And I have no problem with women happily fulfilling these roles full-time and looking at it as a ministry.
The fact this sociologist so adamantly fought such roles makes me wonder what’s promoting the indignation.
Maybe television is part of the mix. According to a study on the rank of women by Maria Shriver and the Center for American Progress, “Women’s professional success and financial status are significantly overrepresented in the mainstream media, suggesting that women indeed ’have it all.’”
It’s true. According to The Nielsen Company, the top five jobs women hold on TV are: surgeon, lawyer, police lieutenant, district attorney and cable news pundit. In reality, the Department of Labor’s 2008 statistics show the most common jobs for women are (in order): secretaries and administrative assistants, registered nurses, elementary and middle school teachers, cashiers and retail salespersons.
I am not saying women shouldn’t be cashiers or strive to be doctors (so please don’t send me hate mail). My cardiac physician friend is pleased as punch with her decision to go through med school and she’s helping patients left and right. But what I do wonder is whether television subtly pressures women to be “more” than a mom. Do “more.” Achieve “more” outside the home because the doctor’s or attorney’s lives seem to be what’s best and most exciting, on television, at least.
So what do you think? Do shows such as Grey’s Anatomy, Private Practice, House, and CSI subtly suggest it’s not enough for a woman to love her husband, raise quality human beings full-time and make killer Italian dinners when they come home at night? Do they inspire women to, in fact, do more and try to “have it all?”