How I Feel When Men Tell Me to Smile

It may sound extreme, but it reminds me of a dog owner telling his pet to perform a trick.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.
Rear view of young woman strolling on city street
Rear view of young woman strolling on city street

Smile -- it can't be that bad. I think you dropped your smile somewhere back there! You're not smiling; how can I change that?

Sound familiar? If you're a woman, you've probably heard men make comments like these. And if you're a man, you might have said them. I've personally heard all of the above. But the most frightening time a man ever commented on my smile was at a college party. He walked across the room and yelled so severely in my face for not smiling that I feared he'd hit me. He told me to leave if I didn't want to have fun.

A few months ago, there was another woman in the news who didn't smile enough. A male reporter asked Serena Williams why she wasn't smiling since she'd won that day. Williams replied she was tired and didn't find the press conference "super enjoyable." But even if she were enjoying herself, should she be expected to physically show it? She wasn't hosting an event. She was fulfilling a requirement, same as male tennis players like Roger Federer. It probably never crossed that reporter's mind to ask a male athlete why he wasn't smiling, because men are not expected to smile the same way women are -- and that is the kind of thinking that needs to change.

Not all men tell women to smile or think it's appropriate. I have male friends who never engage in this behavior, because they know it's wrong to tell anybody how they should look. As my friend Justin put it, he's "never prompted a member of any sex to elevate their mood for [his] approval." But there have been enough men who've commented on my smile to make this worth addressing.

The most popular excuse I hear for why men tell women to smile is that they're flirting. This reasoning makes a disturbing sense to me. When men come of age in a society that says women are prettier when they smile, then some men will expect to see those smiles or feel entitled to make them appear if they're not readily available. This explains the behavior, but doesn't excuse it, which is why I want men to understand how I feel as a woman when they tell me to smile.

When a man tells me to smile, I feel like I've been contracted to provide aesthetic entertainment for him -- a contract I never signed. I've become more like an object in his eyes, which makes it easier for him to mold me into what he wants to see. It may sound extreme, but it reminds me of a dog owner telling his pet to perform a trick. Basically, I feel objectified. After a man tells me to smile, I always tell my girlfriends about it, and we always shake our heads over the fact that it happened yet again. And when I hear a man using the smile gimmick to flirt, a definite miscommunication results. While he hears himself expressing romantic interest, I hear him trying to exert control over me and my actions. The man who tells a woman to smile should understand that he has no more control over her smile than he does over any other part of her body.

I hear the argument, "If you smiled more, men wouldn't feel the need to tell you to." But why is there a "need" at all? Is the image of women not smiling so intolerable that some men feel the need to coax it out of us? And that argument places blame on women, which sounds like telling a woman that if she'd worn the right clothes she wouldn't have been assaulted. Another argument is that smiling can improve people's moods. So maybe men just want to cheer me up and make me stop thinking about myself so much. But why is it only men saying it to me? Women never tell me to smile. And nobody tells men to smile -- especially not other men.

I'm not saying smiles are off-limits when it comes to flirting. Telling me I have a nice smile is different from saying I'd look prettier if I smiled, or that I should smile. The former is a compliment, but the latter two imply smiling is the norm for women and that I'm doing something wrong if I don't.

There are times when smiling is important. Job interviews tend to go better if one looks enthusiastic about the position. If you like the person you're on a date with, a smile helps to convey that you're interested in said person. But there are some people who are extremely introverted or anxious. Even when they're having a great time, it's difficult for them to show it in this particular way.

In one of my favorite John Candy films, "Only the Lonely," Candy's protagonist is on a first date with an extremely shy Ally Sheedy. He tries to engage her in conversation all night, but he never specifically comments on her missing smile. He just tries to get to know her, and it's for that reason she says she had a wonderful time, and that she'd like to try going out again. It was due to major social anxiety and shyness that I wasn't smiling at that college party, which that angry guy might have learned if he'd treated me like a person and asked how I was.

There are plenty of things I need to learn to lighten up about, but I don't think the problem of men telling women to smile is one of them. I'm calling on men who tell women to smile to ask themselves why they're doing it, and to not think they're entitled to do it, same as they're not entitled to catcall or force themselves on women. Telling or even asking women to smile is not sexy, cute, witty, charming or any of the things some men think. It's sexist, it's harassment, and it doesn't lead to success with women. Because women don't like being told what to do any more than men do.

Also on HuffPost:

Popular in the Community

Close

What's Hot