Men: Why You Have To Stop Telling Women To “Smile More”
Cheer your own damn self up
Another reason men tell us to smile more is that they believe women exist to comfort, uplift, and nurture men. If we are in the presence of men, we should be doing things that might comfort their bruised egos and delicate sensibilities—like smile. I mean, for all we know, the men we pass by may be having a hard day so, it’s our job as women to smile to cheer them up, right? WRONG! Men: cheer your own damn selves up.