Men: Why You Have To Stop Telling Women To “Smile More”
We don’t exist as something for you to look at
Usually the men who tell women to smile just believe that women are objects put on this earth for the visual enjoyment of men. And, since we’re more visually pleasing when we smile, these men think we aren’t doing our jobs when we don’t smile. Yeah, guess what—we aren’t here for you, dudes.