I could go on for ages about this topic and never get a majority of people to see eye to eye with me. Especially men.
I don't agree with anyone being objectified, but why does it happen to women so much?
Playboy mags, bikini baristas, strip clubs, and breasts in damn near every movie we see.
I want to blame it on men being pigs, but it's society as a whole.
Not only is it disrespectful, but what kind of example are we setting? Ladies now days think we need to have giant breasts, be a size 0 and look basically flawless to be desire able. It's pitiful.
I'm not a feminist by any means, I'd just like to see women get an ounce of respect and stop being objectified so much. We shouldn't feel so pressured to look a certain way.
And why doesn't it go both ways? I'm not saying I agree with it regardless of who you are, but does this garbage happen to men? Are we seeing dicks all over in our TV programs and ripped, shirtless men serving our lattes? No.
I want to know other peoples take on this. I know it can be a sensitive subject, but I want to know what you've all got to say about it.
No comments:
Post a Comment