women naturally want to be pretty?
is this true? or what we are told to believe?
What is beauty? The media around us and the people around us are building constant pressure for women to be beautiful; magazines teach girls what looks nice and what make-ups nice, mens magazine teach boys to objectify women; some people take this ideology to extremes, not leaving the house without make-up to pick up milk from the shop, married couples over a decade yet the husband has never seen the wife without make-up, plastic surgery all for what?
The sense of 'fitting in', to be comfortable in your own skin - but with "slap" on is it your own skin? You could say you're enhancing your features, but others see it as a mask. Spots go away, and scars fade, other parts of your face are what make you who you are.
This debate can go on forever, I'm not against make-up and beauty neither am I for surgically 'correcting' yourself, this topic was brought up at uni and interesting enough for me to share with you. There's a fine line on how much a person could do; to me image is only skin deep, true beauty is within.
They say guys prefer natural faces