“It is great, that women’s feet now are observed as something aesthetic and that there is a new awareness that they not serve as a sex symbol but that feet rather been seen as a part of the beauty of women, which has not to be hidden.”
I think I cannot say it a better way. But my question is, if women now really have evolved a new awareness for their feet. Because of the fact that I do not have any female readers I can only tell you about my personal observations...
When you look into some women’s forums on the internet you can still read sentences like “I do not like my feet”, “feet are nasty” or “I hate it if someone touches my feet”. But if you look a bit further you can also read some neutral and also positive statements like “well my feet belong to my body” or “ I like my feet and I like it to show them”. And of course there is no women- or lifestyle magazine or website which has no advises for beautiful feet in the beauty-section.
Well I do not think that all is a big coincidence. Ok. No women who walks the streets in for example flip-flops thinks: ”There are so many men out there who love feet”. But I think that more and more women now have an awareness that also feet are beautiful bodyparts and can be extremely sexy. This is my personal impression.
Oh and I almost forgot about all those celebrities who “spontaneously” began to show their feet on their Twitter pages. Or the female singers who act barefoot in their music videos.
(Twitter-Pics by: Kristen Bell, Demi Moore, Kylie Minoque, Katy Perry, Taylor Swift)
Overall I would say that there is a new feeling about feet. Or at least that more and more women develop a more positive attitude towards their feet and feet in general. What do you think?
PS: Sorry for my English. I did not have much time to translate my German article properly. But I hope you have understood what I tried to explain.
PPS: All the pictures are only for illustration and were not taken by me.
Have a great weekend and do not forget to tell me about your thoughts.