Hello Community,

Today I decided to lift so actual and famous theme nowadays:
Equality Between Men and Women...

How do you think is it really healthy and positive in all ways?

During long time women were fighting for equality with men and in most cases they achieved it, especially in Europe and USA. From one side it is great: equal opportunities in career, equal rights, no limitations etc..

But from other side, a lot of women started to loose their feminity: more and more time wearing trousers, fighting for the better, more complicated job, even having refused to have children, because of career or other different factors (especially you can observe it in Germany)!!

So, is this phenomenon really always healthy?

Or... possibly better to live in "man's created world" or....?!


As for me, I suppose that woman should learn to enjoy how to be woman and sometimes concede in personal and business relationship. We have a bit different functions in life - it is nature, and you cannot fight with it..

And what do you think about it?

All the best for you,
Lera.
Translate to English Show original