Have any of you seen "Rise of The Planet of The Apes"? I was wondering,
how well would a mainstream movie do that depicts a not too far distant
future in which Women, fed up with patriarchy, male domination, and
oppression, reach a determination to take control of society, and decide
to do so by using and means available including force?
Force would not be my preferred method, as that sounds too much like what we have now with men and that disgusts me.
(12/30/12)
The way of women is through nurturing and enabling people to accomplish what they are fitted for. Women have a unique ability to lead and through intuition know what is right. I mean, don't you agree? Did not everyone come from women, so what better authority figure is there than women? I think it would be natural for everyone to work for women, be nurtured by women, and find the world a much more agreeable place.