Thinking about it, it seems that men always had rights. Might makes right, right? Ages ago some men took power and, having power, gave themselves rights. From the beginning of recorded time, rights were bestowed by powerful men upon other men—usually those who had some power themselves. With the rise of governments, men without obvious power were given more and more rights. When the United States was being established in the 18th century, it was declared (if not believed) that all men were created equal and had "inalienable" rights. What of the rights of women? They were not recognized in western society until the early twentieth century. Women did not even obtain the right to vote until 1920.