Blog, Gender

Is God a Feminist?

Feminism is “the theory of the political, economic, and social equality of the sexes.”[1] Regardless of your definition of Feminism, I think we can all agree the bottom line of feminism in our culture is equality for women. Now, this isn’t some left wing liberal propaganda (or right-wing propaganda, for that matter), but a look at… Read More Is God a Feminist?