the dictionary definition of feminist is “wanting social, political, legal, and economic rights for women equal to those of men.” If someone just wants women and men to be equal, they’re technically a feminist. Some feminists are toxic, and think women are better than men. They’re not real feminists and go against the definition, but that’s what people think that all feminists are