alright.
so people know that feminists are wrong right?
eh, they are not completely wrong. I just finished a book from julius verne, and men there treated women very nicely and with grace. feminists want to bring it back. I mean, they are wrong in some things like taking everything as offensive and thinking they can do everything. But the generation changed, then, treating women like that was normal. Now, noone (or most of people) wants the world like that back. People say that it's hilarious and funny to treat women like that. Men treat women worse than before (some of) and feminists want it back. they aren't completely wrong.
also, this is just my opinion