idk a whole lot about this issue or the whole story but i did a quick google search and from what i got from it i think that in some ways both sides are wrong.
yes woman have be put down and discriminated against for years in american culture and that needs to change but we cant ignore the male toxicity against other males. both side have to stop working against each other and work together to change the cultural norm.
so we can live in a world where no gender can get away with raping, molesting children or get better jobs based on their gender.
Feminism now seems like a couple of predictable tropish plots for children's shows: boys being forced to let girls into their clubs, while girls are allowed to have their own clubs with no boys allowed.
no I'm talking bout the "YOUR R*PING ME" when they feel uncomfortable, basically when ever there is a public meet up for men issues there going to be there to scream things like "men are pigs' or "teach men not to r*pe"
but the thing about the phrase "teach men not to rape" is saying to a robber that stealing is breaking the law, what the phrase puts across is that men don't know rape is bad.