Feminism is pretty simple - a demand that women have the same rights and treatment as men. Where it becomes problematic is when men act like jerks, feminists often try to use that to justify acting the same way, instead of confronting jerk behavior, thus making the world a much more insufferable place. There's nothing wrong with the idea of "feminism" per se (other than some of its founders hating religion), but the way it can be interpreted (even by the ppl who founded it) can be extremely unhelpful to society. But the goal of getting both genders equal rights is not a bad idea.
2020s feminism involves all this transgender stuff. This seems really cheap nowadays because we live in a victimhood-first culture, where victims are prioritized, so everyone wants to be a victim. This is why we have trans-autistic people and many ppl claiming to have disorders like DID, etc. Women are higher on the victim pole than men, who are obviously at the bottom - especially white men - but they can hack the victimhood system simply by identifying as a woman, and now they're prioritized over other women. And bc men are much stronger than women, they take over women's sports, taking away women's ability to compete in them like men do. The point of women's sports was for women to get opportunities like men, and the trans agenda is enabling selfish men to destroy that. And males transgendering into females should NOT be in female bathrooms or dressing rooms - especially if they're known rapists (and this kind of stuff happens). Once upon a time, feminists would've protested very loudly against things like this, but they're nowadays too afraid to say anything about it, for fear of being Twittered by the woke mob. Somehow we went from being oppressive to women - to caring for women - to oppressing them again - only they can't say anything about it or they're labeled bigots.
And what's left of actual feminism is pure nastiness. This is why strong female characters in Hollywood are just all jerks now.