Hollywood's biggest flops of late have been reboots. Unpleasing stupid and liberal narrative arcs about feminism and diversity. The typical Tropes such as the patriarchy and racism. There is no originality, No likeable characters, No story, No tickets being sols.
There is nothing remotely good emerging from Hollywood. When directors think changing the lead character to female or a person of color to pander to a specific audience they are saying my ideology is more important than making a good movie I was hired to make.
There have been plenty of female heroes, But certain people ignore them to push their lying narrative.
No one is looking for subversive movies. People want an entertaining movie, Free from politics, agendas and ideologies. People want a good story with likeable characters portrayed by actors/actresses that keep their mouths shut and do the job they were hired to do.
Hollywood is not the only industry harboring predators who have gotten away with stuff for a long time, But it is the only industry run by hypocritical elites lecturing everyone on morality.
You're talking about Weinstein like he was acting as a villain in a role. He was a predator using his position of power to prey on women, While Hollywood elites knew what was happening but stayed quiet to further their careers.