Unfortunately nothing is gonna change cause society doesn’t actually give a shit, the one I’ve seen a lot is the depression. Men are always told to “be a man, don’t be a f**king pussy” or “man up” when they actually need help. When a women only says they think they have depression society rushes to help them. Seriously wtf.