You don't even know what woke means.
you think America was better when women weren't allowed to work outside the home? you think America was better when women weren't allowed to make their own choices? When they weren't allowed to have their own property or credit cards? you think America was better when a man could rape his wife and it was completely legal? you think America was better when black people couldn't live in certain towns or else they would be attacked or murdered?