recently I remebered; that I was taught that the US made Hawaiʻi part of the US to "protect it from being attacked by other countries" in elementary school and knowing what I know now about the state's history and stuff and how that is just not true I'm not sure how to feel about it lol