So I guess it is only right for the indigenous people to take land away from other indigenous people. It's only bad if white and black people take their land away. And why is it only in the United States? From Argentina to Canada white people took land away from the indigenous people but it was only bad when the United States did it???