If I recall correctly, the first people to step on the mainland of the Americas were the conquistadors, who had little regard for the Indians. Yes, after Europe set a firm foot in America, many Indians were brought to Christ, but as the American nation expanded further and further West, the Indians felt threatened by the removal and desecration of their land, and retaliated.
I make no comment about whether or not America is a promised land.