How did immigration change America?

Before immigrants came, the Native Americans had the country to themselves. Immigrants took their land, spread diseases, and declared war on the Native Americans.