Post a New Question

social studies

posted by on .

How did immigration change America?

  • social studies - ,

    Before immigrants came, the Native Americans had the country to themselves. Immigrants took their land, spread diseases, and declared war on the Native Americans.

Answer This Question

First Name:
School Subject:
Answer:

Related Questions

More Related Questions

Post a New Question