Post a New Question


posted by on .

Many historians argue that the vietnam war in Vietnam was a profoundly disillusioning experience for Americans. Would you agree? Did the Vietnam experience signal a fundamental shift in the American foreign policy? what have been the consequences for american foreign policy in the post-vietnam era? have they changed in the post-9/11 era?

Answer This Question

First Name:
School Subject:

Related Questions

More Related Questions

Post a New Question