posted by Anonymous on .
What were the Wild West myths of the West?
Is this an adequate answer?
To Americans, the West had become a larger-than-life place where brave men and women tested themselves against hazards of all kinds and won?
Or should I include plays or works of fiction?
I'm not sure that was a myth. Of course, they didn't always win, though.
I'm sure you can find more myths in the
Google sites above.
Did you read this site?