Can anyone explain how expatriate Americans and native Europeans viewed America after WWI.

Expatriate Americans viewed the U.S. has disparaging the talents and accomplishments of women and blacks.