Who declared the culture war.

The term "culture war" was popularized in the United States in the 1990s, particularly by cultural conservatives who felt that traditional values were being challenged and undermined by liberal social and cultural changes. However, it is difficult to pinpoint one specific individual or group that declared the "culture war" as it is more of a metaphorical concept that emerged from societal shifts and clashes of values.