Who dominated the South's politics?

Do you mean the Southern U.S.?

When?

yes the southern U.S. in 1860.

The Southern Democrats dominated that sections politics.

http://www.tulane.edu/~latner/Background/BackgroundElection.html

To find out who dominated the South's politics, you can start by examining the historical context of the American South. In the United States, particularly during the 19th century, the South was predominantly characterized by a political system dominated by the Democratic Party.

The Democratic Party enjoyed considerable influence and control over the South's political landscape for several reasons. First and foremost, the South was heavily agrarian and depended heavily on agricultural production, especially in the form of cash crops like cotton. As a result, the region had a large number of wealthy plantation owners who sought to protect their economic interests, including the institution of slavery.

The Democratic Party, especially before the American Civil War, gained considerable support in the South by aligning itself with the interests of these wealthy planters. The party defended and supported the institution of slavery, which was deeply entrenched in the Southern economy and society. This support for slavery and the preservation of Southern traditions solidified the Democratic Party's influence in the region.

Furthermore, the Democratic Party found support by appealing to white voters' fears and prejudices. They often used racialized rhetoric, promoting white supremacy and perpetuating notions of black inferiority. This strategy reinforced the party's political power base by exploiting racial divisions and preserving the social hierarchy.

It is important to note that the dominance of the Democratic Party's control over Southern politics began to erode during the mid-20th century. With the Civil Rights Movement gaining momentum and the Democratic Party gradually embracing civil rights reforms, many white Southern voters began shifting their allegiance to the Republican Party, especially following the passage of the Civil Rights Act of 1964.

In summary, the Democratic Party historically dominated the South's politics due to its alignment with the interests of wealthy plantation owners, support for the institution of slavery, and racialized political strategies. However, it is essential to acknowledge that the political landscape has evolved over time, and the South's political dynamics have undergone significant changes in recent decades.