where did the postwar politics take place?

Postwar politics took place in various countries around the world, most notably in Europe, Asia, and the United States. Some of the key events and developments in postwar politics include the establishment of the United Nations, the Cold War, decolonization movements in Africa and Asia, the establishment of new political systems in former Axis powers like Germany and Japan, and the civil rights movements in the United States.