when did the postwar politics take place?

Postwar politics generally took place in the years following the end of World War II in 1945. This period, sometimes referred to as the post-World War II era, saw significant political developments, reconstruction efforts, and shifts in global power dynamics as countries grappled with the aftermath of the war and aimed to establish a new world order.