Through most of American history until the late 1930s, foreign policy was largely tangential to American domestic politics and American culture. Separated from most of the world by wide oceans, Americans generally regarded world affairs as a remote concern, having little political or social impact. Since the outbreak of World War II in Europe in 1939, however, foreign policy and foreign affairs have often been at the top of the political agenda, and have had a deep impact not only on politics but on American culture.
Before World War II, the general thrust of American policy and thought was isolationist. The period of imperialism around the turn of the 20th century, and even the relatively brief American involvement in World War I, did little to change this. In 1939, however, a great-power war again broke out in Europe, and at once the possibility that the US would become involved became a major public concern. Conservatives tended to be isolationists, regarding it not America's concern, while some accused British and Jewish interests of seeking to draw the US into war.
Pearl Harbor ended the debate, and the US totally mobilized for war, with profound effects on American culture. War production ended the lingering Depression, while movement of people as soldiers and war workers eroded local and regional social isolation. Large numbers of African-Americans, for example, left the rural South and were exposed to a life free of legal segregation. The war also left the US as a global superpower.
Any "return to normalcy" and isolationism was prevented by the swift emergency of the Cold War after World War II ended. American military forces remained stationed around the world, and the UN and NATO, along with other international organizations, were formed to advance American global interests. Conservative Republicans did not revert to isolationism, since they viewed Communism as a fundamental threat to their own values. F...