Japan


During World War I, Japan, allied with the United Kingdom through the Anglo-Japanese Alliance, capitalized on the opportunity to expand its sphere of influence in East Asia. It declared war on Germany in 1914 and seized German-held territories in China and the Pacific. By the end of the war, Japan emerged more assertive and confident, having gained territories and a boost to its international standing at the Treaty of Versailles, though its wider ambitions were restrained by Western powers. In World War II, Japan's militaristic expansionism reached its zenith. Following years of increasing aggression in Asia, Japan attacked Pearl Harbor in 1941, bringing the United States into the war. This marked the beginning of a series of expansive military campaigns across Southeast Asia and the Pacific. Japan's initial successes turned to devastating losses as the war progressed, culminating in the atomic bombings of Hiroshima and Nagasaki in 1945, leading to Japan's surrender and a profound transformation in its national identity and place in the world order.