Charles De Wolf at Commonweal:
Still, national images are always subject to fluctuation, and at least in the West, historical shifts in the perception of Japan have been particularly dramatic. A much darker view of the country—as a land of ferocious militarists caught up in a death cult—was already fading when I was a boy in the early postwar years. U.S. soldiers returning from Japan showed color slides of Kyoto temples and stately young women in kimonos. Soon Japan was being described as the proverbial phoenix rising from the ashes, now firmly committed to democracy and staunchly allied with its former enemy, the United States. Growing interest in Japan and Japanese culture, particularly in the late 1970s and early 1980s, led to an exaggerated picture of the nation’s strength. Admiration was mixed with misplaced envy, with “Japan Incorporated” now perceived as a new kind of Imperial Japan, with black-suited businessmen replacing sword-wielding warriors. Teaching Japanese and linguistics in a liberal-arts college in upstate New York for two years in the late 1980s, I met students eager to live in Japan long enough to master the language, obtain MBAs from a prestigious American institution, and thereby make their fortunes. Then in the early 1990s, the economic bubble burst. The rising superpower was now suddenly being described with another cliché—the land of the setting sun.