Information Technology, Language, and the Nation State
February 24, 1999 / 6.30 P.M.
Jonathan Lewis, Institute of Social Science, University of Tokyo
Computerization has many aspects, but fundamentally it entails a massive increase in the amount of information being exchanged and stored by human society. The previous jump in the quantity of information, brought about by the invention of the printing press, had far-reaching and well-documented effects both on languages and politics. Benedict Anderson, for example, identifies it as an important source of the nation-state, due to its ruthless selection of a limited number of quote;print languagesquote; in which information could be economically distributed across regions. The question of how the current jump is affecting, and being affected by, natural languages is an intriguing one, not only for socio-linguists but also for sociologists of technology and students of political economy.
The question is being posed around the world, but with particular urgency in Japan and other advanced East Asian economies where competitiveness in manufacturing has not been matched by the software, finance, and other information-based sectors. Competitiveness dictates not only greater English proficiency but also the adoption of international standards for the display of Chinese and other characters in order to facilitate information exchange within East Asia. Yet measures to regionalize or even globalize language policies present many problems for East Asian states, where quote;national languagesquote;, hurriedly standardized in the 19th century, have been pillars of state legitimacy to an extent unmatched in many other countries. Furthermore, regional conflicts have made even slight movements in language policy politically sensitive — witness, for example, the controversy over the Korean Ministry of Culture and Tourism’s recent announcement that it would start to use some Chinese characters in government documents and street signs.
The world’s computer companies have jumped into this difficult policy area. Many software developers are making their products compatible to the Unicode standard, essentially a large matrix on which tens of thousands of characters from the world’s major languages have been plotted. Unicode allows Microsoft, Apple and other companies to produce localized (Japanese, German, etc.) versions of their software quicker and cheaper. The first version of Unicode was released without extensive consultation with East Asian governments and companies, and was roundly criticised for its omissions of characters. The second version was released after much more far-reaching consultation, and regional efforts to identify overlapping kanji, and as a result MITI is backing the Unicode standard. Nevertheless, Ministry of Education-backed researchers, along with some Japanese computer companies and conservative groups of historians and writers, are holding out for independent Japanese encoding methods. In Korea the controversy has focused on Unicode’s effective limitation on the number of possible combinations of elements in hangul characters, which were unlimited under the domestic company Hancom’s system.
The ongoing debate about character encoding standards is the area of language policy in which we can see perhaps most clearly an awkwardly emerging accommodation between nation-states and global or regional economic interests. Just as the contents of the Unicode matrix are decided by negotiations between economic and political interests, so the direction taken by the so-called information revolution is not dictated by technologies, but framed by them. This alone should make us sceptical to claims that some technological fix, such as machine translation, will solve problems of international communication.
Jonathan LEWIS is Associate Professor at the Institute of Social Science at Tōkyō University.