Re: [ga] IDNs & the GNSO New TLD PDP
Danny, the situation is more complex than that. I will spend some time to review it for you, Vint and others. It is part of the whole issue. You cannot understand ICANN, ALAC, GAC, etc. and the constituency real needs if you do not understand this. Also, you do not understand why the world civil society is disappointed with the US user community, ALAC, etc. which has not understood, has not been interested and has not supported its action. Look at what you discuss and consider what is at stake. We are not interested in what interests you - and we do not know how to call on you; and if it is important or not to get you involved now... The IETF - under the impulsion of Harald Alvestrand - has adopted a network centric (read US centric) architecture named "globalization" (RFC 3066 and RFC 3935). One can dispute if it is the best now for the US interest/security, but it fits well with the other US centric propositions (ICANN, IETF, IANA, ISOC). The target is to "influence" the ways others "design, use and manage" the Internet (RFC 3935) in defence of core values which do not include multiculturisation. They also use a new Internet definition which is very near from the USC definition (a control on people's machines, rather than a users adhesion to protocols). Globalization is simple to understand. Mark Davis (the President of Unicode, the current main US stakeholders cartel) defined it in a very simple way as: the ASCII Internet internationalization with the users computers localization. The internationalization is to extend the network support of ASCII towards UTF-8 (standardised by ISO with Unicode, Unicode extending ISO with proprietary features), and localization is the local tuning of the computers with local parameters found in "locale" files. Unicode intends to be the de facto exclusive publisher of locale files to unilaterally extend its content to support proprietary services. Windows taking over Linux's language oriented core. This is an American approach. It is acceptable to support a world where the default is English, the technology is discussed in American (RFC 3935), the e-commerce is US lead, the search engines works in American, the suppliers are American, etc. The strategy is "shaping the world" with English as a default language and other language a local version of the English core. This seems realistic: the USA are not technologically better, but they have a better capacity to defend their interests - at least often a short term vision of them. The USG made clear the US security and the American way of life are not negotiable (true for Internet, ecology, power, water, etc.) However, this cannot be architecturally workable in a global distributed system when others countries (EU, China, etc. with better technological capacity, identify their own interests as more important than US compatibility). I say with better technological capacity: this has been identified by the IAB in RFC 3869. The Internet R&D is currently mainly supported by commercial interests, with short term biased goals. The public/popular funding is in EU, China and in the Open Source community: the IETF/IAB call for it. If funding is to come from the USG it will probably be channeled through NSF and DARPA, not to the IETF/IAB. So, today the balkanization is US originated as a wa y to tackle the "legacy Internet" phasing out immediate impact. The way the globalization can be enforced and controlled is through the control of the IANA language registry. This new registry is the most important one. There is the root for the DNS, there is the IP blocks for the addressing, and the Langtags Registry for the globalization. The langtags are as simple as the DNS root or the IP blocks. The role of the langtags is to build the globalization in consistently identifying the languages on the network (internationalization) and the locale files (localization). There are made of three subtags: the ISO two or three characters language ID, the ISO four character script ID and the country code (like in ccTLDs, but carefully separated from them not to give countries authority on them) and a few UN regional codes. American is "en-latn-us" to say the English, wrote in using Latin characters, as used in the USA. "en_latn_us" is the corresponding locale file. The control of the "langroot" and the limitation of the number of langtags is important to control the stability of the whole system, hence the stability of the globalized/internationalized/localized Internet. No need to say that this proposition is Unicode oriented. The IANA registry is managed by the ietf-languages@xxxxxxxxxxxxx (Harald Alvestrand is a Member of the Unicode BoD). The RFC 3066 bis, which finalizes the project, is proposed/supported by a strong Unicode affinity group. For information: 80% of the Unicode officers are IBM, M$, Apple, Cisco. Unicode is to the "langroot", what Verisign is to the "name root". It is interesting to see that aside these "solutions providers" there is an increasing participation of the "solutions providers" (Yahoo!,Google, etc. with Verisign who recently joined). Once they got a grip on the IANA the fight will be important to control the Legacy. This "globalization" proposition is consistent, well established, well documented, based upon a long thinking and effort. The natural result should be that the IETF should enter in an MoU with Unicode about languages related issues, as they did with ICANN over the Names and Numbers. This is even a necessity, because through the RFC 3066 bis, the IETF claims a technical leadership and resulting responsibilities, in the area of language identification and support, it obviously has not. Unicode can claim this expertise and involvement, in a way comparable to ICANN with ALAC/GNSO for namespace (list unicode@xxxxxxxxxxx) or ISOC with the individual users, etc. The problem is that this architecture cannot scale - what is the primary technical requirement for Internet architectural elements. The "globalization=internationalization+localization" is why IDNA did not work and IDNs cannot deploy. So, the USA must stay with it, pushing it to the maximum, or to change it and impose the new one, if they want to keep their leadership. Actually there is no problem in fixing the issue (cf.infra): but Unicode - hence the stakeholders - would lose their control. The Tunis deal is about that. Rather than moving together, what is no more necessary due to the emergence of new digital countries, the world gave five (probably ten years) to the USA to get in phase with what the world wants to be by then. The negotiation was about the way we will live together and compete. - The USA keep the control of the legacy, continue to propose their digital umbrella to some countries (ccNSO), try to extend the options (not new features) of the old Internet (ICANN effort in IDNs). - The IGF is free to develop and has to organise. The important is that they do not strangle innovation and dispute. The NGN can be a good base (where the IETF seems interested to cooperate) grassroots efforts can also rely upon. Anyway most of the Internet deployment is ITU based today. The "US team" is now in competition with the IGF (ITU, users, Countries, open source, other US projects): they should now bluff/intoxicate competition so it does not develop too fast, giving a chance to the NSF/DARPA future projects to transition. Some Americans think that http://export.gov/advocacy/ people are not that good. We know by experience that this is not true :-) I opposed RFC 3066 bis at the IETF (similar atmosphere as icannatlarge.org election). My position was to clean the project to prevent ... the commercial confusion creep, to make sure it would not be accepted before Tunis and that it will never fly worldwide. Very similar to the other ICANN/ISOC, etc. issues. Details of the resulting saga (this delayed the project for one year but made it better) are of no interest. Result is more interesting: the IESG approved the RFC 3066 bis at the _same_very_moment_ the USG approved the Tunis deal :-) on Nov. 14th. Now if you want to understand about reality: http://www.theregister.co.uk/2005/12/05/minc_icann_letter/ The architectural solution is simple. Instead of considering unilateralism as the single layer, one has only to understand that every country, every corporate, every community, or every user is unilateral. And that the layer above is their concerted multilateral intergovernance. Then it works and it is much innovation prone; it is more secure and reliable (it is a real distributed network architecture). Instead of imposing a "globalization" you can take care of the "harmonisation". Instead of "internationalizing the American system", you are to universalise the common world digital ecosystem. Instead of localizing the American computers, you are to personalise their pervasive continuity (convergence). Instead of adding languages as options on top of the end to end multilingual architecture, you develop a brain to brain interapplication layer where languages are the upper layer protocols, and to serve the mind to mind vernacular relations between people within their societies, supporting a true multiculturalisation which most probably the biggest market to come. You can use the US "globalization" as the US unilateral layer "default": this works well (you support the tags people define themselves - you can use RTC 4151 URI-tags for that). But, ... there is no more commercial exclusive . The architectural breakthrough is huge. However the USA have not today the searchers and the designers for this (they are more interested by the stabilization of their past deployment, marked by their technical cultures and motivated by their national security). Most of all they have no financial incentive in a market rather than a cultural empowerment context (why to work to lose market shares). The rest of the world is more messy, but the conceptual competition over - after all - simple systems will spur the R&D and innovation. As a French I am interested in gaining shares of the French cultural market, what an American will as losing the International market share corresponding to French users). This is where non-US and culturally smart US developers have a leading role to play. The competition will be on the brain2brain, mind2mind extended services brainware layers. End2end bandwidth basic/valued added oriented services are to lose interest and be merged in the complexity and the versatility of their management. IDN are of low interest on a global basis as the sales of Chinese names show it. Much more sophisticated user behaviours are involved than the dumb ".com" registration sales through absurdly competing registrars. This may be an opportunity for Registrars: to sell more intelligent services. But they will have to understand and adapt to the expectation of the users. The first one being a no-spam network. The more one vernacularises, the less one can offer a low grade service (users of a new technology can accept sunrise inconveniences, not of an established solution). jfc At 18:54 05/12/2005, Danny Younger wrote: Vint,
|