Version française
Home     About     Download     Resources     Contact us    
Browse thread
Re: [Caml-list] Re: Why OCaml sucks [unicode]
[ Home ] [ Index: by date | by threads ]
[ Search: ]

[ Message by date: previous | next ] [ Message in thread: previous | next ] [ Thread: previous | next ]
Date: -- (:)
From: Gordon Henriksen <gordonhenriksen@m...>
Subject: Re: [Caml-list] Re: Why OCaml sucks [unicode]
On May 14, 2008, at 01:04, Mike Lin wrote:

> Re: unicode support, anyone else who was around programming Win32  
> 5-10 years ago (before .NET) might join me in offering a wary word  
> of caution: the otherwise-disarming "let's just support both in one  
> runtime" philosophy can be taken too far. In the nightmare scenario  
> (which was reality until Windows 9x finally died an overdue death)  
> you end up with two versions of every library function that takes a  
> string. (This when we now have like 3 contenders for a standard  
> library and n GUI toolkit bindings!)

FWIW, this continues to be the reality. Win32 entry points that handle  
strings are declared as such:

BOOL DoFooW(...); // UTF-16 variant takes wchar_t. Always UTF-16.
BOOL DoFooA(...); // “ANSI” character set variant takes char. Could be  
UTF-8.

#ifdef _UNICODE
#  define DoFoo DoFooW
#else
#  define DoFoo DoFooA
#endif

Doubtless, the implementation in most cases transcodes the string in  
one variant and passes it to the other.

> I'll take a couple days to try to reason out more specifically about  
> how that got so out of control, but I just want to note that there  
> is a precedent for trying to have a sympatric speciation between  
> ASCII and unicode software and, while it was necessary for  
> historical reasons, it was pretty godawful to deal with!

— Gordon