Version française
Home     About     Download     Resources     Contact us    
Browse thread
If OCaml were a car
[ Home ] [ Index: by date | by threads ]
[ Search: ]

[ Message by date: previous | next ] [ Message in thread: previous | next ] [ Thread: previous | next ]
Date: -- (:)
From: Thomas Fischbacher <tf@f...>
Subject: Re: [Caml-list] If OCaml were a car

John Skaller wrote:

>>This might come back and bite you in a couple of years you've got 16-
>>and 32-core processors and you find your parallel GC / operating
>>system really don't scale.
> 
> Wrong approach IMHO. There are physical limits on localisation:
> the speed of light and Heisenberg's constant. We know from
> simple observation of organic systems, that systems clump
> at various levels.

If one look e.g. at the distribution function of something like
the size of debian packages, one will find that it follows a power
law (dN/N(size) ~ a*size^b) remarkably well over multiple orders of
magnitude. Similar distributions can be observed for many systems.

There are deep fundamental reasons that tell us that we should expect
something like this on very general grounds whenever three things
come together:

(1) We have "frustration" in the system in the sense that there are
multiple stable configurations, and lots of them.

(2) The system's constituents are linked up in such a way that reducing
the "internal stress" in one place increases the internal stress in its
neighbourhood, and

(3) The individual components are networked in an effectively
more-than-one-dimensional way. (So, the number of nodes reached
increases with the number of hops.)

These are very general properties that come together in many different
systems. Now, in 1987 a bunch of theoretical physicists (Bak, Tang, and
Weisenfeld) had a deep insight that whenever this happens, then all the
interesting distributions one can derive from such a system should
turn out to follow power-law behaviour. (The key idea is to look at the
weakest condition that gives stability, which will determine the
system's structure, as every stronger condition would constrain
configuration space too much.)

In essence, Perl did two things right: (1) It properly acknowledged that
most problems are small problems that should be resolvable as a one-line
job (unlike C++ which believes all problems to be huge tasks - if they
are not, they first have to be turned into one so that one then can use
C++ to solve them), and (2) it also acknowledged that human language is
irregular in some funny but far from chaotic way, and tried to mimick
some of that irregularity in order to make programming more natural and
hence less tiring.

Quite in general, what deeply worries me (in programming as well as in
a much broader context concerning how our present-day society works)
is that we do not pay proper respect to these fundamental mechanisms
that always cause activity on a variety of different scales. When
making decisions, our reasoning all too often starts out from
false assumptions on scale distributions of system sizes. In
particular, we often design our systems based on assumptions of
extreme scales (say, extreme centralization or extreme
decentralization) rather than "reasonable assumptions about
natural distributions". That is fundamentally wrong, and
Thermodynamics/statistical mechanics can even prove this. :-)

-- 
best regards,
Thomas Fischbacher
tf@functionality.de