Version franaise
Home About Download Resources Contact us
Browse thread
The Implicit Accumulator: a design pattern using optional arguments
[ Home ] [ Index: by date | by threads ]
[ Search: ]

[ Message by date: previous | next ] [ Message in thread: previous | next ] [ Thread: previous | next ]
Date: -- (:)
From: Quc_Peyrot <chojin@l...>
Subject: Re: [Caml-list] The Implicit Accumulator: a design pattern using optional arguments

On Jun 27, 2007, at 3:55 PM, Thomas Fischbacher wrote:

> Quc Peyrot wrote:

[...]

> > let encrypt ?encrypted str =
> >   let len = String.length str in
> >   let result = match encrypted with
> >     | None -> String.create len
> >     | Some s -> s
> >   in
> >   (* ... *)
> >   result
>
>> Which gave me more than a 2x speedup while still being able to  
>> call a  simple:
>> let encrypted = encrypt str
>> during normal usage
>
> I use this strategy a lot and found that it eventually pays to use
> uniform conventions for that: all my functions that can benefit from
> having space pre-allocated where to write a result to use ?target
> as their very first named optional argument (and also return that
> target buffer afterwards).

Ah, thanks, I was actually trying to find a common name too, but
didn't really like "result". "target" is nice :p

> However, unless I am mistaken, I fear that this also does introduce
> some intrinsic/unavoidable inefficiency, as providing a ?target
> argument will (presuambly?) require dynamic consing of a <thingy>
> option cell - so not a good idea for a very small function that is
> called very very often.
>
> There are many many way more advanced tricks one would want to play
> with the idea of "allocating buffers at the appropriate time". For
> example, if this were LISP, one could often use dynamically scoped (in
> the sense of (declare (dynamic-extent buffer-stack))) contextual
> variables for great benefit (and these gory details often can also be
> hidden quite conveniently under a few (maybe even in-place macrolet)
> macros...), but unfortunately, OCaml does not support anything like
> that. Of course, re-entrantness of your code may always become an
> issue if you move buffers to higher scopes.

I didn't get that part, but I'm not familiar with Lisp.

> One thing OCaml can do better than, say, CMU CL, is to define globally
> visible functions that depend on some otherwise inaccessible context,
> as in the following example:
>
> let float_factorial =
>   let _known_factorials = ref [|1.0;1.0;2.0;6.0;24.0;120.0;720.0|] in
>   (fun n ->

I encountered this pattern today while reading extlib's OptParse.Opt  
code:

let value_option metavar default coerce errfmt =
   let data = ref default in
   {
     option_metavars = [metavar];
     option_defhelp = None;
     option_get = (fun _ -> !data);
     option_set_value = (fun x -> data := Some x);
   (*...*)

I was a little bit surprised at first that we could do that  
(let ...ref...  in) but it's really nice.
To me it seems that the common feature which enables us to do all  
these tricks is the fact that we have a garbage collector (correct me  
if I am wrong). It's really powerful, and I find it fascinating.

I mean, for someone like me, with quite some experience in the asm/c/c 
++ world (i.e. a garbage collector-less world) but not much in other  
languages, it's easy to naively think of a garbage collector as a  
fancy feature to prevent from having to call "free/delete". But I'm  
starting to realize there is a whole new set of powerful design  
patterns which come along. It has been said multiple times on this  
mailing list, but I think we really miss a book about these design  
patterns and optimization tricks often specific to a given (or a set  
of) feature (functional, lazy computations, garbage collector...).

I find it ironical that high-level languages (such as ocaml) are  
intended (of course that's my interpretation of it) to hide low-level  
details and give you more expressiveness in your code, which should  
naively make you more productive, and make it easier to program  
something. But requires therefore tons of new knowledges and deep  
understanding of advanced concepts to be able to actually code  
efficient (runtime and memory-wise) code.

I mean, in asm/c/c++ there isn't much feature to learn, you pretty  
much do everything yourself. It's therefore quite easy (comparing to  
OCaml) to actually see what is efficient and what is not. OCaml is so  
high-level, and is doing so much for you, that you really need to  
learn a lot more about compilation theory to be able to actually feel  
at ease when you are looking for efficiency without giving up too  
much code elegance. But don't get me wrong, I love it, it's  
fascinating, but still ironical from my point of view.

> Other advanced optimization techniques that can be used for benefit
> in very specialized situations include (explicit) continuation coding:
> rather than returning a value (e.g. a tuple), you take as an argument
> a function to which you then pass on your return value(s). This is  
> quite
> useful whenever "execution flow branches out into multiple paths that
> have to be taken", and may sometimes (though rarely) also be used for
> good as a poor man's VALUES/MULTIPLE-VALUE-BIND substitute.

I didn't get that part at all. I think I would need an example to  
understand
why it is interesting to pass the function instead of just returning  
the tuple
and processing it.

-- 
Best Regards,
Quc