Re: Map is not tail recursive

From: Xavier Leroy (xleroy@pauillac.inria.fr)
Date: Mon Jan 11 1999 - 12:03:58 MET


Date: Mon, 11 Jan 1999 12:03:58 +0100
From: Xavier Leroy <xleroy@pauillac.inria.fr>
To: Juan Jose Garcia Ripoll <jjgarcia@ind-cr.uclm.es>,
Subject: Re: Map is not tail recursive
In-Reply-To: <369885D1.1C8A1269@ind-cr.uclm.es>; from Juan Jose Garcia Ripoll on Sun, Jan 10, 1999 at 11:49:53AM +0100

> I've had a look at the List package and it seems that it is not properly
> tail recursive.

No, it is not. Like many other functions on lists.

> Even more, for medium to large lists it exhausts the
> stack. I would suggest either recoding it as
> [with List.reverse]
> Another possibility would be to introduce destructive operations such as
> Scheme's setcdr! and setcar!. This would eliminate the need of using
> List.reverse, at the cost of introducing some imperative style.

The solution using List.reverse has unacceptable run-time penalty. In
particular, it heap-allocates twice as much than the natural
implementation.

setcdr! for lists is not available in the language, since lists are
immutable. The underlying implementation model supports setcdr!
(at some cost, though, because of the generational garbage
collection), so we could have a "magic" implementation of List.map
that uses setcdr!, but it's always preferable to remain within what
the language can express.

Let me put this another way. There are some library functions for
which no "one size fits all" implementation exist: i.e. the
implementation favors one style of use over others. For List.map or
list concatenation, for instance, one has to choose between running at
full speed on small to medium-sized lists, or being tail-recursive and
handling arbitrarily large lists at a significant cost in speed and
clarity of implementation. I chose the former approach. So, if you
really need to handle large lists, you may have to write your own
(tail-rec and slow) map function. It's not really code duplication,
in that your map function will have quite different running behavior
than the one in the standard library.

I would also contend that if your program routinely manipulate
100000-element lists, then you're probably using the wrong data
structure anyway. But that's a different issue.

Regards,

- Xavier Leroy



This archive was generated by hypermail 2b29 : Sun Jan 02 2000 - 11:58:17 MET