Version française
Home     About     Download     Resources     Contact us    

This site is updated infrequently. For up-to-date information, please visit the new OCaml website at

Browse thread
[Caml-list] CFG's and OCaml
[ Home ] [ Index: by date | by threads ]
[ Search: ]

[ Message by date: previous | next ] [ Message in thread: previous | next ] [ Thread: previous | next ]
Date: 2004-08-13 (16:03)
From: David McClain <David.McClain@A...>
Subject: Re: [Caml-list] CFG's and OCaml
Uhuhh... Yes, I did that same "evil" thing as well, even before 
discussing all these reduce/reduce conflicts/

What I find is that these screwball little tricks might help, and the 
might not. YACC is just too darned sensitive to minor and non-obvious 
perturbations in the input grammar specification. Realizing its legacy, 
indeed it does arise from the old IBM-360, or more properly PDP-10, 
days. That was the style of programming back then.. I remember it well.

I see that the root of the problem really lies in the reducion from 
LL(1) to LALR(1) where driver table entries are shared as much as 
possible to diminish the size of these tables. However, when that 
occurs, we end up getting mixes like we have between the distinct 
expression and pattern subtrees in the grammar.

Indeed these do appear syntactically similar, yet by way of 
specification, we are hoping to apply completely different semantic 
actions to these reductions. This is a dilemma.

I had though the other night about creating a common subtree of 
semantic actions and then let the higher levels unravel that subtree. 
That would probably work well here, but it is a lot more work.

David McClain
Senior Corporate Scientist
Avisere, Inc.
+1.520.390.7738 (USA)

On Aug 13, 2004, at 08:49, Brian Hurt wrote:

> On Fri, 13 Aug 2004, David McClain wrote:
>> Okay... here's a case where when I do "exactly" what the gurus at 
>> Inria
>> do, I get a reduce/reduce conflict, and yet when I build OCaml it does
>> not report any such conflicts. [I say "exactly" because obviously I'm
>> not...]
>> simple_expr:
>> 	constant
>> 	...
>> simple_pattern:
>> 	signed_constant
>> 	...
>> constant:
>> 	INT
>> | 	FLOAT
>> signed_constant:
>> 	constant
>>   ;; /*  ---------------------------------------------------------- */
>> The reduce/reduce conflict comes on deciding whether to assign an INT
>> seen to signed_constant which will reduce to simple_pattern, or 
>> instead
>> to become constant which reduces to simple_expr. Both Inria and I do
>> completely different semantic reductions in these two cases, and so a
>> reduce/reduce conflict could be fatal here...
> When the compiler sees an int, which path should it take?
> My advice would be to remove the constant from signed_constant's 
> patterns.
>> So, as so often happens with the master's touch, everything works fine
>> for them, but it doesn't for me. Why should this be, in this example
>> case?
> They're doing something evil- take a look at line 355 of
> parsing/parser.mly in 3.08.0.
> -- 
> "Usenet is like a herd of performing elephants with diarrhea -- 
> massive,
> difficult to redirect, awe-inspiring, entertaining, and a source of
> mind-boggling amounts of excrement when you least expect it."
>                                 - Gene Spafford
> Brian

To unsubscribe, mail Archives:
Bug reports: FAQ:
Beginner's list: