[antlr-interest] Yet an another TokenStream ( for C++ Target )

A Z asicaddress at gmail.com
Sat Feb 18 11:26:25 PST 2012


Hi Gokulakannan,

  I played with idea myself, since it is the most efficient way, but I
don't see how it can work for any LL(*) grammar without adding grammar
constraints. If k is set in the options then it is much easier and actually
someone did this a few years ago:
http://markmail.org/message/ej25vkco44ppkaxt


b) Every time a rule's action is executed in execution mode( when
> backtracking == 0 ), it will delete all the tokens except the first and
> last token.
>
I don't quite follow you here. Actions can appear anywhere in rules and
since you'll be in a rule at all times, the only way to know which tokens
are safe to delete is to track all the tokens used by rule actions and only
delete them after the action executes.
  The codegen probably knows about all calls to $token_vars in actions but
it likely doesn't know about calls to LT(n). I think this approach can work
but it will probably require that any user code cannot access the token
stream or any tokens directly except with $ references.






On Sat, Feb 18, 2012 at 11:43 PM, Gokulakannan Somasundaram <
gokul007 at gmail.com> wrote:

> one small correction.
>
> This can be enabled by changing the trait TOKENS_ACCESSED_FROM_OWNING_RULE
> to true.
>
> Thanks,
> Gokul.
>
> On Sun, Feb 19, 2012 at 1:23 AM, Gokulakannan Somasundaram <
> gokul007 at gmail.com> wrote:
>
> > Hi,
> >    I am planning to write a TokenStream with the following
> characteristics
> > a) It will fetch k number of tokens every time it is called. ( I am
> > setting a default of 100)
> > b) Every time a rule's action is executed in execution mode( when
> > backtracking == 0 ), it will delete all the tokens except the first and
> > last token.
> > c) After the tokens are deleted, if they are tried to be accessed, this
> > will throw an exception
> >
> > Advantages:
> > Memory usage will be be optimal. It will be dependant on
> > a) amount of backtracking required
> > b) Maximum number of tokens covered in a single rule
> > c) Amount of look-ahead( This will never be a determining factor )
> >
> > Disadvantages
> > a) As a rule will lose its tokens except start and stop token, any
> attempt
> > to refer to those tokens would result in a exception.
> >
> > But this can be overcome by storing the required data of those tokens in
> a
> > variable storage, during the execution of the rule.
> >
> > This can be enabled by changing the trait
> TOKENS_ACCESSED_FROM_OWNING_RULE
> > to false.
> >
> > Any Comments / Suggestions?
> >
> > Thanks,
> > Gokul.
> >
>
> List: http://www.antlr.org/mailman/listinfo/antlr-interest
> Unsubscribe:
> http://www.antlr.org/mailman/options/antlr-interest/your-email-address
>


More information about the antlr-interest mailing list