Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change lexer state according to parser state on the fly #60

Closed
brchiu opened this issue Sep 17, 2020 · 1 comment
Closed

Change lexer state according to parser state on the fly #60

brchiu opened this issue Sep 17, 2020 · 1 comment

Comments

@brchiu
Copy link

brchiu commented Sep 17, 2020

If we want to change lexer states according to parser states, e.g. according to certain token combination patterns, it is not possible to do so because parse method of Parser class uses tokens that be produced from Lexer's tokenize method.

To be exact, I would like to implement similar behavior in asn2wrs.py that changes lexer state after WITH SYNTAX or in specific states, e.g. p_ObjectSet, p_ParameterList... .

Is it any alternative way to do it ?

Thanks.

@brchiu
Copy link
Author

brchiu commented Sep 18, 2020

I record the initial lexer of Lexer1 inside MyParser class, then to switch to Lexer2 in specific states to do the trick.

Thank you for this wonderful project.

class Lexer1(Lexer):
    ...

class Lexer2(Lexer):
    ...

class MyParser(Paser):
    def __init__(self, lexer):
          self.lexer = lexer

    @_("TOK_WITH TOK_SYNTAX lbraceignore SyntaxList rbraceignore")
    def WithSyntaxSpec(self, p):
        return p[3]

    @_("braceignorebegin TOK_LBRACE")
    def lbraceignore(self, p):
        return p[0]
    @_("")
    def braceignorebegin(self, p):
        self.lexer.begin(Lexer2)

    @_("braceignoreend TOK_RBRACE")
    def rbraceignore(self, p):
        return p[0]

    @_("")
    def braceignoreend(self, p):
        self.lexer.begin(Lexer1)

if __name__ == "__main__":
    lexer1 = Lexer1()
    lexer2 = Lexer2()
    parser = MyParser(lexer1)
    parser.parse(lexer.tokenize(data))

@brchiu brchiu closed this as completed Sep 18, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant