Manual:Lexer:Limitations
まんま
>>> from lepl import * >>> matchers = (Integer() | Literal('-'))[:] & Eos() >>> list(matchers.match('1-2')) [(['1', '-2'], ''), (['1', '-', '2'], '')] >>> tokens = (Token(Integer()) | Token(r'\-'))[:] & Eos() >>> list(tokens.match('1-2')) [(['1', '-2'], Line(None)[0:])] >>> matcher = Token(Any()) & Any() >>> matcher.parse('') Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/Python-3.1/lib/python3.1/site-packages/lepl/matchers.py", line 187, in parse return self.null_parser(config)(stream) File "/usr/local/Python-3.1/lib/python3.1/site-packages/lepl/matchers.py", line 151, in null_parser return make_parser(self, config.stream.null, config) File "/usr/local/Python-3.1/lib/python3.1/site-packages/lepl/parser.py", line 243, in make_parser matcher = make_matcher(matcher, stream, config) File "/usr/local/Python-3.1/lib/python3.1/site-packages/lepl/parser.py", line 227, in make_matcher matcher = rewriter(matcher) File "/usr/local/Python-3.1/lib/python3.1/site-packages/lepl/lexer/rewriters.py", line 109, in rewriter tokens = find_tokens(matcher) File "/usr/local/Python-3.1/lib/python3.1/site-packages/lepl/lexer/rewriters.py", line 62, in find_tokens for n in non_tokens))) lepl.lexer.matchers.LexerError: The grammar contains a mix of Tokens and non-Token matchers at the top level. If Tokens are used then non-token matchers that consume input must only appear "inside" Tokens. The non-Token matchers include: Any.
マニュアルでは「list(functions.match('1-2'))」と間違っていたけど