do_insertions bug with no tokens

Issue #180 resolved
Former user created an issue

This manifested itself as the html+php lexer not being able to recover from a missing ?> at eof.


from pygments.lexers import get_lexer_by_name lex = get_lexer_by_name("html+php") lex <pygments.lexers.HtmlPhpLexer> print list(lex.get_tokens("<?php")) [] }}}

The bug appears to be in pygments.lexer.do_insertions, it's not checking for leftover insertions after the token loop.


Reported by Tim Hatch

Comments (2)

  1. Log in to comment