This manifested itself as the html+php lexer not being able to recover from a missing ?> at eof.
from pygments.lexers import get_lexer_by_name lex = get_lexer_by_name("html+php") lex <pygments.lexers.HtmlPhpLexer> print list(lex.get_tokens("<?php"))  }}}
The bug appears to be in
pygments.lexer.do_insertions, it's not checking for leftover insertions after the token loop.
Reported by Tim Hatch firstname.lastname@example.org