do_insertions bug with no tokens

Issue #180 resolved
Former user created an issue

This manifested itself as the html+php lexer not being able to recover from a missing ?> at eof.

{{{

from pygments.lexers import get_lexer_by_name lex = get_lexer_by_name("html+php") lex <pygments.lexers.HtmlPhpLexer> print list(lex.get_tokens("<?php")) [] }}}

The bug appears to be in pygments.lexer.do_insertions, it's not checking for leftover insertions after the token loop.

See http://trac.edgewall.org/ticket/4464#comment:8

Reported by Tim Hatch trac@timhatch.com

Comments (2)

  1. Log in to comment