Commits

Benjamin Peterson  committed c13abed

tokenize is just broken on test_pep3131.py

  • Participants
  • Parent commits 3e9c882
  • Branches 3.2

Comments (0)

Files changed (1)

File Lib/test/test_tokenize.py

     >>> tempdir = os.path.dirname(f) or os.curdir
     >>> testfiles = glob.glob(os.path.join(tempdir, "test*.py"))
 
+tokenize is broken on test_pep3131.py because regular expressions are broken on
+the obscure unicode identifiers in it. *sigh*
+    >>> testfiles.remove(os.path.join(tempdir, "test_pep3131.py"))
     >>> if not support.is_resource_enabled("cpu"):
     ...     testfiles = random.sample(testfiles, 10)
     ...