1. pygame
  2. pygame
  3. pygame

Issues

Issue #156 resolved

Test running failing with eval when tests print something to stdout.

René Dudfield
created an issue

From issue #155

BUGGY TEST RESULTS EVAL:
 {'test.image__save_gl_surface_test': {'errors': [('GL_ImageSave.test_image_save_works_with_opengl_surfaces',
                                                  'Traceback (most recent call last):\n  File "/build/buildd/pygame-1.9.2~pre~ubuntu+2250+2~raring1/test/image__save_gl_surface_test.py", line 29, in test_image_save_works_with_opengl_surfaces\n    screen = pygame.display.set_mode((640,480), OPENGL|DOUBLEBUF)\nerror: OpenGL not available\n',
                                                  '/build/buildd/pygame-1.9.2~pre~ubuntu+2250+2~raring1/test/image__save_gl_surface_test.py',
                                                  '29')],
                                      'failures': [],
                                      'num_tests': 1,
                                      'output': 'E\n======================================================================\nERROR: GL_ImageSave.test_image_save_works_with_opengl_surfaces\n----------------------------------------------------------------------\nTraceback (most recent call last):\n  File "/build/buildd/pygame-1.9.2~pre~ubuntu+2250+2~raring1/test/image__save_gl_surface_test.py", line 29, in test_image_save_works_with_opengl_surfaces\n    screen = pygame.display.set_mode((640,480), OPENGL|DOUBLEBUF)\nerror: OpenGL not available\n\n----------------------------------------------------------------------\nRan 1 test in 0.004s\n\nFAILED (errors=1)\n',
                                      'tests': {'GL_ImageSave.test_image_save_works_with_opengl_surfaces': {'error': ('GL_ImageSave.test_image_save_works_with_opengl_surfaces',
                                                                                                                      'Traceback (most recent call last):\n  File "/build/buildd/pygame-1.9.2~pre~ubuntu+2250+2~raring1/test/image__save_gl_surface_test.py", line 29, in test_image_save_works_with_opengl_surfaces\n    screen = pygame.display.set_mode((640,480), OPENGL|DOUBLEBUF)\nerror: OpenGL not available\n',
                                                                                                                      '/build/buildd/pygame-1.9.2~pre~ubuntu+2250+2~raring1/test/image__save_gl_surface_test.py',
                                                                                                                      '29'),
                                                                                                            'stderr': '',
                                                                                                            'stdout': '',
                                                                                                            'tags': ['slow',
                                                                                                                     'display'],
                                                                                                            'times': [0.0038089752197265625]}}}}
[?1000l[?25h[?0c

Traceback (most recent call last):
  File "run_tests.py", line 9, in <module>
    import test.__main__
  File "/build/buildd/pygame-1.9.2~pre~ubuntu+2250+2~raring1/test/__main__.py", line 131, in <module>
    run(*args, **kwds)
  File "/build/buildd/pygame-1.9.2~pre~ubuntu+2250+2~raring1/test/test_utils/run_tests.py", line 275, in run
    test_results = get_test_results(raw_return)
  File "/build/buildd/pygame-1.9.2~pre~ubuntu+2250+2~raring1/test/test_utils/test_runner.py", line 192, in get_test_results
    return eval(test_results.group(1))
  File "<string>", line 17
    [?1000l[?25h[?0c
    ^
SyntaxError: invalid syntax

The same happens on both i386 and x64 builds. The full log is available here

Comments (6)

  1. Thomas Kluyver

    At present, the start of the eval-able output is marked with the string <--!! TEST RESULTS START HERE !!-->. A simple fix for this case would be to add an end marker as well, so the spurious output is cut off before calling eval()

    That could still be susceptible to extra output mixed in with the results - which is unlikely, but possible if threads are in use. A more robust solution could be to write the information to a temporary file.

  2. René Dudfield reporter

    ah, ok. Nice one.

    Well, I added the end marker in the commit bbb8804. Let's see if that fixes it on the launchpad builders.

    Using a temporary file seems like a bigger change, so I just did the smaller change.

  3. Log in to comment