Issue while combining coverage files : ValueError: Expecting object: line 1 column 73665 (char 73664)

Issue #559 closed
Anish
created an issue

Hi, I am using coverage-4.3.4 for generating coverage reports for my project, 1.The parallel flag is enabled . 2.Also i am COVERAGE_FILE= variable to override the location of coverage metadata files. 3.Using subporocess instrumentation by adding coverage coverage.process_startup() in sitecustomize.py After running my tests and trying to combine the reports i am getting below error message Coverage.py warning: Couldn't read data from '/var/log/PyCov/PyCov.dhcp-8-95.13499.861792': ValueError: Expecting object: line 1 column 73665 (char 73664)

Regards, Anish

Comments (8)

  1. Anish reporter

    Thanks for the reply Ned.

    Possibly before the service completes the write i am stopping all services for it to dump the coverage metadata,but i dont see any active process hooked to COVERAGE_FILE location

    lsof +D /var/log/PyCov COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME bash 4968 root cwd DIR 8,3 4096 647517 /var/log/PyCov lsof 34388 root cwd DIR 8,3 4096 647517 /var/log/PyCov lsof 34389 root cwd DIR 8,3 4096 647517 /var/log/PyCov

    but there is no process other than lsof and bash accessing this location,still some files are truncated .

    Is there any good way to make sure that the file doesn't get truncated or any configuration setting which make sure the write is completely?It is not a kill though it is a graceful stop

    Thanks , Anish

  2. Ned Batchelder repo owner

    If you are allowing the process to end gracefully, then I don't see why the data wouldn't get written. If you want, try adding a flush. In coverage/data.py, at the end of CoverageData.write_fileobj, add "file_obj.flush()". See if that improves things.

  3. Anish reporter

    Hi Ned, I have added flush in the data.py as below ,but still hitting this issue # Write the data to the file. file_obj.write(self._GO_AWAY) file_obj.flush() json.dump(file_data, file_obj)

    Error Message Coverage.py warning: Couldn't read data from '/var/log/PyCov/.coverage.sc-rdops-sys02-dhcp-53-178.eng.test.com.38111.291278': ValueError: Expecting object: line 1 column 4096 (char 4095) Coverage.py warning: Couldn't read data from '/var/log/PyCov/.coverage.sc-rdops-sys02-dhcp-53-178.eng.test.com.40784.110718': ValueError: Expecting object: line 1 column 94208 (char 94207) Coverage.py warning: Couldn't read data from '/var/log/PyCov/.coverage.sc-rdops-sys02-dhcp-53-178.eng.test.com.41928.763865': ValueError: Expecting object: line 1 column 176128 (char 176127) Coverage.py warning: Couldn't read data from '/var/log/PyCov/.coverage.sc-rdops-sys02-dhcp-53-178.eng.test.com.38116.455212': ValueError: No JSON object could be decoded Coverage.py warning: Couldn't read data from '/var/log/PyCov/.coverage.sc-rdops-sys02-dhcp-53-178.eng.test.com.40203.167990': ValueError: Expecting object: line 1 column 61440 (char 61439) Coverage.py warning: Couldn't read data from '/var/log/PyCov/.coverage.sc-rdops-sys02-dhcp-53-178.eng.test.com.38737.504823': ValueError: Expecting object: line 1 column 81920 (char 81919)

  4. Anish reporter

    Hi Ned , Just to give a context we have about 10 python services and few health services which are subprocess, running with different user privileges .We need to capture coverage for these services and subprocesses which these enable during lifecycle

    I am making the below changes to enable coverage

    1.Downloaded the coverage module locally and created a symbolic link under sys.path/site-packages

    2.Created a folder /var/log/PyCov/ with all permissions and setting that as data_file = /var/log/PyCov/.coverage in .coveragerc

    3.Created a sitecustomize.py in sys.path and exported
    COVERAGE_PROCESS_START as .coveragerc path with coverage.process_startup()

    and after doing this we restart all services and carry out various operation on the same

    so each of this processes writes into data_file path as i am using parallel flag ,before generating coverage i stop all services for it to dump metadata and then run the combine command in the data_file

  5. Ned Batchelder repo owner

    I don't see anything in your description that would point to a problem. I don't understand why you aren't installing coverage with "pip install coverage": why make a manual symlink to the downloaded package? Are you sure you have parallel=True in your .coveragerc? If not, then multiple processes could be trying to write the same file.

  6. Log in to comment