Destroy fails under windows 7 - cannot access file...used by another process

Issue #197 open
created an issue

Upon occasion, when an attempt is made to destroy an item via the global index or global history page, Windows 7 reports that "The process cannot access the file because it is being used by another process."

See attached file destroy.txt.

Comments (11)

  1. Thomas Waldmann repo owner
    WindowsError: [Error 32] The process cannot access the file because it is being used by another process

    Can you find out what other process is using the file when this happens?

  2. RogerHaase reporter
    • changed status to open

    With a randomized name like that, I am sure there is no other process using the file. :-)

    The error 32... message is usually misleading, the file may have been left open or ?

    Today I uploaded, created, and destroyed 300 files and cannot reproduce the problem. I haven't seen the problem shortly after reporting it. A possibility is that the problem went away after deleting the wiki data and index directories and buildig new ones.

    Will close this issue if it cannot be reproduced in the next month or so.

  3. RogerHaase reporter
    • changed status to open

    Problem has returned. Tried several things, rebuild index, start with empty wiki, reload a wiki from a backup, reboot PC. But create a new item, destroy item consistently fails. Delete works OK.

    Suspect this is a windows only problem caused by an open file, but... cannot explain why the problem could not be reproduced for months and why today the problem is consistent - 10 failures out of 10 tries across 2 repos.

  4. RogerHaase reporter

    Thanks for pointer to Process Explorer. I downloaded it years ago and forgot I had it.

    Lots there that I do not understand, but it looks like storage/stores/ line 76 passes back an open file that does not get closed explicitly, thereby causing the failure on Windows. Process Explorer shows that multiple clicks on the "home" link will open the same wiki/content/data/content file 1, 2, 3, 4, 5, 6... times. Eventually, many instances of the open file will get closed.

    Browsing through several items will cause 10 or more data items to be opened at once, eventually some of these are somehow closed as more are opened.

    The randomness of the problem is caused by my use of the Destroy link on the itemview bar lately, vs. using the Global Index Destroy link earlier. Deleting via the itemview bar succeeds if the server is restarted and Destroy is the first click.

  5. Thomas Waldmann repo owner

    well, the stores and backends returns an open file for the data part of a item revision (the meta part is returned as a dict). this was a design decision to avoid loading big data into memory and to support streaming.

    a while ago, i refactored some parts of the code to use "with" and context managers (esp. in, to make sure files are closed no matter what. maybe we need some more of this stuff on some layer above storage.backend also.

  6. Log in to comment