Improve the time complexity of releasing a large number of R objects

Issue #215 new
Laurent Gautier created an issue

rpy2 is using the so-called R "precious list" to preserve objects from garbage collection on the R side.

While the preservation of an R objects is in constant time (append it to the "precious list"), its release is in O(n) (n being the total number of R objects preserved) as the object needs to be found in the list before it can be released. This can be noticeable when a Python container hold a large number of R objects is freed (see #177).

Keeping R objects to be protected in a R container where the lookup is more efficient would be a way to speed up the release.

Comments (2)

  1. Laurent Gautier reporter

    I have implemented the use of an R environment (with hashing), and the outcome seems somewhat worse than with using the "precious list"


    The environment containing all R objects to be preserved is created with hash=TRUE, but the time to delete entries appears to be strongly dependent on the number of entries in the environment. So much for the expected O[1]...

  2. Log in to comment