strange optimization behavior on node/factor removal on tracking and filtering application

Issue #370 invalid
Emil Jose created an issue

In order to avoid the ever growing factor graph problem, the oldest factors and nodes were removed from the graph and the remaining graph was given for optimization. The optimization works fine and outputs a filtered result till the node with ID - half of the latest id in the graph is removed from the graph. Once the node with id half of that of the latest node is removed, the co-variance matrix of the latest node will go to a high negative value(-10^12) and the nodes added incrementally after that will hold this same co-variance matrix value(-10^12) and no optimization happens. The structure of the graph is preserved along the process and verified(dot file). The node/factor removal starts with the oldest node with id 0 and is incrementally deleted and works as expected and is verified by printing it to dot file. Is there any work around to this problem? or is there any other standard procedure to address this issue?

Comments (2)

  1. Frank Dellaert

    You are describing a problem, but I think this needs a bit more investigation as to what the underlying cause is. The standard procedure would be to try and identify a bug/issue with the code and write a concise (failing) unit test that reproduces the problem, if possible.

  2. Log in to comment