Add numerical test of cost function gradient

Issue #68 resolved
andrew_peterson repo owner created an issue

It would be fairly easy to have a mistake in the cost function gradient calculation, as it's rather complicated.

However, it would be fairly easy to have a function/method that numerically checks the cost function gradient. E.g., it only needs to numerically perturb each parameter, one-by-one, and this should closely approximate the true gradient.

I think we should be able to make a generic implementation of this so it can work on any model / descriptor combo.

Comments (3)

  1. andrew_peterson reporter

    Actually, I think we are likely to find a bug when we do this. I noticed that the "test_guassian_neural.py" script is not converging, even for a simple system and loose convergence parameters (like 0.02 for energy rmse). It appears that the scipy optimizer gives up. On a hunch, I tried not feeding fprime to the scipy optimizer and it slowly converged (very slowly as it has to compute numerical derivatives at each step, but it converges). This leads me to believe that it is getting a bad value of fprime.

  2. Alireza Khorshidi

    Just added numerical loss function gradients in the commit 2bf2998, for now in pure python mode only.

    In the first try, it seems that get_dEnergy_dParameters gives the same results as get_numerical_dEnergy_dParameters, but get_dForces_dParameters gives different results from get_numerical_dForces_dParameters! Will look more carefully tomorrow.

  3. Log in to comment