Initial weight for retraining?
What is the initial weight for retraining? Is that the weight in amp file?
Comments (7)
-
-
reporter you mean if I load the existing amp file to retrain a set of images, the starting point will be the weight recorded in this amp file?
-
Yeap. Look at this line: https://bitbucket.org/andrewpeterson/amp/src/3c78aff4591ba125e8da22e81919a783a6694068/amp/model/neuralnetwork.py#lines-205 When you instantiate the
NeuralNetwork
class from.load()
, the weights are present and they are not going to be initialized. -
reporter Thanks, another question is if the fingerprints of images will be recalculated during retraining?
-
No, if you keep your
.ampdb
directories in the same location where you performed a previous training. In that case, Amp will try to open neighbor lists, fingerprints, their derivatives from those.ampdb
databases instead of computing them again. Just be careful to not mix those databases!. -
reporter I see, thanks very much.
-
repo owner - changed status to resolved
Not a bug...
- Log in to comment
Depending on what you are referring to:
1) If you are using the
retries
keyword argument, thenrandomize()
is called withweights=True
andscalings=True
which will make your optimization to restart in another point of the loss function space.2) If you would like to retrain, then there is this information on https://amp.readthedocs.io/en/latest/useamp.html?highlight=amp#re-training. There you open an
.amp
calculator file and use those weights as your starting point in the loss function to minimize it further.Does that make sense?