SuperNN
1.0.0
|
CSuperNN::ActFunc | Activation function dispatcher |
CSuperNN::Connection | Synaptic connection between two neurons |
CSuperNN::Elliot | Elliot sigmoid-like function |
CSuperNN::ElliotSymmetric | Elliot sigmoid-like function (Symmetric) |
▼Cstd::exception | STL class |
CSuperNN::Exception | The exception can be identified by the type() method |
CSuperNN::Gaussian | Gaussian function |
CSuperNN::GaussianSymmetric | Gaussian symmetric function |
CSuperNN::NBN::Hist | |
CSuperNN::Linear | Linear function |
CSuperNN::Network | Artificial neural network structure that supports arbitrary feedforward topologies, like multilayer perceptrons and fully connected cascade networks |
CSuperNN::Neuron | Neuron, that can contain connections to neurons in the next layers |
CSuperNN::Runner | Auxiliary class to ease the usage of an already trained neural network |
CSuperNN::Sigmoid | Activation functions were not implemented in an OO way due to performance |
CSuperNN::SigmoidSymmetric | Sigmoid symmetric function |
CSuperNN::Sign | Sign function (net >= 0 ? 1 : -1) |
CSuperNN::SInfo | Minimum / maximum scaling information |
▼CSuperNN::TrainingAlgorithm | Abstract class that provides the calculation of the error derivatives and the error accumulation, used by the derived backpropagation training algorithms |
▼CSuperNN::ImplBackprop | Base class for the standard backpropagation algorithm |
CSuperNN::Batch | Batch backpropagation |
CSuperNN::Incremental | Incremental backpropagation |
▼CSuperNN::IRprop | Improved resilient backpropagation algorithm |
CSuperNN::IRpropL1 | Modified improved resilient backpropagation algorithm |
CSuperNN::NBN | Neuron by Neuron algorithm |
▼Cstd::vector< T > | STL class |
CSuperNN::Bounds | Data scaling information, for all input and output neurons |
CSuperNN::Data | Data used in training, validation and testing |
CSuperNN::Layer | Array of neurons |