A DNN framwork implement with MAGMA {#mainpage}


The Docs for MagmaDNN can be found at https://magmadnn.bitbucket.io/docs/index.html.

How to install

First, you should hg clone this repository.


CMAKE >= 2.8

How to Run


If you are on bridges, please execute:

interact -gpu

to get on a GPU node, and then

bash ./scripts/setup.sh --bridges

It will help you to install the modified MAGMA in the folder 'third_party/magma'.

OR you can install it using CMAKE:

cd debug
cmake -DCMAKE_BUILD_TYPE=Debug ..

Run Benchmark: MNIST

We have an implementation of Fully-Connected Layers and tested it on a benchamrk MNIST, which is a hand-written digits dataset.

The following is the instruction of how to get it run.

On bridges
bash ./scripts/run_MNIST_DNN.sh -s -b   

Please note that you should be on a GPU node to run the example.

Beside using the script to run MNIST, you may run the following command in ./debug folder if you use CMAKE to compile.

make MNIST_DNN   
./MNIST_DNN ../data/mnist/train-images-idx3-ubyte ../data/mnist/train-labels-idx1-ubyte CNN   

You may change CNN to MLP to choose another type of neural network.

Expected Result

If it has run successfully, there will be some compilation detail from CMAKE and the information about the accuracy of the NN. The last few lines should look like:

number of labels: 60000   
The type of the neural network is CNN   
% MAGMA 2.2.0 svn compiled for CUDA capability >= 6.0, 32-bit magma_int_t, 64-bit pointer.   
% CUDA runtime 8000, driver 9000. OpenMP threads 8. MKL 2017.0.3, MKL threads 4.   
% device 0: GeForce GTX 1050 Ti, 1468.0 MHz clock, 4035.7 MiB memory, capability 6.1   
% Thu Nov  2 20:23:44 2017   
Total number of layers: 9   
Turn:   99, Loss: 5.0805, Accuracy: 0.1300   
Turn:  199, Loss: 4.7879, Accuracy: 0.2600   
Turn:  299, Loss: 4.6547, Accuracy: 0.2200   
Turn:  399, Loss: 4.6466, Accuracy: 0.2400   
Turn:  499, Loss: 4.5296, Accuracy: 0.2700   
Turn:  599, Loss: 4.4023, Accuracy: 0.4500   
Turn:  699, Loss: 4.3853, Accuracy: 0.4100   
Turn:  799, Loss: 4.3336, Accuracy: 0.3800   
Turn:  899, Loss: 4.2905, Accuracy: 0.5100   

Run Unit Tests

In MAGMA-DNN, we use Google Test to build unit tests. The code are stored in src with the pattern xxxxxxxx_unittest.cpp.

How to run

First, you use build MAGMA-DNN using CMAKE because it can help you install Google Test from Git.
Assuming that you have built MAGMA-DNN on debug, run the following commands:

cd debug
make ProjUnitTest   



doxygen Doxygen

and the ouput files are in docs/doxygen.

If you are not familiar with how to open the generated doxygen files, you may use python.

cd docs/doxygen/html   
python -m SimpleHTTPServer   

By default, it will run on, and you can open this URL on your browser to read it.
If you are using a remote machine, change to you machine IP address (you may check it using ifconfig).

Have done and Todo

[x] Tensor with MAGMA

[x] Fully Connected Layers

[x] Convolution and Pooling Layers

[] RNN Layers (LSTM and GRU)

[] The activation of pooling layers.
(The parameters of activation function for pooling layers is dummy now since it is always use MAX regards of the parameter)

[] Replace the CNN forward and backward with DNNBlas opeartions

[] Replace the pooling forward and backward with DNNBlas opeartions

[] Add a solver to help build neural networks

[] Add adaptive learning methods for SGD, such as Adagrad

[] Improve the performance (It may never be finished)