HTTPS SSH

Proximal Deep Structured Models

This repo is the folder for training and testing the proximal deep structured models.

The proximal deep structured model was a powerful structured deep model with continuous-valued output. The proposed algorithm was described in our NIPS 2016 paper. This tool could help conduct inference over a family of CRF models as a RNN-structured feed-forward network and learning can be conducted through back-propagation.

License

The code is licensed under a GPL 3.0 license. MXNET is licensed under an Apache-2.0 license. If you find it useful in your research, please consider citing:

@inproceedings{wang2016proximal,
  title={Proximal Deep Structured Models},
  author={Wang, Shenlong and Fidler, Sanja and Urtasun, Raquel},
  booktitle={Advances in Neural Information Processing Systems},
  pages={865--873},
  year={2016}
}

Code directory

Currently depth refinement code (Sec 4.2) is included.

  • demo_train.py Training code
  • demo_val.py Validation code
  • demo_1mp.py Timing benchmark
  • training_curves.py Training curve parser
  • pdnet.py The computation graph definition
  • solver.py The training solver.
  • data_iter.py The data iterator.

Dependencies

  • MXNET
  • CUDA
  • PIL, NUMPY, SCIPY, MATPLOTLIB, etc. I recommend you to install Anaconda.

Running

Command

  • Download the data from here: data
  • Uncompress the data and put the ./train folder under the root folder of the src code, rename/soft-link to data_denoise
  • Please run demo_train.py for training and demo_val.py for testing. Note that please modify the params path accordingly with your learned parameters.
  • The metric is PSNR and RMSE.

TODOs:

  • Add other experiments
  • Add a more user-friendly automatic computation graph generator
  • Add tensorflow based implementation.
  • Add ADMM/Half-quadratic splitting net.