Proximal Deep Structured Models

This repo is the folder for training and testing the proximal deep structured models.

The proximal deep structured model was a powerful structured deep model with continuous-valued output. The proposed algorithm was described in our NIPS 2016 paper. This tool could help conduct inference over a family of CRF models as a RNN-structured feed-forward network and learning can be conducted through back-propagation.


The code is licensed under a GPL 3.0 license. MXNET is licensed under an Apache-2.0 license. If you find it useful in your research, please consider citing:

  title={Proximal Deep Structured Models},
  author={Wang, Shenlong and Fidler, Sanja and Urtasun, Raquel},
  booktitle={Advances in Neural Information Processing Systems},

Code directory

Currently depth refinement code (Sec 4.2) is included.

  • Training code
  • Validation code
  • Timing benchmark
  • Training curve parser
  • The computation graph definition
  • The training solver.
  • The data iterator.


  • CUDA
  • PIL, NUMPY, SCIPY, MATPLOTLIB, etc. I recommend you to install Anaconda.



  • Download the data from here: data
  • Uncompress the data and put the ./train folder under the root folder of the src code, rename/soft-link to data_denoise
  • Please run for training and for testing. Note that please modify the params path accordingly with your learned parameters.
  • The metric is PSNR and RMSE.


  • Add other experiments
  • Add a more user-friendly automatic computation graph generator
  • Add tensorflow based implementation.
  • Add ADMM/Half-quadratic splitting net.