# Overview

Atlassian Sourcetree is a free Git and Mercurial client for Windows.

Atlassian Sourcetree is a free Git and Mercurial client for Mac.

# A Lagrangian Propagator for Artificial Neural Networks in Constraint Programming

*NOTE: the content of this repository sums to ~560MB, largely because the code is shipped in a vagrant Virtual Machine.*

## Introduction

This repository contains resources to the paper titled "A Lagrangian Propagator for Artificial Neural Networks in Constraint Programming" [1], which presents a new propagator for global constraint that encode Artificial Neural Networks (ANNs) in Constraint Programming (CP).

The first method for encoding ANNs in CP has been proposed in [2]. The method was later generalized into the Empirical Model Learning approach (see [3] and [4]), based on the key idea of embedding a Machine Learning model into a more complex and comprehensive combinatorial optimization model. This technique allow to approximately represent the behavior of a system that is impervious w.r.t. conventional modeling efforts.

The idea is related to so-called surrogate models employed in Derivative Free (or black box) Optimization. However, such approaches are typically designed to tackle problems without a complex combinatorial structure. Conversely, in EML, the emphasis is on (i) integrating the Machine Learning model with a more complex, traditional optimization model, and on (ii) taking advantage as much as possible of the structure of the Machine Learning model for boosting the search process.

For example, the original approach from [2] proposed a simple algorithm to enforce Bound Consistency on global constraints that encode single artificial neurons. In [5] we have strengthened such approach by targeting two-layer ANNs rarther than single neurons, and by using a Lagrangian relaxation to compute bounds on the network *output*.

In [1], i.e. the paper associated to this repository, we have extended our approach from [5] by introducing a technique to perform reduced-cost based filtering on the network *input*, and by experimenting with different overhead reduction techniques. The paper contains also a larger experimental evaluation.

## Content of the Repository

This repository contains:

- The code for the Lagrangian propagator, in the
`code-vm`

(see the related description) - The Artificial Neural Networks considered in our experimentation, in the
`anns`

folder (see the related description) - The instances considered in our experimentation, in the
`instances`

folder (see the related description) - All our experimental results, in the
`results`

folder (see the related description)

## References

[1] Lombardi, Michele, and Stefano Gualandi. "A lagrangian propagator for artificial neural networks in constraint programming." Constraints (2015): 1-28.

[2] Andrea Bartolini, Michele Lombardi, Michela Milano, Luca Benini: "Neuron Constraints to Model Complex Real-World Problems". CP 2011: 115-129

[3] Andrea Bartolini, Michele Lombardi, Michela Milano, Luca Benini: "Optimization and Controlled Systems: A Case Study on Thermal Aware Workload Dispatching". AAAI 2012

[4] Alessio Bonfietti, Michele Lombardi, Michela Milano: "Embedding Decision Trees and Random Forests in Constraint Programming". CPAIOR 2015: 74-90

[5] Michele Lombardi, Stefano Gualandi: "A New Propagator for Two-Layer Neural Networks in Empirical Model Learning". CP 2013: 448-463

[6] Stefano Gualandi, Federico Malucelli: "Resource Constrained Shortest Paths with a Super Additive Objective Function". CP 2012: 299-315