Commits

Ruben Martinez-Cantin committed 6193494

Syncronizing

Comments (0)

Files changed (10)

Compatibility.wiki

+==List of compatibility tests:==
+
+|=API |= Linux/gcc |= Linux/clang |= MacOS/gcc |= Windows/MinGW |= Windows/VisualStudio |
+| C/C++ | {{check_green.png}}  | {{check_green.png}} | {{check_green.png}} | {{check_green.png}} | {{check_green.png}} |
+| Matlab | {{check_green.png}} | {{check_green.png}} | {{check_green.png}} | {{balloon_yellow.png}} | {{check_green.png}} |
+| Octave | {{check_green.png}} | {{check_green.png}} | {{check_green.png}} | {{balloon_yellow.png}} | {{balloon_yellow.png}} |
+| Python | {{check_green.png}} | {{check_green.png}} | {{check_green.png}} | {{balloon_yellow.png}} | {{balloon_yellow.png}}|
+
+{{check_green.png}} Working
+{{balloon_yellow.png}} Not tested
+{{cross_red.png}} Not working
+
+==Notes on tested systems: ==
+* Linux: Debian 6.0, Ubuntu 11.04, 11.10, 12.04
+* MacOS: 10.6 (Snow Leopard)
+* MS Windows: XP, 7
+
+Compilers:
+* gcc: different versions from 4.1 to 4.4.
+* clang: version 2.7
+* minwg: version 1.0
+* Visual Studio 2012, Visual Studio Express 2010
+
+Interfaces
+* Matlab: 2010a, 2010b, 2011b.
+* Octave: version 3.2
+* Python: version 2.5, 2.6, 2.7
+
+
+\\
+\\
+\\
+Icons: [[http://www.doublejdesign.co.uk/|Double-J designs]]

Dependencies.wiki

+=== 2 - DEPENDENCIES: ===
+
+
+==== 2.1 - BOOST: ==== 
+
+This code uses Boost libraries for matrix operations (uBlas) and random
+number generation. They can be found in standard linux distributions or
+it can be downloaded from ([[http://www.boost.org]]). Since they are pure template
+libraries, they do not require compilation. Just make sure the headers are
+on the include path.
+
+They are not very efficient, so it may change in future versions.
+
+
+====  2.2 - Inner-loop Nonlinear Optimization library: ==== 
+
+This library requires some other nonlinear optimization library for the inner loop
+(e.g.: DIRECT). 
+
+===== a) Using NLOPT (default): ===== 
+
+We recommend the use of NLOPT for the inner loop optimization. The latest
+version can be downloaded from 
+
+[[http://ab-initio.mit.edu/wiki/index.php/NLopt]]
+
+NLOPT does not require external libraries and it works in many operating systems. 
+
+**Linux and Mac OS:**
+If you are going to use shared libraries (required for the Python inteface), then it is easier if you compile NLOPT with the shared flag (fPIC)
+
+{{{
+> ./configure --enable-shared  
+> make
+> sudo make install
+}}}
+
+**Windows:**
+As compiling it in Windows is tricky, there are precompilled dlls for download.
+
+===== b) Using Fortran DIRECT: ===== 
+
+For completeness, this code includes a Fortran 77 implementation of the
+DIRECT-L algorithm by J. Gablonsky
+
+J. M. Gablonsky and C. T. Kelley, "A locally-biased form of the DIRECT 
+algorithm," J. Global Optimization, vol. 21 (1), p. 27-37 (2001). 
+
+The original code can be downloaded from 
+[[http://www4.ncsu.edu/~ctk/SOFTWARE/DIRECTv204.tar.gz]]
+which includes some parallel processing functions that are not yet 
+supported in bayesian-optimization.
+
+I have only tested this code using gfortran on Windows, Mac OS X
+and Linux, but it should work with other fortran compilers like f77 or
+f95.
+
+
+==== 2.3 - PYTHON: ==== 
+
+The library has been tested with Python 2.6 and 2.7. The interface also relies on numpy arrays [[http://new.scipy.org/download.html]]
+
+Python development files such as Python.h are needed to compile the interface. It you want to modify the Python interface, you also need Cython.
+
+{{{
+$ cython --cplus bayesopt.pyx
+}}}
+==  BayesOpt ==
+
+This is an efficient, C++ implementation of several Bayesian optimization
+algorithms. See References for some of the papers.
+
+It combines the use of a stochastic process as a surrogate function
+with some "active learning" criterion to find the optimum of an "arbitrary" 
+function using very few iterations. It can also be used for sequential experimental 
+design and stochastic bandits by selecting the adequate criterion.
+
+Using old C++ standards (C++98), it can be used with many compilers in Windows, 
+Linux, Mac OS. There are APIs for C/C++, Python and Matlab/Octave.
+
+**Important:** This code is free to use. However, if you are using, or plan to use, the library, //specially if it is for research or academic purposes//, please send me an [[mailto:rmcantin@unizar.es|email]]  with your name, institution and a brief description of your interest for this code (one or two lines).
+
+== INDEX: ==
+
+#[[Install]]
+#[[Usage]]
+#[[Compatibility]]
+#[[References]]
+
+----
+If you use BayesOpt in work that leads to a publication, we would appreciate it if you would kindly cite BayesOpt in your manuscript. Cite BayesOpt as something like:
+
+* Ruben Martinez-Cantin, **BayesOpt: a toolbox for nonlinear-optimization, experimental design and stochastic bandits, http://bitbucket.org/rmcantin/bayesian-optimization**
+----
+   Copyright (C) 2011-2012 Ruben Martinez-Cantin <rmcantin@unizar.es>
+ 
+   BayesOpt is free software: you can redistribute it and/or modify it 
+   under the terms of the GNU General Public License as published by
+   the Free Software Foundation, either version 3 of the License, or
+   (at your option) any later version.
+
+   BayesOpti is distributed in the hope that it will be useful, but 
+   WITHOUT ANY WARRANTY; without even the implied warranty of
+   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+   GNU General Public License for more details.
+
+   You should have received a copy of the GNU General Public License
+   along with BayesOpt.  If not, see <[[http://www.gnu.org/licenses/|http://www.gnu.org/licenses/]]>.
+= 1 - Install Library:  =
+
+Bayesian-optimization uses standard C/C++ code and it can be compiled in 
+different platforms using CMake. 
+
+**Linux or Mac OS:**
+In Ubuntu/Debian, you can get the dependencies by running:
+{{{
+>> sudo apt-get install libboost-dev python-dev python-numpy cmake g++ cython
+}}}
+In Mac OS you can install macports and run:
+{{{
+>> sudo port install boost python27 py27-numpy gcc46 cmake py27-cython
+}}}
+
+To compile the source code:
+{{{
+>> cmake . 
+>> make
+>> sudo make install
+}}}
+If you use //ccmake// instead of //cmake// you will access a graphical interface to select features such as debug/release mode and if you want to use shared libraries or not. Shared libraries are required to run the Python interface.
+
+If you have doxygen installed on your computer, you can compile the documentation right after compiling the code by running. 
+{{{
+>> make doc
+}}}
+The documentation will appear in the "doc" subdirectory.
+
+**Windows and other systems:**
+The easiest way to compile this library is using the cross platform and cross compiler tool [[http://www.cmake.org/|CMake]].
+
+This code uses Boost libraries for matrix operations (uBlas) and random number generation. They can be found in standard linux distributions or it can be downloaded from (http://www.boost.org). Since they are pure template libraries, they do not require compilation. Just make sure the headers are on the include path. You can also add an entry named BOOST_ROOT in CMake with the corresponding path.
+
+
+== Install Python interface ==
+
+Both Python and Numpy are needed if you want the python interface. The library has been tested with Python 2.6 and 2.7. The interface relies on Numpy arrays. Python development files such as Python.h are needed to compile the interface. 
+
+In Ubuntu/Debian, you can get the dependencies by running:
+{{{
+>> sudo apt-get install python-dev python-numpy cython
+}}}
+In Mac OS you can install macports and run:
+{{{
+>> sudo port install python27 py27-numpy py27-cython
+}}}
+
+It you want to **modify** the Python interface, you need to modify the pyx file and run the Cython compiler.
+{{{
+$ cython --cplus bayesopt.pyx
+}}}
+
+**Important**: Python requires bayesopt to be a **shared** library.
+
+In **Windows** you need to download and install [[http://www.python.org/|Python]], [[http://new.scipy.org/download.html|Numpy]] and [[http://cython.org/|Cython]]. Make sure that CMake finds all the dependencies.
+
+
+
+== Install MATLAB/Octave interface ==
+
+# Make sure the library is compiled with the MATLAB_COMPATIBLE option.
+# Compile the interface using the //compile_matlab.m// or //compile_octave.m// script, which can be found in the ///matlab/// directory.
+# If bayesopt or nlopt are compiled as **shared** libraries: 
+At run time, MATLAB also needs to access to the libraries. For example, in Linux and Mac OS you make sure to execute the //exportlocalpath.sh// script is executed before calling MATLAB.

Known_Issues.wiki

+=== 3. KNOWN ISSUES  === 
+
+In some systems, the linker, Python or MATLAB are not able to find the shared libraries. You
+just need to point the LD_LIBRARY_PATH and PYTHONPATH to the corresponding folder (by default: 
+/usr/local/lib in Linux and Mac OS). 
+
+If you use bash, there is a shell script to do that exportlocalpaths.sh
+
+MATLAB for Linux ships with outdated libraries for gcc (4.3 in 2011b). You might get errors like
+this one:
+{{{
+/usr/lib/x86_64-linux-gnu/gcc/x86_64-linux-gnu/4.5.2/cc1: /usr/local/MATLAB/R2010b/sys/os/glnxa64/libstdc++.so.6: version `GLIBCXX_3.4.14' not found (required by /usr/lib/libppl_c.so.2)
+}}}
+The solution is to change the symbolic links in //matlabroot///sys/os/glnx86 for //libgcc_s.so.1// 
+and //libstdc++.so.6// to point to the system libraries, which typically can be found in /lib or /usr/lib
+= 3. REFERENCES  =
+
+# Eric Brochu, Vlad M Cora, and Nando de Freitas. A tutorial on bayesian optimization of expensive
+cost functions, with application to active user modeling and hierarchical reinforcement
+learning. eprint arXiv:1012.2599, arXiv.org, December 2010.
+# Mark S. Handcock and Michael L. Stein. A bayesian analysis of kriging. Technometrics,
+35(4):403–410, 1993.
+# Matthew Hoffman, Eric Brochu, and Nando de Freitas. Portfolio allocation for Bayesian 
+optimization.
+In 27th Conference on Uncertainty in Artificial Intelligence (UAI2011), 2011.
+# D. Huang, T. T. Allen, W. I. Notz, and N. Zeng. Global optimization of stochastic black-box
+systems via sequential kriging meta-models. Journal of Global Optimization, 34(3):441– 466,
+2006.
+# Steven G. Johnson. The NLopt nonlinear-optimization package, http://ab-initio.mit.edu/nlopt.
+# D.R. Jones. A taxonomy of global optimization methods based on response surfaces. Journal
+of Global Optimization, 21:345–383, 2001.
+# D.R. Jones, M. Schonlau, and W.J. Welch. Efficient global optimization of expensive blackbox
+functions. Journal of Global Optimization, 13(4):455–492, 1998.
+# R. Martinez-Cantin, N. de Freitas, E. Brochu, J.A. Castellanos, and A. Doucet. A Bayesian
+exploration-exploitation approach for optimal online sensing and planning with a visually
+guided mobile robot. Autonomous Robots - Special Issue on Robot Learning, Part B, 27(3):93–
+103, 2009.
+# Ruben Martinez-Cantin, Nando de Freitas, Jose Castellanos and Arnaud Doucet (2007) Active 
+Policy Learning for Robot Planning and Exploration under Uncertainty. In Proc. of Robotics: 
+Science and Systems
+# Jonas Mockus. Application of bayesian approach to numerical methods of global and stochastic
+optimization. Journal of Global Optimization, 4:347–365, 1994.
+# J. Sacks, W.J. Welch, T.J. Mitchell, and H.P. Wynn. Design and analysis of computer experiments.
+Statistical Science, 4(4):409–423, 1989.
+# T.J. Santner, B. Williams, and W. Notz. The Design and Analysis of Computer Experiments.
+Springer-Verlag, 2003.
+# N. Srinivas, A. Krause, S. Kakade, and M. Seeger. Gaussian process optimization in the bandit
+setting: No regret and experimental design. In Proc. International Conference on Machine
+Learning (ICML), 2010.
+# B J Williams, T J Santner, and W I Notz. Sequential design of computer experiments to
+minimize integrated response functions. Statistica Sinica, 10(4):1133–1152, 2000.
+=  2 - USAGE:   =
+
+After installing the library, several test program will also be generated and can be found in the //bin// folder. For the external interfaces, there are test programs respectively under the //python// and //matlab// folders.
+They provide examples of the different interfaces that Bayesian-optimization provide.
+
+First of all, make sure your system finds the correct libraries. In Windows, you can copy the dlls to your working folder or include the lib folder in the path.
+
+In some *nix systems, including Ubuntu, Debian and Mac OS, the library is installed by default in ///usr/local/lib//. However, this folder is not included in the linker, Python or Matlab paths by default. The script //exportlocalpaths.sh// makes sure that the folder is included in all the necessary paths.
+
+After that, there are 3 steps that should be follow:
+# Define the function to optimize.
+# Set or modify the parameters of the optimization process. The set of parameters and the default set can be found in parameters.h.
+# Run the optimizer.
+
+Here we show a brief summary of the different ways to use the library:
+
+== 2.1 - C/C++ callback usage ==
+
+This interface is the most standard approach, it could also be used for other languages such as Fortran, Ada, etc.
+
+The function to optimize must agree with the template provided in bayesoptwpr.h
+
+{{{
+#!c
+double my_function (unsigned int n, const double *x, double *gradient, void *func_data);
+}}}
+
+Note that the gradient has been included for future compatibility, although in the current implementation, it is not used. You can just ignore it or send a NULL pointer.
+
+The parameters are defined in the bopt_params struct. The easiest way to set the parameters is to use
+
+{{{
+#!c
+bopt_params initialize_parameters_to_default(void);
+}}}
+
+and then, modify the necesary fields.
+
+Once we have set the parameters and the function, we can called the optimizer
+{{{
+#!c
+  int bayes_optimization(int nDim, /* number of dimensions */
+                         eval_func f, /* function to optimize */
+                         void* f_data, /* extra data that is transfered directly to f */
+			 const double *lb, const double *ub, /* bounds */
+			 double *x, /* out: minimizer */
+			 double *minf, /* out: minimum */
+			 bopt_params parameters);
+}}}
+
+== 2.2 - C++ inheritance usage ==
+
+This is the most straighforward and complete method to use the library. The object that must be optimized must inherit from the BayesOptContinuous or BayesOptDiscrete classes. The objects can be found in bayesoptcont.hpp and bayesoptdisc.hpp respectively.
+
+Then, we just need to override one of the virtual functions called evaluateSample, which can be called with C arrays and uBlas vectors. Since there is no pure virtual functions, you can just redefine your preferred interface.
+
+**Experimental**: You can also override the checkReachability function to include nonlinear restrictions.
+
+{{{
+#!c++
+class MyOptimization: public BayesOptContinous
+{
+ public:
+  MyOptimization(bopt_params param):
+    BayesOptContinous(param) {}
+
+  double evaluateSample( const boost::numeric::ublas::vector<double> &query ) 
+  {
+     // My function here
+  };
+
+  bool checkReachability( const boost::numeric::ublas::vector<double> &query )
+  { 
+     // My restrictions here 
+  };
+};
+}}}
+
+As with the callback interface, the parameters are defined in the bopt_params struct. The easiest way to set the parameters is to use
+
+{{{
+#!c
+bopt_params initialize_parameters_to_default(void);
+}}}
+
+and then, modify the necesary fields.
+
+== 2.3 - Python callback/inheritance usage ==
+
+The file python/test.py provides examples of the two Python interfaces. 
+
+**Parameters**: For both interfaces, the parameters are defined as a Python dictionary with the same structure as the bopt_params struct in the C/C++ interface. The enumerate values are replaced by strings without the prefix. For example, the C_EI criteria is replaced by the string "EI" and the M_ZERO mean function is replaced by the string "ZERO".
+
+The parameter dictionary can be initialized using 
+{{{
+#!python
+parameters = bayesopt.initialize_params()
+}}}
+however, this is not necesary in general. If any of the parameter is not included in the dictionary, the default value is included instead. 
+
+**Callback**: The callback interface is just a wrapper of  the C interface. In this case, the callback function should have the form
+{{{
+#!python
+def my_function (query):
+}}}
+where //query// is a numpy array and the function returns a double scalar.
+
+The optimization process can be called as
+{{{
+#!python
+y_out, x_out, error = bayesopt.optimize(my_function, n_dimensions, lower_bound, upper_bound, parameters)
+}}}
+where the result is a tuple with the minimum as a numpy array (x_out), the value of the function at the minimum (y_out) and the error code.
+
+**Inheritance**:
+The inheritance usage is similar to the C++ interface.
+{{{
+#!python
+class MyModule(bayesoptmodule.BayesOptModule):
+    def evalfunc(self,query):
+        """ My function """
+}}}
+The BayesOptModule include atributes for the parameters (//params//), number of dimensions (//n//) and bounds (//lb// and //up//).
+
+The optimization process can be called as
+{{{
+#!python
+my_instance = MyModule()
+# set parameters, bounds and number of dimensions.
+y_out, x_out, error = my_instance.optimize()
+}}}
+wher the result is a tuple with the minimum as a numpy array (x_out), the value of the function at the minimum (y_out) and the error code.
+
+== 2.4 - Matlab/Octave callback usage ==
+
+The file matlab/runtest.m provides an example of the Matlab/Octave interface. 
+
+**Parameters**: The parameters are defined as a Matlab struct equivalent to bopt_params struct in the C/C++ interface, except for the //theta// and //mu// arrays which are replaced by Matlab vectors. Thus, the number of elements (n_theta and n_mu) are not needed. The enumerate values are replaced by strings without the prefix. For example, the C_EI criteria is replaced by the string "EI" and the M_ZERO mean function is replaced by the string "ZERO". 
+
+If any of the parameter is not included in the Matlab struct, the default value is automatically included instead. 
+
+**Callback**: The callback interface is just a wrapper of  the C interface. In this case, the callback function should have the form
+{{{
+#!matlab
+function y = my_function (query):
+}}}
+where //query// is a Matlab vector and the function returns a scalar.
+
+The optimization process can be called (both in Matlab and Octave) as
+{{{
+#!matlab
+[x_out, y_out] = bayesopt('my_function', n_dimensions, parameters, lower_bound, upper_bound)
+}}}
+where the result is the minimum as a vector (x_out) and the value of the function at the minimum (y_out).
+
+In Matlab, but not in Octave, the optimization can also be called with function handlers
+{{{
+#!matlab
+[x_out, y_out] = bayesopt(@my_function, n_dimensions, parameters, lower_bound, upper_bound)
+}}}
+
+**Note on gcc versions:**
+Matlab for *nix OS may ship outdated libraries for gcc (e.g.: 4.3 in 2011b). You might get errors like
+this one:
+{{{
+/usr/lib/x86_64-linux-gnu/gcc/x86_64-linux-gnu/4.5.2/cc1: /usr/local/MATLAB/R2010b/sys/os/glnxa64/libstdc++.so.6: version `GLIBCXX_3.4.14' not found (required by /usr/lib/libppl_c.so.2)
+}}}
+The solution is to change the symbolic links in //matlabroot///sys/os/glnx86 for //libgcc_s.so.1// 
+and //libstdc++.so.6// to point to the system libraries, which typically can be found in /lib or /usr/lib.

balloon_yellow.png

Added
New image

check_green.png

Added
New image

cross_red.png

Added
New image