+After installing the library, several test program will also be generated and can be found in the //bin// folder. For the external interfaces, there are test programs respectively under the //python// and //matlab// folders.
+They provide examples of the different interfaces that Bayesian-optimization provide.
+First of all, make sure your system finds the correct libraries. In Windows, you can copy the dlls to your working folder or include the lib folder in the path.
+In some *nix systems, including Ubuntu, Debian and Mac OS, the library is installed by default in ///usr/local/lib//. However, this folder is not included in the linker, Python or Matlab paths by default. The script //exportlocalpaths.sh// makes sure that the folder is included in all the necessary paths.
+After that, there are 3 steps that should be follow:
+# Define the function to optimize.
+# Set or modify the parameters of the optimization process. The set of parameters and the default set can be found in parameters.h.
+Here we show a brief summary of the different ways to use the library:
+== 2.1 - C/C++ callback usage ==
+This interface is the most standard approach, it could also be used for other languages such as Fortran, Ada, etc.
+The function to optimize must agree with the template provided in bayesoptwpr.h
+double my_function (unsigned int n, const double *x, double *gradient, void *func_data);
+Note that the gradient has been included for future compatibility, although in the current implementation, it is not used. You can just ignore it or send a NULL pointer.
+The parameters are defined in the bopt_params struct. The easiest way to set the parameters is to use
+bopt_params initialize_parameters_to_default(void);
+and then, modify the necesary fields.
+Once we have set the parameters and the function, we can called the optimizer
+ int bayes_optimization(int nDim, /* number of dimensions */
+ eval_func f, /* function to optimize */
+ void* f_data, /* extra data that is transfered directly to f */
+ const double *lb, const double *ub, /* bounds */
+ double *x, /* out: minimizer */
+ double *minf, /* out: minimum */
+ bopt_params parameters);
+== 2.2 - C++ inheritance usage ==
+This is the most straighforward and complete method to use the library. The object that must be optimized must inherit from the BayesOptContinuous or BayesOptDiscrete classes. The objects can be found in bayesoptcont.hpp and bayesoptdisc.hpp respectively.
+Then, we just need to override one of the virtual functions called evaluateSample, which can be called with C arrays and uBlas vectors. Since there is no pure virtual functions, you can just redefine your preferred interface.
+**Experimental**: You can also override the checkReachability function to include nonlinear restrictions.
+class MyOptimization: public BayesOptContinous
+ MyOptimization(bopt_params param):
+ BayesOptContinous(param) {}
+ double evaluateSample( const boost::numeric::ublas::vector<double> &query )
+ bool checkReachability( const boost::numeric::ublas::vector<double> &query )
+ // My restrictions here
+As with the callback interface, the parameters are defined in the bopt_params struct. The easiest way to set the parameters is to use
+bopt_params initialize_parameters_to_default(void);
+and then, modify the necesary fields.
+== 2.3 - Python callback/inheritance usage ==
+The file python/test.py provides examples of the two Python interfaces.
+**Parameters**: For both interfaces, the parameters are defined as a Python dictionary with the same structure as the bopt_params struct in the C/C++ interface. The enumerate values are replaced by strings without the prefix. For example, the C_EI criteria is replaced by the string "EI" and the M_ZERO mean function is replaced by the string "ZERO".
+The parameter dictionary can be initialized using
+parameters = bayesopt.initialize_params()
+however, this is not necesary in general. If any of the parameter is not included in the dictionary, the default value is included instead.
+**Callback**: The callback interface is just a wrapper of the C interface. In this case, the callback function should have the form
+def my_function (query):
+where //query// is a numpy array and the function returns a double scalar.
+The optimization process can be called as
+y_out, x_out, error = bayesopt.optimize(my_function, n_dimensions, lower_bound, upper_bound, parameters)
+where the result is a tuple with the minimum as a numpy array (x_out), the value of the function at the minimum (y_out) and the error code.
+The inheritance usage is similar to the C++ interface.
+class MyModule(bayesoptmodule.BayesOptModule):
+ def evalfunc(self,query):
+The BayesOptModule include atributes for the parameters (//params//), number of dimensions (//n//) and bounds (//lb// and //up//).
+The optimization process can be called as
+my_instance = MyModule()
+# set parameters, bounds and number of dimensions.
+y_out, x_out, error = my_instance.optimize()
+wher the result is a tuple with the minimum as a numpy array (x_out), the value of the function at the minimum (y_out) and the error code.
+== 2.4 - Matlab/Octave callback usage ==
+The file matlab/runtest.m provides an example of the Matlab/Octave interface.
+**Parameters**: The parameters are defined as a Matlab struct equivalent to bopt_params struct in the C/C++ interface, except for the //theta// and //mu// arrays which are replaced by Matlab vectors. Thus, the number of elements (n_theta and n_mu) are not needed. The enumerate values are replaced by strings without the prefix. For example, the C_EI criteria is replaced by the string "EI" and the M_ZERO mean function is replaced by the string "ZERO".
+If any of the parameter is not included in the Matlab struct, the default value is automatically included instead.
+**Callback**: The callback interface is just a wrapper of the C interface. In this case, the callback function should have the form
+function y = my_function (query):
+where //query// is a Matlab vector and the function returns a scalar.
+The optimization process can be called (both in Matlab and Octave) as
+[x_out, y_out] = bayesopt('my_function', n_dimensions, parameters, lower_bound, upper_bound)
+where the result is the minimum as a vector (x_out) and the value of the function at the minimum (y_out).
+In Matlab, but not in Octave, the optimization can also be called with function handlers
+[x_out, y_out] = bayesopt(@my_function, n_dimensions, parameters, lower_bound, upper_bound)
+**Note on gcc versions:**
+Matlab for *nix OS may ship outdated libraries for gcc (e.g.: 4.3 in 2011b). You might get errors like
+/usr/lib/x86_64-linux-gnu/gcc/x86_64-linux-gnu/4.5.2/cc1: /usr/local/MATLAB/R2010b/sys/os/glnxa64/libstdc++.so.6: version `GLIBCXX_3.4.14' not found (required by /usr/lib/libppl_c.so.2)
+The solution is to change the symbolic links in //matlabroot///sys/os/glnx86 for //libgcc_s.so.1//
+and //libstdc++.so.6// to point to the system libraries, which typically can be found in /lib or /usr/lib.