Ruben Martinez-Cantin committed 0a9a4d7 Merge

Merged rmcantin/bayesopt into default

Comments (0)

Files changed (7)

 c20fdd2c37dfcb0c9f425f0e821a5b1d046098f9 v0.5
 71d10a5c8d7ed3d944f9ffbc1473ce1402e5bf0a v0.5.1
 2aa766866e57e023e6afc6636efd89af739b8fb1 v0.6
+2fd35e657c26cf0e09a632af6497c4a5a7c554f9 v0.7
 General Public License for more details.
 You should have received a copy of the GNU General Public License
-along with BayesOpt. If not, see <>.
+along with BayesOpt. If not, see <>.


 - \subpage demos
 - \subpage bopttheory
 - \subpage contriblib
+- \subpage relsoft


+/*!  \page  relsoft Related software
+\section relbopt Bayesian optimization with Gaussian process
+\li Spearmint (Python): A library based on \cite Snoek2012. It is more
+oriented to cluster computing. It provides time based
+criteria. Implemented in Python with interface also to Matlab
+\li DiceOptim (R): It adds optimization capabilities to
+DiceKriging. See \cite Roustant2012 and
+\section relbopt2 Bayesian optimization with other models
+\li SMAC (Java): A library based on \cite HutHooLey11-smac. It uses
+random forests as surrogate model. Intended for hyperparameter
+optimization. It has Python interface through HPOlib \cite
+\li Hyperopt (Python): A library mainly based on \cite
+Bergstra2011. It uses trees of Parzen estimators for the surrogate
+\section relkrig Related surrogate modeling software
+\li Perk (Fortran+Shell): A library for surrogate modelling. It was
+mainly used to illustrate the results in \cite
+\li SUMO (Matlab+Java): A library for surrogate modelling using
+Gaussian processes. Its main purpose it is not optimization but it
+also includes a demo with the Expected Imporvement
+\li GPML (Matlab): The most popular library for Gaussian
+Process. BayesOpt uses some of its design
+\section relnopt Related nonlinear optimization software
+\li NLOPT (C): One of the best libraries for general purpose nonlinear
+optimization. It has intefaces for a large number of programming
   timestamp = {2013.05.07}
+  author = {K. Eggensperger and M. Feurer and F. Hutter and J.
+	Bergstra and J. Snoek and H. Hoos and K. Leyton-Brown},
+  title = {Towards an Empirical Foundation for Assessing Bayesian Optimization
+	of Hyperparameters},
+  longbooktitle = {NIPS workshop on Bayesian Optimization in Theory and Practice},
+  booktitle = {BayesOpt workshop (NIPS)},
+  year = {2013},
+  owner = {rmcantin},
+  timestamp = {2014.05.19}
   author = {Christian Igel and Marc Toussaint},
   title = {A No-Free-Lunch theorem for non-uniform distributions of target functions},


File contents unchanged.


 params = {} #bayesopt.initialize_params()
 # We decided to change some of them
-params['n_init_samples'] = 150
+params['n_init_samples'] = 30
 params['n_iter_relearn'] = 20
 params['noise'] = 1e-10
 params['kernel_name'] = "kMaternISO5"