1. Ruben Martinez-Cantin
  2. BayesOpt

Commits

Ruben Martinez-Cantin  committed ef0b6e0

Adding new changes to the docs.

  • Participants
  • Parent commits 0a9a4d7
  • Branches default
  • Tags v0.7.1

Comments (0)

Files changed (4)

File doxygen/install.dox

View file
 
 <li> In some *nix systems, including Ubuntu, Debian and Mac OS, the
 library is installed by default in /usr/local/lib/. However, this
-folder is not included by default in the linker, Python or Matlab
-paths by default. This is specially critical when building shared
-libraries (mandatory for Python usage). The script \em
-exportlocalpaths.sh makes sure that the folder with the libraries is
-included in all the necessary paths.
+folder is not included by default in the linker or Matlab paths by
+default. This is specially critical when building shared
+libraries. The script \em exportlocalpaths.sh makes sure that the
+folder with the libraries is included in all the necessary
+paths. 
+
+<li> The Python package is compiled in a different file and installed
+in the proper folder by default (site-packages). Thus, the Python
+interface should work out of the box in Linux and Mac OS.
+
 </ul>
 
 <li> Python: the scrip \b demo_quad gives a quick overview of all the

File doxygen/models.dox

View file
 \li "cEI","cBEI","cEIa": The most extended and reliable algorithm is
 the Expected Improvement algorithm \cite Mockus78. In this case we
 provide the general version from \cite Schonlau98 which includes an
-exponent to trade off exploration and exploitation "cEI". Whe also includes
-a variation from \cite Mockus1989 which add a \a bias or \a threshold
-to the improvement "cBEI".
+exponent to trade off exploration and exploitation "cEI". For an
+annealed version of the exploration/exploitation trade off, use
+"cEIa". Whe also includes a variation from \cite Mockus1989 which add
+a \a bias or \a threshold to the improvement "cBEI".
 \li "cLCB", "cLCBa": Another popular algorithm is the Lower Confidence
-Bound (LCB), or UCB in case of maximization. Introduced by 
-\cite cox1992statistical as Sequential Design for Optimization (SDO).
+Bound (LCB), or UCB in case of maximization. Introduced by \cite
+cox1992statistical as Sequential Design for Optimization
+(SDO). Analogously, "cLCBa" represents an annealed version of the
+exploration/exploitation trade off.
+\li "cMI": A generalized version of the LCB criterion which relies on
+the mutual information. See \cite Contal2014
 \li "cPOI": Probability of improvement, by \cite Kushner:1964
 \li "cExpReturn","cThompsonSampling","cOptimisticSampling": This
 criteria are related with the predicted return of the function. The
 value). The second one is based on the Thompson sampling (drawing a
 random sample from the predicted distribution). Finally, the
 optimistic sampling takes the minimum of the other two (mean vs random).
-
 \li "cAopt": This is based on the A-optimality criteria. It is the
 predicted variance at the query point. Thus, this criteria is intended
 for \b exploration of the input space, not for optimization.

File doxygen/using.dox

View file
   combination of them. It is used to select which points to evaluate
   for each iteration of the optimization process. Could be a
   combination of functions like
-  "cHedge(cEI,cLCB,cPOI,cThompsonSampling)". See section critmod for
+  "cHedge(cEI,cLCB,cPOI,cThompsonSampling)". See section \ref critmod for
   the different possibilities. [Default: "cEI"]
 
 - \b crit_params, \b n_crit_params: Array with the set of parameters

File optimization.bib

View file
   timestamp = {2010.06.18}
 }
 
+@INPROCEEDINGS{Contal2014,
+  author = {Emile Contal and Vianney Perchet and Nicolas Vayatis},
+  title = {Gaussian Process Optimization with Mutual Information},
+  booktitle = {International Conference on Machine Learning},
+  year = {2014},
+  owner = {rmcantin},
+  timestamp = {2014.07.02}
+}
+
 @INCOLLECTION{cox1997sdo,
   author = {Cox, Dennis D and John, Susan},
   title = {SDO: A statistical method for global optimization},
   timestamp = {2013.03.15}
 }
 
+@INPROCEEDINGS{EggFeuBerSnoHooHutLey13,
+  author = {K. Eggensperger and M. Feurer and F. Hutter and J. Bergstra and J.
+	Snoek and H. Hoos and K. Leyton-Brown},
+  title = {Towards an Empirical Foundation for Assessing Bayesian Optimization
+	of Hyperparameters},
+  booktitle = {BayesOpt workshop (NIPS)},
+  year = {2013},
+  longbooktitle = {NIPS workshop on Bayesian Optimization in Theory and Practice},
+  owner = {rmcantin},
+  timestamp = {2014.05.19}
+}
+
 @INPROCEEDINGS{Garnett2010,
   author = {Roman Garnett and Michael A. Osborne and Stephen J. Roberts},
   title = {Bayesian optimization for sensor set selection},
   timestamp = {2013.05.07}
 }
 
-@INPROCEEDINGS{EggFeuBerSnoHooHutLey13,
-  author = {K. Eggensperger and M. Feurer and F. Hutter and J.
-	Bergstra and J. Snoek and H. Hoos and K. Leyton-Brown},
-  title = {Towards an Empirical Foundation for Assessing Bayesian Optimization
-	of Hyperparameters},
-  longbooktitle = {NIPS workshop on Bayesian Optimization in Theory and Practice},
-  booktitle = {BayesOpt workshop (NIPS)},
-  year = {2013},
-  owner = {rmcantin},
-  timestamp = {2014.05.19}
-}
-
-
 @ARTICLE{Igel04,
   author = {Christian Igel and Marc Toussaint},
   title = {A No-Free-Lunch theorem for non-uniform distributions of target functions},