Skip to content

Commit

Permalink
minor changes
Browse files Browse the repository at this point in the history
  • Loading branch information
abhishek-ghose committed Jan 16, 2024
1 parent 260069a commit 9f4231b
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 3 deletions.
2 changes: 1 addition & 1 deletion bayesopt_1_key_ideas_GPs.html
Original file line number Diff line number Diff line change
Expand Up @@ -758,7 +758,7 @@ <h3 id="summary-and-challenges">Summary and Challenges</h3>

<hr style="height: 50px; border: 0; box-shadow: inset 0 12px 12px -12px rgba(0, 0, 0, 0.5);" />

<p>We looked at one critical ingredient of BayesOpt here - the surrogate model using GPs. We’ll see the other critical component - acquisition functions - in the next post. That also contains a list of learning resources at the end. See you there!</p>
<p>We looked at one critical ingredient of BayesOpt here - surrogate models. We’ll see the other critical component - acquisition functions - in the next post. That also contains a list of learning resources at the end. See you there!</p>

<h2 id="references">References</h2>
<ol class="bibliography"><li><span id="bayesopt_is_superior">Turner, R., Eriksson, D., McCourt, M., Kiili, J., Laaksonen, E., Xu, Z., &amp; Guyon, I. (2021). Bayesian Optimization is Superior to Random Search for Machine Learning Hyperparameter Tuning: Analysis of the Black-Box Optimization Challenge 2020. In H. J. Escalante &amp; K. Hofmann (Eds.), <i>Proceedings of the NeurIPS 2020 Competition and Demonstration Track</i> (Vol. 133, pp. 3–26). PMLR. https://proceedings.mlr.press/v133/turner21a.html</span>
Expand Down
4 changes: 3 additions & 1 deletion bayesopt_2_acq_fns.html
Original file line number Diff line number Diff line change
Expand Up @@ -639,7 +639,9 @@ <h2 id="minimal-code-for-bayesopt">Minimal Code for BayesOpt</h2>

<h2 id="summary-a-pet-peeve-and-resources">Summary, a Pet Peeve and Resources</h2>

<p>We’re finally at the end of our (somewhat long) discussion of BayesOpt. One way to think of BayesOpt is as a clever and benign middleman that effectively converts black-box optimizers like CMA-ES (for auxilliary optimization) to being sample efficient, by standing between them and the real-world, and supplying them hallucinated functions from the surrogate model. The evaluation-hungry auxilliary optimizers can run wild on these computationally cheap-to-evaluate functions. All this while BayesOpt observes them, and relays important information to and from the expensive real-world function. Torturing this metaphor, you might wonder, “Hey, this seems like a fairly general way to compute other properties of functions, not just optima! Just replace CMA-ES with an arbitrary algorithm!”. And you’d be right - this is exactly the kind of extension <em>InfoBAX</em> <a class="citation" href="#infobax">(Neiswanger et al., 2021)</a> proposes.</p>
<p>We’re finally at the end of our (somewhat long) discussion of BayesOpt. One way to think of BayesOpt is as a clever and benign middleman that effectively converts black-box optimizers like CMA-ES (for auxilliary optimization) to being sample efficient, by standing between them and the real-world, and supplying them with hallucinated functions from the surrogate model. The evaluation-hungry auxilliary optimizers can run wild on these computationally cheap-to-evaluate functions. All this while BayesOpt observes them, inferring value of different input regions, and relays strategic information to and from the expensive real-world function.</p>

<p>Torturing this metaphor further, you might wonder, “Hey, this seems like a fairly general way to compute other properties of functions, not just optima! Just replace CMA-ES with an arbitrary algorithm!”. And you’d be right - this is exactly the kind of extension <em>InfoBAX</em> <a class="citation" href="#infobax">(Neiswanger et al., 2021)</a> proposes.</p>

<p>While we looked at the cannonical versions of BayesOpt, I’d point out that because of the plug-and-play nature of the framework - you can replace one or more of the following: surrogate model, acquisition function, kernel - BayesOpt is extremely versatile. It has been adapted to combinatorial search in discrete structured spaces <a class="citation" href="#papenmeier2023bounce">(Papenmeier et al., 2023; Baptista &amp; Poloczek, 2018; Deshwal et al., 2020; Deshwal &amp; Doppa, 2021)</a>, parallelization <a class="citation" href="#RePEc:inm:oropre:v:68:y:2020:i:6:p:1850-1865">(Wang et al., 2020; Kandasamy et al., 2018)</a>, sparsity <a class="citation" href="#10.5555/3020948.3021002">(McIntire et al., 2016; Liu et al., 2023)</a> and used with Neural Networks as the surrogate model <a class="citation" href="#pmlr-v37-snoek15">(Snoek et al., 2015; Springenberg et al., 2016)</a>. The list of applications is impressively diverse - from hyperparameter optimization to plasma control in nuclear fusion <a class="citation" href="#mehta2022an">(Mehta et al., 2022)</a>. The preamble to the part 1 post lists some more.</p>

Expand Down
2 changes: 1 addition & 1 deletion feed.xml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
<link>https://blog.quipu-strands.com</link>
<atom:link href="https://blog.quipu-strands.com/feed.xml" rel="self" type="application/rss+xml" />
<description>My thought-recorder.</description>
<lastBuildDate>Mon, 15 Jan 2024 15:46:26 -0800</lastBuildDate>
<lastBuildDate>Tue, 16 Jan 2024 00:43:44 -0800</lastBuildDate>

<item>
<title>Bayesian Optimization, Part 2: Acquisition Functions</title>
Expand Down

0 comments on commit 9f4231b

Please sign in to comment.