Skip to content

Commit 8c6bd02

Browse files
committed
Pushing the docs for revision for branch: master, commit e8cd756074b4ffdffddc0893647ec9b71d7b9754
1 parent 9dc8aa2 commit 8c6bd02

File tree

421 files changed

+81586
-45
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

421 files changed

+81586
-45
lines changed
25.8 KB
Binary file not shown.
Binary file not shown.
Binary file not shown.
21.1 KB
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.

beta/doctrees/contents.doctree

11.3 KB
Binary file not shown.

beta/doctrees/development.doctree

3.36 KB
Binary file not shown.

beta/doctrees/environment.pickle

9.13 MB
Binary file not shown.

beta/doctrees/getting_started.doctree

10.5 KB
Binary file not shown.

beta/doctrees/index.doctree

7.89 KB
Binary file not shown.

beta/doctrees/install.doctree

4.43 KB
Binary file not shown.
44.2 KB
Binary file not shown.
133 KB
Binary file not shown.
32.2 KB
Binary file not shown.

beta/doctrees/modules/classes.doctree

114 KB
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.

beta/doctrees/modules/index.doctree

2.73 KB
Binary file not shown.
Binary file not shown.
3.91 KB
Binary file not shown.

beta/doctrees/modules/plots.doctree

66.2 KB
Binary file not shown.

beta/doctrees/modules/space.doctree

143 KB
Binary file not shown.

beta/doctrees/modules/utils.doctree

61.8 KB
Binary file not shown.

beta/doctrees/tune_toc.doctree

8.79 KB
Binary file not shown.

beta/doctrees/user_guide.doctree

11.8 KB
Binary file not shown.

beta/doctrees/whats_new/v0.1.doctree

4.56 KB
Binary file not shown.

beta/doctrees/whats_new/v0.2.doctree

6.13 KB
Binary file not shown.

beta/doctrees/whats_new/v0.3.doctree

8.49 KB
Binary file not shown.

beta/doctrees/whats_new/v0.4.doctree

5.81 KB
Binary file not shown.

beta/doctrees/whats_new/v0.5.doctree

8.92 KB
Binary file not shown.

beta/doctrees/whats_new/v0.6.doctree

6.55 KB
Binary file not shown.

beta/doctrees/whats_new/v0.7.doctree

3.23 KB
Binary file not shown.

beta/html/.buildinfo

+4
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
# Sphinx build info version 1
2+
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3+
config: 101315b500250a2eccd5c5cad70d536a
4+
tags: 645f666f9bcd5a90fca523b33c5a78b7

beta/html/.nojekyll

Whitespace-only changes.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "code",
5+
"execution_count": null,
6+
"metadata": {
7+
"collapsed": false
8+
},
9+
"outputs": [],
10+
"source": [
11+
"%matplotlib inline"
12+
]
13+
},
14+
{
15+
"cell_type": "markdown",
16+
"metadata": {},
17+
"source": [
18+
"\n# Parallel optimization\n\n\nIaroslav Shcherbatyi, May 2017.\n\nReviewed by Manoj Kumar and Tim Head.\n\nReformatted by Holger Nahrstaedt 2020\n\n.. currentmodule:: skopt\n\nIntroduction\n============\n\nFor many practical black box optimization problems expensive objective can be\nevaluated in parallel at multiple points. This allows to get more objective\nevaluations per unit of time, which reduces the time necessary to reach good\nobjective values when appropriate optimization algorithms are used, see for\nexample results in [1] and the references therein.\n\n\nOne such example task is a selection of number and activation function of a\nneural network which results in highest accuracy for some machine learning\nproblem. For such task, multiple neural networks with different combinations\nof number of neurons and activation function type can be evaluated at the same\ntime in parallel on different cpu cores / computational nodes.\n\nThe \u201cask and tell\u201d API of scikit-optimize exposes functionality that allows to\nobtain multiple points for evaluation in parallel. Intended usage of this\ninterface is as follows:\n\n1. Initialize instance of the `Optimizer` class from skopt\n2. Obtain n points for evaluation in parallel by calling the `ask` method of\n an optimizer instance with the `n_points` argument set to n > 0\n3. Evaluate points\n4. Provide points and corresponding objectives using the `tell` method of\n an optimizer instance\n5. Continue from step 2 until eg maximum number of evaluations reached\n"
19+
]
20+
},
21+
{
22+
"cell_type": "code",
23+
"execution_count": null,
24+
"metadata": {
25+
"collapsed": false
26+
},
27+
"outputs": [],
28+
"source": [
29+
"print(__doc__)\nimport numpy as np"
30+
]
31+
},
32+
{
33+
"cell_type": "markdown",
34+
"metadata": {},
35+
"source": [
36+
"Example\n=======\n\nA minimalistic example that uses joblib to parallelize evaluation of the\nobjective function is given below.\n\n"
37+
]
38+
},
39+
{
40+
"cell_type": "code",
41+
"execution_count": null,
42+
"metadata": {
43+
"collapsed": false
44+
},
45+
"outputs": [],
46+
"source": [
47+
"from skopt import Optimizer\nfrom skopt.space import Real\nfrom joblib import Parallel, delayed\n# example objective taken from skopt\nfrom skopt.benchmarks import branin\n\noptimizer = Optimizer(\n dimensions=[Real(-5.0, 10.0), Real(0.0, 15.0)],\n random_state=1,\n base_estimator='gp'\n)\n\nfor i in range(10):\n x = optimizer.ask(n_points=4) # x is a list of n_points points\n y = Parallel(n_jobs=4)(delayed(branin)(v) for v in x) # evaluate points in parallel\n optimizer.tell(x, y)\n\n# takes ~ 20 sec to get here\nprint(min(optimizer.yi)) # print the best objective found"
48+
]
49+
},
50+
{
51+
"cell_type": "markdown",
52+
"metadata": {},
53+
"source": [
54+
"Note that if `n_points` is set to some integer > 0 for the `ask` method, the\nresult will be a list of points, even for `n_points`=1. If the argument is\nset to `None` (default value) then a single point (but not a list of points)\nwill be returned.\n\nThe default \"minimum constant liar\" [1] parallelization strategy is used in\nthe example, which allows to obtain multiple points for evaluation with a\nsingle call to the `ask` method with any surrogate or acquisition function.\nParalellization strategy can be set using the \"strategy\" argument of `ask`.\nFor supported parallelization strategies see the documentation of\nscikit-optimize.\n\n[1] [https://hal.archives-ouvertes.fr/hal-00732512/document](https://hal.archives-ouvertes.fr/hal-00732512/document) .\n"
55+
]
56+
}
57+
],
58+
"metadata": {
59+
"kernelspec": {
60+
"display_name": "Python 3",
61+
"language": "python",
62+
"name": "python3"
63+
},
64+
"language_info": {
65+
"codemirror_mode": {
66+
"name": "ipython",
67+
"version": 3
68+
},
69+
"file_extension": ".py",
70+
"mimetype": "text/x-python",
71+
"name": "python",
72+
"nbconvert_exporter": "python",
73+
"pygments_lexer": "ipython3",
74+
"version": "3.7.6"
75+
}
76+
},
77+
"nbformat": 4,
78+
"nbformat_minor": 0
79+
}
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,146 @@
1+
"""
2+
==========================
3+
Comparing surrogate models
4+
==========================
5+
6+
Tim Head, July 2016.
7+
Reformatted by Holger Nahrstaedt 2020
8+
9+
.. currentmodule:: skopt
10+
11+
Bayesian optimization or sequential model-based optimization uses a surrogate
12+
model to model the expensive to evaluate function `func`. There are several
13+
choices for what kind of surrogate model to use. This notebook compares the
14+
performance of:
15+
16+
* gaussian processes,
17+
* extra trees, and
18+
* random forests
19+
20+
as surrogate models. A purely random optimization strategy is also used as
21+
a baseline.
22+
"""
23+
24+
print(__doc__)
25+
import numpy as np
26+
np.random.seed(123)
27+
import matplotlib.pyplot as plt
28+
29+
#############################################################################
30+
# Toy model
31+
# =========
32+
#
33+
# We will use the `branin` function as toy model for the expensive function.
34+
# In a real world application this function would be unknown and expensive
35+
# to evaluate.
36+
37+
from skopt.benchmarks import branin as _branin
38+
39+
def branin(x, noise_level=0.):
40+
return _branin(x) + noise_level * np.random.randn()
41+
42+
#############################################################################
43+
44+
from matplotlib.colors import LogNorm
45+
46+
47+
def plot_branin():
48+
fig, ax = plt.subplots()
49+
50+
x1_values = np.linspace(-5, 10, 100)
51+
x2_values = np.linspace(0, 15, 100)
52+
x_ax, y_ax = np.meshgrid(x1_values, x2_values)
53+
vals = np.c_[x_ax.ravel(), y_ax.ravel()]
54+
fx = np.reshape([branin(val) for val in vals], (100, 100))
55+
56+
cm = ax.pcolormesh(x_ax, y_ax, fx,
57+
norm=LogNorm(vmin=fx.min(),
58+
vmax=fx.max()))
59+
60+
minima = np.array([[-np.pi, 12.275], [+np.pi, 2.275], [9.42478, 2.475]])
61+
ax.plot(minima[:, 0], minima[:, 1], "r.", markersize=14,
62+
lw=0, label="Minima")
63+
64+
cb = fig.colorbar(cm)
65+
cb.set_label("f(x)")
66+
67+
ax.legend(loc="best", numpoints=1)
68+
69+
ax.set_xlabel("X1")
70+
ax.set_xlim([-5, 10])
71+
ax.set_ylabel("X2")
72+
ax.set_ylim([0, 15])
73+
74+
75+
plot_branin()
76+
77+
#############################################################################
78+
# This shows the value of the two-dimensional branin function and
79+
# the three minima.
80+
#
81+
#
82+
# Objective
83+
# =========
84+
#
85+
# The objective of this example is to find one of these minima in as
86+
# few iterations as possible. One iteration is defined as one call
87+
# to the `branin` function.
88+
#
89+
# We will evaluate each model several times using a different seed for the
90+
# random number generator. Then compare the average performance of these
91+
# models. This makes the comparison more robust against models that get
92+
# "lucky".
93+
94+
from functools import partial
95+
from skopt import gp_minimize, forest_minimize, dummy_minimize
96+
97+
func = partial(branin, noise_level=2.0)
98+
bounds = [(-5.0, 10.0), (0.0, 15.0)]
99+
n_calls = 60
100+
101+
#############################################################################
102+
103+
104+
def run(minimizer, n_iter=5):
105+
return [minimizer(func, bounds, n_calls=n_calls, random_state=n)
106+
for n in range(n_iter)]
107+
108+
# Random search
109+
dummy_res = run(dummy_minimize)
110+
111+
# Gaussian processes
112+
gp_res = run(gp_minimize)
113+
114+
# Random forest
115+
rf_res = run(partial(forest_minimize, base_estimator="RF"))
116+
117+
# Extra trees
118+
et_res = run(partial(forest_minimize, base_estimator="ET"))
119+
120+
#############################################################################
121+
# Note that this can take a few minutes.
122+
123+
from skopt.plots import plot_convergence
124+
125+
plot = plot_convergence(("dummy_minimize", dummy_res),
126+
("gp_minimize", gp_res),
127+
("forest_minimize('rf')", rf_res),
128+
("forest_minimize('et)", et_res),
129+
true_minimum=0.397887, yscale="log")
130+
131+
plot.legend(loc="best", prop={'size': 6}, numpoints=1)
132+
133+
#############################################################################
134+
# This plot shows the value of the minimum found (y axis) as a function
135+
# of the number of iterations performed so far (x axis). The dashed red line
136+
# indicates the true value of the minimum of the branin function.
137+
#
138+
# For the first ten iterations all methods perform equally well as they all
139+
# start by creating ten random samples before fitting their respective model
140+
# for the first time. After iteration ten the next point at which
141+
# to evaluate `branin` is guided by the model, which is where differences
142+
# start to appear.
143+
#
144+
# Each minimizer only has access to noisy observations of the objective
145+
# function, so as time passes (more iterations) it will start observing
146+
# values that are below the true value simply because they are fluctuations.

0 commit comments

Comments
 (0)