Tutorial Learner1D#

Note

Because this documentation consists of static html, the live_plot and live_info widget is not live. Download the notebook in order to see the real behaviour. [1]

Hide code cell content
import adaptive

adaptive.notebook_extension()

import random
from functools import partial

import numpy as np

scalar output: f:โ„ โ†’ โ„#

We start with the most common use-case: sampling a 1D function f: โ„ โ†’ โ„.

We will use the following function, which is a smooth (linear) background with a sharp peak at a random location:

offset = random.uniform(-0.5, 0.5)


def f(x, offset=offset, wait=True):
    from random import random
    from time import sleep

    a = 0.01
    if wait:
        sleep(random() / 10)
    return x + a**2 / (a**2 + (x - offset) ** 2)

We start by initializing a 1D โ€œlearnerโ€, which will suggest points to evaluate, and adapt its suggestions as more and more points are evaluated.

learner = adaptive.Learner1D(f, bounds=(-1, 1))

Next we create a โ€œrunnerโ€ that will request points from the learner and evaluate โ€˜fโ€™ on them.

By default on Unix-like systems the runner will evaluate the points in parallel using local processes concurrent.futures.ProcessPoolExecutor.

On Windows systems the runner will use a loky.get_reusable_executor. A ProcessPoolExecutor cannot be used on Windows for reasons.

# The end condition is when the "loss" is less than 0.01. In the context of the
# 1D learner this means that we will resolve features in 'func' with width 0.01 or wider.
runner = adaptive.Runner(learner, loss_goal=0.01)
Hide code cell content
await runner.task  # This is not needed in a notebook environment!

When instantiated in a Jupyter notebook the runner does its job in the background and does not block the IPython kernel. We can use this to create a plot that updates as new data arrives:

runner.live_info()
runner.live_plot(update_interval=0.1)

We can now compare the adaptive sampling to a homogeneous sampling with the same number of points:

if not runner.task.done():
    raise RuntimeError(
        "Wait for the runner to finish before executing the cells below!"
    )
learner2 = adaptive.Learner1D(f, bounds=learner.bounds)

xs = np.linspace(*learner.bounds, len(learner.data))
learner2.tell_many(xs, map(partial(f, wait=False), xs))

learner.plot() + learner2.plot()

vector output: f:โ„ โ†’ โ„^N#

Sometimes you may want to learn a function with vector output:

random.seed(0)
offsets = [random.uniform(-0.8, 0.8) for _ in range(3)]

# sharp peaks at random locations in the domain


def f_levels(x, offsets=offsets):
    a = 0.01
    return np.array(
        [offset + x + a**2 / (a**2 + (x - offset) ** 2) for offset in offsets]
    )

adaptive has you covered! The Learner1D can be used for such functions:

learner = adaptive.Learner1D(f_levels, bounds=(-1, 1))
runner = adaptive.Runner(
    learner, loss_goal=0.01
)  # continue until `learner.loss()<=0.01`
Hide code cell content
await runner.task  # This is not needed in a notebook environment!
runner.live_info()
runner.live_plot(update_interval=0.1)

Looking at curvature#

By default adaptive will sample more points where the (normalized) euclidean distance between the neighboring points is large. You may achieve better results sampling more points in regions with high curvature. To do this, you need to tell the learner to look at the curvature by specifying loss_per_interval.

from adaptive.learner.learner1D import (
    curvature_loss_function,
    default_loss,
    uniform_loss,
)

curvature_loss = curvature_loss_function()
learner = adaptive.Learner1D(f, bounds=(-1, 1), loss_per_interval=curvature_loss)
runner = adaptive.Runner(learner, loss_goal=0.01)
Hide code cell content
await runner.task  # This is not needed in a notebook environment!
runner.live_info()
runner.live_plot(update_interval=0.1)

We may see the difference of homogeneous sampling vs only one interval vs including the nearest neighboring intervals in this plot. We will look at 100 points.

def sin_exp(x):
    from math import exp, sin

    return sin(15 * x) * exp(-(x**2) * 2)


learner_h = adaptive.Learner1D(sin_exp, (-1, 1), loss_per_interval=uniform_loss)
learner_1 = adaptive.Learner1D(sin_exp, (-1, 1), loss_per_interval=default_loss)
learner_2 = adaptive.Learner1D(sin_exp, (-1, 1), loss_per_interval=curvature_loss)

# adaptive.runner.simple is a non parallel blocking runner.
adaptive.runner.simple(learner_h, npoints_goal=100)
adaptive.runner.simple(learner_1, npoints_goal=100)
adaptive.runner.simple(learner_2, npoints_goal=100)

(
    learner_h.plot().relabel("homogeneous")
    + learner_1.plot().relabel("euclidean loss")
    + learner_2.plot().relabel("curvature loss")
).cols(2)

More info about using custom loss functions can be found in Custom adaptive logic for 1D and 2D.

Exporting the data#

We can view the raw data by looking at the dictionary learner.data. Alternatively, we can view the data as NumPy array with

learner.to_numpy()
array([[-1.00000000e+00, -9.99837596e-01],
       [-8.33333333e-01, -8.33071541e-01],
       [-6.66666667e-01, -6.66175921e-01],
       [-5.00000000e-01, -4.98767199e-01],
       [-4.16666667e-01, -4.14204927e-01],
       [-3.33333333e-01, -3.26198694e-01],
       [-2.91666667e-01, -2.74779579e-01],
       [-2.70833333e-01, -2.39352608e-01],
       [-2.60416667e-01, -2.13457265e-01],
       [-2.50000000e-01, -1.73045260e-01],
       [-2.39583333e-01, -9.39052558e-02],
       [-2.34375000e-01, -1.76035090e-02],
       [-2.29166667e-01,  1.15139783e-01],
       [-2.23958333e-01,  3.51362445e-01],
       [-2.21354167e-01,  5.14753480e-01],
       [-2.18750000e-01,  6.78540041e-01],
       [-2.16145833e-01,  7.77820386e-01],
       [-2.13541667e-01,  7.54224320e-01],
       [-2.10937500e-01,  6.25058076e-01],
       [-2.08333333e-01,  4.60704768e-01],
       [-2.03125000e-01,  1.97100177e-01],
       [-1.97916667e-01,  4.92997743e-02],
       [-1.92708333e-01, -2.96830566e-02],
       [-1.87500000e-01, -7.34168001e-02],
       [-1.77083333e-01, -1.13210733e-01],
       [-1.66666667e-01, -1.26208580e-01],
       [-1.45833333e-01, -1.25569456e-01],
       [-1.25000000e-01, -1.12902457e-01],
       [-8.33333333e-02, -7.76297414e-02],
       [-5.55111512e-17,  2.15133258e-03],
       [ 8.33333333e-02,  8.44528819e-02],
       [ 1.66666667e-01,  1.67351366e-01],
       [ 3.33333333e-01,  3.33665370e-01],
       [ 5.00000000e-01,  5.00195370e-01],
       [ 6.66666667e-01,  6.66795188e-01],
       [ 1.00000000e+00,  1.00006769e+00]])

If Pandas is installed (optional dependency), you can also run

df = learner.to_dataframe()
df
x y function.offset function.wait
0 -1.000000e+00 -0.999838 -0.215367 True
1 -8.333333e-01 -0.833072 -0.215367 True
2 -6.666667e-01 -0.666176 -0.215367 True
3 -5.000000e-01 -0.498767 -0.215367 True
4 -4.166667e-01 -0.414205 -0.215367 True
5 -3.333333e-01 -0.326199 -0.215367 True
6 -2.916667e-01 -0.274780 -0.215367 True
7 -2.708333e-01 -0.239353 -0.215367 True
8 -2.604167e-01 -0.213457 -0.215367 True
9 -2.500000e-01 -0.173045 -0.215367 True
10 -2.395833e-01 -0.093905 -0.215367 True
11 -2.343750e-01 -0.017604 -0.215367 True
12 -2.291667e-01 0.115140 -0.215367 True
13 -2.239583e-01 0.351362 -0.215367 True
14 -2.213542e-01 0.514753 -0.215367 True
15 -2.187500e-01 0.678540 -0.215367 True
16 -2.161458e-01 0.777820 -0.215367 True
17 -2.135417e-01 0.754224 -0.215367 True
18 -2.109375e-01 0.625058 -0.215367 True
19 -2.083333e-01 0.460705 -0.215367 True
20 -2.031250e-01 0.197100 -0.215367 True
21 -1.979167e-01 0.049300 -0.215367 True
22 -1.927083e-01 -0.029683 -0.215367 True
23 -1.875000e-01 -0.073417 -0.215367 True
24 -1.770833e-01 -0.113211 -0.215367 True
25 -1.666667e-01 -0.126209 -0.215367 True
26 -1.458333e-01 -0.125569 -0.215367 True
27 -1.250000e-01 -0.112902 -0.215367 True
28 -8.333333e-02 -0.077630 -0.215367 True
29 -5.551115e-17 0.002151 -0.215367 True
30 8.333333e-02 0.084453 -0.215367 True
31 1.666667e-01 0.167351 -0.215367 True
32 3.333333e-01 0.333665 -0.215367 True
33 5.000000e-01 0.500195 -0.215367 True
34 6.666667e-01 0.666795 -0.215367 True
35 1.000000e+00 1.000068 -0.215367 True

and load that data into a new learner with

new_learner = adaptive.Learner1D(learner.function, (-1, 1))  # create an empty learner
new_learner.load_dataframe(df)  # load the pandas.DataFrame's data
new_learner.plot()