API
Contents
API¶
Computation engine¶
- class tinygp.GaussianProcess(kernel: Kernel, X: JAXArray, *, diag: Union[JAXArray, float] = 0.0, mean: Optional[Union[Mean, JAXArray]] = None)¶
An interface for designing a Gaussian Process regression model
- Parameters
kernel (Kernel) – The kernel function
X (JAXArray) – The input coordinates. This can be any PyTree that is compatible with
kernel
where the zeroth dimension isN_data
, the size of the data set.diag (JAXArray, optional) – The value to add to the diagonal of the covariance matrix, often used to capture measurement uncertainty. This should be a scalar or have the shape
(N_data,)
.mean – (Mean, optional): A callable or constant mean function that will be evaluated with the
X
as input:mean(X)
- condition(y: tinygp.types.JAXArray) tinygp.types.JAXArray ¶
Condition the process on observed data
- Parameters
y (JAXArray) – The observed data. This should have the shape
(N_data,)
, whereN_data
was the zeroth axis of theX
data provided when instantiating this object.- Returns
The marginal likelihood of this model, evaluated at
y
.
- condition_and_predict(y: JAXArray, X_test: Optional[JAXArray] = None, *, kernel: Optional[Kernel] = None, include_mean: bool = True, return_var: bool = False, return_cov: bool = False) Tuple[JAXArray, Union[JAXArray, Tuple[JAXArray, JAXArray]]] ¶
Condition on observed data and return the predictive process
This combines
GaussianProcess.condition()
andGaussianProcess.predict()
into a single operation which will be somewhat more efficient than calling them both separately. See those docstrings for a description of all the arguments.
- numpyro_dist(**kwargs)¶
Get the numpyro MultivariateNormal distribution for this process
- predict(y: JAXArray, X_test: Optional[JAXArray] = None, *, kernel: Optional[Kernel] = None, include_mean: bool = True, return_var: bool = False, return_cov: bool = False) Union[JAXArray, Tuple[JAXArray, JAXArray]] ¶
Predict the GP model at new test points conditioned on observed data
- Parameters
y (JAXArray) – The observed data. This should have the shape
(N_data,)
, whereN_data
was the zeroth axis of theX
data provided when instantiating this object.X_test (JAXArray, optional) – The coordinates where the prediction should be evaluated. This should have a data type compatible with the
X
data provided when instantiating this object. If it is not provided,X
will be used by default, so the predictions will be made.include_mean (bool, optional) – If
True
(default), the predicted values will include the mean function evaluated atX_test
.return_var (bool, optional) – If
True
, the variance of the predicted values atX_test
will be returned.return_cov (bool, optional) – If
True
, the covariance of the predicted values atX_test
will be returned. Ifreturn_var
isTrue
, this flag will be ignored.
- Returns
The mean of the predictive model evaluated at
X_test
, with shape(N_test,)
whereN_test
is the zeroth dimension ofX_test
. If eitherreturn_var
orreturn_cov
isTrue
, the variance or covariance of the predicted process will also be returned with shape(N_test,)
or(N_test, N_test)
respectively.
- sample(key: jax._src.prng.PRNGKeyArray, shape: Optional[Sequence[int]] = None) tinygp.types.JAXArray ¶
Generate samples from the prior process
- Parameters
key – A
jax
random number key array. shape (tuple, optional): Theto (number and shape of samples) – generate.
- Returns
The sampled realizations from the process with shape
(N_data,) + shape
whereN_data
is the zeroth dimension of theX
coordinates provided when instantiating this process.
Kernels¶
- class tinygp.kernels.Kernel¶
The base class for all kernel implementations
This subclass provides default implementations to add and multiply kernels. Subclasses should accept parameters in their
__init__
and then overrideKernel.evaluate()
with custom behavior.- evaluate(X1: tinygp.types.JAXArray, X2: tinygp.types.JAXArray) tinygp.types.JAXArray ¶
Evaluate the kernel at a pair of input coordinates
This should be overridden be subclasses to return the kernel-specific value. Two things to note:
Users shouldn’t generally call
Kernel.evaluate()
. Instead, always “call” the kernel instance directly; for example, you can evaluate the Matern-3/2 kernel usingMatern32(1.5)(x1, x2)
, for arrays of input coordinatesx1
andx2
.When implementing a custom kernel, this method should treat
X1
andX2
as single datapoints. In other words, these inputs will typically either be scalars of have shapen_dim
, wheren_dim
is the number of input dimensions, rather thann_data
or(n_data, n_dim)
, and you should let theKernel
vmap
magic handle all the broadcasting for you.
- class tinygp.kernels.Custom(function: Callable[[Any, Any], Any])¶
A custom kernel class implemented as a callable
- Parameters
function – A callable with a signature and behavior that matches
Kernel.evaluate()
.
- class tinygp.kernels.Constant(value: tinygp.types.JAXArray)¶
This kernel returns the constant
\[k(\mathbf{x}_i,\,\mathbf{x}_j) = c\]where \(c\) is a parameter.
- Parameters
c – The parameter \(c\) in the above equation.
- class tinygp.kernels.DotProduct¶
The dot product kernel
\[k(\mathbf{x}_i,\,\mathbf{x}_j) = \mathbf{x}_i \cdot \mathbf{x}_j\]with no parameters.
- class tinygp.kernels.Polynomial(*, order: tinygp.types.JAXArray, scale: tinygp.types.JAXArray = DeviceArray(1.0, dtype=float32), sigma: tinygp.types.JAXArray = DeviceArray(0.0, dtype=float32))¶
A polynomial kernel
\[k(\mathbf{x}_i,\,\mathbf{x}_j) = [(\mathbf{x}_i / \ell) \cdot (\mathbf{x}_j / \ell) + \sigma^2]^P\]- Parameters
order – The power \(P\).
scale – The parameter \(\ell\).
sigma – The parameter \(\sigma\).
- class tinygp.kernels.Exp(scale: tinygp.types.JAXArray = DeviceArray(1.0, dtype=float32))¶
The exponential kernel
\[k(\mathbf{x}_i,\,\mathbf{x}_j) = \exp(-r)\]where
\[r = ||(\mathbf{x}_i - \mathbf{x}_j) / \ell||_1\]- Parameters
scale – The parameter \(\ell\).
- class tinygp.kernels.ExpSquared(scale: tinygp.types.JAXArray = DeviceArray(1.0, dtype=float32))¶
The exponential squared or radial basis function kernel
\[k(\mathbf{x}_i,\,\mathbf{x}_j) = \exp(-r^2 / 2)\]where
\[r = ||(\mathbf{x}_i - \mathbf{x}_j) / \ell||_2\]- Parameters
scale – The parameter \(\ell\).
- class tinygp.kernels.Matern32(scale: tinygp.types.JAXArray = DeviceArray(1.0, dtype=float32))¶
The Matern-3/2 kernel
\[k(\mathbf{x}_i,\,\mathbf{x}_j) = (1 + \sqrt{3}\,r)\,\exp(-\sqrt{3}\,r)\]where
\[r = ||(\mathbf{x}_i - \mathbf{x}_j) / \ell||_1\]- Parameters
scale – The parameter \(\ell\).
- class tinygp.kernels.Matern52(scale: tinygp.types.JAXArray = DeviceArray(1.0, dtype=float32))¶
The Matern-5/2 kernel
\[k(\mathbf{x}_i,\,\mathbf{x}_j) = (1 + \sqrt{5}\,r + 5\,r^2/\sqrt{3})\,\exp(-\sqrt{5}\,r)\]where
\[r = ||(\mathbf{x}_i - \mathbf{x}_j) / \ell||_1\]- Parameters
scale – The parameter \(\ell\).
- class tinygp.kernels.Cosine(period: tinygp.types.JAXArray)¶
The cosine kernel
\[k(\mathbf{x}_i,\,\mathbf{x}_j) = \cos(2\,\pi\,r)\]where
\[r = ||(\mathbf{x}_i - \mathbf{x}_j) / P||_1\]- Parameters
period – The parameter \(P\).
- class tinygp.kernels.ExpSineSquared(*, period: tinygp.types.JAXArray, gamma: tinygp.types.JAXArray)¶
The exponential sine squared or quasiperiodic kernel
\[k(\mathbf{x}_i,\,\mathbf{x}_j) = \exp(-\Gamma\,\sin^2 \pi r)\]where
\[r = ||(\mathbf{x}_i - \mathbf{x}_j) / P||_1\]- Parameters
period – The parameter \(P\).
gamma – The parameter \(\Gamma\).
- class tinygp.kernels.RationalQuadratic(*, alpha: JAXArray, scale: Optional[JAXArray] = None)¶
The rational quadratic
\[k(\mathbf{x}_i,\,\mathbf{x}_j) = (1 + r^2 / 2\,\alpha)^{-\alpha}\]where
\[r = ||(\mathbf{x}_i - \mathbf{x}_j) / \ell||_2\]- Parameters
scale – The parameter \(\ell\).
alpha – The parameter \(\alpha\).
Transforms¶
In tinygp
, a “transform” is any callable that takes an input coordinate and
returns a transformed coordinate. There are some built in implementations for
standard linear transformations that can be used to handle multivariate vector
inputs.
- class tinygp.transforms.Transform(transform: Callable[[Any], Any], kernel: tinygp.kernels.Kernel)¶
Apply a transformation to the input coordinates of the kernel
- Parameters
transform – (Callable): A callable object that accepts coordinates as inputs and returns transformed coordinates.
kernel (Kernel) – The kernel to use in the transformed space.
- class tinygp.transforms.Affine(scale: tinygp.types.JAXArray, kernel: tinygp.kernels.Kernel, *, variance: bool = False)¶
Apply an affine transformation to the input coordinates of the kernel
For example, the following transformed kernels are all equivalent:
>>> import numpy as np >>> from tinygp import kernels, transforms >>> kernel0 = kernels.Matern32(4.5) >>> kernel1 = transforms.Affine(4.5, kernels.Matern32()) >>> kernel2 = transforms.Affine(4.5 ** 2, kernels.Matern32(), variance=True) >>> np.testing.assert_allclose( ... kernel0.evaluate(0.5, 0.1), kernel1.evaluate(0.5, 0.1) ... ) >>> np.testing.assert_allclose( ... kernel0.evaluate(0.5, 0.1), kernel2.evaluate(0.5, 0.1) ... )
- Parameters
scale (JAXArray) – A 0-, 1-, or 2- dimensional array specifying the variance or covariance of the input dimensions.
kernel (Kernel) – The kernel to use in the transformed space.
variance – (bool, optional): If
True
, take the square root ofscale
before applying its inverse.
- class tinygp.transforms.Subspace(axis: Union[Sequence[int], int], kernel: tinygp.kernels.Kernel)¶
A kernel transform that selects a subset of the input dimensions
For example, the following kernel only depends on the coordinates in the second (1-th) dimension:
>>> import numpy as np >>> from tinygp import kernels, transforms >>> kernel = transforms.Subspace(1, kernels.Matern32()) >>> np.testing.assert_allclose( ... kernel.evaluate(np.array([0.5, 0.1]), np.array([-0.4, 0.7])), ... kernel.evaluate(np.array([100.5, 0.1]), np.array([-70.4, 0.7])), ... )
- Parameters
axis – (Axis, optional): An integer or tuple of integers specifying the axes to select.
kernel (Kernel) – The kernel to use in the transformed space.
Mean functions¶
In tinygp
, a mean function is specified as a callable that takes an input
coordinate and returns the scalar mean value at that point. This will be
vmap
-ed, so it should treat its input as a single coordinate and leave
broadcasting to the tinygp.GaussianProcess
object.
- tinygp.means.zero_mean(X: tinygp.types.JAXArray) tinygp.types.JAXArray ¶
- tinygp.means.constant_mean(value: tinygp.types.JAXArray) tinygp.means.Mean ¶