class tinygp.solvers.DirectSolver(kernel: kernels.Kernel, X: JAXArray, noise: Noise, *, covariance: Any | None = None)[source]#

Bases: Solver

A direct solver that uses jax’s built in Cholesky factorization

You generally won’t instantiate this object directly but, if you do, you’ll probably want to use the DirectSolver.init() method instead of the usual constructor.

condition(kernel: kernels.Kernel, X_test: JAXArray | None, noise: Noise) Any[source]#

Compute the covariance matrix for a conditional GP

  • kernel – The kernel for the covariance between the observed and predicted data.

  • X_test – The coordinates of the predicted points. Defaults to the input coordinates.

  • noise – The noise model for the predicted process.

covariance() tinygp.helpers.JAXArray[source]#

The evaluated covariance matrix

dot_triangular(y: tinygp.helpers.JAXArray) tinygp.helpers.JAXArray[source]#

Compute a matrix product with the lower triangular linear system

If the covariance matrix is K = L @ L.T for some lower triangular matrix L, this method returns L @ y for some y.

normalization() tinygp.helpers.JAXArray[source]#

The multivariate normal normalization constant

This should be (log_det + n*log(2*pi))/2, where n is the size of the covariance matrix, and log_det is the log determinant of the matrix.

solve_triangular(y: tinygp.helpers.JAXArray, *, transpose: bool = False) tinygp.helpers.JAXArray[source]#

Solve the lower triangular linear system defined by this solver

If the covariance matrix is K = L @ L.T for some lower triangular matrix L, this method solves L @ x = y for some y. If the transpose parameter is True, this instead solves L.T @ x = y.

variance() tinygp.helpers.JAXArray[source]#

The diagonal of the covariance matrix