DirectSolver#
- class tinygp.solvers.DirectSolver(kernel: kernels.Kernel, X: JAXArray, noise: Noise, *, covariance: Any | None = None)[source]#
Bases:
Solver
A direct solver that uses
jax
’s built in Cholesky factorizationYou generally won’t instantiate this object directly but, if you do, you’ll probably want to use the
DirectSolver.init()
method instead of the usual constructor.- condition(kernel: kernels.Kernel, X_test: JAXArray | None, noise: Noise) Any [source]#
Compute the covariance matrix for a conditional GP
- Parameters:
kernel – The kernel for the covariance between the observed and predicted data.
X_test – The coordinates of the predicted points. Defaults to the input coordinates.
noise – The noise model for the predicted process.
- dot_triangular(y: tinygp.helpers.JAXArray) tinygp.helpers.JAXArray [source]#
Compute a matrix product with the lower triangular linear system
If the covariance matrix is
K = L @ L.T
for some lower triangular matrixL
, this method returnsL @ y
for somey
.
- normalization() tinygp.helpers.JAXArray [source]#
The multivariate normal normalization constant
This should be
(log_det + n*log(2*pi))/2
, wheren
is the size of the covariance matrix, andlog_det
is the log determinant of the matrix.
- solve_triangular(y: tinygp.helpers.JAXArray, *, transpose: bool = False) tinygp.helpers.JAXArray [source]#
Solve the lower triangular linear system defined by this solver
If the covariance matrix is
K = L @ L.T
for some lower triangular matrixL
, this method solvesL @ x = y
for somey
. If thetranspose
parameter isTrue
, this instead solvesL.T @ x = y
.