DKR
dkregression.DKR(kernel, likelihood, cross_validation)
The DKR object handles the hyper parameter fitting as well as running the model inference. It ties together the kernel, the observation likelihood and the cross-validation configuration.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
kernel |
Kernel
|
A kernel object that adheres to the definition of kernels as outlined in the reference section about kernels. |
required |
likelihood |
Likelihood
|
An observation likelihood object that adheres to the definition of observation likelihoods as outlined in the reference section about observation likelihoods. |
required |
cross_validation |
CrossValidation
|
In instance of the cross-validation configuration object. The documentation can be found in the approporiate reference section. |
required |
Attributes:
| Name | Type | Description |
|---|---|---|
kernel |
Kernel
|
The kernel object that contains the current estimate of the kernel (hyper) parameters under |
likelihood |
Likelihood
|
Likelihood model that is currently used. |
cross_validation |
CrossValidation
|
Cross-validation configuration. The attributes can be read, but also overwritten if desired. |
X |
Tensor
|
If set, the input data points of the dataset. The shape is |
Y |
Tensor
|
If set, the output data points of the dataset. The shape is |
Source code in src/dkregression/dkr.py
fit(X, Y, verbose=0, budget=100)
The DKR.fit method finds the optimal value for the kernel hyperparameters.
Mathematically, the fit method optimizes the average negative log-likelihood over
the different cross-validation partitions' held-out data points. The exact
configuration of this cross-validation depends on configuration of DKR.cross_validation,
but the goal is to have every data point in the dataset at least once being part of the
"test dataset". The optimization backend that is used is Meta's Nevergrad. The optimization
is bound by DKR.kernel.params_lower_bound and DKR.kernel.params_lower_bound.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
X |
Tensor
|
The "training" input values. Need to be of shape `(n,d_input) even if d_input==1. |
required |
Y |
Tensor
|
The "training" target values. Need to be of shape `(n,d_output) even if d_output==1. |
required |
verbose |
int
|
Defines how verbose the output is. This corresponds to the |
0
|
budget |
int
|
Defines how many iterations Nevergrad is allowed to run. This maps to the |
100
|
Examples:
import torch
from dkregression import DKR
from dkregression.kernels import RBF
from dkregression.likelihoods import PoissonLikelihood
from dkregression.cross_validation import CrossValidation
X = torch.rand((100,4))
Y = torch.randint(0,25,(100,1))
kernel = RBF(X)
likelihood = PoissonLikelihood()
cv = CrossValidation()
# initialization of the DKR model with the kernel, likelihood and cross-validation configuration
model = DKR(kernel, likelihood, cv)
# fit the kernel (hyper) parameter(s) in 'verbose' mode with a budget of 200
model.fit(X,Y,verbose=1,budget=200)
Source code in src/dkregression/dkr.py
predict(Xq)
The DKR.predict method returns the parameters of the observation likelihood as defined in DKR.likelihood.param_names for each query point in Xq.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
Xq |
Tensor
|
Contains all the query points and needs to be of shape |
required |
Returns:
| Name | Type | Description |
|---|---|---|
dict |
The keys of this dictionary correspond to |
Examples:
import torch
from dkregression import DKR
from dkregression.kernels import RBF
from dkregression.likelihoods import PoissonLikelihood
from dkregression.cross_validation import CrossValidation
X = torch.rand((100,1))
Y = torch.randint(0,25,(100,1))
kernel = RBF(X)
likelihood = PoissonLikelihood()
cv = CrossValidation()
# initialization of the DKR model with the kernel, likelihood and cross-validation configuration
model = DKR(kernel, likelihood, cv)
# fit the kernel (hyper) parameter(s)
model.fit(X,Y)
# model inference for 50 points equally spaced from 0 to 1
Xq = torch.linspace(0,1,50).reshape(-1,1)
Yq = model.predict(Xq)