You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a set of variances var_Y for each element of my training set Y. My setup looks like this, where I've commented out the
lines from the original example that I've modified.
# Model
model = bGPLVM(N, data_dim, latent_dim, n_inducing, pca=pca)
# Likelihood
#likelihood = GaussianLikelihood(batch_shape=model.batch_shape)
likelihood = FixedNoiseGaussianLikelihood(batch_shape=self.model.batch_shape,
noise=torch.ones(self.model.batch_shape),
learn_additional_noise=False)
# Declaring the objective to be optimised along with optimiser
# (see models/latent_variable.py for how the additional loss terms are accounted for)
mll = VariationalELBO(likelihood, model, num_data=len(Y))
optimizer = torch.optim.Adam([ {'params': model.parameters()}, {'params': likelihood.parameters()}], lr=0.01)
# Training loop - optimises the objective wrt kernel hypers, variational params and inducing inputs
# using the optimizer provided.
loss_list = []
iterator = trange(10000 if not smoke_test else 4, leave=True)
batch_size = 100
for i in iterator:
batch_index = model._get_batch_idx(batch_size)
optimizer.zero_grad()
sample = model.sample_latent_variable() # a full sample returns latent x across all N
sample_batch = sample[batch_index]
output_batch = model(sample_batch)
loss = -self.mll(output_batch, self.Y[batch_index].T, noise=self.var_Y[batch_index].T).sum()
# loss = -mll(output_batch, Y[batch_index].T).sum()
loss_list.append(loss.item())
iterator.set_description('Loss: ' + str(float(np.round(loss.item(),2))) + ", iter no: " + str(i))
loss.backward()
optimizer.step()
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi All,
I'm trying to train a GPLVM following the example here, but with a FixedNoiseGaussianLikelihood. I'm following the example here:
https://docs.gpytorch.ai/en/stable/examples/045_GPLVM/Gaussian_Process_Latent_Variable_Models_with_Stochastic_Variational_Inference.html
I have a set of variances var_Y for each element of my training set Y. My setup looks like this, where I've commented out the
lines from the original example that I've modified.
Can you please let me know if I'm on the right track? Also, if I wanted to implement missing data, would the best thing
be to follow the example GaussianLikelihoodWithMissingObs in https://github.com/cornellius-gp/gpytorch/blob/master/gpytorch/likelihoods/gaussian_likelihood.py, but instead subclass FixedNoiseGaussianLikelihood in place of GaussianLikelihood?
Many Thanks!!
Beta Was this translation helpful? Give feedback.
All reactions