Skip to content

kernel hyperparameter learning #82

Open
@martinjankowiak

Description

@martinjankowiak

hello,

i'm confused about how to make sure that all the hyperparameters of my kernel are being learned.

in particular i would like to use a matern 5/2 kernel specified by D+1 parameters (D length scales and one kernel scale). to my understanding that can be specified as follows

kernel = transform(transform(Matern52Kernel(), ScaleTransform(1.0)), ARDTransform(ones(D)))

however, if i use Flux.params(kernel) to inspect the kernel hyperparameters after training it seems the hyperparameters haven't been updated.

note that i am using SVGP with the default optimiser so i would expect the hyperpararmeters to be updated. is this the wrong way to inspect the hyperparameters? do i need to do anything else to specify that i want the hyperparameters to be updated?

thank you!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions