Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Use scalar parameters with ParameterHandling #397

Open
wants to merge 15 commits into
base: master
Choose a base branch
from

Conversation

devmotion
Copy link
Member

This PR implements #299 (comment). See the comment and the additional explanations in the issue for more details.

This PR is still WIP and tests will fail (I assume).

src/basekernels/constant.jl Outdated Show resolved Hide resolved
src/basekernels/exponential.jl Outdated Show resolved Hide resolved
src/basekernels/exponential.jl Outdated Show resolved Hide resolved
src/basekernels/polynomial.jl Outdated Show resolved Hide resolved
src/basekernels/rational.jl Outdated Show resolved Hide resolved
src/kernels/scaledkernel.jl Outdated Show resolved Hide resolved
src/mokernels/independent.jl Outdated Show resolved Hide resolved
src/test_utils.jl Outdated Show resolved Hide resolved
src/transform/scaletransform.jl Outdated Show resolved Hide resolved
src/transform/scaletransform.jl Outdated Show resolved Hide resolved
Copy link
Member

@theogf theogf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just did a first pass
Two general comments:

  • By using only we could get rid of a lot of the length(v) = = 1
  • In the composite kernels, instead of vcat each of the resulting flattened vectors, why not use flatten again to get the reconstruction for free

src/basekernels/constant.jl Outdated Show resolved Hide resolved
src/basekernels/exponential.jl Outdated Show resolved Hide resolved
src/basekernels/fbm.jl Outdated Show resolved Hide resolved
src/basekernels/polynomial.jl Outdated Show resolved Hide resolved

# or just `@noparams GibbsKernel` - it would be safer since there is no
# default fallback for `flatten`
function ParameterHandling.flatten(::Type{T}, k::GibbsKernel) where {T<:Real}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is flatten on Function defined? If yes I think it is fine to throw an error and to tell to implement the needed function for the used struct


See also: [ParameterHandling.jl](https://github.com/invenia/ParameterHandling.jl)
"""
struct ParameterKernel{P,K} <: Kernel
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Parameter or Parametrized ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I chose Parameter because I felt many kernels are parameterized and hence Paramet(e)rizedKernel might be confusing.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually didn't we mention another more meaningful name like KernelLayer or something of this taste? Maybe KernelModel?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I think in my initial sketch I used KernelLayer (probably since we discussed its use with Flux at this point) and in the more concrete suggestion I used KernelModel. I thought ParameterKernel was a bit better but I'm not sold on the name and change it to whatever we think is best.

src/kernels/parameterkernel.jl Show resolved Hide resolved
src/kernels/scaledkernel.jl Show resolved Hide resolved
src/transform/ardtransform.jl Outdated Show resolved Hide resolved
src/transform/scaletransform.jl Outdated Show resolved Hide resolved
@devmotion
Copy link
Member Author

By using only we could get rid of a lot of the length(v) = = 1

I know, and initially I used it in some places 🙂 However, it requires Julia >= 1.4 and hence does not work with Julia 1.3.

@theogf
Copy link
Member

theogf commented Nov 3, 2021

Since this is going to a be a (massively) breaking release, maybe we can drop support < 1.6

@devmotion
Copy link
Member Author

I would be fine with this but I didn't want to make this change without prior discussion. Also I assume Julia 1.6 might be the LTS before this PR is ready and merged 🙂

src/basekernels/constant.jl Outdated Show resolved Hide resolved
src/basekernels/exponential.jl Outdated Show resolved Hide resolved
src/basekernels/exponential.jl Outdated Show resolved Hide resolved
src/basekernels/fbm.jl Outdated Show resolved Hide resolved
src/basekernels/fbm.jl Outdated Show resolved Hide resolved
src/basekernels/rational.jl Outdated Show resolved Hide resolved
src/basekernels/rational.jl Outdated Show resolved Hide resolved

See also: [ParameterHandling.jl](https://github.com/invenia/ParameterHandling.jl)
"""
struct ParameterKernel{P,K} <: Kernel
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I think in my initial sketch I used KernelLayer (probably since we discussed its use with Flux at this point) and in the more concrete suggestion I used KernelModel. I thought ParameterKernel was a bit better but I'm not sold on the name and change it to whatever we think is best.

src/transform/ardtransform.jl Outdated Show resolved Hide resolved
src/transform/scaletransform.jl Outdated Show resolved Hide resolved
src/basekernels/matern.jl Outdated Show resolved Hide resolved
src/utils.jl Outdated Show resolved Hide resolved
src/utils.jl Outdated Show resolved Hide resolved
devmotion and others added 2 commits November 11, 2021 23:09
Co-authored-by: Théo Galy-Fajou <theo.galyfajou@gmail.com>
@codecov
Copy link

codecov bot commented Nov 11, 2021

Codecov Report

Attention: 361 lines in your changes are missing coverage. Please review.

Files Coverage Δ
src/KernelFunctions.jl 100.00% <ø> (ø)
src/basekernels/cosine.jl 0.00% <ø> (-100.00%) ⬇️
src/basekernels/exponentiated.jl 0.00% <ø> (-80.00%) ⬇️
src/basekernels/nn.jl 0.00% <ø> (-98.00%) ⬇️
src/basekernels/piecewisepolynomial.jl 0.00% <ø> (-89.48%) ⬇️
src/basekernels/wiener.jl 0.00% <ø> (-92.86%) ⬇️
src/transform/transform.jl 0.00% <ø> (-54.55%) ⬇️
src/TestUtils.jl 0.00% <0.00%> (-93.25%) ⬇️
src/transform/ardtransform.jl 0.00% <0.00%> (-26.67%) ⬇️
src/utils.jl 4.59% <40.00%> (-87.07%) ⬇️
... and 23 more

... and 21 files with indirect coverage changes

📢 Thoughts on this report? Let us know!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants