You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Replace Flux with Lux in deep kernel learning example (#435)
* Initial plan
* Replace Flux with Lux in deep kernel learning example
Co-authored-by: yebai <3279477+yebai@users.noreply.github.com>
* Improve Lux implementation with proper parameter handling
Co-authored-by: yebai <3279477+yebai@users.noreply.github.com>
* Apply suggestions from code review
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
* Fix Literate.jl parsing issue in deep kernel learning example
Co-authored-by: yebai <3279477+yebai@users.noreply.github.com>
* Clean up comments in script.jl
* docs: use more of Lux official API for training and inference (#438)
* Update examples/2-deep-kernel-learning/script.jl
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: yebai <3279477+yebai@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Avik Pal <avik.pal.2017@gmail.com>
Copy file name to clipboardExpand all lines: examples/2-deep-kernel-learning/script.jl
+50-28Lines changed: 50 additions & 28 deletions
Original file line number
Diff line number
Diff line change
@@ -1,10 +1,10 @@
1
-
# # Deep Kernel Learning with Flux
1
+
# # Deep Kernel Learning with Lux
2
2
3
3
## Background
4
4
5
5
# This example trains a GP whose inputs are passed through a neural network.
6
6
# This kind of model has been considered previously [^Calandra] [^Wilson], although it has been shown that some care is needed to avoid substantial overfitting [^Ober].
7
-
# In this example we make use of the `FunctionTransform` from [KernelFunctions.jl](github.com/JuliaGaussianProcesses/KernelFunctions.jl/) to put a simple Multi-Layer Perceptron built using Flux.jl inside a standard kernel.
7
+
# In this example we make use of the `FunctionTransform` from [KernelFunctions.jl](github.com/JuliaGaussianProcesses/KernelFunctions.jl/) to put a simple Multi-Layer Perceptron built using Lux.jl inside a standard kernel.
8
8
9
9
# [^Calandra]: Calandra, R., Peters, J., Rasmussen, C. E., & Deisenroth, M. P. (2016, July). [Manifold Gaussian processes for regression.](https://ieeexplore.ieee.org/abstract/document/7727626) In 2016 International Joint Conference on Neural Networks (IJCNN) (pp. 3338-3345). IEEE.
10
10
@@ -17,35 +17,46 @@
17
17
# the different hyper-parameters
18
18
using AbstractGPs
19
19
using Distributions
20
-
using Flux
21
20
using KernelFunctions
22
21
using LinearAlgebra
22
+
using Lux
23
+
using Optimisers
23
24
using Plots
25
+
using Random
26
+
using Zygote
24
27
default(; legendfontsize=15.0, linewidth=3.0);
25
28
29
+
Random.seed!(42) # for reproducibility
30
+
26
31
# ## Data creation
27
32
# We create a simple 1D Problem with very different variations
28
33
29
34
xmin, xmax = (-3, 3) # Limits
30
35
N =150
31
36
noise_std =0.01
32
37
x_train_vec =rand(Uniform(xmin, xmax), N) # Training dataset
33
-
x_train =collect(eachrow(x_train_vec)) # vector-of-vectors for Flux compatibility
38
+
x_train =collect(eachrow(x_train_vec)) # vector-of-vectors for neural network compatibility
34
39
target_f(x) =sinc(abs(x)^abs(x)) # We use sinc with a highly varying value
0 commit comments