-
-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Start the SciML showcase #92
Conversation
This is the start of the SciML showcase which was mentioned in #73 (comment) . Thus this PR supersedes #73 What is the SciML Showcase? It's a place for cool demonstrations. Tutorials have to be simple and teaching focused, so it's hard for them to really dive into the fun stuff. But the "why SciML?" is the fun stuff, so that needs to be there front and center somehow. Enter the showcase. By being separated from the "getting started" section, it's very clear (and has a note) that this is not for getting started. It's saying, hey, you might not understand this code at first glance, but this is to show you all of the cool stuff you'll learn around here! And that's it's main purpose: to show off some cool stuff. Thus to kick off the showcase, I wanted to revive some of the core examples from the UDE paper and the Bayesian UDE paper, given how those examples seem to be some of the biggest awe drivers to the ecosystem. That said, the showcase also serves another purpose of making sure that our best examples stay reproducible! Thus the showcase is made for the examples to be run with `@example` in strict mode, meaning that errors cause failures in the documentation build. I intend for this to be added as a downstream test to all of the major SciML packages that are showcased in the showcase, as this will ensure that any breakage to our top examples are remedied ASAP. This will make our "core" examples much more robust, making it easier for people to share them. For now I started with Raj's BNODE code, and the Scenario 1 of the UDE. That UDE code has gone through a few iterations, so I'm not sure it will "just work" copy pasting it here. I may want some help from @AlCap23 and @Abhishek-1Bhatt to get it up to snuff. Also @RajDandekar and @Vaibhavdixit02 for the BNODE. I may merge this early just so that others can more easily poke away at it as well: this is more than a one person effort, and this is to just get things started. I have plans for more things in the showcase. I think the MethodOfLines Brusselator example is a great one to show @xtalax. We should probably put a GPU physics-informed neural network up there too. And maybe something cool from ModelingToolkit using the standard library? @YingboMa Those will be follow ups in issues, but note that things in this overview are getting the most hits in the docs, so this means the showcase will be front and center. We will want to put our best foot forward right here.
Hmm seems hard to get that example to Lux. @Vaibhavdixit02 @RajDandekar @avik-pal have you tried using AdvancedHMC before with Lux? I don't know if all of the functions support component arrays. |
@AlCap23 did you forget to interpolate something in the DataDrivenSparse codegen? julia> full_res = solve(full_problem, basis, opt, maxiter = 10000, progress = true)
ERROR: UndefVarError: u not defined
Stacktrace:
[1] macro expansion
@ C:\Users\accou\.julia\packages\SymbolicUtils\qulQp\src\code.jl:444 [inlined]
[2] macro expansion
@ C:\Users\accou\.julia\packages\RuntimeGeneratedFunctions\6v5Gn\src\RuntimeGeneratedFunctions.jl:137 [inlined]
[3] macro expansion
@ .\none:0 [inlined]
[4] generated_callfunc
@ .\none:0 [inlined]
[5] (::RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :ˍ₋arg3, :ˍ₋arg4, :ˍ₋arg5), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0x20e8270c, 0x1a53fee2, 0x3908bd32, 0x2d5daea8, 0xd0cfefa2)})(::SubArray{Float64, 1, Matrix{Float64}, Tuple{Base.Slice{Base.OneTo{Int64}}, Int64}, true}, ::SubArray{Float64, 1, Matrix{Float64}, Tuple{Base.Slice{Base.OneTo{Int64}}, Int64}, true}, ::SubArray{Float64, 1, Vector{Float64}, Tuple{Base.Slice{Base.OneTo{Int64}}}, true}, ::SubArray{Float64, 0, Vector{Float64}, Tuple{Int64}, true}, ::Matrix{Float64})
@ RuntimeGeneratedFunctions C:\Users\accou\.julia\packages\RuntimeGeneratedFunctions\6v5Gn\src\RuntimeGeneratedFunctions.jl:124
[6] _apply_function(f::DataDrivenDiffEq.DataDrivenFunction{false, false, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :ˍ₋arg3, :ˍ₋arg4, :ˍ₋arg5), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0x20e8270c, 0x1a53fee2, 0x3908bd32, 0x2d5daea8, 0xd0cfefa2)}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋out, :ˍ₋arg1, :ˍ₋arg2, :ˍ₋arg3, :ˍ₋arg4, :ˍ₋arg5), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0x9775d9a4, 0x72053c2f, 0x217229a6, 0xa4a5d22b, 0x9b018c42)}}, du::SubArray{Float64, 1, Matrix{Float64}, Tuple{Base.Slice{Base.OneTo{Int64}}, Int64}, true}, u::SubArray{Float64, 1, Matrix{Float64}, Tuple{Base.Slice{Base.OneTo{Int64}}, Int64}, true}, p::SubArray{Float64, 1, Vector{Float64}, Tuple{Base.Slice{Base.OneTo{Int64}}}, true}, t::SubArray{Float64, 0, Vector{Float64}, Tuple{Int64}, true}, c::Matrix{Float64})
@ DataDrivenDiffEq C:\Users\accou\.julia\packages\DataDrivenDiffEq\yx4Ta\src\basis\build_function.jl:32
[7] (::DataDrivenDiffEq.var"#1#2"{DataDrivenDiffEq.DataDrivenFunction{false, false, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :ˍ₋arg3, :ˍ₋arg4, :ˍ₋arg5), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0x20e8270c, 0x1a53fee2, 0x3908bd32, 0x2d5daea8, 0xd0cfefa2)}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋out, :ˍ₋arg1, :ˍ₋arg2, :ˍ₋arg3, :ˍ₋arg4, :ˍ₋arg5), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0x9775d9a4, 0x72053c2f, 0x217229a6, 0xa4a5d22b, 0x9b018c42)}}, Matrix{Float64}, Matrix{Float64}, Vector{Float64}, Vector{Float64}, Matrix{Float64}})(i::Int64)
@ DataDrivenDiffEq C:\Users\accou\.julia\packages\DataDrivenDiffEq\yx4Ta\src\basis\build_function.jl:150
[8] iterate
@ .\generator.jl:47 [inlined]
[9] _collect(c::Base.OneTo{Int64}, itr::Base.Generator{Base.OneTo{Int64}, DataDrivenDiffEq.var"#1#2"{DataDrivenDiffEq.DataDrivenFunction{false, false, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :ˍ₋arg3, :ˍ₋arg4, :ˍ₋arg5), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0x20e8270c, 0x1a53fee2, 0x3908bd32, 0x2d5daea8, 0xd0cfefa2)}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋out, :ˍ₋arg1, :ˍ₋arg2, :ˍ₋arg3, :ˍ₋arg4, :ˍ₋arg5), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0x9775d9a4, 0x72053c2f, 0x217229a6, 0xa4a5d22b, 0x9b018c42)}}, Matrix{Float64}, Matrix{Float64}, Vector{Float64}, Vector{Float64}, Matrix{Float64}}}, #unused#::Base.EltypeUnknown, isz::Base.HasShape{1})
@ Base .\array.jl:807
[10] collect_similar
@ .\array.jl:716 [inlined]
[11] map
@ .\abstractarray.jl:2933 [inlined]
[12] _apply_vec_function(f::DataDrivenDiffEq.DataDrivenFunction{false, false, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋arg1, :ˍ₋arg2, :ˍ₋arg3, :ˍ₋arg4, :ˍ₋arg5), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0x20e8270c, 0x1a53fee2, 0x3908bd32, 0x2d5daea8, 0xd0cfefa2)}, RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:ˍ₋out, :ˍ₋arg1, :ˍ₋arg2, :ˍ₋arg3, :ˍ₋arg4, :ˍ₋arg5), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0x9775d9a4, 0x72053c2f, 0x217229a6, 0xa4a5d22b, 0x9b018c42)}}, du::Matrix{Float64}, u::Matrix{Float64}, p::Vector{Float64}, t::Vector{Float64}, c::Matrix{Float64})
@ DataDrivenDiffEq C:\Users\accou\.julia\packages\DataDrivenDiffEq\yx4Ta\src\basis\build_function.jl:148
[13] (::Basis{false, false})(p::DataDrivenProblem{Float64, false, DataDrivenDiffEq.Continuous})
@ DataDrivenDiffEq C:\Users\accou\.julia\packages\DataDrivenDiffEq\yx4Ta\src\problem\type.jl:337
[14] get_fit_targets
@ C:\Users\accou\.julia\packages\DataDrivenDiffEq\yx4Ta\src\utils\common_options.jl:157 [inlined]
[15] init(prob::DataDrivenProblem{Float64, false, DataDrivenDiffEq.Continuous}, basis::Basis{false, false}, alg::STLSQ{Vector{Float64}, Float64}; options::DataDrivenCommonOptions{Float64, NamedTuple{(), Tuple{}}}, kwargs::Base.Pairs{Symbol, Integer, Tuple{Symbol, Symbol}, NamedTuple{(:maxiter, :progress), Tuple{Int64, Bool}}})
@ DataDrivenDiffEq C:\Users\accou\.julia\packages\DataDrivenDiffEq\yx4Ta\src\utils\common_options.jl:175
[16] solve(::DataDrivenProblem{Float64, false, DataDrivenDiffEq.Continuous}, ::Vararg{Any}; kwargs::Base.Pairs{Symbol, Integer, Tuple{Symbol, Symbol}, NamedTuple{(:maxiter, :progress), Tuple{Int64, Bool}}})
@ CommonSolve C:\Users\accou\.julia\packages\CommonSolve\u9cNO\src\CommonSolve.jl:23
[17] top-level scope
@ REPL[166]:1 |
Nope. I'll have a look later tonight. Probably this is due to a missing collect. |
Leaving that as un-exampled for now @AlCap23 . Bayesian is fully complete. Merging with that. |
This is the start of the SciML showcase which was mentioned in #73 (comment) . Thus this PR supersedes #73
What is the SciML Showcase? It's a place for cool demonstrations. Tutorials have to be simple and teaching focused, so it's hard for them to really dive into the fun stuff. But the "why SciML?" is the fun stuff, so that needs to be there front and center somehow. Enter the showcase.
By being separated from the "getting started" section, it's very clear (and has a note) that this is not for getting started. It's saying, hey, you might not understand this code at first glance, but this is to show you all of the cool stuff you'll learn around here! And that's it's main purpose: to show off some cool stuff.
Thus to kick off the showcase, I wanted to revive some of the core examples from the UDE paper and the Bayesian UDE paper, given how those examples seem to be some of the biggest awe drivers to the ecosystem.
That said, the showcase also serves another purpose of making sure that our best examples stay reproducible! Thus the showcase is made for the examples to be run with
@example
in strict mode, meaning that errors cause failures in the documentation build. I intend for this to be added as a downstream test to all of the major SciML packages that are showcased in the showcase, as this will ensure that any breakage to our top examples are remedied ASAP. This will make our "core" examples much more robust, making it easier for people to share them.For now I started with Raj's BNODE code, and the Scenario 1 of the UDE. That UDE code has gone through a few iterations, so I'm not sure it will "just work" copy pasting it here. I may want some help from @AlCap23 and @Abhishek-1Bhatt to get it up to snuff. Also @RajDandekar and @Vaibhavdixit02 for the BNODE. I may merge this early just so that others can more easily poke away at it as well: this is more than a one person effort, and this is to just get things started.
I have plans for more things in the showcase. I think the MethodOfLines Brusselator example is a great one to show @xtalax. We should probably put a GPU physics-informed neural network up there too. And maybe something cool from ModelingToolkit using the standard library? @YingboMa
Those will be follow ups in issues, but note that things in this overview are getting the most hits in the docs, so this means the showcase will be front and center. We will want to put our best foot forward right here.