-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Long import times for Chains during testing #171
Comments
Have to tried to downgrade MCMCChains? Would be interesting to see if one can figure out if it is caused by a specific combination of packages and if a specific commit caused a performance regression. |
Good point. I was able to reproduce that pattern of results with MCMCChains 0.4.1. MCMCChains .3.4 had the same pattern of results but ran at 51 seconds with Requires 1.0, instead of 130ish. I could not run earlier versions without errors. |
Could you post the full list of packages (output of |
For MCMCChains 3.4:
For MCMCChains .4.1
|
Are you sure that's the packages for both runs? It says |
Sorry about the confusion. It looks like I repasted the first list because I did not copy correctly. I updated the list above. |
Thanks! Just had a quick look at the lists, but I'm quite surprised that version 0.4.1 seems to use older versions of the dependencies such as AxisArrays and Distributions. That seems weird?! |
And the list of dependencies seems to have grown quite significantly... |
Maybe that's mainly since DataFrames (and KernelDensity) were added as dependencies between 0.3.4 and 0.4.1: |
Sometimes the versions depend on the order in which they are installed and the requirements of the other dependencies. That is why I installed HangTest within a new environment. So it is not uncommon to see dependency downgrades. |
I just noticed that Requires is not a direct requirement of MCMCChains - do you happen to know which of the dependencies pulls it in? I'm wondering if that indicates that it's not (only) an MCMCChains problem but (also) a problem of one of its dependencies, in particular when thinking about the different timings with Requires 0.5 and 1.0. |
This has been around since Julia 1.3. Over the last 2 months I've spend almost all my cycles on trying to figure out what is happening and came to the conclusion that it is a combination of figuring out package versions and many dependencies. Why it suddenly degraded in Julia 1.3 is unclear to me. During the last 2 months I removed the fixed dependency on MCMCChains from CmdStan.jl and Stan.jl and replaced it by using Requires.jl. This solved the immediate problem (very long compile times, travis timeouts and docs on travis no longer building and even what I once erroneously called |
By the way, over the last week I tested quite a bit with DataFrames.jl and have never seen issues with that package. |
@devmotion, I'm not sure if there are dependencies in MCMCChains that currently have restrictions on Requires. However, a more general problem is that other packages that people use will probably have constraints. @goedman, I have been dealing with this issue for a while also. I just haven't had the time to pinpoint what was causing the issue. It took me a while to narrow down the problem to MCMCChains. I was forced to investigate the issue more thoroughly when it caused the MCMC sampler I coded for AdvancedPS to timeout on travis. I suppose some people might be ignoring the issue until it presents an impasse. |
To test MCMCChains, I typically activate MCMCChains as a project but that didn't give a lot of new insights:
Takes a couple of minutes on my machine. Part of the issue is that MCMCChains depends on Plots.jl indirectly (via RecipeBase) which typically causes another long delay the first time a plot is needed (this is definitely being addressed I believe). |
Someone on Slack recommend this post. MCMCChains shows up towards the end of the thread. I threw in some comments -- maybe Tim has some ideas. The weirdest thing to me is that the issue only happens at module scope. |
Possibly related, though more about REPL behavior: |
Good links, Cameron. To me it looks like the root cause of the problem is still unclear and thus likely there are several factors in play. After activating HangTest, I am very disappointed not to be able to always use MCMCChains any longer. It worked great in CmdStan 5.x and Julia < 1.3. I wonder if there would be interest, if possible, to refactor MCMCChains? |
Yes, there is tremendous interest, at least from me. I would basically tear the whole thing apart and start from ground zero, with some things kept in spirit or modified heavily. The diagnostic stuff should be generalized better and perhaps reduced to core functionality. The Unfortunately my initial refactoring many months ago was probably not destined to be the high-quality redesign I wanted. When I did the initial refactoring of MCMCChains, there were two factors that kept it from being as good as it could have been. First, I am a much better developer now than I was at the time -- if I sat down with it again, there would probably be some significant improvements across the board. I would drastically reduce the dependency list, cut down on unnecessary functions, and generally slim everything down. MCMCChains is riddled with poor choices. Second, there's a lot of legacy stuff in there that's left over from Klara (!) that really should have just been built from the ground up with a cohesive focus on performance. I was trying really hard to shoehorn a new thing into an old thing without thinking about how to adapt the old thing, and I think that has really started to show. I should note that with the general methods in We've been kicking around the idea of a |
Thanks Cameron, looks like you and David are tackling the problem the right way, pretty much start from scratch with what you learned. I'll take a look at AbstractMCMC. I've actually been playing with/thinking about another possible idea, derived from MonteCarloMeasurements, in being able to construct Particles (=sample chain) + Distribution + Distribution parameters (as identified by mcmc). But over the last 2 months clearly had other issues to look into. But I feel a lot better it has the right visibility now. |
Seems the PR is actually fixing (parts of) the issue: https://discourse.julialang.org/t/slowdown-after-package-load/33533/13 |
Just wanted to toss in here that NestedSamplers.jl completely hangs when trying to use it or test it. I believe it is because of its dependence on |
The only thing that I found to mitigate the problem in the short term is to restrict Requires to v0.5.2. |
@itsdfish I am not using Requires at all, but still encounter this issue. |
Do you know if any of your dependencies are using Requires? You can check with |
Yes, I'll try manually overriding it. (NestedSamplers) pkg> st --manifest
Project NestedSamplers v0.1.0
Status `~/dev/julia/NestedSamplers/Manifest.toml`
[621f4979] AbstractFFTs v0.5.0
[80f14c24] AbstractMCMC v0.3.0
[7d9fca2a] Arpack v0.4.0
[68821587] Arpack_jll v3.5.0+2
[4fba245c] ArrayInterface v2.4.1
[13072b0f] AxisAlgorithms v1.0.0
[39de3d68] AxisArrays v0.4.2
[b99e7846] BinaryProvider v0.5.8
[324d7699] CategoricalArrays v0.7.7
[aaaa29a8] Clustering v0.13.3
[bbf7d656] CommonSubexpressions v0.2.0
[34da2185] Compat v3.4.0
[9a962f9c] DataAPI v1.1.0
[a93c6f00] DataFrames v0.20.0
[864edb3b] DataStructures v0.17.9
[e2d170a0] DataValueInterfaces v1.0.0
[163ba53b] DiffResults v1.0.2
[b552c78f] DiffRules v1.0.0
[b4f34e82] Distances v0.8.2
[31c24e10] Distributions v0.22.4
[da5c29d0] EllipsisNotation v0.4.0
[7a1cc6ca] FFTW v1.2.0
[f5851436] FFTW_jll v3.3.9+3
[1a297f60] FillArrays v0.8.4
[6a86dc24] FiniteDiff v2.2.0
[f6369f11] ForwardDiff v0.10.9
[1d5cc7b8] IntelOpenMP_jll v2018.0.3+0
[a98d9a8b] Interpolations v0.12.5
[8197267c] IntervalSets v0.4.0
[41ab1584] InvertedIndices v1.0.0
[c8e1da08] IterTools v1.3.0
[82899510] IteratorInterfaceExtensions v1.0.0
[682c06a0] JSON v0.21.0
[5ab0869b] KernelDensity v0.5.1
[d3d80556] LineSearches v7.0.1
[c7f686f2] MCMCChains v1.0.2
[856f044c] MKL_jll v2019.0.117+2
[e1d29d7a] Missings v0.4.3
[d41bc354] NLSolversBase v7.6.1
[77ba4419] NaNMath v0.3.3
[b8a86587] NearestNeighbors v0.4.4
[6fe1bfb0] OffsetArrays v1.0.2
[4536629a] OpenBLAS_jll v0.3.7+5
[efe28fd5] OpenSpecFun_jll v0.5.3+1
[429524aa] Optim v0.20.1
[bac558e1] OrderedCollections v1.1.0
[90014a1f] PDMats v0.9.11
[d96e819e] Parameters v0.12.0
[69de0a69] Parsers v0.3.11
[2dfb63ee] PooledArrays v0.5.3
[85a6dd25] PositiveFactorizations v0.2.3
[92933f4c] ProgressMeter v1.2.0
[1fd47b50] QuadGK v2.3.1
[b3c3ace0] RangeArrays v0.3.2
[c84ed2f1] Ratios v0.4.0
[3cdcf5f2] RecipesBase v0.7.0
[189a3867] Reexport v0.2.0
[ae029012] Requires v1.0.1
[79098fc4] Rmath v0.6.0
[992d4aef] Showoff v0.3.1
[a2af1166] SortingAlgorithms v0.3.1
[276daf66] SpecialFunctions v0.9.0
[90137ffa] StaticArrays v0.12.1
[2913bbd2] StatsBase v0.32.0
[4c63d2b9] StatsFuns v0.9.3
[3783bdb8] TableTraits v1.0.0
[bd369af6] Tables v0.2.11
[efce3f68] WoodburyMatrices v0.5.0
[2a0f44e3] Base64
[ade2ca70] Dates
[8bb1440f] DelimitedFiles
[8ba89e20] Distributed
[9fa8497b] Future
[b77e0a4c] InteractiveUtils
[76f85450] LibGit2
[8f399da3] Libdl
[37e2e46d] LinearAlgebra
[56ddb016] Logging
[d6f4376e] Markdown
[a63ad114] Mmap
[44cfe95a] Pkg
[de0858da] Printf
[3fa0cd96] REPL
[9a3f8284] Random
[ea8e919c] SHA
[9e88b42a] Serialization
[1a1011a3] SharedArrays
[6462fe0b] Sockets
[2f01184e] SparseArrays
[10745b16] Statistics
[4607b0f0] SuiteSparse
[8dfed614] Test
[cf7118a7] UUIDs
[4ec0a83e] Unicode |
If you are lucky, you can simply run |
Yep, pinning Requires to 0.5.2 solved the problems I was having! Thanks! |
Hi Chris, Does the pinning solve it or the fact that pre and final compilation work had been completed earlier? I see similar issues with KernelDensity alone (which in turns polls in Optim), but only when I build a new version of J1.5-DEV. But on 1.3.1 and 1.4-RC1 the delays are far less pronounced after the very first time. |
As far as I can tell, pinning Requires v0.5.2 seems to solve it so long as your dependencies do not need Requires v1.0. My initial post shows the time differences on my local machine. There is a similar pattern on Travis, except MCMCChains with Requires v1.0 takes even longer, often causing a time out. |
Is this still an issue with MCMCChains 3.0? At least for me the import times are drastically reduced (although it surely can be improved further). |
I just ran all StanModels tests (after rebuilding J1.5, so all packages will be recompiled) and no noticeable delay! |
Awesome! Let's close this issue for now, but I think we've still got an eye on performance and load times moving forward. |
I am opening this issue to document long import times for Chains. In some cases, it can take upwards of an hour locally, causing times outs on travis.
In the following MWE with fewer dependencies, the tests complete faster, but are nonetheless very long.
On my local machine, I have found that calling
import MCMCChains: Chains
slows down testing significantly, and the slow down is particularly pronounced with Requires 1.0 on the first run. This finding is particularly important for testing on travis because it uses a fresh installation and will timeout in larger packages. Here are my results:The general pattern has been reproduced on travis:
With the import statement: about 1 minute.
Without the import statement: about 7 minutes.
The text was updated successfully, but these errors were encountered: