Skip to content

Conversation

@devmotion
Copy link
Member

This PR addresses #55.

I'm not sure if that's the best way to solve it but it avoids calling Pkg.build("Libtask") which apparently fails in CI tests. Moreover, it fixes the typo due to which, as far as I can see, the package was rebuilt each time it was loaded.

@devmotion
Copy link
Member Author

I added a missing build step to the CI tests and disabled tests on Julia master again since they were not finished after > 30 minutes.

@devmotion
Copy link
Member Author

As expected with the current setup, tests fail after increasing the version number. Wouldn't it be possible to decouple the version of the binary dependency and the version of the Julia package, since tests passed, e.g., with the binary dependency of version 0.3.3?

@devmotion
Copy link
Member Author

IMO, for Julia >= 1.3 we should use Yggdrasil if possible. However, for the time being, since we still want to support Julia < 1.3, I guess the best way would be to build the binary dependencies in a separate LibtaskBuilder repository, in the same way as, e.g., SundialsBuilder builds the binaries for Sundials.jl. In that way building the binaries would be decoupled from the Julia package, and hence we wouldn't get test errors in non-breaking updates of the Julia package.

@KDr2
Copy link
Member

KDr2 commented Mar 30, 2020

IMO, for Julia >= 1.3 we should use Yggdrasil if possible. However, for the time being, since we still want to support Julia < 1.3, I guess the best way would be to build the binary dependencies in a separate LibtaskBuilder repository, in the same way as, e.g., SundialsBuilder builds the binaries for Sundials.jl. In that way building the binaries would be decoupled from the Julia package, and hence we wouldn't get test errors in non-breaking updates of the Julia package.

Yeah, the current binary-providing process has flaws when we bump up the version in Project.toml but not release it. But separating it to another repo pose maintenance burden, and considering common users should not use an unreleased version, I added a retrying mechanism to use the binaries for the previous version. That re-try mechanism became nugatory due to the shallow clone of repo, I fixed it with a new commit on this branch.

Copy link
Member

@KDr2 KDr2 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. I will check the support for nightly built Julia later.

@KDr2 KDr2 merged commit 4e201c5 into master Mar 30, 2020
@delete-merged-branch delete-merged-branch bot deleted the devmotion-patch-1 branch March 30, 2020 11:04
@devmotion
Copy link
Member Author

Yeah, the current binary-providing process has flaws when we bump up the version in Project.toml but not release it. But separating it to another repo pose maintenance burden, and considering common users should not use an unreleased version, I added a retrying mechanism to use the binaries for the previous version.

Actually I think the maintenance burden would be much lower since the binaries would have to be rebuilt only occasionally but not for every update of the Julia package. For instance, the last proper commit (i.e., excluding changes of CITATION.bib and CompatHelper) in SundialsBuilder dates back to July 2018 whereas Sundials.jl was updated and fixed many times since July 2018.

The current setup already caused some additional problems again since it was not possible to release the latest commit on the master branch with JuliaRegistrator (it failed because a tag for the same version already exists for the commit in which the package version was updated) and even releasing that commit caused test errors in the registry: JuliaRegistries/General#11820

@yebai
Copy link
Member

yebai commented Mar 30, 2020

Maybe add a parameter to specify a previous version of binary as a fallback? For example, on top of the version information (say v0.21) in Project.toml , we could specify a parameter in deps/build.jl to determine a fallback binary release (say v0.10). If the current version v0.21 is not released on Github yet, we could simply download the binary from v0.10. This way, we only need to occasionally update this version parameter in deps/build.jl to a newer released version and avoid maintaining two repos.

@KDr2
Copy link
Member

KDr2 commented Mar 31, 2020

Maybe add a parameter to specify a previous version of binary as a fallback? For example, on top of the version information (say v0.21) in Project.toml , we could specify a parameter in deps/build.jl to determine a fallback binary release (say v0.10). If the current version v0.21 is not released on Github yet, we could simply download the binary from v0.10. This way, we only need to occasionally update this version parameter in deps/build.jl to a newer released version and avoid maintaining two repos.

Hmm, actually we are now already doing so, but instead of specifying a fallback version, we find the last release version to use. These two methods have the same effect.

The issue is, when you can't find a compatible version to fallback, the release will fail. This is what @devmotion pointed out.

How about this:

  1. create a separate repo for the binary releasing, but this repo contains nothing rather than github actions to clone the Libtask repo, build, and upload the library
  2. So, we can specify the certain dylib to download in Libtask's build.jl
  3. When the code of dylib is updated, we make a new release on the separate repo, and the new dylib will be built and uploaded to the separate repo
  4. update Libtask's build.jl to use the latest dylib URL, then tag and release Libtask.

In step 3, we can write a script to sync the tags between the two repos, so that we don't need to tag on the separate repo manually. We tag on Libtask, the script tag the separate repo and make a release, then build and upload the dylib, then we release Libtask.

@KDr2
Copy link
Member

KDr2 commented Mar 31, 2020

Another way to fix the release issue is to build the dylib locally if no binary found for the current release. ThArrays uses this approach.

@devmotion
Copy link
Member Author

create a separate repo for the binary releasing, but this repo contains nothing rather than github actions to clone the Libtask repo, build, and upload the library
So, we can specify the certain dylib to download in Libtask's build.jl
When the code of dylib is updated, we make a new release on the separate repo, and the new dylib will be built and uploaded to the separate repo
update Libtask's build.jl to use the latest dylib URL, then tag and release Libtask.

Isn't that basically what BinaryBuilder suggests to do and what is done in the SundialsBuilder + Sundials.jl example above in a more clearly separate way? I don't really see how keeping the Libtask sources for the binaries in Libtask.jl and then cloning the repo from the other repo would be easier to handle then just keeping the binary-related things in one repo. One could still just make a new release of the binary if needed, without having to clone (and possibly to adjust) the Julia repo, and then update the build.jl accordingly to point to the new release.

IMO currently the process of building the binaries and making a new release of Libtask.jl is too intertwined, and hence seems to fail quite regularly. I guess, separating both steps by using dedicated repos could be helpful.

@KDr2
Copy link
Member

KDr2 commented Mar 31, 2020

IMO, for Julia >= 1.3 we should use Yggdrasil if possible. However, for the time being, since we still want to support Julia < 1.3, I guess the best way would be to build the binary dependencies in a separate LibtaskBuilder repository

Hmm... I think this limitation doesn't apply to us, because in our build scripts( see here and here ) we download different versions of Julia then build the binary. So we can simply submit our build script to Yggdrasil to generate a *_jll.jl package which Libtask can depend on.

@devmotion
Copy link
Member Author

AFAIK the problem is not the build process but that Yggdrasil uses the new artifacts system that was only added to Pkg in Julia 1.3 (https://julialang.github.io/Pkg.jl/dev/artifacts/). So the main issue is that in Julia < 1.3 only Julia packages can be installed and added as dependencies.

@KDr2
Copy link
Member

KDr2 commented Mar 31, 2020

Oh sorry for the misunderstanding, but this doesn't matter either. What we are doing now is downloading the tarball (i.e. artifacts) from Libtask's GitHub Release page, we can just change the download URL to the *_jll.jl package's, rather than using the artifacts mechanism.

@devmotion
Copy link
Member Author

The officially recommended way to deal with this problem if one wants to support Julia < 1.3 is to generate a build.jl file from the Yggdrasil release: https://github.com/JuliaPackaging/Yggdrasil#binaryproviderjl

@KDr2
Copy link
Member

KDr2 commented Mar 31, 2020

Yeah, we now have and use this build file too, there’s a build.jl for each release along with the binary tarballs on the release page.

@KDr2
Copy link
Member

KDr2 commented Mar 31, 2020

PR to Yggdrasil filed: JuliaPackaging/Yggdrasil#699

@devmotion
Copy link
Member Author

Great, hopefully that will make the release process smoother!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants