Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: publish artifacts by subdirectory #50

Closed
privatenumber opened this issue Aug 10, 2022 · 12 comments
Closed

feat: publish artifacts by subdirectory #50

privatenumber opened this issue Aug 10, 2022 · 12 comments

Comments

@privatenumber
Copy link

Description

To be able to set a subdirectory to publish the artifact to, while preserving files outside of the subdirectory.

Motivation

I'd like to create a "staging" environment for development branches or PRs to deploy the changes to.

By separating the deploys by branch name (per subdirectory), PRs can be deployed to a staging environment before published to the main branch directory.

@yoannchaudet
Copy link
Collaborator

Pages does not do incremental deployments today and does not plan to do it so your "staging" scenario will not work.

We have better plans for preview deployments but no ETA for that yet. github.community is a good place for feature requests.

@KotlinIsland
Copy link

@yoannchaudet What about publishing multiple versions of documentation? It's very common for a project to need to provide and deploy documentation for multiple branches at the same time.

@yoannchaudet
Copy link
Collaborator

Pages provides the primitives only for making (semi) atomic deployments. All the content must be provided at deployment time. Later deployments replace everything. We will eventually add support for ephemeral deployments too but we are not planning on supporting long lived deployments like you have in mind.

The build process can do incremental build if it wants but that's really agnostic of Pages.

For the scenario you describe there are a few solutions:

  1. You could create one project site per major version of your documentation. Pages project sites are routed in a path based fashion and available at this URL: <user or org name>.github.io/<repo name>. While a user/org site is published at <user or org name>.github.io. Having to create individual repos to "snapshot" your documentation obviously comes with some drawbacks.

  2. Have your site version aware. I know of this Jekyll plugin that does support it: https://vsoch.github.io/docsy-jekyll/docs/versioning.

@KotlinIsland
Copy link

The build process can do incremental build if it wants but that's really agnostic of Pages.

What do you mean by this?

I know of this Jekyll plugin that does support it

But how would you accomplish this with the current actions?

Is it possible to download the current deployment, add/update a directory for a specific version then re-upload it?

@yoannchaudet
Copy link
Collaborator

What do you mean by this?

You have full control over the build process if you use a custom workflow. If you want to cache some artifacts and re-use them from one build to another one, that's up to you. Pages does not need to know how you build your static assets anymore is what I meant.

Is it possible to download the current deployment, add/update a directory for a specific version then re-upload it?

You can do that if you want. There are APIs to download artifacts.

If you use the plugin I was referring to, the idea is for your repository to contain the documentation for all your versions at once on a given branch.

Another strategy if you use branches to snapshot your documentation, you can in your build process run as many builds as you need from all your branches and piece all the artifacts together into one large artifact to upload to Pages. Sphinx works like that (or used to if I recall), that's the static site generator used by the Python doc.

@jhpratt
Copy link

jhpratt commented Sep 29, 2022

While it is possible to download artifacts, that assumes that the site is updated frequently enough to still have the artifact present (unless it has a needlessly long lifetime). Is it possible to download the current state of the deployment instead?

I'm trying to replace this action with a GitHub-provided action. Unfortunately this feature isn't being accepted, which is a major step back in my opinion. Right now I have two different repositories that I want deployed on the same site, and I need an index page. The only way to do this is to have a single deployment, as far as I can tell. Not having the ability to deploy each bit of the site separately is a major roadblock. I'd prefer not to have to use a third repository, which is the current situation.

Also it doesn't seem possible to deploy to a subdomain directly. Surely my use case isn't that extraordinary? Having the same functionality as the linked action would be wonderful, at the least. Anything else is naturally a regression.

@yoannchaudet
Copy link
Collaborator

We don't have an API to download the content of a given Pages site. The best workaround I can provide is for you to extend the life of your artifacts (we default it to 1 but you can go all the way to the maximum Actions allows which on top of my head is ~ 90 days give or take). If your site is not built often enough, you can add a schedule trigger to your workflow to make extra builds before your artifact expire.

Also while we now deploy Pages sites using artifacts, there is really nothing preventing you to continue to push your artifacts to a branch at build time if this flow works best for you. We are not making this "illegal" in any way.

@jhpratt
Copy link

jhpratt commented Sep 29, 2022

That's precisely what I found when digging into it. It's more that I want to eliminate the needless repository, but am unable to do so despite having this action directly from GitHub. There's a few things that aren't supported, and I sincerely wish they were. A third-party script shouldn't be able to do more than an official one, in my opinion.

@yoannchaudet
Copy link
Collaborator

Respectfully, I disagree. We are home to a lot of open source projects and really don't aim at imposing vendor locks or solving all problems. We provide basic primitives for the most common usage that is being made of Pages. If a third party project is expanding on that, this is welcome.

To get back to your request:

To be able to set a subdirectory to publish the artifact to, while preserving files outside of the subdirectory.

This is unfortunately not something we support out of the box today or that I think we will implement soon.

@jhpratt
Copy link

jhpratt commented Sep 30, 2022

Naturally not all problems will be solved. I just don't see how the primitives provided here permit this, which is the point I was trying to get across. If there were a way to pull the current deployment, that would probably suffice. Without that, I don't see how these primitives are enough. It's a reasonably common use case, not something that is extremely niche.

For what it's worth, the action I linked does not build on top of GitHub's action, but rather uses the older method that doesn't take advantage of direct deployment. Saying it's "expanding" on this action isn't really the case.

@KotlinIsland
Copy link

So you are trying to deploy two parts, each from a different repo, could you make a workflow in one that checks out the other repo, and builds both parts, and deploys everything at once?

@jhpratt
Copy link

jhpratt commented Sep 30, 2022

By itself, yes, but then I still have to send it back to the third repo to publish it as the subdomain directly 😄

For clarity's sake, I'm going to keep my current setup, as it's clear that this action isn't intended for a use case like mine. Take that as you will.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants