Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[packaging] run on a daily basis for branches only #21250

Closed

Conversation

v1v
Copy link
Member

@v1v v1v commented Sep 23, 2020

What does this PR do?

As agreed let's run the packaging pipeline for the master/release branches on a daily basis, then snapshot binaries are at least generated once a day in the worst case scenario.

Why is it important?

Current workflow consists on three different pipelines that run sequentially, if any of those pipelines got broken then the following pipelines won't be triggered.

In other words, for every commit that's merged to master, the multibranch pipeline is triggered, and only if it ran successfully the packaging pipeline will be triggered, and only if that pipeline is successfully the beats-tester pipeline will run.

@v1v v1v self-assigned this Sep 23, 2020
@v1v v1v added the automation label Sep 23, 2020
@v1v v1v requested a review from a team September 23, 2020 13:31
@botelastic botelastic bot added the needs_team Indicates that the issue/PR needs a Team:* label label Sep 23, 2020
@v1v v1v added ci Team:Automation Label for the Observability productivity team labels Sep 23, 2020
@botelastic botelastic bot removed the needs_team Indicates that the issue/PR needs a Team:* label label Sep 23, 2020
@v1v v1v marked this pull request as ready for review September 23, 2020 13:32
@elasticmachine
Copy link
Collaborator

💚 Build Succeeded

Pipeline View Test View Changes Artifacts preview

Expand to view the summary

Build stats

  • Build Cause: [Pull request #21250 opened]

  • Start Time: 2020-09-23T13:32:43.136+0000

  • Duration: 25 min 11 sec

@kuisathaverat
Copy link
Contributor

if the job is building with every successful build on the Beats master branch Why we need to run it on schedule? it should be triggered several times a day, so trigger it on schedule will generate binaries for the same las successful build or if the beats master is broken probably broken binaries.

@v1v
Copy link
Member Author

v1v commented Sep 23, 2020

if the job is building with every successful build on the Beats master branch Why we need to run it on schedule? it should be triggered several times a day, so trigger it on schedule will generate binaries for the same las successful build or if the beats master is broken probably broken binaries.

We agreed in the weekly beats-ci meeting to provide this approach. A bit more of context, this is a short term proposal to generate the packages since the flakiness in the test might cause a bunch of commit builds without a binary. For instance for the last 6 days there was no successful build:

image

It's not ideal to have flaky tests, but as long as there are there is no way to build the packaging unless we enforce this scheduled run that will help to run the beats-tester on the other hand.

Long term proposal, I guess to avoid flakiness

@kuisathaverat
Copy link
Contributor

kuisathaverat commented Sep 23, 2020

I think that it is the wrong way to fix the problem, and add a new one on top, generate invalid binaries. The flaky tests are not new, now we have the pytest-rerunfailures they can rerun the flaky test only, they only have to mark then, they are 4-5 tests no more the pipeline fails always in the same test.
I will manage to check the flakiest tests Tomorrow and mark 'em all as flaky, then I will make a PR. #21260

Copy link
Contributor

@kuisathaverat kuisathaverat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It can generate invalid binaries

@v1v
Copy link
Member Author

v1v commented Sep 23, 2020

Is the unified release process generating binaries from master/7.x branches? If so, we got the same issue.

We could potentially create two folders:

  • unstable folder where the packaging runs and generate binaries from commit that got flaky tests.

  • stable folder where the packaging runs and generates binaries from commit that didng have any flaky test.

Probably we should work on no merge to master allowed policy if master is broken, but we are not the owners. Meanwhile I'd say to enable generating binaries and test them with the beats-tester once a day, so I can work on supporting that particular case in another pipeline/workflow.

What do you think?

@kuisathaverat
Copy link
Contributor

flaky and broken test based on the latest 21 builds #21300, we are talking about 5 flaky tests and 2 broken tests.

TestClientPublishEventKerberosAware (21/21) and test_default_settings (16/21) are directly broken for 6 days.

@v1v
Copy link
Member Author

v1v commented Oct 15, 2020

I'll close this PR now as we have agreed to fix the issues instead workaround it

@v1v v1v closed this Oct 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
automation ci Team:Automation Label for the Observability productivity team
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants