Skip to content

Conversation

@NateBrady23
Copy link
Member

@NateBrady23 NateBrady23 commented Apr 15, 2018

We no longer have any tests that use local dependencies. Everything pulls from a dockerhub dependency.

  • Remove __build_dependencies

  • Remove __gather_dependencies

  • Remove build arguments (TFB_DATABASE, MAX_CONCURRENCY)

  • Move wrk and database up and remove toolset/setup/docker and toolset/setup

  • Remove the techempower database dependencies and build the database images locally. Right now this will only minimally affect travis runs especially since the 3 database images are pulling from ubuntu:16.04 instead of the official database images (some of the official images had some quirks I haven't worked out yet.) For continuous benchmarking, this will have virtually no time impact.

    • We'll be able to see some of the time differences between the toolset cleanup commit and the build database dockerfiles locally commit
  • Yes I hoorayed my own PR

  • Grouped more tests together with TESTLANG from Group more languages for travis #3575 bringing down the total time impact of building the database tests locally

Edit: JavaScript, which is using all 3 databases, went from 13:52 to 15:10. And this is with using the ubuntu:16.04 base and doing apt updates. I'd say this is a win, especially when we can pull from the official bases for the database.

@NateBrady23 NateBrady23 changed the title Toolset cleanup Toolset cleanup & Build remove TE database dependencies Apr 15, 2018
@NateBrady23 NateBrady23 changed the title Toolset cleanup & Build remove TE database dependencies Toolset cleanup & remove TE database dependencies Apr 15, 2018
@NateBrady23
Copy link
Member Author

NateBrady23 commented Apr 16, 2018

@msmith-techempower FYI, aspnet-mono has been failing but it's not being reported by travis as failed.

Copy link
Member

@msmith-techempower msmith-techempower left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good.

for the given tests.
'''
docker_buildargs = {
'MAX_CONCURRENCY': str(max(benchmarker_config.concurrency_levels)),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this no longer needed because we're just leveraging nproc now?

tag="techempower/tfb.test.%s" %
test_docker_file.replace(".dockerfile", ""),
buildargs=docker_buildargs,
forcerm=True):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want forcerm anymore?

@msmith-techempower
Copy link
Member

FYI, aspnet-mono has been failing but it's not being reported by travis as failed.

@michaelhixson and I broke Travis on Friday by decoupling failure state of the toolset from failure state of any test.

This should probably be fixed by switching on the running mode as to whether that holds - verify should return failure for the suite if any test fails in any way, but benchmark should only fail if there is an error thrown (the toolset should return non-zero only if it could not successfully run; if it benchmarked everything and some tests failed, it should return 0).

benchmarker.run()
all_passed = benchmarker.run()
if config.mode == "verify":
return 0 if all_passed else 1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not just return all_passed?

# Get a list of all known tests that we can run.
all_tests = gather_remaining_tests(self.config, self.results)

all_passed = True
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ugh, change this to something like any_failed and reverse the binary logic so that you can just bubble the return value up. That line of return 0 if all_passed else 1 is so gross.

@msmith-techempower
Copy link
Member

msmith-techempower commented Apr 16, 2018

finch: [info] Packaging /finch/target/scala-2.12/techempower-benchmarks-finch-assembly-0.19.0.jar ...
finch: [info] Done packaging.
...
finch: Error: Unable to access jarfile target/scala-2.12/techempower-benchmarks-finch-assembly-1.0.jar

These are all different for each test, too >_<

@michaelhixson
Copy link
Contributor

Regarding the scala travis failures: I think the CMD lines of those dockerfiles need to be updated to align with the version number changes in this commit 2bb5629

@NateBrady23
Copy link
Member Author

I'll open a separate PR for the scala tests

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants