Skip to content

Commit 379c935

Browse files
Merge #6583: ci: run functional tests on GitHub Actions, upload functional test logs as artifacts
3461c14 ci: tentatively drop multiprocess and tsan functional tests (Kittywhiskers Van Gogh) 5db8fa0 ci: cache previous releases if running `linux64` variant (Kittywhiskers Van Gogh) cca0d89 ci: add functional tests for linux builds (Kittywhiskers Van Gogh) fc2efb6 ci: upload functional test logs as artifacts (Kittywhiskers Van Gogh) b25e846 ci: handle space exhaustion by deleting files we don't need (Kittywhiskers Van Gogh) 0a1e635 ci: add reusable workflow for running functional tests (Kittywhiskers Van Gogh) 57cf278 ci: use helper script to bundle artifacts (Kittywhiskers Van Gogh) Pull request description: ## Additional Information * [`actions/cache`](https://github.com/marketplace/actions/cache) allows specification of directories that want to be cached using glob expressions, which extend further into defining exclusions, needed due keep cache sizes maintainable ([source](https://github.com/dashpay/dash/blob/bb469687d3d936f82fd8e8fbe0934eec5e17df5e/.gitlab-ci.yml#L128-L138)). * Unfortunately, the implementation of globbing with respect to exclusions is more-or-less broken (see [actions/toolkit#713](actions/toolkit#713 (comment))) with the requirement that the inclusion depth should match the depth of the exclusion. Attempting to play by these rules more or less fails ([build](https://github.com/kwvg/dash/actions/runs/13344612118/job/37273624710#step:5:4634)). * Attempting to use third party actions like [`tj-actions/glob`](https://github.com/marketplace/actions/glob-match) provide for a much more saner experience but they enumerate individual files that match patterns, not folders. This means that when we pass them to `actions/cache`, we breach the arguments length limit ([build](https://github.com/kwvg/dash/actions/runs/13343953711/job/37272121153#step:9:4409)). * Modifying `ulimit` to get around this isn't very feasible due to odd behavior surrounding it (see [actions/runner#3421](actions/runner#3421)) and the general consensus is to save to a file and have the next action read from file ([source](https://stackoverflow.com/a/71349472)). [`tj-actions/glob`](https://github.com/marketplace/actions/glob-match) graciously does this with the `paths-output-file` output but it takes two to play and [`actions/cache`](https://github.com/marketplace/actions/cache) does not accept files (`path` must be a newline-delimited string). The path of least resistance, it seems, is to use a script to bundle our cache into a neat input and leave [`actions/cache`](https://github.com/marketplace/actions/cache) out of it entirely, this is the approach taken. * As we aren't using self-hosted runners, we are subject to GitHub's limits for everything, runner space, total artifact storage budget, total cache storage budget. * Caches that not **accessed** in 7 days are evicted and there's a 10 GB budget for all caches ([source](https://docs.github.com/en/actions/writing-workflows/choosing-what-your-workflow-does/caching-dependencies-to-speed-up-workflows#usage-limits-and-eviction-policy)) and GitHub will evict oldest (presumably by **creation**?) caches to make sure that limit is adhered to. * What makes this limit troubling is the immutable nature of caches as unlike GitLab, which is more conductive to shared caches ([source](https://github.com/dashpay/dash/blob/bb469687d3d936f82fd8e8fbe0934eec5e17df5e/.gitlab-ci.yml#L55-L69)), GitHub insists on its immutability (see [actions/toolkit#505](actions/toolkit#505)) and the only way to "update" a cache is to structure your cache key to allow the updated content to reflect in the key itself or delete the cache and create a new one, which brings race condition concerns ([comment](#6406 (review))). Sidenote, overwriting contents are allowed for artifacts ([source](https://github.com/actions/upload-artifact/blob/65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08/README.md#overwriting-an-artifact)), just not caches. * This means we need to be proactive in getting rid of caches with a short shelf life to avoid more long lasting caches (like `depends-sources`) from being evicted due to old age as we breach the 10 GB limit. We cannot just set a short retention period as GitHub doesn't offer you to do that with caches like they do with artifacts ([source](https://github.com/actions/upload-artifact/blob/65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08/README.md#retention-period)). * ~~While doing this properly would require us to implement a cache reaper workflow, this still needed to be addressed now as the contents of `build-ci` need to be passed onto the functional test workflow and this creates an ephemeral cache with a short lifespan that threatens longer-living (but older) caches.~~ ~~This is currently approached by deleting `build-ci` (output) caches when the functional test runner is successful, we let the cache stick around if the build fails to allow for rerunning failed instances.~~ ~~If for whatever reason a successful build has to be rerun, the build workflow would need to be rerun (though the `ccache` cache will speed this up significantly) to generate the output cache again for the test workflow to succeed. Failing to do this will result in a cache miss and run failure.~~ **Edit:** Switched to using artifacts to mitigate cache thrashing concerns. Auto-expiration is a huge plus, too. * Runners are limited to 14 GB of **addressable** storage space ([source](https://docs.github.com/en/actions/using-github-hosted-runners/using-github-hosted-runners/about-github-hosted-runners#standard-github-hosted-runners-for-public-repositories)) and breaching this limit will cause the runners to fail. * Our TSan ([build](https://github.com/kwvg/dash/actions/runs/13355816205/job/37298587344#step:5:1178)) and multiprocess ([build](https://github.com/kwvg/dash/actions/runs/13355816205/job/37298658464#step:5:1190)) test variants breach this limit when collecting logs and deleting the `build-ci` (see 2153b0b) cache doesn't make enough of a dent to help ([build](https://github.com/kwvg/dash/actions/runs/13356474530)). * Omitting the logs from successful runs would be a regression in content for our `test_logs` artifacts and therefore wasn't considered. * While third-party actions like [`AdityaGarg8/remove-unwanted-software`](https://github.com/marketplace/actions/maximize-build-disk-space-only-remove-unwanted-software) can bring significant space savings ([build](https://github.com/kwvg/dash/actions/runs/13357806504/job/37302970610#step:2:150)), they cannot be run in jobs that utilize the [`container`](https://docs.github.com/en/actions/writing-workflows/choosing-where-your-workflow-runs/running-jobs-in-a-container) context (so, all jobs after the container creation workflow) as they'd be executed inside the container when we want to affect the runner underneath ([build](https://github.com/kwvg/dash/actions/runs/13357260369/job/37301757225#step:3:29), notice the step being run after "Initialize containers"). * There are no plans to implement definable "pre-steps" (see [actions/runner#812](actions/runner#812)) and the only way to implement "before" and "after" steps is by self-hosting ([source](https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/running-scripts-before-or-after-a-job)). This has been sidestepped tentatively by omitting TSan and multiprocess builds as any attempted fixes would require precision gutting to get borderline space savings which could easily be nullified by future code changes that occupy more space and such measures are better reserved for future PRs. * Artifacts share their storage quota with GitHub Packages ([source](https://docs.github.com/en/billing/managing-billing-for-your-products/managing-billing-for-github-packages/about-billing-for-github-packages#about-billing-for-github-packages)) and artifacts by default linger around for 90 days ([source](https://docs.github.com/en/actions/writing-workflows/choosing-what-your-workflow-does/storing-and-sharing-data-from-a-workflow#about-workflow-artifacts)) and considering that each run can generate multi-gigabyte total artifacts, testing the upper limit runs the risk of being very expensive. **Edit:** It appears pricing is reflective of artifacts in private repos, public repos don't seem to run this risk. * ~~All artifacts generated have an expiry of one day (compared to GitLab's three days, [source](https://github.com/dashpay/dash/blob/bb469687d3d936f82fd8e8fbe0934eec5e17df5e/.gitlab-ci.yml#L165), but they are self-hosted).~~ **Edit:** Artifacts now have an expiry of three days, matching GitLab. * Artifacts are compressed as ZIP archives and there is no way around that as of now (see [actions/upload-artifact#109](actions/upload-artifact#109 (comment))) and the permissions loss it entails is acknowledged by GitHub and their solution is... to put them in a tarball ([source](https://github.com/actions/upload-artifact/blob/65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08/README.md#permission-loss)). To keep size under control, artifacts contain `zstd-5` compressed tarballs (compression level determined by benchmarks, see below) generated by bundling scripts alongside their checksum. <details> <summary>Benchmarks:</summary> ``` $ zstd -b1 -e22 -T0 artifacts-linux64_multiprocess.tar 1#_multiprocess.tar :1586411520 -> 537552492 (x2.951), 2396.2 MB/s, 1367.8 MB/s 2#_multiprocess.tar :1586411520 -> 499098623 (x3.179), 2131.8 MB/s, 1306.6 MB/s 3#_multiprocess.tar :1586411520 -> 474452284 (x3.344), 1371.6 MB/s, 1245.6 MB/s 4#_multiprocess.tar :1586411520 -> 470931621 (x3.369), 620.3 MB/s, 1239.1 MB/s 5#_multiprocess.tar :1586411520 -> 459075785 (x3.456), 457.2 MB/s, 1230.1 MB/s 6#_multiprocess.tar :1586411520 -> 449594612 (x3.529), 415.3 MB/s, 1289.7 MB/s 7#_multiprocess.tar :1586411520 -> 446208421 (x3.555), 282.6 MB/s, 1296.3 MB/s 8#_multiprocess.tar :1586411520 -> 442797797 (x3.583), 254.3 MB/s, 1338.4 MB/s 9#_multiprocess.tar :1586411520 -> 438690318 (x3.616), 210.8 MB/s, 1331.5 MB/s 10#_multiprocess.tar :1586411520 -> 437195147 (x3.629), 164.1 MB/s, 1337.4 MB/s 11#_multiprocess.tar :1586411520 -> 436501141 (x3.634), 108.2 MB/s, 1342.5 MB/s 12#_multiprocess.tar :1586411520 -> 436405679 (x3.635), 102.7 MB/s, 1344.0 MB/s 13#_multiprocess.tar :1586411520 -> 436340981 (x3.636), 65.9 MB/s, 1344.0 MB/s 14#_multiprocess.tar :1586411520 -> 435626720 (x3.642), 61.5 MB/s, 1346.9 MB/s 15#_multiprocess.tar :1586411520 -> 434882716 (x3.648), 49.4 MB/s, 1352.9 MB/s 16#_multiprocess.tar :1586411520 -> 411221852 (x3.858), 33.6 MB/s, 1049.2 MB/s 17#_multiprocess.tar :1586411520 -> 399523001 (x3.971), 26.0 MB/s, 1003.7 MB/s 18#_multiprocess.tar :1586411520 -> 379278765 (x4.183), 21.0 MB/s, 897.5 MB/s 19#_multiprocess.tar :1586411520 -> 378022246 (x4.197), 14.7 MB/s, 896.0 MB/s 20#_multiprocess.tar :1586411520 -> 375741653 (x4.222), 14.0 MB/s, 877.6 MB/s 21#_multiprocess.tar :1586411520 -> 373303486 (x4.250), 11.9 MB/s, 866.8 MB/s 22#_multiprocess.tar :1586411520 -> 358172556 (x4.429), 6.09 MB/s, 884.9 MB/s ``` </details> > **Note:** As mentioned above, we use similar bundling scripts for the outputs cache but unlike artifacts, we cannot disable their compression routines or even adjust compression levels (see [actions/tookit#544](actions/toolkit#544)) ## Notes * ~~If we add or remove binaries in terms of compile output, `bundle-build.sh` needs to be updated. Without updating it, it will fail if it cannot find the files it was looking for and it will not include files that it wasn't told to include.~~ No longer applicable. ## Breaking Changes None expected. ## Checklist - [x] I have performed a self-review of my own code - [x] I have commented my code, particularly in hard-to-understand areas - [x] I have added or updated relevant unit/integration/functional/e2e tests **(note: N/A)** - [x] I have made corresponding changes to the documentation **(note: N/A)** - [x] I have assigned this pull request to a milestone _(for repository code-owners and collaborators only)_ ACKs for top commit: PastaPastaPasta: utACK 3461c14 UdjinM6: ACK 3461c14 Tree-SHA512: ef55bc10902c57673ffd9bee6562b362a87658e4c51e543b8553bf48c41544a302d6acad7c5a30395fbfcfd085354251a07327c3e78c93c750585496926be9f6
2 parents 00c1bf0 + 3461c14 commit 379c935

File tree

7 files changed

+268
-3
lines changed

7 files changed

+268
-3
lines changed

.github/workflows/build-src.yml

+20-3
Original file line numberDiff line numberDiff line change
@@ -15,11 +15,17 @@ on:
1515
description: "Key needed to access cached depends"
1616
required: true
1717
type: string
18+
outputs:
19+
key:
20+
description: "Key needed for restoring artifacts bundle"
21+
value: ${{ jobs.build-src.outputs.key }}
1822

1923
jobs:
2024
build-src:
2125
name: Build source
2226
runs-on: ubuntu-24.04
27+
outputs:
28+
key: ${{ steps.bundle.outputs.key }}
2329
container:
2430
image: ${{ inputs.container-path }}
2531
options: --user root
@@ -89,9 +95,20 @@ jobs:
8995
./ci/dash/test_unittests.sh
9096
shell: bash
9197

92-
- name: Upload build artifacts
98+
- name: Bundle artifacts
99+
id: bundle
100+
run: |
101+
export BUILD_TARGET="${{ inputs.build-target }}"
102+
export BUNDLE_KEY="build-${BUILD_TARGET}-$(git rev-parse --short=8 HEAD)"
103+
./ci/dash/bundle-artifacts.sh create
104+
echo "key=${BUNDLE_KEY}" >> "${GITHUB_OUTPUT}"
105+
106+
- name: Upload artifacts
93107
uses: actions/upload-artifact@v4
94108
with:
95-
name: build-artifacts-${{ inputs.build-target }}
109+
name: ${{ steps.bundle.outputs.key }}
96110
path: |
97-
/output
111+
${{ steps.bundle.outputs.key }}.tar.zst
112+
compression-level: 0
113+
overwrite: true
114+
retention-days: 3

.github/workflows/build.yml

+36
Original file line numberDiff line numberDiff line change
@@ -154,3 +154,39 @@ jobs:
154154
build-target: win64
155155
container-path: ${{ needs.container.outputs.path }}
156156
depends-key: ${{ needs.depends-win64.outputs.key }}
157+
158+
test-linux64:
159+
name: linux64-test
160+
uses: ./.github/workflows/test-src.yml
161+
needs: [container, depends-linux64, src-linux64]
162+
with:
163+
bundle-key: ${{ needs.src-linux64.outputs.key }}
164+
build-target: linux64
165+
container-path: ${{ needs.container.outputs.path }}
166+
167+
test-linux64_nowallet:
168+
name: linux64_nowallet-test
169+
uses: ./.github/workflows/test-src.yml
170+
needs: [container, depends-linux64_nowallet, src-linux64_nowallet]
171+
with:
172+
bundle-key: ${{ needs.src-linux64_nowallet.outputs.key }}
173+
build-target: linux64_nowallet
174+
container-path: ${{ needs.container.outputs.path }}
175+
176+
test-linux64_sqlite:
177+
name: linux64_sqlite-test
178+
uses: ./.github/workflows/test-src.yml
179+
needs: [container, depends-linux64, src-linux64_sqlite]
180+
with:
181+
bundle-key: ${{ needs.src-linux64_sqlite.outputs.key }}
182+
build-target: linux64_sqlite
183+
container-path: ${{ needs.container.outputs.path }}
184+
185+
test-linux64_ubsan:
186+
name: linux64_ubsan-test
187+
uses: ./.github/workflows/test-src.yml
188+
needs: [container, depends-linux64, src-linux64_ubsan]
189+
with:
190+
bundle-key: ${{ needs.src-linux64_ubsan.outputs.key }}
191+
build-target: linux64_ubsan
192+
container-path: ${{ needs.container.outputs.path }}

.github/workflows/test-src.yml

+83
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,83 @@
1+
name: Test source
2+
3+
on:
4+
workflow_call:
5+
inputs:
6+
bundle-key:
7+
description: "Key needed to access bundle of artifacts"
8+
required: true
9+
type: string
10+
build-target:
11+
description: "Target name as defined by inputs.sh"
12+
required: true
13+
type: string
14+
container-path:
15+
description: "Path to built container at registry"
16+
required: true
17+
type: string
18+
19+
env:
20+
INTEGRATION_TESTS_ARGS: "--extended --exclude feature_pruning,feature_dbcrash"
21+
22+
jobs:
23+
test-src:
24+
name: Test source
25+
runs-on: ubuntu-24.04
26+
container:
27+
image: ${{ inputs.container-path }}
28+
options: --user root
29+
steps:
30+
- name: Checkout code
31+
uses: actions/checkout@v4
32+
with:
33+
ref: ${{ github.event.pull_request.head.sha }}
34+
fetch-depth: 1
35+
36+
- name: Download build artifacts
37+
uses: actions/download-artifact@v4
38+
with:
39+
name: ${{ inputs.bundle-key }}
40+
41+
- name: Manage releases cache
42+
uses: actions/cache@v4
43+
if: inputs.build-target == 'linux64'
44+
with:
45+
path: |
46+
releases
47+
key: releases-${{ hashFiles('ci/test/00_setup_env_native_qt5.sh', 'test/get_previous_releases.py') }}
48+
49+
- name: Run functional tests
50+
id: test
51+
run: |
52+
git config --global --add safe.directory "$PWD"
53+
export BUILD_TARGET="${{ inputs.build-target }}"
54+
export BUNDLE_KEY="${{ inputs.bundle-key }}"
55+
./ci/dash/bundle-artifacts.sh extract
56+
./ci/dash/slim-workspace.sh
57+
source ./ci/dash/matrix.sh
58+
./ci/dash/test_integrationtests.sh ${INTEGRATION_TESTS_ARGS}
59+
shell: bash
60+
61+
- name: Bundle test logs
62+
id: bundle
63+
if: success() || (failure() && steps.test.outcome == 'failure')
64+
run: |
65+
export BUILD_TARGET="${{ inputs.build-target }}"
66+
echo "short-sha=$(git rev-parse --short=8 HEAD)" >> "${GITHUB_OUTPUT}"
67+
( [ -d "testlogs" ] && echo "upload-logs=true" >> "${GITHUB_OUTPUT}" && ./ci/dash/bundle-logs.sh ) \
68+
|| echo "upload-logs=false" >> "${GITHUB_OUTPUT}"
69+
shell: bash
70+
71+
- name: Upload test logs
72+
uses: actions/upload-artifact@v4
73+
if: |
74+
success() || (failure() && steps.test.outcome == 'failure')
75+
&& steps.bundle.outputs.upload-logs == 'true'
76+
with:
77+
name: test_logs-${{ inputs.build-target }}-${{ steps.bundle.outputs.short-sha }}
78+
path: |
79+
test_logs-${{ inputs.build-target }}.tar.zst
80+
test_logs-${{ inputs.build-target }}.tar.zst.sha256
81+
compression-level: 0
82+
overwrite: true
83+
retention-days: 1

ci/dash/bundle-artifacts.sh

+59
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,59 @@
1+
#!/usr/bin/env bash
2+
# Copyright (c) 2024-2025 The Dash Core developers
3+
# Distributed under the MIT software license, see the accompanying
4+
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
5+
6+
export LC_ALL=C.UTF-8
7+
8+
set -eo pipefail
9+
10+
SH_NAME="$(basename "${0}")"
11+
VERB="${1}"
12+
13+
if [ -z "${BUILD_TARGET}" ]; then
14+
echo "${SH_NAME}: BUILD_TARGET not defined, cannot continue!";
15+
exit 1;
16+
elif [ -z "${BUNDLE_KEY}" ]; then
17+
echo "${SH_NAME}: BUNDLE_KEY not defined, cannot continue!";
18+
exit 1;
19+
elif [ ! "$(command -v zstd)" ]; then
20+
echo "${SH_NAME}: zstd not found, cannot continue!";
21+
exit 1;
22+
elif [ -z "${VERB}" ]; then
23+
echo "${SH_NAME}: Verb missing, acceptable values 'create' or 'extract'";
24+
exit 1;
25+
elif [ "${VERB}" != "create" ] && [ "${VERB}" != "extract" ]; then
26+
echo "${SH_NAME}: Invalid verb '${VERB}', expected 'create' or 'extract'";
27+
exit 1;
28+
fi
29+
30+
OUTPUT_ARCHIVE="${BUNDLE_KEY}.tar.zst"
31+
if [ -f "${OUTPUT_ARCHIVE}" ] && [ "${VERB}" = "create" ]; then
32+
echo "${SH_NAME}: ${OUTPUT_ARCHIVE} already exists, cannot continue!";
33+
exit 1;
34+
elif [ ! -f "${OUTPUT_ARCHIVE}" ] && [ "${VERB}" = "extract" ]; then
35+
echo "${SH_NAME}: ${OUTPUT_ARCHIVE} missing, cannot continue!";
36+
exit 1;
37+
fi
38+
39+
if [ "${VERB}" = "create" ]; then
40+
EXCLUSIONS=(
41+
"*.a"
42+
"*.o"
43+
".deps"
44+
".libs"
45+
)
46+
EXCLUSIONS_ARG=""
47+
for excl in "${EXCLUSIONS[@]}"
48+
do
49+
EXCLUSIONS_ARG+=" --exclude=${excl}";
50+
done
51+
52+
# shellcheck disable=SC2086
53+
tar ${EXCLUSIONS_ARG} --use-compress-program="zstd -T0 -5" -cf "${OUTPUT_ARCHIVE}" "build-ci";
54+
elif [ "${VERB}" = "extract" ]; then
55+
tar --use-compress-program="unzstd" -xf "${OUTPUT_ARCHIVE}";
56+
else
57+
echo "${SH_NAME}: Generic error";
58+
exit 1;
59+
fi

ci/dash/bundle-logs.sh

+28
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
#!/usr/bin/env bash
2+
# Copyright (c) 2025 The Dash Core developers
3+
# Distributed under the MIT software license, see the accompanying
4+
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
5+
6+
export LC_ALL=C.UTF-8
7+
8+
set -eo pipefail
9+
10+
SH_NAME="$(basename "${0}")"
11+
LOG_DIRECTORY="testlogs"
12+
13+
if [ ! -d "${LOG_DIRECTORY}" ]; then
14+
echo "${SH_NAME}: '${LOG_DIRECTORY}' directory missing, will skip!";
15+
exit 0;
16+
elif [ -z "${BUILD_TARGET}" ]; then
17+
echo "${SH_NAME}: BUILD_TARGET not defined, cannot continue!";
18+
exit 1;
19+
fi
20+
21+
LOG_ARCHIVE="test_logs-${BUILD_TARGET}.tar.zst"
22+
if [ -f "${LOG_ARCHIVE}" ]; then
23+
echo "${SH_NAME}: ${LOG_ARCHIVE} already exists, cannot continue!";
24+
exit 1;
25+
fi
26+
27+
tar --use-compress-program="zstd -T0 -5" -cf "${LOG_ARCHIVE}" "${LOG_DIRECTORY}"
28+
sha256sum "${LOG_ARCHIVE}" > "${LOG_ARCHIVE}.sha256";

ci/dash/slim-workspace.sh

+41
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
#!/usr/bin/env bash
2+
# Copyright (c) 2025 The Dash Core developers
3+
# Distributed under the MIT software license, see the accompanying
4+
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
5+
6+
export LC_ALL=C.UTF-8
7+
8+
set -eo pipefail
9+
10+
SH_NAME="$(basename "${0}")"
11+
12+
if [ -z "${BUILD_TARGET}" ]; then
13+
echo "${SH_NAME}: BUILD_TARGET not defined, cannot continue!";
14+
exit 1;
15+
elif [ -z "${BUNDLE_KEY}" ]; then
16+
echo "${SH_NAME}: BUNDLE_KEY not defined, cannot continue!";
17+
exit 1;
18+
fi
19+
20+
TARGETS=(
21+
# Bundle restored from artifact
22+
"${BUNDLE_KEY}.tar.zst"
23+
# Binaries not needed by functional tests
24+
"build-ci/dashcore-${BUILD_TARGET}/src/dash-tx"
25+
"build-ci/dashcore-${BUILD_TARGET}/src/bench/bench_dash"
26+
"build-ci/dashcore-${BUILD_TARGET}/src/qt/dash-qt"
27+
"build-ci/dashcore-${BUILD_TARGET}/src/qt/test/test_dash-qt"
28+
"build-ci/dashcore-${BUILD_TARGET}/src/test/test_dash"
29+
"build-ci/dashcore-${BUILD_TARGET}/src/test/fuzz/fuzz"
30+
# Misc. files that can be heavy
31+
"build-ci/dashcore-${BUILD_TARGET}/src/qt/qrc_bitcoin.cpp"
32+
"build-ci/dashcore-${BUILD_TARGET}/src/qt/qrc_dash_locale.cpp"
33+
)
34+
35+
# Delete directories we don't need
36+
for target in "${TARGETS[@]}"
37+
do
38+
if [[ -d "${target}" ]] || [[ -f "${target}" ]]; then
39+
rm -rf "${target}";
40+
fi
41+
done

contrib/containers/ci/Dockerfile

+1
Original file line numberDiff line numberDiff line change
@@ -184,6 +184,7 @@ RUN apt-get update && apt-get install $APT_ARGS \
184184
wine-stable \
185185
wine64 \
186186
zip \
187+
zstd \
187188
&& rm -rf /var/lib/apt/lists/*
188189

189190
# Make sure std::thread and friends is available

0 commit comments

Comments
 (0)