Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_runner: use LFC by default #8613

Open
wants to merge 106 commits into
base: main
Choose a base branch
from
Open

Conversation

bayandin
Copy link
Member

@bayandin bayandin commented Aug 6, 2024

Problem

We don't currently test LFC enough, because we disable LFC by default.

Summary of changes

LFC was enabled by default, use the environment variable USE_LFC=false to disable.

Checklist before requesting a review

  • I have performed a self-review of my code.
  • If it is a core feature, I have added thorough tests.
  • Do we need to implement analytics? if so did you add the relevant metrics to the dashboard?
  • If this PR requires public announcement, mark it with /release-notes label and add several sentences in this section.

Checklist before merging

  • Do not forget to reformat commit message to not include the above checklist

Copy link

github-actions bot commented Aug 6, 2024

6512 tests run: 6228 passed, 1 failed, 283 skipped (full report)


Failures on Postgres 17

# Run all failed tests locally:
scripts/pytest -vv -n $(nproc) -k "test_tenant_import[debug-pg17-None-local_fs]"
Flaky tests (1)

Postgres 16

Test coverage report is not available

The comment gets automatically updated with the latest test results
4eed077 at 2024-11-06T09:23:23.366Z :recycle:

Copy link
Member

@koivunej koivunej left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see no problem with enabling LFC by default, it's a great idea. Not extending the current amount of caching (1MB by default) might be the best way to not introduce flakyness.

Note we have some regress and perf tests which configure shared buffers. Perhaps those could also be using a larger LFC?

$ rg 'shared_buffers\s*=' test_runner/
test_runner/fixtures/compare_fixtures.py
196:                "shared_buffers=1MB",

test_runner/performance/test_compaction.py
38:        "main", tenant_id=tenant_id, config_lines=["shared_buffers=512MB"]
82:        "main", tenant_id=tenant_id, config_lines=["shared_buffers=512MB"]

test_runner/performance/test_startup.py
52:                    config_lines=["shared_buffers=262144"],

test_runner/performance/test_seqscans.py
55:            shared_buffers = row[0]

test_runner/performance/pageserver/pagebench/test_large_slru_basebackup.py
92:        "main", tenant_id=template_tenant, config_lines=["shared_buffers=1MB"]

test_runner/regress/test_lfc_working_set_approximation.py
21:            "shared_buffers='1MB'",
85:            "shared_buffers=1MB",

test_runner/regress/test_pageserver_catchup.py
15:        "test_pageserver_catchup_while_compute_down", config_lines=["shared_buffers=512MB"]

test_runner/regress/test_pg_query_cancellation.py
52:            "shared_buffers = 128MB",

test_runner/regress/test_local_file_cache.py
24:            "shared_buffers='1MB'",

test_runner/regress/test_wal_acceptor.py
2151:        config_lines=["shared_buffers=1MB"],

test_runner/regress/test_explain_with_lfc_stats.py
19:            "shared_buffers='1MB'"

@bayandin bayandin added the run-benchmarks Indicates to the CI that benchmarks should be run for PR marked with this label label Aug 6, 2024
@bayandin bayandin force-pushed the bayandin/regress-tests-use-LFC branch from 878f504 to f5f761d Compare August 6, 2024 16:55
@bayandin
Copy link
Member Author

@a-masterov is going to continue the work on this PR

@a-masterov a-masterov force-pushed the bayandin/regress-tests-use-LFC branch from 5013cf4 to 865d173 Compare October 30, 2024 12:53
@a-masterov a-masterov force-pushed the bayandin/regress-tests-use-LFC branch from a1c99d2 to 7953fd1 Compare October 31, 2024 16:30
@bayandin
Copy link
Member Author

bayandin commented Nov 4, 2024

Last Week:

  • LFC made tests more flaky due to higher disk IO

This Week:

  • Need to figure out what to do with disk
  • Code review from @bayandin

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
run-benchmarks Indicates to the CI that benchmarks should be run for PR marked with this label
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants