Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New design of system tests #22311

Merged
merged 31 commits into from
Mar 25, 2022
Merged
Show file tree
Hide file tree
Changes from 25 commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
d1360e2
New design of Airflow System Tests
Mar 10, 2022
ccb1334
Remove Bigquery example dags already migrated to new system tests
Mar 10, 2022
653f90c
Merge branch 'apache:main' into new_system_tests
bhirsz Mar 10, 2022
fc16df6
Rename test files with 'example_*' as a prefix
mnojek Mar 14, 2022
b97f75a
fixup! Rename test files with 'example_*' as a prefix
potiuk Mar 14, 2022
ba9a389
fixup! fixup! Rename test files with 'example_*' as a prefix
potiuk Mar 14, 2022
abfef7d
Fix always tests to work with example_* prefix
Mar 15, 2022
d9106a1
Skip tests if required env vars are not defined
Mar 15, 2022
8342b97
Fix paths to example dags in docs
Mar 15, 2022
35689e4
Use autofixture to skip tests with missing env vars
Mar 15, 2022
ec25fea
Remove deprecated doc string
Mar 15, 2022
326a9fc
Merge branch 'main' into new_system_tests
bhirsz Mar 16, 2022
7acc96e
Auto-format BREEZE.rst
Mar 16, 2022
28c07c3
Fix missing documentation tag end
Mar 16, 2022
5c54739
Remove unnecessary conversion to list
Mar 18, 2022
f083a08
Remove redundant between word
Mar 18, 2022
efb9fc7
pytest_configure -> auto-fixture
Mar 21, 2022
b5b0d7d
Replace test_run method in system tests by common method
Mar 21, 2022
b5c05c7
Mock os.environ with AIRFLOW__CORE__EXECUTOR env
Mar 21, 2022
e44a778
Update pre-commit hook, move watcher import, small refactor
mnojek Mar 21, 2022
11be7c0
Update docs lint checks with new system tests design
Mar 23, 2022
e338bb7
Merge branch 'new_system_tests' of https://github.com/lwyszomi/airflo…
Mar 23, 2022
00999c5
Merge branch 'main' into new_system_tests
bhirsz Mar 23, 2022
a9c7bc1
Remove whitespace in docs/exts/docs_build/lint_checks.py
mnojek Mar 23, 2022
fa1e1a3
Update pytest function and pre-commit hook, improve docs
mnojek Mar 23, 2022
90813e7
Merge branch 'main' into new_system_tests
mnojek Mar 24, 2022
d321fab
fixup! Merge branch 'main' into new_system_tests
potiuk Mar 24, 2022
94309a3
Merge branch 'main' into new_system_tests
mnojek Mar 24, 2022
a5029e1
Merge branch 'main' into new_system_tests
mnojek Mar 25, 2022
3832e47
Update check_system_tests precommit pattern
Mar 25, 2022
4e37d3b
Merge branch 'main' into new_system_tests
bhirsz Mar 25, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -590,6 +590,13 @@ repos:
pass_filenames: false
require_serial: true
additional_dependencies: ['rich']
- id: check-system-tests
name: Check if system tests have required segments of code
entry: ./scripts/ci/pre_commit/pre_commit_check_system_tests.py
language: python
files: ^tests/system/.*/example_[^/]*.py$
pass_filenames: true
additional_dependencies: ['rich']
- id: markdownlint
name: Run markdownlint
description: Checks the style of Markdown files.
Expand Down
6 changes: 3 additions & 3 deletions BREEZE.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2293,9 +2293,9 @@ This is the current syntax for `./breeze <./breeze>`_:
changelog-duplicates check-apache-license check-builtin-literals
check-executables-have-shebangs check-extras-order check-hooks-apply
check-integrations check-merge-conflict check-ti-run-id-in-providers check-xml
daysago-import-check debug-statements detect-private-key docstring-params doctoc
dont-use-safe-filter end-of-file-fixer fix-encoding-pragma flake8 flynt
forbidden-xcom-get-value codespell forbid-tabs helm-lint identity
check-system-tests daysago-import-check debug-statements detect-private-key
docstring-params doctoc dont-use-safe-filter end-of-file-fixer fix-encoding-pragma
flake8 flynt forbidden-xcom-get-value codespell forbid-tabs helm-lint identity
incorrect-use-of-LoggingMixin insert-license isort json-schema language-matters
lint-dockerfile lint-openapi markdownlint mermaid migration-reference
mixed-line-ending mypy mypy-helm no-providers-in-core-examples no-relative-imports
Expand Down
2 changes: 2 additions & 0 deletions STATIC_CODE_CHECKS.rst
Original file line number Diff line number Diff line change
Expand Up @@ -168,6 +168,8 @@ require Breeze Docker images to be installed locally.
------------------------------------ ---------------------------------------------------------------- ------------
``check-xml`` Checks XML files with xmllint
------------------------------------ ---------------------------------------------------------------- ------------
``check-system-tests`` Check if system tests have required segments of code
------------------------------------ ---------------------------------------------------------------- ------------
``daysago-import-check`` Checks if daysago is properly imported
------------------------------------ ---------------------------------------------------------------- ------------
``debug-statements`` Detects accidentally committed debug statements
Expand Down
1 change: 1 addition & 0 deletions breeze-complete
Original file line number Diff line number Diff line change
Expand Up @@ -99,6 +99,7 @@ check-integrations
check-merge-conflict
check-ti-run-id-in-providers
check-xml
check-system-tests
daysago-import-check
debug-statements
detect-private-key
Expand Down
1 change: 1 addition & 0 deletions dev/breeze/src/airflow_breeze/pre_commit_ids.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@
'check-hooks-apply',
'check-integrations',
'check-merge-conflict',
'check-system-tests',
'check-ti-run-id-in-providers',
'check-xml',
'codespell',
Expand Down
61 changes: 34 additions & 27 deletions docs/apache-airflow-providers-google/operators/cloud/bigquery.rst
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ data.
Prerequisite Tasks
^^^^^^^^^^^^^^^^^^

.. include::/operators/_partials/prerequisite_tasks.rst
.. include:: ../_partials/prerequisite_tasks.rst
mnojek marked this conversation as resolved.
Show resolved Hide resolved

Manage datasets
^^^^^^^^^^^^^^^
Expand All @@ -42,7 +42,7 @@ Create dataset
To create an empty dataset in a BigQuery database you can use
:class:`~airflow.providers.google.cloud.operators.bigquery.BigQueryCreateEmptyDatasetOperator`.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_dataset.py
mnojek marked this conversation as resolved.
Show resolved Hide resolved
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_create_dataset]
Expand All @@ -58,7 +58,7 @@ To get the details of an existing dataset you can use

This operator returns a `Dataset Resource <https://cloud.google.com/bigquery/docs/reference/rest/v2/datasets#resource>`__.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_dataset.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_get_dataset]
Expand All @@ -72,7 +72,7 @@ List tables in dataset
To retrieve the list of tables in a given dataset use
:class:`~airflow.providers.google.cloud.operators.bigquery.BigQueryGetDatasetTablesOperator`.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_tables.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_get_dataset_tables]
Expand All @@ -89,7 +89,7 @@ To update a table in BigQuery you can use
The update method replaces the entire Table resource, whereas the patch
method only replaces fields that are provided in the submitted Table resource.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_tables.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_update_table]
Expand All @@ -106,7 +106,7 @@ To update a dataset in BigQuery you can use
The update method replaces the entire dataset resource, whereas the patch
method only replaces fields that are provided in the submitted dataset resource.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_dataset.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_update_dataset]
Expand All @@ -120,7 +120,7 @@ Delete dataset
To delete an existing dataset from a BigQuery database you can use
:class:`~airflow.providers.google.cloud.operators.bigquery.BigQueryDeleteDatasetOperator`.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_dataset.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_delete_dataset]
Expand All @@ -143,15 +143,15 @@ ways. You may either directly pass the schema fields in, or you may point the
operator to a Google Cloud Storage object name. The object in Google Cloud
Storage must be a JSON file with the schema fields in it.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_tables.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_create_table]
:end-before: [END howto_operator_bigquery_create_table]

You can use this operator to create a view on top of an existing table.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_tables.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_create_view]
Expand All @@ -160,7 +160,7 @@ You can use this operator to create a view on top of an existing table.
You can also use this operator to create a materialized view that periodically
cache results of a query for increased performance and efficiency.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_tables.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_create_materialized_view]
Expand All @@ -177,15 +177,22 @@ you can use

Similarly to
:class:`~airflow.providers.google.cloud.operators.bigquery.BigQueryCreateEmptyTableOperator`
you may either directly pass the schema fields in, or you may point the operator
to a Google Cloud Storage object name.
you can directly pass the schema fields in.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_operations.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_create_external_table]
:end-before: [END howto_operator_bigquery_create_external_table]

Or you may point the operator to a Google Cloud Storage object name where the schema is stored.

.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_tables.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_create_table_schema_json]
:end-before: [END howto_operator_bigquery_create_table_schema_json]

.. _howto/operator:BigQueryGetDataOperator:

Fetch data from table
Expand All @@ -201,7 +208,7 @@ returned list will be equal to the number of rows fetched. Each element in the
list will again be a list where elements would represent the column values for
that row.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_queries.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_queries.py
:language: python
:dedent: 8
:start-after: [START howto_operator_bigquery_get_data]
Expand All @@ -218,7 +225,7 @@ To upsert a table you can use
This operator either updates the existing table or creates a new, empty table
in the given dataset.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_tables.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_upsert_table]
Expand All @@ -235,7 +242,7 @@ To update the schema of a table you can use
This operator updates the schema field values supplied, while leaving the rest unchanged. This is useful
for instance to set new field descriptions on an existing table schema.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_tables.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_update_table_schema]
Expand All @@ -249,23 +256,23 @@ Delete table
To delete an existing table you can use
:class:`~airflow.providers.google.cloud.operators.bigquery.BigQueryDeleteTableOperator`.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_tables.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_delete_table]
:end-before: [END howto_operator_bigquery_delete_table]

You can also use this operator to delete a view.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_tables.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_delete_view]
:end-before: [END howto_operator_bigquery_delete_view]

You can also use this operator to delete a materialized view.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_operations.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_tables.py
:language: python
:dedent: 4
:start-after: [START howto_operator_bigquery_delete_materialized_view]
Expand All @@ -278,7 +285,7 @@ Execute BigQuery jobs

Let's say you would like to execute the following query.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_queries.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_queries.py
:language: python
:dedent: 0
:start-after: [START howto_operator_bigquery_query]
Expand All @@ -288,7 +295,7 @@ To execute the SQL query in a specific BigQuery database you can use
:class:`~airflow.providers.google.cloud.operators.bigquery.BigQueryInsertJobOperator` with
proper query job configuration that can be Jinja templated.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_queries.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_queries.py
:language: python
:dedent: 8
:start-after: [START howto_operator_bigquery_insert_job]
Expand All @@ -300,7 +307,7 @@ For more information on types of BigQuery job please check
If you want to include some files in your configuration you can use ``include`` clause of Jinja template
language as follow:

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_queries.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_queries.py
:language: python
:dedent: 8
:start-after: [START howto_operator_bigquery_select_job]
Expand Down Expand Up @@ -329,7 +336,7 @@ This operator expects a sql query that will return a single row. Each value on
that first row is evaluated using python ``bool`` casting. If any of the values
return ``False`` the check is failed and errors out.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_queries.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_queries.py
:language: python
:dedent: 8
:start-after: [START howto_operator_bigquery_check]
Expand All @@ -347,7 +354,7 @@ This operator expects a sql query that will return a single row. Each value on
that first row is evaluated against ``pass_value`` which can be either a string
or numeric value. If numeric, you can also specify ``tolerance``.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_queries.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_queries.py
:language: python
:dedent: 8
:start-after: [START howto_operator_bigquery_value_check]
Expand All @@ -362,7 +369,7 @@ To check that the values of metrics given as SQL expressions are within a certai
tolerance of the ones from ``days_back`` before you can use
:class:`~airflow.providers.google.cloud.operators.bigquery.BigQueryIntervalCheckOperator`.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_queries.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_queries.py
:language: python
:dedent: 8
:start-after: [START howto_operator_bigquery_interval_check]
Expand All @@ -380,7 +387,7 @@ use the ``{{ ds_nodash }}`` macro as the table name suffix.

:class:`~airflow.providers.google.cloud.sensors.bigquery.BigQueryTableExistenceSensor`.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_sensors.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_sensors.py
:language: python
:dedent: 4
:start-after: [START howto_sensor_bigquery_table]
Expand All @@ -392,7 +399,7 @@ Check that a Table Partition exists
To check that a table exists and has a partition you can use.
:class:`~airflow.providers.google.cloud.sensors.bigquery.BigQueryTablePartitionExistenceSensor`.

.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_bigquery_sensors.py
.. exampleinclude:: /../../tests/system/providers/google/bigquery/example_bigquery_sensors.py
:language: python
:dedent: 4
:start-after: [START howto_sensor_bigquery_table_partition]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -297,7 +297,7 @@ Configuration information defines how you want the sensitive data de-identified.

This config can either be saved and persisted in de-identification templates or defined in a :class:`~google.cloud.dlp_v2.types.DeidentifyConfig` object:

.. literalinclude:: /../../airflow/providers/google/cloud/example_dags/example_dlp.py
.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_dlp.py
:language: python
:start-after: [START dlp_deidentify_config_example]
:end-before: [END dlp_deidentify_config_example]
Expand Down
Loading