Skip to content

Move test model folders #17034

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 33 commits into from
May 3, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
babdc23
test - to be revert
ydshieh Apr 27, 2022
8ff0bde
temp change to generate new cache - to be reverted
ydshieh Apr 27, 2022
d09b62b
temp change to generate new cache - to be reverted
ydshieh Apr 27, 2022
75ff03c
clean-up
ydshieh Apr 27, 2022
ffbcad9
move test model folders (TODO: fix imports and others)
ydshieh Apr 26, 2022
7851bdb
fix (potentially partially) imports (in model test modules)
ydshieh Apr 26, 2022
fed0b1d
fix (potentially partially) imports (in tokenization test modules)
ydshieh Apr 26, 2022
c439b16
fix (potentially partially) imports (in feature extraction test modules)
ydshieh Apr 26, 2022
f35f1c5
fix import utils.test_modeling_tf_core
ydshieh Apr 26, 2022
2d801d2
fix path ../fixtures/
ydshieh Apr 26, 2022
7d1d1a5
fix imports about generation.test_generation_flax_utils
ydshieh Apr 26, 2022
ab59261
fix more imports
ydshieh Apr 26, 2022
ae4827b
fix fixture path
ydshieh Apr 26, 2022
faba7b6
fix get_test_dir
ydshieh Apr 26, 2022
8c9600e
update module_to_test_file
ydshieh Apr 26, 2022
732ca46
fix get_tests_dir from wrong transformers.utils
ydshieh Apr 26, 2022
16e269e
update config.yml (CircleCI)
ydshieh Apr 26, 2022
6b52d1f
fix style
ydshieh Apr 26, 2022
e971915
remove missing imports
ydshieh Apr 26, 2022
4b0320e
update new model script
ydshieh Apr 26, 2022
8b470d0
update check_repo
ydshieh Apr 26, 2022
3dad271
update SPECIAL_MODULE_TO_TEST_MAP
ydshieh Apr 26, 2022
790f0d9
fix style
ydshieh Apr 26, 2022
c31b9c9
add __init__
ydshieh Apr 27, 2022
39966aa
update self-scheduled
ydshieh Apr 27, 2022
c90b35c
fix add_new_model scripts
ydshieh Apr 27, 2022
579cdbf
check one way to get location back
ydshieh Apr 27, 2022
fa9cfd4
python setup.py build install
ydshieh Apr 27, 2022
021ae85
fix import in test auto
ydshieh May 1, 2022
825581f
update self-scheduled.yml
ydshieh May 1, 2022
b83abd8
update slack notification script
ydshieh May 1, 2022
0e44f67
Add comments about artifact names
ydshieh May 2, 2022
9bc8e1a
fix for yolos
ydshieh May 3, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
2 changes: 1 addition & 1 deletion .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -916,7 +916,7 @@ jobs:
path: ~/transformers/test_preparation.txt
- run: |
if [ -f test_list.txt ]; then
python -m pytest -n 1 tests/*layoutlmv2* --dist=loadfile -s --make-reports=tests_layoutlmv2 --durations=100
python -m pytest -n 1 tests/models/*layoutlmv2* --dist=loadfile -s --make-reports=tests_layoutlmv2 --durations=100
fi
- store_artifacts:
path: ~/transformers/tests_output.txt
Expand Down
18 changes: 15 additions & 3 deletions .github/workflows/self-scheduled.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,13 +43,14 @@ jobs:
working-directory: /transformers
run: |
rm -rf tests/__pycache__
rm -rf tests/models/__pycache__
rm -rf reports

- id: set-matrix
name: Identify models to test
working-directory: /transformers/tests
run: |
echo "::set-output name=matrix::$(python3 -c 'import os; x = list(filter(os.path.isdir, os.listdir(os.getcwd()))); x.sort(); print(x)')"
echo "::set-output name=matrix::$(python3 -c 'import os; tests = os.getcwd(); model_tests = os.listdir(os.path.join(tests, "models")); d1 = sorted(list(filter(os.path.isdir, os.listdir(tests)))); d2 = sorted(list(filter(os.path.isdir, [f"models/{x}" for x in model_tests]))); d1.remove("models"); d = d2 + d1; print(d)')"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is a bit difficult to read, may I propose a simplified version which I hope is easier to follow, plus it adds some new lines and additionally removes __pycache__ dir which shouldn't be in the results.

Suggested change
echo "::set-output name=matrix::$(python3 -c 'import os; tests = os.getcwd(); model_tests = os.listdir(os.path.join(tests, "models")); d1 = sorted(list(filter(os.path.isdir, os.listdir(tests)))); d2 = sorted(list(filter(os.path.isdir, [f"models/{x}" for x in model_tests]))); d1.remove("models"); d = d2 + d1; print(d)')"
echo "::set-output name=matrix::$(python3 -c '\
import os; \
tests = os.getcwd(); \
d = set(filter(os.path.isdir, os.listdir(tests) + [f"models/{x}" for x in os.listdir(f"{tests}/models")])); \
d -= set(["models", "__pycache__"]); \
print(sorted(d))')"

Here is just the python snippet:

python -c '\
import os; \
tests = os.getcwd(); \
d = set(filter(os.path.isdir, os.listdir(tests) + [f"models/{x}" for x in os.listdir(f"{tests}/models")])); \
d -= set(["models", "__pycache__"]); \
print(sorted(d))'

which I tested with, I hope I plugged it in correctly above - i.e. untested.


- name: NVIDIA-SMI
run: |
Expand All @@ -76,7 +77,16 @@ jobs:
needs: setup
steps:
- name: Echo folder ${{ matrix.folders }}
run: echo "${{ matrix.folders }}"
shell: bash
# For folders like `models/bert`, set an env. var. (`matrix_folders`) to `models_bert`, which will be used to
# set the artifact folder names (because the character `/` is not allowed).
run: |
echo "${{ matrix.folders }}"
matrix_folders=${{ matrix.folders }}
echo "$matrix_folders"
matrix_folders=${matrix_folders/'models/'/'models_'}
echo "$matrix_folders"
echo "matrix_folders=$matrix_folders" >> $GITHUB_ENV

- name: Update clone
working-directory: /transformers
Expand All @@ -95,7 +105,7 @@ jobs:
if: ${{ always() }}
uses: actions/upload-artifact@v2
with:
name: ${{ matrix.machines }}_run_all_tests_gpu_${{ matrix.folders }}_test_reports
name: ${{ matrix.machines }}_run_all_tests_gpu_${{ env.matrix_folders }}_test_reports
path: /transformers/reports/${{ matrix.machines }}_tests_gpu_${{ matrix.folders }}

run_examples_gpu:
Expand Down Expand Up @@ -255,6 +265,8 @@ jobs:
CI_SLACK_CHANNEL_ID: ${{ secrets.CI_SLACK_CHANNEL_ID }}
CI_SLACK_CHANNEL_ID_DAILY: ${{ secrets.CI_SLACK_CHANNEL_ID_DAILY }}
CI_SLACK_CHANNEL_DUMMY_TESTS: ${{ secrets.CI_SLACK_CHANNEL_DUMMY_TESTS }}
# We pass `needs.setup.outputs.matrix` as the argument. A processing in `notification_service.py` to change
# `models/bert` to `models_bert` is required, as the artifact names use `_` instead of `/`.
run: |
pip install slack_sdk
python utils/notification_service.py "${{ needs.setup.outputs.matrix }}"
10 changes: 5 additions & 5 deletions src/transformers/commands/add_new_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -102,10 +102,10 @@ def run(self):

model_dir = f"{path_to_transformer_root}/src/transformers/models/{lowercase_model_name}"
os.makedirs(model_dir, exist_ok=True)
os.makedirs(f"{path_to_transformer_root}/tests/{lowercase_model_name}", exist_ok=True)
os.makedirs(f"{path_to_transformer_root}/tests/models/{lowercase_model_name}", exist_ok=True)

# Tests require submodules as they have parent imports
with open(f"{path_to_transformer_root}/tests/{lowercase_model_name}/__init__.py", "w"):
with open(f"{path_to_transformer_root}/tests/models/{lowercase_model_name}/__init__.py", "w"):
pass

shutil.move(
Expand Down Expand Up @@ -136,7 +136,7 @@ def remove_copy_lines(path):

shutil.move(
f"{directory}/test_modeling_{lowercase_model_name}.py",
f"{path_to_transformer_root}/tests/{lowercase_model_name}/test_modeling_{lowercase_model_name}.py",
f"{path_to_transformer_root}/tests/models/{lowercase_model_name}/test_modeling_{lowercase_model_name}.py",
)
else:
os.remove(f"{directory}/modeling_{lowercase_model_name}.py")
Expand All @@ -153,7 +153,7 @@ def remove_copy_lines(path):

shutil.move(
f"{directory}/test_modeling_tf_{lowercase_model_name}.py",
f"{path_to_transformer_root}/tests/{lowercase_model_name}/test_modeling_tf_{lowercase_model_name}.py",
f"{path_to_transformer_root}/tests/models/{lowercase_model_name}/test_modeling_tf_{lowercase_model_name}.py",
)
else:
os.remove(f"{directory}/modeling_tf_{lowercase_model_name}.py")
Expand All @@ -170,7 +170,7 @@ def remove_copy_lines(path):

shutil.move(
f"{directory}/test_modeling_flax_{lowercase_model_name}.py",
f"{path_to_transformer_root}/tests/{lowercase_model_name}/test_modeling_flax_{lowercase_model_name}.py",
f"{path_to_transformer_root}/tests/models/{lowercase_model_name}/test_modeling_flax_{lowercase_model_name}.py",
)
else:
os.remove(f"{directory}/modeling_flax_{lowercase_model_name}.py")
Expand Down
4 changes: 2 additions & 2 deletions src/transformers/commands/add_new_model_like.py
Original file line number Diff line number Diff line change
Expand Up @@ -554,7 +554,7 @@ def get_model_files(model_type: str, frameworks: Optional[List[str]] = None) ->
]
test_files = filter_framework_files(test_files, frameworks=frameworks)
# Add the test directory
test_files = [REPO_PATH / "tests" / module_name / f for f in test_files]
test_files = [REPO_PATH / "tests" / "models" / module_name / f for f in test_files]
# Filter by existing files
test_files = [f for f in test_files if f.exists()]

Expand Down Expand Up @@ -1227,7 +1227,7 @@ def disable_fx_test(filename: Path) -> bool:

disabled_fx_test = False

tests_folder = REPO_PATH / "tests" / new_model_patterns.model_lower_cased
tests_folder = REPO_PATH / "tests" / "models" / new_model_patterns.model_lower_cased
os.makedirs(tests_folder, exist_ok=True)
with open(tests_folder / "__init__.py", "w"):
pass
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,25 +26,25 @@
{% endif -%}
{% if cookiecutter.has_fast_class == "True" and cookiecutter.slow_tokenizer_use_sentencepiece == "True" -%}
from transformers.testing_utils import require_sentencepiece, require_tokenizers
from ..test_tokenization_common import TokenizerTesterMixin
from ...test_tokenization_common import TokenizerTesterMixin


@require_sentencepiece
@require_tokenizers
{% elif cookiecutter.slow_tokenizer_use_sentencepiece == "True" -%}
from transformers.testing_utils import require_sentencepiece
from ..test_tokenization_common import TokenizerTesterMixin
from ...test_tokenization_common import TokenizerTesterMixin


@require_sentencepiece
{% elif cookiecutter.has_fast_class == "True" -%}
from transformers.testing_utils import require_tokenizers
from ..test_tokenization_common import TokenizerTesterMixin
from ...test_tokenization_common import TokenizerTesterMixin


@require_tokenizers
{% else -%}
from ..test_tokenization_common import TokenizerTesterMixin
from ...test_tokenization_common import TokenizerTesterMixin


{% endif -%}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@
from transformers import is_flax_available, {{cookiecutter.camelcase_modelname}}Config
from transformers.testing_utils import require_flax, slow

from ..test_configuration_common import ConfigTester
from ..test_modeling_flax_common import FlaxModelTesterMixin, ids_tensor
from ...test_configuration_common import ConfigTester
from ...test_modeling_flax_common import FlaxModelTesterMixin, ids_tensor

if is_flax_available():
import numpy as np
Expand Down Expand Up @@ -345,8 +345,8 @@ def test_inference_masked_lm(self):
)
from transformers.testing_utils import require_sentencepiece, require_flax, require_tokenizers, slow

from ..test_configuration_common import ConfigTester
from ..test_modeling_flax_common import FlaxModelTesterMixin, ids_tensor
from ...test_configuration_common import ConfigTester
from ...test_modeling_flax_common import FlaxModelTesterMixin, ids_tensor


if is_flax_available():
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@
from transformers import is_tf_available, {{cookiecutter.camelcase_modelname}}Config
from transformers.testing_utils import require_tf, slow

from ..test_configuration_common import ConfigTester
from ..test_modeling_tf_common import TFModelTesterMixin, floats_tensor, ids_tensor, random_attention_mask
from ...test_configuration_common import ConfigTester
from ...test_modeling_tf_common import TFModelTesterMixin, floats_tensor, ids_tensor, random_attention_mask


if is_tf_available():
Expand Down Expand Up @@ -711,8 +711,8 @@ def test_inference_masked_lm(self):
)
from transformers.testing_utils import require_sentencepiece, require_tf, require_tokenizers, slow

from ..test_configuration_common import ConfigTester
from ..test_modeling_tf_common import TFModelTesterMixin, ids_tensor
from ...test_configuration_common import ConfigTester
from ...test_modeling_tf_common import TFModelTesterMixin, ids_tensor


if is_tf_available():
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,13 +18,13 @@
{% if cookiecutter.is_encoder_decoder_model == "False" -%}
import unittest

from ..test_modeling_common import floats_tensor
from ...test_modeling_common import floats_tensor
from transformers import is_torch_available
from transformers.testing_utils import require_torch, slow, torch_device

from transformers import {{cookiecutter.camelcase_modelname}}Config
from ..test_configuration_common import ConfigTester
from ..test_modeling_common import ModelTesterMixin, ids_tensor, random_attention_mask
from ...test_configuration_common import ConfigTester
from ...test_modeling_common import ModelTesterMixin, ids_tensor, random_attention_mask


if is_torch_available():
Expand Down Expand Up @@ -489,9 +489,9 @@ def test_inference_masked_lm(self):
from transformers.utils import cached_property
from transformers.testing_utils import require_sentencepiece, require_tokenizers, require_torch, slow, torch_device

from ..test_configuration_common import ConfigTester
from ..generation.test_generation_utils import GenerationTesterMixin
from ..test_modeling_common import ModelTesterMixin, ids_tensor
from ...test_configuration_common import ConfigTester
from ...generation.test_generation_utils import GenerationTesterMixin
from ...test_modeling_common import ModelTesterMixin, ids_tensor


if is_torch_available():
Expand Down
5 changes: 3 additions & 2 deletions tests/deepspeed/test_model_zoo.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
TestCasePlus,
execute_subprocess_async,
get_gpu_count,
get_tests_dir,
require_deepspeed,
require_torch_gpu,
slow,
Expand Down Expand Up @@ -70,8 +71,8 @@
XLNET_TINY = "sshleifer/tiny-xlnet-base-cased"
BERT_TINY = "hf-internal-testing/tiny-bert"

FIXTURE_DIRECTORY = os.path.join(dirname(dirname(os.path.abspath(__file__))), "fixtures")
ROOT_DIRECTORY = os.path.join(dirname(dirname(dirname(os.path.abspath(__file__)))))
FIXTURE_DIRECTORY = get_tests_dir("fixtures")
ROOT_DIRECTORY = os.path.join(dirname(get_tests_dir()))

# TODO: to add:
# albert
Expand Down
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@
from transformers.models.auto import get_values
from transformers.testing_utils import require_torch, slow, torch_device

from ..test_configuration_common import ConfigTester
from ..test_modeling_common import ModelTesterMixin, ids_tensor, random_attention_mask
from ...test_configuration_common import ConfigTester
from ...test_modeling_common import ModelTesterMixin, ids_tensor, random_attention_mask


if is_torch_available():
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
from transformers import AlbertConfig, is_flax_available
from transformers.testing_utils import require_flax, slow

from ..test_modeling_flax_common import FlaxModelTesterMixin, ids_tensor, random_attention_mask
from ...test_modeling_flax_common import FlaxModelTesterMixin, ids_tensor, random_attention_mask


if is_flax_available():
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@
from transformers.models.auto import get_values
from transformers.testing_utils import require_tf, slow

from ..test_configuration_common import ConfigTester
from ..test_modeling_tf_common import TFModelTesterMixin, ids_tensor, random_attention_mask
from ...test_configuration_common import ConfigTester
from ...test_modeling_tf_common import TFModelTesterMixin, ids_tensor, random_attention_mask


if is_tf_available():
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,17 +13,15 @@
# See the License for the specific language governing permissions and
# limitations under the License.

import os
import unittest
from os.path import dirname

from transformers import AlbertTokenizer, AlbertTokenizerFast
from transformers.testing_utils import require_sentencepiece, require_tokenizers, slow
from transformers.testing_utils import get_tests_dir, require_sentencepiece, require_tokenizers, slow

from ..test_tokenization_common import TokenizerTesterMixin
from ...test_tokenization_common import TokenizerTesterMixin


SAMPLE_VOCAB = os.path.join(dirname(dirname(os.path.abspath(__file__))), "fixtures/spiece.model")
SAMPLE_VOCAB = get_tests_dir("fixtures/spiece.model")


@require_sentencepiece
Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@
# limitations under the License.

import importlib
import os
import sys
import tempfile
import unittest
Expand All @@ -24,15 +23,15 @@
from transformers.models.auto.configuration_auto import CONFIG_MAPPING, AutoConfig
from transformers.models.bert.configuration_bert import BertConfig
from transformers.models.roberta.configuration_roberta import RobertaConfig
from transformers.testing_utils import DUMMY_UNKNOWN_IDENTIFIER
from transformers.testing_utils import DUMMY_UNKNOWN_IDENTIFIER, get_tests_dir


sys.path.append(str(Path(__file__).parent.parent.parent / "utils"))
sys.path.append(str(Path(__file__).parent.parent.parent.parent / "utils"))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if the number of parent calls is becoming too long, one could do:

Suggested change
sys.path.append(str(Path(__file__).parent.parent.parent.parent / "utils"))
sys.path.append(str(Path(__file__).parents[4] / "utils"))

but it's totally ok like it is right now.


from test_module.custom_configuration import CustomConfig # noqa E402


SAMPLE_ROBERTA_CONFIG = os.path.join(os.path.dirname(os.path.abspath(__file__)), "../fixtures/dummy-config.json")
SAMPLE_ROBERTA_CONFIG = get_tests_dir("fixtures/dummy-config.json")


class AutoConfigTest(unittest.TestCase):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@
# limitations under the License.

import json
import os
import sys
import tempfile
import unittest
Expand All @@ -28,20 +27,18 @@
Wav2Vec2Config,
Wav2Vec2FeatureExtractor,
)
from transformers.testing_utils import DUMMY_UNKNOWN_IDENTIFIER
from transformers.testing_utils import DUMMY_UNKNOWN_IDENTIFIER, get_tests_dir


sys.path.append(str(Path(__file__).parent.parent.parent / "utils"))
sys.path.append(str(Path(__file__).parent.parent.parent.parent / "utils"))

from test_module.custom_configuration import CustomConfig # noqa E402
from test_module.custom_feature_extraction import CustomFeatureExtractor # noqa E402


SAMPLE_FEATURE_EXTRACTION_CONFIG_DIR = os.path.join(os.path.dirname(os.path.abspath(__file__)), "../fixtures")
SAMPLE_FEATURE_EXTRACTION_CONFIG = os.path.join(
os.path.dirname(os.path.abspath(__file__)), "../fixtures/dummy_feature_extractor_config.json"
)
SAMPLE_CONFIG = os.path.join(os.path.dirname(os.path.abspath(__file__)), "../fixtures/dummy-config.json")
SAMPLE_FEATURE_EXTRACTION_CONFIG_DIR = get_tests_dir("fixtures")
SAMPLE_FEATURE_EXTRACTION_CONFIG = get_tests_dir("fixtures/dummy_feature_extractor_config.json")
SAMPLE_CONFIG = get_tests_dir("fixtures/dummy-config.json")


class AutoFeatureExtractorTest(unittest.TestCase):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
from ..bert.test_modeling_bert import BertModelTester


sys.path.append(str(Path(__file__).parent.parent.parent / "utils"))
sys.path.append(str(Path(__file__).parent.parent.parent.parent / "utils"))

from test_module.custom_configuration import CustomConfig # noqa E402

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,25 +36,22 @@
Wav2Vec2FeatureExtractor,
Wav2Vec2Processor,
)
from transformers.testing_utils import PASS, USER, is_staging_test
from transformers.testing_utils import PASS, USER, get_tests_dir, is_staging_test
from transformers.tokenization_utils import TOKENIZER_CONFIG_FILE
from transformers.utils import FEATURE_EXTRACTOR_NAME, is_tokenizers_available


sys.path.append(str(Path(__file__).parent.parent.parent / "utils"))
sys.path.append(str(Path(__file__).parent.parent.parent.parent / "utils"))

from test_module.custom_configuration import CustomConfig # noqa E402
from test_module.custom_feature_extraction import CustomFeatureExtractor # noqa E402
from test_module.custom_processing import CustomProcessor # noqa E402
from test_module.custom_tokenization import CustomTokenizer # noqa E402


SAMPLE_PROCESSOR_CONFIG = os.path.join(
os.path.dirname(os.path.abspath(__file__)), "../fixtures/dummy_feature_extractor_config.json"
)
SAMPLE_VOCAB = os.path.join(os.path.dirname(os.path.abspath(__file__)), "../fixtures/vocab.json")

SAMPLE_PROCESSOR_CONFIG_DIR = os.path.join(os.path.dirname(os.path.abspath(__file__)), "../fixtures")
SAMPLE_PROCESSOR_CONFIG = get_tests_dir("fixtures/dummy_feature_extractor_config.json")
SAMPLE_VOCAB = get_tests_dir("fixtures/vocab.json")
SAMPLE_PROCESSOR_CONFIG_DIR = get_tests_dir("fixtures")


class AutoFeatureExtractorTest(unittest.TestCase):
Expand Down
Loading