-
Notifications
You must be signed in to change notification settings - Fork 5
Description
05/28/2024 15:04:17 - WARNING - my_logger - Process rank: -1, device: cuda, n_gpu: 1, distributed training: False, 16-bits training: False
/home/billy/Billy/FL_LLM/FedPepTAO/FedPepTAO-main/encoder-only-roberta-large/data/process.py:642: FutureWarning: This processor will be removed from the library soon, preprocessing should be handled with the 🤗 Datasets library. You can have a look at this example script for pointers: https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py
warnings.warn(DEPRECATION_WARNING.format("processor"), FutureWarning)
Traceback (most recent call last):
File "encoder-only-roberta-large/run_fed_pers_with_optim_v2.py", line 687, in
best = main()
File "encoder-only-roberta-large/run_fed_pers_with_optim_v2.py", line 618, in main
config = RobertaConfig.from_pretrained(
File "/home/billy/anaconda3/envs/jx_roberta/lib/python3.8/site-packages/transformers/configuration_utils.py", line 427, in from_pretrained
config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/home/billy/anaconda3/envs/jx_roberta/lib/python3.8/site-packages/transformers/configuration_utils.py", line 484, in get_config_dict
resolved_config_file = cached_path(
File "/home/billy/anaconda3/envs/jx_roberta/lib/python3.8/site-packages/transformers/file_utils.py", line 1271, in cached_path
output_path = get_from_cache(
File "/home/billy/anaconda3/envs/jx_roberta/lib/python3.8/site-packages/transformers/file_utils.py", line 1494, in get_from_cache
raise ValueError(
ValueError: Connection error, and we cannot find the requested files in the cached path. Please try again or make sure your Internet connection is on.