Skip to content

Comments

Move checkpoint loading before DataParallel#22

Merged
Tobias-Fischer merged 1 commit intoQVPR:mainfrom
michaelschleiss:main
Aug 3, 2021
Merged

Move checkpoint loading before DataParallel#22
Tobias-Fischer merged 1 commit intoQVPR:mainfrom
michaelschleiss:main

Conversation

@michaelschleiss
Copy link
Contributor

Currently feature_extract.py will fail with the following error message if nGPU > 1.

=> loading checkpoint '/home/michael.schleiss/repos/Patch-NetVLAD/patchnetvlad/./pretrained_models/pittsburgh_WPCA128.pth.tar'
Traceback (most recent call last):
  File "feature_extract.py", line 174, in <module>
    main()
  File "feature_extract.py", line 160, in main
    model.load_state_dict(checkpoint['state_dict'])
  File "/home/michael.schleiss/miniconda3/envs/netvlad/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1223, in load_state_dict
    raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for Module:
	Missing key(s) in state_dict: "encoder.module.0.weight", "encoder.module.0.bias", "encoder.module.2.weight", "encoder.module.2.bias", "encoder.module.5.weight", "encoder.module.5.bias", "encoder.module.7.weight", "encoder.module.7.bias", "encoder.module.10.weight", "encoder.module.10.bias", "encoder.module.12.weight", "encoder.module.12.bias", "encoder.module.14.weight", "encoder.module.14.bias", "encoder.module.17.weight", "encoder.module.17.bias", "encoder.module.19.weight", "encoder.module.19.bias", "encoder.module.21.weight", "encoder.module.21.bias", "encoder.module.24.weight", "encoder.module.24.bias", "encoder.module.26.weight", "encoder.module.26.bias", "encoder.module.28.weight", "encoder.module.28.bias", "pool.module.centroids", "pool.module.conv.weight".
	Unexpected key(s) in state_dict: "encoder.0.weight", "encoder.0.bias", "encoder.2.weight", "encoder.2.bias", "encoder.5.weight", "encoder.5.bias", "encoder.7.weight", "encoder.7.bias", "encoder.10.weight", "encoder.10.bias", "encoder.12.weight", "encoder.12.bias", "encoder.14.weight", "encoder.14.bias", "encoder.17.weight", "encoder.17.bias", "encoder.19.weight", "encoder.19.bias", "encoder.21.weight", "encoder.21.bias", "encoder.24.weight", "encoder.24.bias", "encoder.26.weight", "encoder.26.bias", "encoder.28.weight", "encoder.28.bias", "pool.centroids", "pool.conv.weight".

Fixed by moving model.load_state_dict(checkpoint['state_dict']) before wrapping the model into nn.DataParallel.

Currently feature_extract.py will fail with the following error message if nGPU > 1.

=> loading checkpoint '/home/gaia.fkie.fraunhofer.de/michael.schleiss/repos/Patch-NetVLAD/patchnetvlad/./pretrained_models/pittsburgh_WPCA128.pth.tar'
Traceback (most recent call last):
  File "feature_extract.py", line 174, in <module>
    main()
  File "feature_extract.py", line 160, in main
    model.load_state_dict(checkpoint['state_dict'])
  File "/home/gaia.fkie.fraunhofer.de/michael.schleiss/miniconda3/envs/netvlad/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1223, in load_state_dict
    raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for Module:
	Missing key(s) in state_dict: "encoder.module.0.weight", "encoder.module.0.bias", "encoder.module.2.weight", "encoder.module.2.bias", "encoder.module.5.weight", "encoder.module.5.bias", "encoder.module.7.weight", "encoder.module.7.bias", "encoder.module.10.weight", "encoder.module.10.bias", "encoder.module.12.weight", "encoder.module.12.bias", "encoder.module.14.weight", "encoder.module.14.bias", "encoder.module.17.weight", "encoder.module.17.bias", "encoder.module.19.weight", "encoder.module.19.bias", "encoder.module.21.weight", "encoder.module.21.bias", "encoder.module.24.weight", "encoder.module.24.bias", "encoder.module.26.weight", "encoder.module.26.bias", "encoder.module.28.weight", "encoder.module.28.bias", "pool.module.centroids", "pool.module.conv.weight".
	Unexpected key(s) in state_dict: "encoder.0.weight", "encoder.0.bias", "encoder.2.weight", "encoder.2.bias", "encoder.5.weight", "encoder.5.bias", "encoder.7.weight", "encoder.7.bias", "encoder.10.weight", "encoder.10.bias", "encoder.12.weight", "encoder.12.bias", "encoder.14.weight", "encoder.14.bias", "encoder.17.weight", "encoder.17.bias", "encoder.19.weight", "encoder.19.bias", "encoder.21.weight", "encoder.21.bias", "encoder.24.weight", "encoder.24.bias", "encoder.26.weight", "encoder.26.bias", "encoder.28.weight", "encoder.28.bias", "pool.centroids", "pool.conv.weight".
@Tobias-Fischer
Copy link
Contributor

Many thanks for the catch @michaelschleiss! Looks good to me - @oravus @StephenHausler let's merge?

@StephenHausler
Copy link
Contributor

Yep looks good, feel free to merge

@Tobias-Fischer Tobias-Fischer merged commit 4edee2f into QVPR:main Aug 3, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants