Skip to content

fuxianghuang1/PWCF

Repository files navigation

Probability Weighted Compact Feature for Domain Adaptive Retrieval

Published in CVPR 2020
Contact : huangfuxiang@cqu.edu.cn

Usage: MATLAB R2017

Running Models: Run main_demo.m

More datasets are available at https://pan.baidu.com/s/1EVlYCz51AyDnh5y7PJ5W_Q?pwd=qyrv

*If you want to cite the experimental results, please pay attention to the experimental details in the paper. For handwritten digit datasets (MNIST and USPS), following transfer learning, we select 2000 images from the MNIST as the source domain and 1800 images from the USPS as the target domain. Besides, for each dataset, we randomly select 500 target images as the test set (i.e., queries) and the rest images as the training set data. Note that, to investigate more samples, we select more datasets and diiferent settings in the ohter paper, which pobulished in TNNLS2021, i.e., Domain Adaptation Preconceived Hashing for Unconstrained Visual Retrieval. Specifically, we use 60000 images from the MNIST as the source domain and 10000 images from the USPS as the target domain. For each dataset, we randomly select 10% of the target images as the test set (i.e., queries) and the rest images as the training set data.

Cite: If you find this code useful in your research then please cite

@inproceedings{huang2020PWCF,
  title={Probability Weighted Compact Feature for Domain Adaptive Retrieval},  
  author={Huang, Fuxiang and Zhang, Lei and Yang, Yang and Zhou, Xichuan},  
  booktitle={CVPR},  
  pages={9579-9588},  
  year={2020}
}


@articale{huang2021domain,
  author={Huang, Fuxiang and Zhang, Lei and Gao, Xinbo},    
  journal={IEEE Transactions on Neural Networks and Learning Systems},    
  title={Domain Adaptation Preconceived Hashing for Unconstrained Visual Retrieval},    
  year={2021},  
  pages={1-15},    
  doi={10.1109/TNNLS.2021.3071127}  
}

About

Published in CVPR 2020; matlab codes

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages