An official implementation code for paper "Exploring Multi-View Pixel Contrast for General and Robust Image Forgery Localization". This repo provides code and trained weights.
Detailed illustration of proposed image forgery localization network MPC. The training process of MPC consists of two stages. In the first stage, we solely train the backbone network to extract discriminative features. Each training sample is fed into the backbone network twice, resulting in multi-scale features from two modalities due to the presence of dropout. Subsequently, a subset of pixel embeddings is sampled from these features, forming a structured feature space under the supervision of three types of contrastive losses. In the second stage, the weights of the backbone network are frozen, and we only train a simple localization head to obtain pixel-wise localization maps.
- albumentations 1.3.1
- fvcore 0.1.5.post20221221
- numpy 1.23.0
- opencv-python 4.8.1.78
- opencv-python-headless 4.9.0.80
- torch 2.0.0+cu117
- torchsummary 1.5.1
- torchvision 0.8.2+cu110
- python 3.8
Generate the file list:
python generate_flist.py
For example to train: download hrt_base.pth
cd CATNet_dataset_train/stage1
python train.py
cd CATNet_dataset_train/stage2
python train.py
For example to test: download MPC_CATNet_stage2_weights.pth
cd CATNet_dataset_train/stage2
python test.py
If you want to test MPC of trained with CASIAv2 dataset, please download the weight file from MPC_CASIAv2_stage2_weights.pth
If you use this code for your research, please cite our paper
@article{lou2025exploring,
title={Exploring Multi-View Pixel Contrast for General and Robust Image Forgery Localization},
author={Lou, Zijie and Cao, Gang and Guo, Kun and Yu, Lifang and Weng, Shaowei},
journal={IEEE Transactions on Information Forensics and Security},
year={2025},
publisher={IEEE}
}
Licensed under a Creative Commons Attribution-NonCommercial 4.0 International for Non-commercial use only. Any commercial use should get formal permission first.
This code is based on FOCAL. Thanks for their awesome works.