This repository contains the PyTorch implementation of "NBC-Softmax", an auxiliary block contrastive loss which only uses the negative samples, to improve traditional softmax. This has achieved state of the art in author style detection. This is the official implementation for the results and work described in our paper NBC-Softmax https://arxiv.org/abs/2212.08184.
TLDR; A simple negative block contrastive loss addition for softmax.
The paper describes the loss function NBC-Softmax. This needs to be used with a dataset and network for deep metric learning.
In this repo we only show the NBC-Softmax code. Portions of the contrastive learning code is from PAMC.
The data and network, network
We acknowledge and thank the authors of these works for sharing their code and data.
Above figure shows the comparison between the traditional softmax loss (left) with NBC-softmax ( on the right). We use similarity of different classes, represented and managed by
- Our code was tested on CUDA 11.3.0, python 3.6.9, pytorch 1.3.1. Please note that some, minimal, changes were needed to get SYSML pl-lightning code to run on the current version.
- All parameters are defined in SYSML. additionally we use the following to define the NBS-softmax hyperparameters for --model_params_classwise
SingleDatasetModel
--batch_size 2048
--model_params_classwise "model_type='COMBO2'|model1_type='sm'|model2_type='proj_contrastiveBC1'|model2_ratio=0.5|proj_dim=0|NOTE='singleW2_0.01_G1_0.5_000_TTC_L5_NEG_0.20_z2048'"
MultiDatasetModel
--batch_size 2048
--model_params_cross "model_type='COMBO2'|model1_type='sm'|model2_type='proj_contrastiveBC1'|model2_ratio=0.5|proj_dim=0|NOTE='mutiW2_0.01_G1_0.5_000_TTC_L5_NEG_0.30_z2048'"
@article{kulatilleke2022nbcsoftmax,
title={NBC-Softmax: Darkweb Author fingerprinting and migration tracking},
author={Kulatilleke, Gayan K and Chandra, Shekhar S and Portmann, Marius},
journal={arXiv preprint arXiv:2212.08184},
year={2022}
}