Revisiting Color-Event based Tracking: A Unified Network, Dataset, and Metric, Chuanming Tang, Xiao Wang, Ju Huang, Bo Jiang, Lin Zhu, Jianlin Zhang, Yaowei Wang, Yonghong Tian [Project]
-
🔥 [2024.03.12] A New Long-term RGB-Event based Visual Object Tracking Benchmark Dataset (termed FELT) is available at [Paper] [Code] [DemoVideo]
-
🔥 [2024.03.06] Tracking results of CEUTrack on VisEvent dataset is available at [ceutrack_visevent_dataset_tracking_results.zip]
-
🔥 [2023.09.27] A High Definition (HD) Event based Visual Object Tracking Benchmark Dataset (termed EventVOT) is available at [arXiv] [Github]
- [YouTube]
Baidu Download link:https://pan.baidu.com/s/12XDlKABlz3lDkJJEDvsu9A Passcode:AHUT
The directory should have the below format:
├── COESOT dataset
├── Training Subset (827 videos, 160GB)
├── dvSave-2021_09_01_06_59_10
├── dvSave-2021_09_01_06_59_10_aps
├── dvSave-2021_09_01_06_59_10_dvs
├── dvSave-2021_09_01_06_59_10.aedat4
├── groundtruth.txt
├── absent.txt
├── start_end_index.txt
├── ...
├── Testing Subset (528 videos, 105GB)
├── dvSave-2021_07_30_11_04_12
├── dvSave-2021_07_30_11_04_12_aps
├── dvSave-2021_07_30_11_04_12_dvs
├── dvSave-2021_07_30_11_04_12.aedat4
├── groundtruth.txt
├── absent.txt
├── start_end_index.txt
├── ...
-
unzip the COESOT_eval_toolkit.zip, and open it with Matlab (over Matlab R2020).
-
add your tracking results and baseline results (Passcode:siaw) in
$/coesot_tracking_results/
and modify the name in$/utils/config_tracker.m
. BTW, here we also provide the event-only baseline tracking methods results in [Event_only Results] Passcode:qblp -
run
Evaluate_COESOT_benchmark_SP_PR_only.m
for the overall performance evaluation, including SR, PR, NPR.
- run
plot_BOC.m
for BOC score evaluation and figure plot. - run
plot_radar.m
for attributes radar figrue plot.
- run
Evaluate_COESOT_benchmark_attributes.m
for attributes analysis and figure saved in$/res_fig/
.
A unified framework for color-event tracking.
[Models] Passcode:0uk0 [Raw Results] Passcode:yeow [Training logs] Passcode:hnim
Install env
conda create -n event python=3.7
conda activate event
bash install.sh
Run the following command to set paths for this project
python tracking/create_default_local_file.py --workspace_dir . --data_dir ./data --save_dir ./output
After running this command, you can also modify paths by editing these two files
lib/train/admin/local.py # paths about training
lib/test/evaluation/local.py # paths about testing
Then, put the tracking datasets COESOT in ./data
.
Download pre-trained MAE ViT-Base weights and put it under $/pretrained_models
Download the model weights and put it on $/output/checkpoints/train/ceutrack
- [Note] More useful scripts can be found from:
https://github.com/Event-AHU/COESOT/tree/main/CEUTrack/scripts
# train
export CUDA_VISIBLE_DEVICES=0
python tracking/train.py --script ceutrack --config ceutrack_coesot \
--save_dir ./output --mode multiple --nproc_per_node 1 --use_wandb 0
# test
python tracking/test.py ceutrack ceutrack_coesot --dataset coesot --threads 4 --num_gpus 1
# eval
python tracking/analysis_results.py --dataset coesot --parameter_name ceutrack_coesot
Note: The speeds reported in our paper were tested on a single RTX 3090 GPU.
# Profiling ceutrack_coesot
python tracking/profile_model.py --script ceutrack --config ceutrack_coesot
Use the script from: [show_CAM.py]
from .show_CAM import getCAM
getCAM(response, curr_image, self.idx)
- Paper (arXiv) release
- COESOT dataset release
- Evaluation Toolkit release
- Source Code release
- Tracking Models release
- Thanks for the OSTrack, PyTracking and ViT library for a quickly implement.
@article{tang2022coesot,
title={Revisiting Color-Event based Tracking: A Unified Network, Dataset, and Metric},
author={Tang, Chuanming and Wang, Xiao and Huang, Ju and Jiang, Bo and Zhu, Lin and Zhang, Jianlin and Wang, Yaowei and Tian, Yonghong},
journal={arXiv preprint arXiv:2211.11010},
year={2022}
}