1
1
# Curbing Task Interference using Representation Similarity-Guided Multi-Task Feature Sharing
2
2
3
- OUTPUT_DIR: Directory to save output contents.
4
- DATA_DIR: Directory containing the datasets.
5
- MODEL_DIR: Directory containing the trained models.
3
+ OUTPUT_DIR: Directory to save output contents. < br />
4
+ DATA_DIR: Directory containing the datasets. < br />
5
+ MODEL_DIR: Directory containing the trained models. < br />
6
6
7
7
## Environment:
8
8
9
9
conda_env_local.yml file can be used to create an anaconda environment to run the code.
10
10
11
11
## Training script:
12
12
13
- To train the One-De model on cityscapes dataset:
13
+ To train the One-De model on cityscapes dataset: < br />
14
14
15
15
python train.py --batch-size 8 --workers 8 --data-folder /DATA_DIR/Cityscapes --crop-size 512 1024 --checkname train_cs --config-file ./model_cfgs/cityscapes/one_de.yaml --epochs 140 --lr .0001 --output-dir OUTPUT_DIR --lr-strategy stepwise --lr-decay 98 126 --base-optimizer RAdam --dataset cityscapes
16
-
16
+ < br />
17
17
Other model configs can be found in 'model_cfgs' directory.
18
18
19
19
@@ -22,17 +22,17 @@ Other model configs can be found in 'model_cfgs' directory.
22
22
Models can be evaluated using --eval-only arg along with train script.
23
23
24
24
25
- ## Get CKA similarities and task groupings:
26
- The following code runs grouping using seperate decoder (Sep-De).
25
+ ## Get CKA similarities and task groupings:
26
+ The following code runs grouping using seperate decoder (Sep-De). < br />
27
27
python explain.py --batch-size 4 --workers 0 --crop-size 480 640 --config-file ./model_cfgs/cityscapes/sep_de_group.yaml --resume MODEL_DIR/model_latest_140.pth --data-folder /DATA_DIR/NYUv2 --data-folder-1 /DATA_DIR/NYUv2/image/train --explainer-name CKA --compare-tasks --dataset cityscapes
28
28
29
29
## Cite Our Work
30
30
31
- If you find the code concerning Progressive Decoder Fusion (PDF) useful in your research, please consider citing our paper:
31
+ If you find the code concerning Progressive Decoder Fusion (PDF) useful in your research, please consider citing our paper: < br />
32
32
33
33
Pending.
34
34
35
- If you find the code for UniNet useful in your research, please consider citing our paper:
35
+ If you find the code for UniNet useful in your research, please consider citing our paper: < br />
36
36
37
37
@InProceedings {Gurulingan_2021_ICCV, <br />
38
38
author = {Gurulingan, Naresh Kumar and Arani, Elahe and Zonooz, Bahram}, <br />
0 commit comments