Skip to content

naver-ai/matchme

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 

Repository files navigation

Jiwon Kim1,2*, Byeongho Heo1, Sangdoo Yun1, Seungryong Kim3, Dongyoon Han1,
* Work done during an internship at NAVER AI Lab, currently at LG AI Research
† Corresponding author

1NAVER AI LAB, 2LG AI Research, 3 KAIST

paper

Abstract

Semantic correspondence methods have advanced to obtaining high-quality correspondences employing complicated networks, aiming to maximize the model capacity. However, despite the performance improvements, they may remain constrained by the scarcity of training keypoint pairs, a consequence of the limited training images and the sparsity of keypoints. This paper builds on the hypothesis that there is an inherent data-hungry matter in learning semantic correspondences and uncovers the models can be more trained by employing densified training pairs. We demonstrate a simple machine annotator reliably enriches paired key points via machine supervision, requiring neither extra labeled key points nor trainable modules from unlabeled images. Consequently, our models surpass current state-of-the-art models on semantic correspondence learning benchmarks like SPair-71k, PF-PASCAL, and PF-WILLOW and enjoy further robustness on corruption benchmarks.

Our Motivation:

Current semantic correspondence learning suffers from data hunger during training:

  • (a) Labeled images in SPair-71k contain sparse manually annotated keypoint pairs

  • (b) Unlabeled images could be hidden supplementary sources to increase the pairs' density.

  • (c) Newly expanded image pairs can provide abundant densified pairs to alleviate the data-hungry matter.

    image

New Benchmark

Visualization of corrupted images in SPair-C

image

Corrupted images with different severities

image

Updates

  • (2024/10/11): Code is under internal review.
  • (2024/09/20): Our paper has been accepted at ACCV 2024🎉🎉🎉

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published