Awesome papers & datasets specifically focused on long-term videos.
-
Updated
Nov 15, 2024
Awesome papers & datasets specifically focused on long-term videos.
Temporal Sentence Grounding in Videos / Natural Language Video Localization / Video Moment Retrieval的相关工作
source code of our RaNet in EMNLP 2021
source code of our MGPN in SIGIR 2022
Pytorch implementation of the paper 'Gaussian Mixture Proposals with Pull-Push Learning Scheme to Capture Diverse Events for Weakly Supervised Temporal Video Grounding' (AAAI2024).
ACM Multimedia 2023 - Temporal Sentence in Streaming Videos
paper list on Video Moment Retrieval (VMR), or Natural Language Video Localization (NLVL), Video Grounding (VG), or Temporal Sentence Grounding in Videos (TSGV)
Add a description, image, and links to the temporal-sentence-grounding topic page so that developers can more easily learn about it.
To associate your repository with the temporal-sentence-grounding topic, visit your repo's landing page and select "manage topics."