Stars
✨✨Latest Advances on Multimodal Large Language Models
The official implementation of the paper The Change You Want to See (WACV 2023).
Green Frog https://github.com/redleafnew/zotero-updateifs 的easyScholar数据版。更新影响因子,其他一系列工具,详见Readme
Code for ECCV 2022 paper "Natural Synthetic Anomalies for Self-Supervised Anomaly Detection and Localization".
unoffical and work in progress PyTorch implementation of CutPaste
A list of accepted papers in recent IJCAI about anomaly detection.
Implementation of CVPR'23 paper "WinCLIP: Zero-/few-shot anomaly classification and segmentation". It successfully reproduces the same zero-/few-shot AD performance as that in the original paper.
A world of intelligent edge computing
Official code for Modeling the Background for Incremental Learning in Semantic Segmentation https://arxiv.org/abs/2002.00718
GRAIN is a new pretraining strategy for contrastive vision-language models that learns fine-grained visual features through grounding.
[KBS] Dual-path Frequency Discriminators for Few-shot Anomaly Detection
DINO-X: The World's Top-Performing Vision Model for Open-World Object Detection and Understanding
The LOCO-Annotations dataset is a specialized extension of the MVTec LOCO dataset, focusing on detecting and analyzing high-level semantic logical anomalies in industrial settings. This dataset pro…
The Codes and Data of The First-Ever Comprehensive Benchmark for Multimodal Large Language Models in Industrial Anomaly Detection
A collection of papers on the topic of ``Computer Vision in the Wild (CVinW)''
Project Page for "LISA: Reasoning Segmentation via Large Language Model"
Code and models for NExT-GPT: Any-to-Any Multimodal Large Language Model
Unsupervised text tokenizer for Neural Network-based text generation.
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
[TLLM'23] PandaGPT: One Model To Instruction-Follow Them All