- Austria
Highlights
- Pro
Stars
[CVPR 2024] Depth Anything: Unleashing the Power of Large-Scale Unlabeled Data. Foundation Model for Monocular Depth Estimation
Fuse whisper and pyannote results
Demonstrations of Loss of Plasticity and Implementation of Continual Backpropagation
Set of React components for PDF annotation
Semantic search engine indexing 95 million academic publications
152334H / tortoise-tts-fast
Forked from neonbjb/tortoise-ttsFast TorToiSe inference (5x or your money back!)
[WACV2021] Foreground-aware Semantic Representations for Image Harmonization https://arxiv.org/abs/2006.00809
OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
Cross-domain Correspondence Learning for Exemplar-based Image Translation. (CVPR 2020 Oral)
Pythonic AI generation of images and videos
An integrated solution for authoring / importing / simulating / rendering strand-based hair in Unity.
A multi-voice TTS system trained with an emphasis on quality
This package provides a pythorch implementation of "GAN-Control: Explicitly Controllable GANs", ICCV 2021.
Code for ICCV2019 "Symmetric Cross Entropy for Robust Learning with Noisy Labels"
[CVPR 2022] StyleSwin: Transformer-based GAN for High-resolution Image Generation
An implementation of the efficient attention module.
fenollp / mediapipe
Forked from google-ai-edge/mediapipeMediaPipe is a cross-platform framework for building multimodal applied machine learning pipelines
effort to incorporate medipipe to Nvidia Jetson ecosystem
A collection of resources on applications of Transformers in Medical Imaging.
A WebGL viewer for UMAP or TSNE-clustered images
list of efficient attention modules
Tensorflow implementation of An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale https://arxiv.org/abs/2010.11929v2
[TOG 2022] SofGAN: A Portrait Image Generator with Dynamic Styling
[CVPR 2020 Workshop] A PyTorch GAN library that reproduces research results for popular GANs.