Transfer Learning Library for Domain Adaptation, Task Adaptation, and Domain Generalization
-
Updated
May 10, 2024 - Python
Transfer Learning Library for Domain Adaptation, Task Adaptation, and Domain Generalization
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
BOND: BERT-Assisted Open-Domain Name Entity Recognition with Distant Supervision
A repository contains more than 12 common statistical machine learning algorithm implementations. 常见机器学习算法原理与实现
Code for <Confidence Regularized Self-Training> in ICCV19 (Oral)
[CVPR 2022] ST++: Make Self-training Work Better for Semi-supervised Semantic Segmentation
[NAACL 2021] This is the code for our paper `Fine-Tuning Pre-trained Language Model with Weak Supervision: A Contrastive-Regularized Self-Training Approach'.
Code for <Domain Adaptation for Semantic Segmentation via Class-Balanced Self-Training> in ECCV18
PromptDet: Towards Open-vocabulary Detection using Uncurated Images, ECCV2022
Pay your attention into 1000 hours, and you can master anything you need.
PyTorch code for MUST
Self6D++: Occlusion-Aware Self-Supervised Monocular 6D Object Pose Estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI) 2021.
IAST: Instance Adaptive Self-training for Unsupervised Domain Adaptation (ECCV 2020) https://teacher.bupt.edu.cn/zhuchuang/en/index.htm
SLAM-Supported Semi-Supervised Learning for 6D Object Pose Estimation
[EMNLP 2021] Distantly-Supervised Named Entity Recognition with Noise-Robust Learning and Language Model Augmented Self-Training
Pytorch implementation of our paper: Adapting OCR with Limited Labels
Exploring prompt tuning with pseudolabels for multiple modalities, learning settings, and training strategies.
Improving Human Activity Recognition through Self-training with Unlabeled Data
[IEEE TETCI] "ADAST: Attentive Cross-domain EEG-based Sleep Staging Framework with Iterative Self-Training"
[EMNLP 2022 Findings] Towards Realistic Low-resource Relation Extraction: A Benchmark with Empirical Baseline Study
Add a description, image, and links to the self-training topic page so that developers can more easily learn about it.
To associate your repository with the self-training topic, visit your repo's landing page and select "manage topics."