Awesome Knowledge Distillation
-
Updated
Oct 14, 2024
Awesome Knowledge Distillation
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.
irresponsible innovation. Try now at https://chat.dev/
[ICML2023] Revisiting Data-Free Knowledge Distillation with Poisoned Teachers
[ICML 2024]Exploration and Anti-exploration with Distributional Random Network Distillation
Distillation Knowledge for training Multi-exit Model
An R package providing functions for interpreting and distilling machine learning models
It is envisaged to eliminate these light constituents by distillation (flash or stripping). A preliminary study of the operating conditions of the process can be done in pseudo-binary: we assimilate the C7 cut to n-heptane and the light ones to ethane. We wish to construct the diagrams [T-x-y] and [x-y], [h-x-y] of the ethane-n-heptane binary u…
Easily generate synthetic data for classification tasks using LLMs
Code Reproduction of the essay Distillation Decision Tree
This repository provides a combination of the bubble-point algorithm and Naphtali-Sandholm algorithm to steadily compute distillation separations with partial condensers
Improving Typhoon Center Location Models by Augmented Typhoon Image and Distillation Methods
CISPA Summer Internship
Add a description, image, and links to the distillation-model topic page so that developers can more easily learn about it.
To associate your repository with the distillation-model topic, visit your repo's landing page and select "manage topics."