Skip to content

Papers-Chill/Binary-Efficient

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 

Repository files navigation

Binary-Efficient

Binarizing MobileNet via Evolution-based Searching
Year: 2020

citations: 24
conference: CVPR
paper: https://openaccess.thecvf.com/content_CVPR_2020/papers/Phan_Binarizing_MobileNet_via_Evolution-Based_Searching_CVPR_2020_paper.pdf
abstract: First, we train strong baseline binary networks with a wide range of random group combinations at each convolutional layer. Чето там эволюция, группы подбирабют на каждом слое. Качество маленькое однако сильно.


ShiftAddNet: A Hardware-Inspired Deep Network
Year: 2020

citations: 34
conference: NeurlPS
paper: https://proceedings.neurips.cc/paper/2020/file/1cf44d7975e6c86cffa70cae95b5fbb2-Paper.pdf
abstract: Multiplication can be instead performed with additions and logical bit-shifts. Идея классная, но не соотносится с кантизацией или бинаризацией. Нельзя взять как "вспомогательную", и каку основную тоже странно будет.


Least squares binary quantization of neural networks
Year: 2020

citations: 22
conference: CVPR
paper: https://openaccess.thecvf.com/content_CVPRW_2020/papers/w40/Pouransari_Least_Squares_Binary_Quantization_of_Neural_Networks_CVPRW_2020_paper.pdf
abstract: Оптимизация заумная хрен пойми в чем заключается. Качество стремное.


INSTA-BNN: Binary Neural Network with INSTAnce-aware Threshold
Year: 2022

citations: 2
conference: preprint
https://arxiv.org/abs/2204.07439 \

TODO


AdaBin: Improving Binary Neural Networks with Adaptive Binary Sets
Year: 2022

citations: ?
conference: preprint
https://arxiv.org/abs/2208.08084 \

TODO


Binarizing MobileNet via Evolution-based Searching
Year: 2020

citations: 24
conference: CVPR
paper: https://openaccess.thecvf.com/content_CVPR_2020/papers/Phan_Binarizing_MobileNet_via_Evolution-Based_Searching_CVPR_2020_paper.pdf
abstract: First, we train strong baseline binary networks with a wide range of random group combinations at each convolutional layer. Чето там эволюция, группы подбирабют на каждом слое. Качество маленькое однако сильно.


Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization
Year: 2019

citations: 73
conference: NeurlPS
paper: https://proceedings.neurips.cc/paper/2019/file/9ca8c9b0996bbf05ae7753d34667a6fd-Paper.pdf
abstract: Говорят что фп веса под бинарными играют только роль инерции. Согласен. Может быть очень полезно адаптировать эти идеи чтобы улучшить в общем обучниея.


AsymmNet: Towards ultralight convolution neural networks using asymmetrical bottlenecks
Year: 2021

citations: 5
conference: CVPR
paper: ttps://openaccess.thecvf.com/content_CVPR_2020/papers/Xu_Learning_in_the_Frequency_Domain_CVPR_2020_paper.pdf code: https://github.com/calmevtime/DCTNet review: resue old features, it helps to keep the signal going.


Spectral section


CAN TRY:
Learning in the Frequency Domain
Year: 2020

citations: 118
conference: CVPR
paper: https://openaccess.thecvf.com/content_CVPR_2020/papers/Xu_Learning_in_the_Frequency_Domain_CVPR_2020_paper.pdf code: https://github.com/calmevtime/DCTNet review: Transform images DCT (discrete cosine transform) use it to train models.


SpecNet: Spectral Domain Convolutional Neural Network
Year: 2019

citations: 10
conference: None
paper: https://arxiv.org/pdf/1905.10915.pdf
abstract:


Deep Learning Based on Fourier Convolutional Neural Network Incorporating Random Kernels
Year: 2020

citations: 9
conference: Journal MDPI
paper:https://www.mdpi.com/2079-9292/10/16/2004/htm
abstract:


Hybrid Domain Convolutional Neural Network for Memory Efficient Training
Year: 2021

citations: 9
conference: CAAI
paper:https://link.springer.com/content/pdf/10.1007/978-3-030-93046-2.pdf
abstract:


Parametric Spectral Filters for Fast Converging, Scalable Convolutional Neural Networks
Year: 2021

citations: 0
conference: preprint
paper:https://link.springer.com/content/pdf/10.1007/978-3-030-93046-2.pdf
abstract:

Beyond Filters: Compact Feature Map for Portable Deep Mode

Year: 2017

citations: 57
paper: https://proceedings.mlr.press/v70/wang17m.html

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published