An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.
-
Updated
Apr 30, 2025 - Python
An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
[ICML 2018] "Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions"
Tools and libraries to run neural networks in Minecraft ⛏️
[ICLR 2022] "Audio Lottery: Speech Recognition Made Ultra-Lightweight, Noise-Robust, and Transferable", by Shaojin Ding, Tianlong Chen, Zhangyang Wang
Official PyTorch implementation of "LayerMerge: Neural Network Depth Compression through Layer Pruning and Merging" (ICML'24)
[ICLR 2023] Pruning Deep Neural Networks from a Sparsity Perspective
Code for testing DCT plus Sparse (DCTpS) networks
Bayesian Optimization-Based Global Optimal Rank Selection for Compression of Convolutional Neural Networks, IEEE Access
Official PyTorch implementation of "Efficient Latency-Aware CNN Depth Compression via Two-Stage Dynamic Programming" (ICML'23)
Compact representations of convolutional neural networks via weight pruning and quantization
Code for our WACV 2021 paper "Exploiting the Redundancy in Convolutional Filters for Parameter Reduction"
ESPN: Extreme Sparse Pruned Network
Use a meta-network to learn the importance and correlation of neural network weights
Neural network compression with SVD
Image classification using compressed deep neural network ported on resource-constrained platforms.
Neural Network Pruning Using Dependency Measures
Compressed CNNs for airplane classification in satellite images (APoZ-based parameter pruning, INT8 weight quantization)
This repository is for reproducing the results shown in the NNCodec ICML Workshop paper. Additionally, it includes a demo, prepared for the Neural Compression Workshop (NCW).
Add a description, image, and links to the neural-network-compression topic page so that developers can more easily learn about it.
To associate your repository with the neural-network-compression topic, visit your repo's landing page and select "manage topics."