Deep K-Means: Re-Training And Parameter Sharing With Harder ...
Maybe your like
[edit]
Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep ConvolutionsJunru Wu, Yue Wang, Zhenyu Wu, Zhangyang Wang, Ashok Veeraraghavan, Yingyan Lin Proceedings of the 35th International Conference on Machine Learning, PMLR 80:5363-5372, 2018.Abstract
The current trend of pushing CNNs deeper with convolutions has created a pressing demand to achieve higher compression gains on CNNs where convolutions dominate the computation and parameter amount (e.g., GoogLeNet, ResNet and Wide ResNet). Further, the high energy consumption of convolutions limits its deployment on mobile devices. To this end, we proposed a simple yet effective scheme for compressing convolutions though applying k-means clustering on the weights, compression is achieved through weight-sharing, by only recording $K$ cluster centers and weight assignment indexes. We then introduced a novel spectrally relaxed $k$-means regularization, which tends to make hard assignments of convolutional layer weights to $K$ learned cluster centers during re-training. We additionally propose an improved set of metrics to estimate energy consumption of CNN hardware implementations, whose estimation results are verified to be consistent with previously proposed energy estimation tool extrapolated from actual hardware measurements. We finally evaluated Deep $k$-Means across several CNN models in terms of both compression ratio and energy consumption reduction, observing promising results without incurring accuracy loss. The code is available at https://github.com/Sandbox3aster/Deep-K-MeansCite this Paper
BibTeX @InProceedings{pmlr-v80-wu18h, title = {Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions}, author = {Wu, Junru and Wang, Yue and Wu, Zhenyu and Wang, Zhangyang and Veeraraghavan, Ashok and Lin, Yingyan}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {5363--5372}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/wu18h/wu18h.pdf}, url = {https://proceedings.mlr.press/v80/wu18h.html}, abstract = {The current trend of pushing CNNs deeper with convolutions has created a pressing demand to achieve higher compression gains on CNNs where convolutions dominate the computation and parameter amount (e.g., GoogLeNet, ResNet and Wide ResNet). Further, the high energy consumption of convolutions limits its deployment on mobile devices. To this end, we proposed a simple yet effective scheme for compressing convolutions though applying k-means clustering on the weights, compression is achieved through weight-sharing, by only recording $K$ cluster centers and weight assignment indexes. We then introduced a novel spectrally relaxed $k$-means regularization, which tends to make hard assignments of convolutional layer weights to $K$ learned cluster centers during re-training. We additionally propose an improved set of metrics to estimate energy consumption of CNN hardware implementations, whose estimation results are verified to be consistent with previously proposed energy estimation tool extrapolated from actual hardware measurements. We finally evaluated Deep $k$-Means across several CNN models in terms of both compression ratio and energy consumption reduction, observing promising results without incurring accuracy loss. The code is available at https://github.com/Sandbox3aster/Deep-K-Means} } Copy to Clipboard Download Endnote %0 Conference Paper %T Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions %A Junru Wu %A Yue Wang %A Zhenyu Wu %A Zhangyang Wang %A Ashok Veeraraghavan %A Yingyan Lin %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-wu18h %I PMLR %P 5363--5372 %U https://proceedings.mlr.press/v80/wu18h.html %V 80 %X The current trend of pushing CNNs deeper with convolutions has created a pressing demand to achieve higher compression gains on CNNs where convolutions dominate the computation and parameter amount (e.g., GoogLeNet, ResNet and Wide ResNet). Further, the high energy consumption of convolutions limits its deployment on mobile devices. To this end, we proposed a simple yet effective scheme for compressing convolutions though applying k-means clustering on the weights, compression is achieved through weight-sharing, by only recording $K$ cluster centers and weight assignment indexes. We then introduced a novel spectrally relaxed $k$-means regularization, which tends to make hard assignments of convolutional layer weights to $K$ learned cluster centers during re-training. We additionally propose an improved set of metrics to estimate energy consumption of CNN hardware implementations, whose estimation results are verified to be consistent with previously proposed energy estimation tool extrapolated from actual hardware measurements. We finally evaluated Deep $k$-Means across several CNN models in terms of both compression ratio and energy consumption reduction, observing promising results without incurring accuracy loss. The code is available at https://github.com/Sandbox3aster/Deep-K-Means Copy to Clipboard Download APA Wu, J., Wang, Y., Wu, Z., Wang, Z., Veeraraghavan, A. & Lin, Y.. (2018). Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:5363-5372 Available from https://proceedings.mlr.press/v80/wu18h.html. Copy to Clipboard DownloadRelated Material
- Download PDF
Tag » Cnn K-means
-
Analysis Of Architecture Combining Convolutional Neural Network ...
-
[PDF] Unsupervised Feature Learning With K-means And An Ensemble Of ...
-
Comparison Of K-Means, Autoencoder, And CNN | Download Table
-
Based On K-Means Clustering And CNN Algorithm Research In Hail ...
-
Why Does K-means Give Worst Results Than CNN On Cifar-10 Dataset?
-
How To Use K Means Clustering To Visualise Learnt Features Of A CNN ...
-
[PDF] Lecture 15: K-Means, Image Representation, CNN
-
[PDF] Unsupervised Clustering Based Understanding Of CNN
-
How To Use K-means Clustering To Visualise Learnt Features Of A CNN ...
-
Using K-Means Clustering For Image Segregation | By Hrishi Patel
-
Automatic Myocarditis Diagnosis Using Convolutional Neural Network ...
-
ASzot/ClusterCNN: Using K-means Clustering For Unsupervised CNN ...
-
[PDF] Deep K-Means: Re-Training And Parameter Sharing With Harder ...