Improving Capsule Networks using zero-skipping and pruning

dc.contributor.authorSharifi, Ramin
dc.contributor.supervisorBaniasadi, Amirali
dc.contributor.supervisorGulliver, T. Aaron
dc.date.accessioned2021-11-15T23:57:37Z
dc.date.available2021-11-15T23:57:37Z
dc.date.copyright2021en_US
dc.date.issued2021-11-15
dc.degree.departmentDepartment of Electrical and Computer Engineering
dc.degree.levelMaster of Applied Science M.A.Sc.en_US
dc.description.abstractCapsule Networks are the next generation of image classifiers. Although they have several advantages over conventional Convolutional Neural Networks (CNNs), they remain computationally heavy. Since inference on Capsule Networks is timeconsuming, thier usage becomes limited to tasks in which latency is not essential. Approximation methods in Deep Learning help networks lose redundant parameters to increase speed and lower energy consumption. In the first part of this work, we go through an algorithm called zero-skipping. More than 50% of trained CNNs consist of zeros or values small enough to be considered zero. Since multiplication by zero is a trivial operation, the zero-skipping algorithm can play a massive role in speed increase throughout the network. We investigate the eligibility of Capsule Networks for this algorithm on two different datasets. Our results suggest that Capsule Networks contain enough zeros in their Primary Capsules to benefit from this algorithm. In the second part of this thesis, we investigate pruning as one of the most popular Neural Network approximation methods. Pruning is the act of finding and removing neurons which have low or no impact on the output. We run experiments on four different datasets. Pruning Capsule Networks results in the loss of redundant Primary Capsules. The results show a significant increase in speed with a minimal drop in accuracy. We also, discuss how dataset complexity affects the pruning strategy.en_US
dc.description.scholarlevelGraduateen_US
dc.identifier.bibliographicCitationRamin Sharifi, Pouya Shiri, and Amirali Baniasadi. Zero-skipping in capsnet. is it worth it? In International Conference on Computers and Their Applications, volume 69, pages 355–361, 2020.en_US
dc.identifier.urihttp://hdl.handle.net/1828/13501
dc.languageEnglisheng
dc.language.isoenen_US
dc.rightsAvailable to the World Wide Weben_US
dc.subjectCapsule Networken_US
dc.subjectCapsNeten_US
dc.subjectDeep Learningen_US
dc.titleImproving Capsule Networks using zero-skipping and pruningen_US
dc.typeThesisen_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Sharifi_Ramin_MASc_2021.pdf
Size:
1.23 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2 KB
Format:
Item-specific license agreed upon to submission
Description: