A Modified Minimal Gated Unit and Its FPGA Implementation

dc.contributor.authorZhu, Tong Jr
dc.contributor.supervisorDong, Xiao Dai Jr
dc.date.accessioned2020-12-24T23:28:42Z
dc.date.available2020-12-24T23:28:42Z
dc.date.copyright2020en_US
dc.date.issued2020-12-24
dc.degree.departmentDepartment of Electrical and Computer Engineeringen_US
dc.degree.levelMaster of Engineering M.Eng.en_US
dc.description.abstractRecurrent neural networks (RNNs) are versatile structures used in a variety of sequence data-related applications. The two most popular proposals are long short-term memory (LSTM) and gated recurrent unit (GRU) networks. Towards the goal of building a simpler and more efficient network, minimal gated unit (MGU) has appeared and shown quite promising results. In this project, we present a simple and improved MGU model, MGU\_1, implemented on scalable field programmable gate arrays (FPGA). Experiments with various sequence data show that the MGU\_1 has better accuracy compared to the MGU. The accelerator implemented on FPGA accelerates the inference phase utilizing the model trained on our indoor localization data set. It has two layers of MGU\_1 and each has 32 hidden units. The accelerator can achieve 142 MHz and 60 GOPS on the Xilinx XC7Z020 FPGA and outperforms the Intel i5-5350U based software solution by two orders of magnitude.en_US
dc.description.scholarlevelGraduateen_US
dc.identifier.urihttp://hdl.handle.net/1828/12500
dc.language.isoenen_US
dc.rightsAvailable to the World Wide Weben_US
dc.subjectRNNen_US
dc.subjectFPGAen_US
dc.subjectMGUen_US
dc.subjectIndoor Localizationen_US
dc.titleA Modified Minimal Gated Unit and Its FPGA Implementationen_US
dc.typeprojecten_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Zhu_Tong_MEng_2020.pdf
Size:
1.34 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: