The Derivation and justification of a CORDIC implemented artificial neuron
Date
1995
Authors
Wedlake, Martine Bruce
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Hopfield's network[31,32], and Widrow and Hoff's[63] Adaptive Linear Element (Adaline) network demonstrated early on that artificial neural networks were versatile; however, the Hopfield network's limited information capacity and the Adaline's inability to train multiple layers limited these systems to simple networks. The field of artificial neural networks has blossomed in the past few years due, in part, to the pivotal results of Rumelhart's Back Propagation network[51]. With his training methodology we are able to construct meaningful neural networks and train them reliably. The Back Propagation network addresses these problems by extending the single layer of neurons into a multi- layer perceptron. This architecture improves information capacity and also allows for arbitrary mapping functions between the input and output spaces[40].
Digital neural networks are not well represented in the literature, as compared with software implementations. To fill in the gap, this work applies the CORDIC technique to the field of neural networks to construct digital neurons.
The resulting CORDIC neuron is an enabling technology; in of itself a neuron is barely interesting, but the CORDIC neuron enables full neural network systems to be constructed. The CORDIC method is particularly powerful for neural networks because of its ability to calculate the nonlinear activation function in a straightforward manner whereas most digital neural networks use ROMs for the activation function.
The CORDIC neuron provides a rich set of topics to examine. Of particular interest in this work is the sequence of CORDIC steps: the CORDIC schedule. Both static and dynamic CORDIC schedules are considered and evaluated. Static CORDIC neurons are made initially using emulation software, and later on the Xilinx XC4003 FPGA chip.