TY - GEN
T1 - Analog neuromorphic system based on multi input floating gate MOS neuron model
AU - Tripathi, Ankit
AU - Arabizadeh, Mehdi
AU - Khandelwal, Sourabh
AU - Thakur, Chetan Singh
PY - 2019
Y1 - 2019
N2 - This paper introduces a novel implementation of the low-power analog artificial neural network (ANN) using Multiple Input Floating Gate MOS (MIFGMOS) transistor for machine learning applications. The number of inputs to a neuron in an ANN is the major bottleneck in building a large scale analog system. The proposed MIFGMOS transistor enables to build a large scale system by combining multiple inputs in a single transistor with a small silicon footprint. Here, we show the MIFGMOS based implementation of the Extreme Learning Machine (ELM) architecture using the receptive field approach with transistor operating in the sub-threshold region. The MIFGMOS produces output current as a function of the weighted combination of the voltage applied to its gate terminals. In the ELM architecture, the weights between the input and the hidden layer are random and this allows exploiting the random device mismatch due to the fabrication process, for building Integrated Circuits (IC) based on ELM architecture. Thus, we use implicit random weights present due to device mismatch, and there is no need to store the input weights. We have verified our architecture using circuit simulations on regression and various classification problems such as on the MNIST data-set and a few UCI data-sets. The proposed MIFGMOS enables combining multiple inputs in a single transistor and will thus pave the way to build large scale deep learning neural networks.
AB - This paper introduces a novel implementation of the low-power analog artificial neural network (ANN) using Multiple Input Floating Gate MOS (MIFGMOS) transistor for machine learning applications. The number of inputs to a neuron in an ANN is the major bottleneck in building a large scale analog system. The proposed MIFGMOS transistor enables to build a large scale system by combining multiple inputs in a single transistor with a small silicon footprint. Here, we show the MIFGMOS based implementation of the Extreme Learning Machine (ELM) architecture using the receptive field approach with transistor operating in the sub-threshold region. The MIFGMOS produces output current as a function of the weighted combination of the voltage applied to its gate terminals. In the ELM architecture, the weights between the input and the hidden layer are random and this allows exploiting the random device mismatch due to the fabrication process, for building Integrated Circuits (IC) based on ELM architecture. Thus, we use implicit random weights present due to device mismatch, and there is no need to store the input weights. We have verified our architecture using circuit simulations on regression and various classification problems such as on the MNIST data-set and a few UCI data-sets. The proposed MIFGMOS enables combining multiple inputs in a single transistor and will thus pave the way to build large scale deep learning neural networks.
UR - http://www.scopus.com/inward/record.url?scp=85066808063&partnerID=8YFLogxK
U2 - 10.1109/ISCAS.2019.8702492
DO - 10.1109/ISCAS.2019.8702492
M3 - Conference proceeding contribution
AN - SCOPUS:85066808063
SN - 9781728103976
SP - 1
EP - 5
BT - 2019 IEEE International Symposium on Circuits and Systems (ISCAS)
PB - Institute of Electrical and Electronics Engineers (IEEE)
CY - Piscataway, NJ
T2 - 2019 IEEE International Symposium on Circuits and Systems, ISCAS 2019
Y2 - 26 May 2019 through 29 May 2019
ER -