This paper introduces a novel implementation of the low-power analog artificial neural network (ANN) using Multiple Input Floating Gate MOS (MIFGMOS) transistor for machine learning applications. The number of inputs to a neuron in an ANN is the major bottleneck in building a large scale analog system. The proposed MIFGMOS transistor enables to build a large scale system by combining multiple inputs in a single transistor with a small silicon footprint. Here, we show the MIFGMOS based implementation of the Extreme Learning Machine (ELM) architecture using the receptive field approach with transistor operating in the sub-threshold region. The MIFGMOS produces output current as a function of the weighted combination of the voltage applied to its gate terminals. In the ELM architecture, the weights between the input and the hidden layer are random and this allows exploiting the random device mismatch due to the fabrication process, for building Integrated Circuits (IC) based on ELM architecture. Thus, we use implicit random weights present due to device mismatch, and there is no need to store the input weights. We have verified our architecture using circuit simulations on regression and various classification problems such as on the MNIST data-set and a few UCI data-sets. The proposed MIFGMOS enables combining multiple inputs in a single transistor and will thus pave the way to build large scale deep learning neural networks.