This paper proposes a modified ELM algorithm that properly selects the input weights and biases before training the output weights of single-hidden layer feedforward neural networks with sigmoidal activation function and proves mathematically the hidden layer output matrix maintains full column rank. The modified ELM avoids the randomness compared with the ELM. The experimental results of both regression and classification problems show good performance of the modified ELM algorithm.
- Feedforward neural networks
- Extreme learning machine
- Moore–Penrose generalized inverse
Chen, Z. X., Zhu, H. Y., & Wang, Y. G. (2013). A modified extreme learning machine with sigmoidal activation functions. Neural Computing and Applications, 22(3-4), 541–550. https://doi.org/10.1007/s00521-012-0860-2