Please use this identifier to cite or link to this item: https://dspace.iiti.ac.in/handle/123456789/13638
Title: On Improving Radial Basis Function Neural Networks for Regression
Authors: Jose, Justin
Bhatia, Vimal
Keywords: BFNN;machine learning;mean absolute error;mean squared error;R2 score;regression
Issue Date: 2023
Publisher: Institute of Electrical and Electronics Engineers Inc.
Citation: Mishra, S., Panda, S., Jose, J., Bhatia, V., & Pandey, S. K. (2023). On Improving Radial Basis Function Neural Networks for Regression. 2023 IEEE 7th Conference on Information and Communication Technology, CICT 2023. Scopus. https://doi.org/10.1109/CICT59886.2023.10455422
Abstract: Radial Basis Functional Neural Networks (RBFNNs) are powerful neural network architectures known for their unique approach to learning and pattern recognition. RBFNNs can approximate complex functions efficiently, especially when the data distribution is not well-suited for traditional feedforward neural networks. Compared to fully connected feedforward neural networks, RBFNNs can have fewer parameters, making them potentially easier to train with less data. In this work, we first propose a modified RBFNN model to be used for regression tasks by comparing it with the models proposed in the literature. The effectiveness of four distinct RBFNN architectures for a regression problem is compared. We analyze the performance by changing the architecture and the activation functions in terms of the R2 score, mean absolute error and mean squared error. The performance of the proposed work is analyzed and useful inferences are drawn out. � 2023 IEEE.
URI: https://doi.org/10.1109/CICT59886.2023.10455422
https://dspace.iiti.ac.in/handle/123456789/13638
ISBN: 979-8350305173
Type of Material: Conference Paper
Appears in Collections:Department of Electrical Engineering

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetric Badge: