The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Utilizing Feature Selection on Higher Order Neural Networks
|
Author(s): Zongyuan Zhao (University of Tasmania, Australia), Shuxiang Xu (University of Tasmania, Australia), Byeong Ho Kang (University of Tasmania, Australia), Mir Md Jahangir Kabir (University of Tasmania, Australia), Yunling Liu (China Agricultural University, China) and Rainer Wasinger (University of Tasmania, Australia)
Copyright: 2017
Pages: 16
Source title:
Nature-Inspired Computing: Concepts, Methodologies, Tools, and Applications
Source Author(s)/Editor(s): Information Resources Management Association (USA)
DOI: 10.4018/978-1-5225-0788-8.ch041
Purchase
|
Abstract
Artificial Neural Network has shown its impressive ability on many real world problems such as pattern recognition, classification and function approximation. An extension of ANN, higher order neural network (HONN), improves ANN's computational and learning capabilities. However, the large number of higher order attributes leads to long learning time and complex network structure. Some irrelevant higher order attributes can also hinder the performance of HONN. In this chapter, feature selection algorithms will be used to simplify HONN architecture. Comparisons of fully connected HONN with feature selected HONN demonstrate that proper feature selection can be effective on decreasing number of inputs, reducing computational time, and improving prediction accuracy of HONN.
Related Content
Mohamed Arezki Mellal.
© 2022.
9 pages.
|
Tahir Cetin Akinci, Ramazan Caglar, Gokhan Erdemir, Aydin Tarik Zengin, Serhat Seker.
© 2022.
11 pages.
|
Sunanda Hazra, Provas Kumar Roy.
© 2022.
16 pages.
|
Ragab A. El-Sehiemy, Almoataz Y. Abdelaziz.
© 2022.
23 pages.
|
Khaled Dassa, Abdelmadjid Recioui.
© 2022.
35 pages.
|
Anupama Kumari, Mukund Madhaw, C. B. Majumder, Amit Arora.
© 2022.
21 pages.
|
Mandrita Mondal.
© 2022.
20 pages.
|
|
|