KAF + RSigELU: a nonlinear and kernel-based activation function for deep neural networks


Kiliçarslan S., ÇELİK M.

Neural Computing and Applications, cilt.34, sa.16, ss.13909-13923, 2022 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 34 Sayı: 16
  • Basım Tarihi: 2022
  • Doi Numarası: 10.1007/s00521-022-07211-7
  • Dergi Adı: Neural Computing and Applications
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, PASCAL, Applied Science & Technology Source, Biotechnology Research Abstracts, Compendex, Computer & Applied Sciences, Index Islamicus, INSPEC, zbMATH
  • Sayfa Sayıları: ss.13909-13923
  • Anahtar Kelimeler: Kernel-based activation function (KAF), KAF plus RSigELUS, KAF plus RSigELUD, CNN, Deep neural network
  • Kayseri Üniversitesi Adresli: Hayır

Özet

© 2022, The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature.Activation functions (AFs) are the basis for neural network architectures used in real-world problems to accurately model and learn complex relationships between variables. They are preferred to process the input information coming to the network and to produce the corresponding output. The kernel-based activation function (KAF) offers an extended version of ReLU and sigmoid AFs. Therefore, KAF faced with the problems of bias shift originating from the negative region, vanishing gradient, adaptability, flexibility, and neuron death in parameters during the learning process. In this study, hybrid KAF + RSigELUS and KAF + RSigELUD AFs, which are extended versions of KAF, are proposed. In the proposed AFs, the gauss kernel function is used. The proposed KAF + RSigELUS and KAF + RSigELUD AFs are effective in the positive, negative, and linear activation regions. Performance evaluations of them were conducted on the MNIST, Fashion MNIST, CIFAR-10, and SVHN benchmark datasets. The experimental evaluations show that the proposed AFs overcome existing problems and outperformed ReLU, LReLU, ELU, PReLU, and KAF AFs.