Activation functions : comparison of trends in practice and research for deep learning

Nwankpa, Chigozie Enyinna and Ijomah, Winifred and Gachagan, Anthony and Marshall, Stephen (2021) Activation functions : comparison of trends in practice and research for deep learning. In: 2nd International Conference on Computational Sciences and Technology, 2020-12-17 - 2020-12-19, Jamshoro.

[thumbnail of Nwankpa-etal-ICCST-2021-Activation-functions-comparison-of-trends-in-practice]
Text (Nwankpa-etal-ICCST-2021-Activation-functions-comparison-of-trends-in-practice)
Accepted Author Manuscript

Download (285kB)| Preview


    Deep neural networks (DNN) have been successfully used in diverse emerging domains to solve real world complex problems with may more deep learning (DL) architectures, being developed to date. To achieve this state-of-the-art (SOTA) performances, the DL architectures use activation functions (AFs), to perform diverse computations between the hidden layers and the output layers of any given DL architecture. This paper presents a survey on the existing AFs used in deep learning applications and highlights the recent trends in the use of the AFs for DL applications. The novelty of this paper is that it compiles the majority of the AFs used in DL and outlines the current trends in the applications and usage of these functions in practical deep learning deployments against the SOTA research results. This compilation will aid in making effective decisions in the choice of the most suitable and appropriate AF for a given application, ready for deployment. This paper is timely because majority of the research papers on AF highlights similar works and results while this paper will be the first, to compile the trends in AF applications in practice against the research results from the literature, found in DL research to date.

    ORCID iDs

    Nwankpa, Chigozie Enyinna, Ijomah, Winifred, Gachagan, Anthony ORCID logoORCID: and Marshall, Stephen;