Displaying 1 publication

Abstract:
Sort:
  1. Mohseni SA, Tan AH
    IEEE Trans Syst Man Cybern B Cybern, 2012 Dec;42(6):1645-53.
    PMID: 22665508 DOI: 10.1109/TSMCB.2012.2197610
    This paper proposes a new mixed training algorithm consisting of error backpropagation (EBP) and variable structure systems (VSSs) to optimize parameter updating of neural networks. For the optimization of the number of neurons in the hidden layer, a new term based on the output of the hidden layer is added to the cost function as a penalty term to make optimal use of hidden units related to weights corresponding to each unit in the hidden layer. VSS is used to control the dynamic model of the training process, whereas EBP attempts to minimize the cost function. In addition to the analysis of the imposed dynamics of the EBP technique, the global stability of the mixed training methodology and constraints on the design parameters are considered. The advantages of the proposed technique are guaranteed convergence, improved robustness, and lower sensitivity to initial weights of the neural network.
Related Terms
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links