Displaying all 2 publications

Abstract:
Sort:
  1. Masuyama N, Loo CK, Wermter S
    Int J Neural Syst, 2019 Jun;29(5):1850052.
    PMID: 30764724 DOI: 10.1142/S0129065718500521
    This paper attempts to solve the typical problems of self-organizing growing network models, i.e. (a) an influence of the order of input data on the self-organizing ability, (b) an instability to high-dimensional data and an excessive sensitivity to noise, and (c) an expensive computational cost by integrating Kernel Bayes Rule (KBR) and Correntropy-Induced Metric (CIM) into Adaptive Resonance Theory (ART) framework. KBR performs a covariance-free Bayesian computation which is able to maintain a fast and stable computation. CIM is a generalized similarity measurement which can maintain a high-noise reduction ability even in a high-dimensional space. In addition, a Growing Neural Gas (GNG)-based topology construction process is integrated into the ART framework to enhance its self-organizing ability. The simulation experiments with synthetic and real-world datasets show that the proposed model has an outstanding stable self-organizing ability for various test environments.
  2. Masuyama N, Loo CK, Dawood F
    Neural Netw, 2018 Feb;98:76-86.
    PMID: 29202265 DOI: 10.1016/j.neunet.2017.11.003
    Adaptive Resonance Theory (ART) is one of the successful approaches to resolving "the plasticity-stability dilemma" in neural networks, and its supervised learning model called ARTMAP is a powerful tool for classification. Among several improvements, such as Fuzzy or Gaussian based models, the state of art model is Bayesian based one, while solving the drawbacks of others. However, it is known that the Bayesian approach for the high dimensional and a large number of data requires high computational cost, and the covariance matrix in likelihood becomes unstable. This paper introduces Kernel Bayesian ART (KBA) and ARTMAP (KBAM) by integrating Kernel Bayes' Rule (KBR) and Correntropy Induced Metric (CIM) to Bayesian ART (BA) and ARTMAP (BAM), respectively, while maintaining the properties of BA and BAM. The kernel frameworks in KBA and KBAM are able to avoid the curse of dimensionality. In addition, the covariance-free Bayesian computation by KBR provides the efficient and stable computational capability to KBA and KBAM. Furthermore, Correntropy-based similarity measurement allows improving the noise reduction ability even in the high dimensional space. The simulation experiments show that KBA performs an outstanding self-organizing capability than BA, and KBAM provides the superior classification ability than BAM, respectively.
Related Terms
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links