Displaying publications 1 - 20 of 63 in total

Abstract:
Sort:
  1. Farady I, Kuo CC, Ng HF, Lin CY
    Sensors (Basel), 2023 Jan 15;23(2).
    PMID: 36679785 DOI: 10.3390/s23020988
    Anomalies are a set of samples that do not follow the normal behavior of the majority of data. In an industrial dataset, anomalies appear in a very small number of samples. Currently, deep learning-based models have achieved important advances in image anomaly detection. However, with general models, real-world application data consisting of non-ideal images, also known as poison images, become a challenge. When the work environment is not conducive to consistently acquiring a good or ideal sample, an additional adaptive learning model is needed. In this work, we design a potential methodology to tackle poison or non-ideal images that commonly appear in industrial production lines by enhancing the existing training data. We propose Hierarchical Image Transformation and Multi-level Features (HIT-MiLF) modules for an anomaly detection network to adapt to perturbances from novelties in testing images. This approach provides a hierarchical process for image transformation during pre-processing and explores the most efficient layer of extracted features from a CNN backbone. The model generates new transformations of training samples that simulate the non-ideal condition and learn the normality in high-dimensional features before applying a Gaussian mixture model to detect the anomalies from new data that it has never seen before. Our experimental results show that hierarchical transformation and multi-level feature exploration improve the baseline performance on industrial metal datasets.
    Matched MeSH terms: Normal Distribution
  2. Li H, Khang TF
    PeerJ, 2023;11:e16126.
    PMID: 37790621 DOI: 10.7717/peerj.16126
    BACKGROUND: Pathological conditions may result in certain genes having expression variance that differs markedly from that of the control. Finding such genes from gene expression data can provide invaluable candidates for therapeutic intervention. Under the dominant paradigm for modeling RNA-Seq gene counts using the negative binomial model, tests of differential variability are challenging to develop, owing to dependence of the variance on the mean.

    METHODS: Here, we describe clrDV, a statistical method for detecting genes that show differential variability between two populations. We present the skew-normal distribution for modeling gene-wise null distribution of centered log-ratio transformation of compositional RNA-seq data.

    RESULTS: Simulation results show that clrDV has false discovery rate and probability of Type II error that are on par with or superior to existing methodologies. In addition, its run time is faster than its closest competitors, and remains relatively constant for increasing sample size per group. Analysis of a large neurodegenerative disease RNA-Seq dataset using clrDV successfully recovers multiple gene candidates that have been reported to be associated with Alzheimer's disease.

    Matched MeSH terms: Normal Distribution
  3. Ang CYS, Chiew YS, Vu LH, Cove ME
    Comput Methods Programs Biomed, 2022 Mar;215:106601.
    PMID: 34973606 DOI: 10.1016/j.cmpb.2021.106601
    BACKGROUND: Spontaneous breathing (SB) effort during mechanical ventilation (MV) is an important metric of respiratory drive. However, SB effort varies due to a variety of factors, including evolving pathology and sedation levels. Therefore, assessment of SB efforts needs to be continuous and non-invasive. This is important to prevent both over- and under-assistance with MV. In this study, a machine learning model, Convolutional Autoencoder (CAE) is developed to quantify the magnitude of SB effort using only bedside MV airway pressure and flow waveform.

    METHOD: The CAE model was trained using 12,170,655 simulated SB flow and normal flow data (NB). The paired SB and NB flow data were simulated using a Gaussian Effort Model (GEM) with 5 basis functions. When the CAE model is given a SB flow input, it is capable of predicting a corresponding NB flow for the SB flow input. The magnitude of SB effort (SBEMag) is then quantified as the difference between the SB and NB flows. The CAE model was used to evaluate the SBEMag of 9 pressure control/ support datasets. Results were validated using a mean squared error (MSE) fitting between clinical and training SB flows.

    RESULTS: The CAE model was able to produce NB flows from the clinical SB flows with the median SBEMag of the 9 datasets being 25.39% [IQR: 21.87-25.57%]. The absolute error in SBEMag using MSE validation yields a median of 4.77% [IQR: 3.77-8.56%] amongst the cohort. This shows the ability of the GEM to capture the intrinsic details present in SB flow waveforms. Analysis also shows both intra-patient and inter-patient variability in SBEMag.

    CONCLUSION: A Convolutional Autoencoder model was developed with simulated SB and NB flow data and is capable of quantifying the magnitude of patient spontaneous breathing effort. This provides potential application for real-time monitoring of patient respiratory drive for better management of patient-ventilator interaction.

    Matched MeSH terms: Normal Distribution
  4. Mustafa S, Iqbal MW, Rana TA, Jaffar A, Shiraz M, Arif M, et al.
    Comput Intell Neurosci, 2022;2022:4348235.
    PMID: 35909861 DOI: 10.1155/2022/4348235
    Malignant melanoma is considered one of the deadliest skin diseases if ignored without treatment. The mortality rate caused by melanoma is more than two times that of other skin malignancy diseases. These facts encourage computer scientists to find automated methods to discover skin cancers. Nowadays, the analysis of skin images is widely used by assistant physicians to discover the first stage of the disease automatically. One of the challenges the computer science researchers faced when developing such a system is the un-clarity of the existing images, such as noise like shadows, low contrast, hairs, and specular reflections, which complicates detecting the skin lesions in that images. This paper proposes the solution to the problem mentioned earlier using the active contour method. Still, seed selection in the dynamic contour method has the main drawback of where it should start the segmentation process. This paper uses Gaussian filter-based maximum entropy and morphological processing methods to find automatic seed points for active contour. By incorporating this, it can segment the lesion from dermoscopic images automatically. Our proposed methodology tested quantitative and qualitative measures on standard dataset dermis and used to test the proposed method's reliability which shows encouraging results.
    Matched MeSH terms: Normal Distribution
  5. Hu S, Hall DA, Zubler F, Sznitman R, Anschuetz L, Caversaccio M, et al.
    Hear Res, 2021 10;410:108338.
    PMID: 34469780 DOI: 10.1016/j.heares.2021.108338
    Recently, Bayesian brain-based models emerged as a possible composite of existing theories, providing an universal explanation of tinnitus phenomena. Yet, the involvement of multiple synergistic mechanisms complicates the identification of behavioral and physiological evidence. To overcome this, an empirically tested computational model could support the evaluation of theoretical hypotheses by intrinsically encompassing different mechanisms. The aim of this work was to develop a generative computational tinnitus perception model based on the Bayesian brain concept. The behavioral responses of 46 tinnitus subjects who underwent ten consecutive residual inhibition assessments were used for model fitting. Our model was able to replicate the behavioral responses during residual inhibition in our cohort (median linear correlation coefficient of 0.79). Using the same model, we simulated two additional tinnitus phenomena: residual excitation and occurrence of tinnitus in non-tinnitus subjects after sensory deprivation. In the simulations, the trajectories of the model were consistent with previously obtained behavioral and physiological observations. Our work introduces generative computational modeling to the research field of tinnitus. It has the potential to quantitatively link experimental observations to theoretical hypotheses and to support the search for neural signatures of tinnitus by finding correlates between the latent variables of the model and measured physiological data.
    Matched MeSH terms: Normal Distribution
  6. Yakno M, Mohamad-Saleh J, Ibrahim MZ
    Sensors (Basel), 2021 Sep 27;21(19).
    PMID: 34640769 DOI: 10.3390/s21196445
    Enhancement of captured hand vein images is essential for a number of purposes, such as accurate biometric identification and ease of medical intravenous access. This paper presents an improved hand vein image enhancement technique based on weighted average fusion of contrast limited adaptive histogram equalization (CLAHE) and fuzzy adaptive gamma (FAG). The proposed technique is applied using three stages. Firstly, grey level intensities with CLAHE are locally applied to image pixels for contrast enhancement. Secondly, the grey level intensities are then globally transformed into membership planes and modified with FAG operator for the same purposes. Finally, the resultant images from CLAHE and FAG are fused using improved weighted averaging methods for clearer vein patterns. Then, matched filter with first-order derivative Gaussian (MF-FODG) is employed to segment vein patterns. The proposed technique was tested on self-acquired dorsal hand vein images as well as images from the SUAS databases. The performance of the proposed technique is compared with various other image enhancement techniques based on mean square error (MSE), peak signal-to-noise ratio (PSNR), and structural similarity index measurement (SSIM). The proposed enhancement technique's impact on the segmentation process has also been evaluated using sensitivity, accuracy, and dice coefficient. The experimental results show that the proposed enhancement technique can significantly enhance the hand vein patterns and improve the detection of dorsal hand veins.
    Matched MeSH terms: Normal Distribution
  7. Walters K, Cox A, Yaacob H
    Genet Epidemiol, 2021 Jun;45(4):386-401.
    PMID: 33410201 DOI: 10.1002/gepi.22375
    The Gaussian distribution is usually the default causal single-nucleotide polymorphism (SNP) effect size prior in Bayesian population-based fine-mapping association studies, but a recent study showed that the heavier-tailed Laplace prior distribution provided a better fit to breast cancer top hits identified in genome-wide association studies. We investigate the utility of the Laplace prior as an effect size prior in univariate fine-mapping studies. We consider ranking SNPs using Bayes factors and other summaries of the effect size posterior distribution, the effect of prior choice on credible set size based on the posterior probability of causality, and on the noteworthiness of SNPs in univariate analyses. Across a wide range of fine-mapping scenarios the Laplace prior generally leads to larger 90% credible sets than the Gaussian prior. These larger credible sets for the Laplace prior are due to relatively high prior mass around zero which can yield many noncausal SNPs with relatively large Bayes factors. If using conventional credible sets, the Gaussian prior generally yields a better trade off between including the causal SNP with high probability and keeping the set size reasonable. Interestingly when using the less well utilised measure of noteworthiness, the Laplace prior performs well, leading to causal SNPs being declared noteworthy with high probability, whilst generally declaring fewer than 5% of noncausal SNPs as being noteworthy. In contrast, the Gaussian prior leads to the causal SNP being declared noteworthy with very low probability.
    Matched MeSH terms: Normal Distribution
  8. Mas'ud AA, Sundaram A, Ardila-Rey JA, Schurch R, Muhammad-Sukki F, Bani NA
    Sensors (Basel), 2021 Apr 06;21(7).
    PMID: 33917472 DOI: 10.3390/s21072562
    In high-voltage (HV) insulation, electrical trees are an important degradation phenomenon strongly linked to partial discharge (PD) activity. Their initiation and development have attracted the attention of the research community and better understanding and characterization of the phenomenon are needed. They are very damaging and develop through the insulation material forming a discharge conduction path. Therefore, it is important to adequately measure and characterize tree growth before it can lead to complete failure of the system. In this paper, the Gaussian mixture model (GMM) has been applied to cluster and classify the different growth stages of electrical trees in epoxy resin insulation. First, tree growth experiments were conducted, and PD data captured from the initial to breakdown stage of the tree growth in epoxy resin insulation. Second, the GMM was applied to categorize the different electrical tree stages into clusters. The results show that PD dynamics vary with different stress voltages and tree growth stages. The electrical tree patterns with shorter breakdown times had identical clusters throughout the degradation stages. The breakdown time can be a key factor in determining the degradation levels of PD patterns emanating from trees in epoxy resin. This is important in order to determine the severity of electrical treeing degradation, and, therefore, to perform efficient asset management. The novelty of the work presented in this paper is that for the first time the GMM has been applied for electrical tree growth classification and the optimal values for the hyperparameters, i.e., the number of clusters and the appropriate covariance structure, have been determined for the different electrical tree clusters.
    Matched MeSH terms: Normal Distribution
  9. Alameri M, Hasikin K, Kadri NA, Nasir NFM, Mohandas P, Anni JS, et al.
    Comput Math Methods Med, 2021;2021:6953593.
    PMID: 34497665 DOI: 10.1155/2021/6953593
    Infertility is a condition whereby pregnancy does not occur despite having unprotected sexual intercourse for at least one year. The main reason could originate from either the male or the female, and sometimes, both contribute to the fertility disorder. For the male, sperm disorder was found to be the most common reason for infertility. In this paper, we proposed male infertility analysis based on automated sperm motility tracking. The proposed method worked in multistages, where the first stage focused on the sperm detection process using an improved Gaussian Mixture Model. A new optimization protocol was proposed to accurately detect the motile sperms prior to the sperm tracking process. Since the optimization protocol was imposed in the proposed system, the sperm tracking and velocity estimation processes are improved. The proposed method attained the highest average accuracy, sensitivity, and specificity of 92.3%, 96.3%, and 72.4%, respectively, when tested on 10 different samples. Our proposed method depicted better sperm detection quality when qualitatively observed as compared to other state-of-the-art techniques.
    Matched MeSH terms: Normal Distribution
  10. Usman OL, Muniyandi RC, Omar K, Mohamad M
    PLoS One, 2021;16(2):e0245579.
    PMID: 33630876 DOI: 10.1371/journal.pone.0245579
    Achieving biologically interpretable neural-biomarkers and features from neuroimaging datasets is a challenging task in an MRI-based dyslexia study. This challenge becomes more pronounced when the needed MRI datasets are collected from multiple heterogeneous sources with inconsistent scanner settings. This study presents a method of improving the biological interpretation of dyslexia's neural-biomarkers from MRI datasets sourced from publicly available open databases. The proposed system utilized a modified histogram normalization (MHN) method to improve dyslexia neural-biomarker interpretations by mapping the pixels' intensities of low-quality input neuroimages to range between the low-intensity region of interest (ROIlow) and high-intensity region of interest (ROIhigh) of the high-quality image. This was achieved after initial image smoothing using the Gaussian filter method with an isotropic kernel of size 4mm. The performance of the proposed smoothing and normalization methods was evaluated based on three image post-processing experiments: ROI segmentation, gray matter (GM) tissues volume estimations, and deep learning (DL) classifications using Computational Anatomy Toolbox (CAT12) and pre-trained models in a MATLAB working environment. The three experiments were preceded by some pre-processing tasks such as image resizing, labelling, patching, and non-rigid registration. Our results showed that the best smoothing was achieved at a scale value, σ = 1.25 with a 0.9% increment in the peak-signal-to-noise ratio (PSNR). Results from the three image post-processing experiments confirmed the efficacy of the proposed methods. Evidence emanating from our analysis showed that using the proposed MHN and Gaussian smoothing methods can improve comparability of image features and neural-biomarkers of dyslexia with a statistically significantly high disc similarity coefficient (DSC) index, low mean square error (MSE), and improved tissue volume estimations. After 10 repeated 10-fold cross-validation, the highest accuracy achieved by DL models is 94.7% at a 95% confidence interval (CI) level. Finally, our finding confirmed that the proposed MHN method significantly outperformed the normalization method of the state-of-the-art histogram matching.
    Matched MeSH terms: Normal Distribution
  11. Kadir MA, Abdul Razak FI, Haris NSH
    Data Brief, 2020 Oct;32:106263.
    PMID: 32905010 DOI: 10.1016/j.dib.2020.106263
    The data in this article provide information on spectroscopic and theoretical data for p-chlorocalix[4]arene when combined with selected drugs, such as paracetamol, ibuprofen, and cetirizine. The present spectroscopic data are generated from Fourier Transform Infrared (FTIR), Nuclear Magnetic Resonance (1H NMR and 13C NMR), and Ultraviolet-Visible spectroscopy (UV-Vis) as the key tools for molecular characterization. The measurement of the optimization energy, interaction energy, and the band gap energy between the molecules was calculated by Gaussian 09 software. It is interesting to note that of the three titled drugs identified, p-chlorocalix[4]arene showed the highest interaction energy with paracetamol, followed by ibuprofen and cetirizine.
    Matched MeSH terms: Normal Distribution
  12. Wicaksono FD, Arshad YB, Sihombing H
    Heliyon, 2020 Apr;6(4):e03607.
    PMID: 32346625 DOI: 10.1016/j.heliyon.2020.e03607
    This paper presents the novel approach of the Norm-dist Monte-Carlo fuzzy analytic hierarchy process (NMCFAHP) to incorporate probabilistic and epistemic uncertainty due to human's judgment vagueness in multi-criteria decision analysis. Normal distribution is applied as the most appropriate distribution model to approximate the probability distribution function of the criteria and alternatives within Monte-Carlo simulation. To test the applicability of the proposed NMCFAHP, the case study of non-destructive test (NDT) technology selection is performed in the Petroleum Company in Borneo, Indonesia. When compared with the conventional triangular fuzzy-AHP, the proposed NMCFAHP method reduces the standard error of mean values by 90.4-99.8% at the final evaluation scores. This means that the proposed NMCFAHP significantly involves fewer errors when dealing with fuzzy uncertainty and stochastic randomness. The proposed NMCFAHP delivers reliable performance to overcome probabilistic uncertainty and epistemic vagueness in the group decision making process.
    Matched MeSH terms: Normal Distribution
  13. Arunachalam GR, Chiew YS, Tan CP, Ralib AM, Nor MBM
    Comput Methods Programs Biomed, 2020 Jan;183:105103.
    PMID: 31606559 DOI: 10.1016/j.cmpb.2019.105103
    BACKGROUND AND OBJECTIVE: Mechanical ventilation therapy of respiratory failure patients can be guided by monitoring patient-specific respiratory mechanics. However, the patient's spontaneous breathing effort during controlled ventilation changes airway pressure waveform and thus affects the model-based identification of patient-specific respiratory mechanics parameters. This study develops a model to estimate respiratory mechanics in the presence of patient effort.

    METHODS: Gaussian effort model (GEM) is a derivative of the single-compartment model with basis function. GEM model uses a linear combination of basis functions to model the nonlinear pressure waveform of spontaneous breathing patients. The GEM model estimates respiratory mechanics such as Elastance and Resistance along with the magnitudes of basis functions, which accounts for patient inspiratory effort.

    RESULTS AND DISCUSSION: The GEM model was tested using both simulated data and a retrospective observational clinical trial patient data. GEM model fitting to the original airway pressure waveform is better than any existing models when reverse triggering asynchrony is present. The fitting error of GEM model was less than 10% for both simulated data and clinical trial patient data.

    CONCLUSION: GEM can capture the respiratory mechanics in the presence of patient effect in volume control ventilation mode and also can be used to assess patient-ventilator interaction. This model determines basis functions magnitudes, which can be used to simulate any waveform of patient effort pressure for future studies. The estimation of parameter identification GEM model can further be improved by constraining the parameters within a physiologically plausible range during least-square nonlinear regression.

    Matched MeSH terms: Normal Distribution
  14. Rajagopal H, Mokhtar N, Tengku Mohmed Noor Izam TF, Wan Ahmad WK
    PLoS One, 2020;15(5):e0233320.
    PMID: 32428043 DOI: 10.1371/journal.pone.0233320
    Image Quality Assessment (IQA) is essential for the accuracy of systems for automatic recognition of tree species for wood samples. In this study, a No-Reference IQA (NR-IQA), wood NR-IQA (WNR-IQA) metric was proposed to assess the quality of wood images. Support Vector Regression (SVR) was trained using Generalized Gaussian Distribution (GGD) and Asymmetric Generalized Gaussian Distribution (AGGD) features, which were measured for wood images. Meanwhile, the Mean Opinion Score (MOS) was obtained from the subjective evaluation. This was followed by a comparison between the proposed IQA metric, WNR-IQA, and three established NR-IQA metrics, namely Blind/Referenceless Image Spatial Quality Evaluator (BRISQUE), deepIQA, Deep Bilinear Convolutional Neural Networks (DB-CNN), and five Full Reference-IQA (FR-IQA) metrics known as MSSIM, SSIM, FSIM, IWSSIM, and GMSD. The proposed WNR-IQA metric, BRISQUE, deepIQA, DB-CNN, and FR-IQAs were then compared with MOS values to evaluate the performance of the automatic IQA metrics. As a result, the WNR-IQA metric exhibited a higher performance compared to BRISQUE, deepIQA, DB-CNN, and FR-IQA metrics. Highest quality images may not be routinely available due to logistic factors, such as dust, poor illumination, and hot environment present in the timber industry. Moreover, motion blur could occur due to the relative motion between the camera and the wood slice. Therefore, the advantage of WNR-IQA could be seen from its independency from a "perfect" reference image for the image quality evaluation.
    Matched MeSH terms: Normal Distribution
  15. Walters K, Cox A, Yaacob H
    Genet Epidemiol, 2019 Sep;43(6):675-689.
    PMID: 31286571 DOI: 10.1002/gepi.22212
    The default causal single-nucleotide polymorphism (SNP) effect size prior in Bayesian fine-mapping studies is usually the Normal distribution. This choice is often based on computational convenience, rather than evidence that it is the most suitable prior distribution. The choice of prior is important because previous studies have shown considerable sensitivity of causal SNP Bayes factors to the form of the prior. In some well-studied diseases there are now considerable numbers of genome-wide association study (GWAS) top hits along with estimates of the number of yet-to-be-discovered causal SNPs. We show how the effect sizes of the top hits and estimates of the number of yet-to-be-discovered causal SNPs can be used to choose between the Laplace and Normal priors, to estimate the prior parameters and to quantify the uncertainty in this estimation. The methodology can readily be applied to other priors. We show that the top hits available from breast cancer GWAS provide overwhelming support for the Laplace over the Normal prior, which has important consequences for variant prioritisation. This work in this paper enables practitioners to derive more objective priors than are currently being used and could lead to prioritisation of different variants.
    Matched MeSH terms: Normal Distribution
  16. Faust O, Razaghi H, Barika R, Ciaccio EJ, Acharya UR
    Comput Methods Programs Biomed, 2019 Jul;176:81-91.
    PMID: 31200914 DOI: 10.1016/j.cmpb.2019.04.032
    BACKGROUND AND OBJECTIVE: Sleep is an important part of our life. That importance is highlighted by the multitude of health problems which result from sleep disorders. Detecting these sleep disorders requires an accurate interpretation of physiological signals. Prerequisite for this interpretation is an understanding of the way in which sleep stage changes manifest themselves in the signal waveform. With that understanding it is possible to build automated sleep stage scoring systems. Apart from their practical relevance for automating sleep disorder diagnosis, these systems provide a good indication of the amount of sleep stage related information communicated by a specific physiological signal.

    METHODS: This article provides a comprehensive review of automated sleep stage scoring systems, which were created since the year 2000. The systems were developed for Electrocardiogram (ECG), Electroencephalogram (EEG), Electrooculogram (EOG), and a combination of signals.

    RESULTS: Our review shows that all of these signals contain information for sleep stage scoring.

    CONCLUSIONS: The result is important, because it allows us to shift our research focus away from information extraction methods to systemic improvements, such as patient comfort, redundancy, safety and cost.

    Matched MeSH terms: Normal Distribution
  17. Yeap ZX, Sim KS, Tso CP
    Microsc Res Tech, 2019 Apr;82(4):402-414.
    PMID: 30575192 DOI: 10.1002/jemt.23181
    Image processing is introduced to remove or reduce the noise and unwanted signal that deteriorate the quality of an image. Here, a single level two-dimensional wavelet transform is applied to the image in order to obtain the wavelet transform sub-band signal of an image. An estimation technique to predict the noise variance in an image is proposed, which is then fed into a Wiener filter to filter away the noise from the sub-band of the image. The proposed filter is called adaptive tuning piecewise cubic Hermite interpolation with Wiener filter in the wavelet domain. The performance of this filter is compared with four existing filters: median filter, Gaussian smoothing filter, two level wavelet transform with Wiener filter and adaptive noise Wiener filter. Based on the results, the adaptive tuning piecewise cubic Hermite interpolation with Wiener filter in wavelet domain has better performance than the other four methods.
    Matched MeSH terms: Normal Distribution
  18. Mohd Khairul Bazli Mohd Aziz, Fadhilah Yusof, Zalina Mohd Daud, Zulkifli Yusop, Mohammad Afif Kasno
    MATEMATIKA, 2019;35(2):157-170.
    MyJurnal
    The well-known geostatistics method (variance-reduction method) is commonly used to determine the optimal rain gauge network. The main problem in geostatistics method to determine the best semivariogram model in order to be used in estimating the variance. An optimal choice of the semivariogram model is an important point for a good data evaluation process. Three different semivariogram models which are Spherical, Gaussian and Exponential are used and their performances are compared in this study. Cross validation technique is applied to compute the errors of the semivariograms. Rain-fall data for the period of 1975 – 2008 from the existing 84 rain gauge stations covering the state of Johor are used in this study. The result shows that the exponential model is the best semivariogram model and chosen to determine the optimal number and location of rain gauge station.
    Matched MeSH terms: Normal Distribution
  19. Masuyama N, Loo CK, Dawood F
    Neural Netw, 2018 Feb;98:76-86.
    PMID: 29202265 DOI: 10.1016/j.neunet.2017.11.003
    Adaptive Resonance Theory (ART) is one of the successful approaches to resolving "the plasticity-stability dilemma" in neural networks, and its supervised learning model called ARTMAP is a powerful tool for classification. Among several improvements, such as Fuzzy or Gaussian based models, the state of art model is Bayesian based one, while solving the drawbacks of others. However, it is known that the Bayesian approach for the high dimensional and a large number of data requires high computational cost, and the covariance matrix in likelihood becomes unstable. This paper introduces Kernel Bayesian ART (KBA) and ARTMAP (KBAM) by integrating Kernel Bayes' Rule (KBR) and Correntropy Induced Metric (CIM) to Bayesian ART (BA) and ARTMAP (BAM), respectively, while maintaining the properties of BA and BAM. The kernel frameworks in KBA and KBAM are able to avoid the curse of dimensionality. In addition, the covariance-free Bayesian computation by KBR provides the efficient and stable computational capability to KBA and KBAM. Furthermore, Correntropy-based similarity measurement allows improving the noise reduction ability even in the high dimensional space. The simulation experiments show that KBA performs an outstanding self-organizing capability than BA, and KBAM provides the superior classification ability than BAM, respectively.
    Matched MeSH terms: Normal Distribution
  20. Bako Sunday Samuel, Mohd Bakri Adam, Anwar Fitrianto
    MATEMATIKA, 2018;34(2):365-380.
    MyJurnal
    Recent studies have shown that independent identical distributed Gaussian
    random variables is not suitable for modelling extreme values observed during extremal
    events. However, many real life data on extreme values are dependent and stationary
    rather than the conventional independent identically distributed data. We propose a stationary
    autoregressive (AR) process with Gumbel distributed innovation and characterise
    the short-term dependence among maxima of an (AR) process over a range of sample
    sizes with varying degrees of dependence. We estimate the maximum likelihood of the
    parameters of the Gumbel AR process and its residuals, and evaluate the performance
    of the parameter estimates. The AR process is fitted to the Gumbel-generalised Pareto
    (GPD) distribution and we evaluate the performance of the parameter estimates fitted
    to the cluster maxima and the original series. Ignoring the effect of dependence leads to
    overestimation of the location parameter of the Gumbel-AR (1) process. The estimate
    of the location parameter of the AR process using the residuals gives a better estimate.
    Estimate of the scale parameter perform marginally better for the original series than the
    residual estimate. The degree of clustering increases as dependence is enhance for the AR
    process. The Gumbel-AR(1) fitted to the threshold exceedances shows that the estimates
    of the scale and shape parameters fitted to the cluster maxima perform better as sample
    size increases, however, ignoring the effect of dependence lead to an underestimation of
    the parameter estimates of the scale parameter. The shape parameter of the original
    series gives a superior estimate compare to the threshold excesses fitted to the Gumbel
    distributed Generalised Pareto ditribution.
    Matched MeSH terms: Normal Distribution
Related Terms
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links