Displaying all 2 publications

Abstract:
Sort:
  1. Al-Timemy AH, Mosa ZM, Alyasseri Z, Lavric A, Lui MM, Hazarbassanov RM, et al.
    Transl Vis Sci Technol, 2021 12 01;10(14):16.
    PMID: 34913952 DOI: 10.1167/tvst.10.14.16
    Purpose: To develop and assess the accuracy of a hybrid deep learning construct for detecting keratoconus (KCN) based on corneal topographic maps.

    Methods: We collected 3794 corneal images from 542 eyes of 280 subjects and developed seven deep learning models based on anterior and posterior eccentricity, anterior and posterior elevation, anterior and posterior sagittal curvature, and corneal thickness maps to extract deep corneal features. An independent subset with 1050 images collected from 150 eyes of 85 subjects from a separate center was used to validate models. We developed a hybrid deep learning model to detect KCN. We visualized deep features of corneal parameters to assess the quality of learning subjectively and computed area under the receiver operating characteristic curve (AUC), confusion matrices, accuracy, and F1 score to evaluate models objectively.

    Results: In the development dataset, 204 eyes were normal, 123 eyes were suspected KCN, and 215 eyes had KCN. In the independent validation dataset, 50 eyes were normal, 50 eyes were suspected KCN, and 50 eyes were KCN. Images were annotated by three corneal specialists. The AUC of the models for the two-class and three-class problems based on the development set were 0.99 and 0.93, respectively.

    Conclusions: The hybrid deep learning model achieved high accuracy in identifying KCN based on corneal maps and provided a time-efficient framework with low computational complexity.

    Translational Relevance: Deep learning can detect KCN from non-invasive corneal images with high accuracy, suggesting potential application in research and clinical practice to identify KCN.

  2. Abdi Alkareem Alyasseri Z, Alomari OA, Al-Betar MA, Awadallah MA, Hameed Abdulkareem K, Abed Mohammed M, et al.
    Comput Intell Neurosci, 2022;2022:5974634.
    PMID: 35069721 DOI: 10.1155/2022/5974634
    Recently, the electroencephalogram (EEG) signal presents an excellent potential for a new person identification technique. Several studies defined the EEG with unique features, universality, and natural robustness to be used as a new track to prevent spoofing attacks. The EEG signals are a visual recording of the brain's electrical activities, measured by placing electrodes (channels) in various scalp positions. However, traditional EEG-based systems lead to high complexity with many channels, and some channels have critical information for the identification system while others do not. Several studies have proposed a single objective to address the EEG channel for person identification. Unfortunately, these studies only focused on increasing the accuracy rate without balancing the accuracy and the total number of selected EEG channels. The novelty of this paper is to propose a multiobjective binary version of the cuckoo search algorithm (MOBCS-KNN) to find optimal EEG channel selections for person identification. The proposed method (MOBCS-KNN) used a weighted sum technique to implement a multiobjective approach. In addition, a KNN classifier for EEG-based biometric person identification is used. It is worth mentioning that this is the initial investigation of using a multiobjective technique with EEG channel selection problem. A standard EEG motor imagery dataset is used to evaluate the performance of the MOBCS-KNN. The experiments show that the MOBCS-KNN obtained accuracy of 93.86% using only 24 sensors with AR20 autoregressive coefficients. Another critical point is that the MOBCS-KNN finds channels not too close to each other to capture relevant information from all over the head. In conclusion, the MOBCS-KNN algorithm achieves the best results compared with metaheuristic algorithms. Finally, the recommended approach can draw future directions to be applied to different research areas.
Related Terms
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links