Displaying publications 81 - 100 of 1459 in total

Abstract:
Sort:
  1. Supakar R, Satvaya P, Chakrabarti P
    Comput Biol Med, 2022 Dec;151(Pt A):106225.
    PMID: 36306576 DOI: 10.1016/j.compbiomed.2022.106225
    Normal life can be ensured for schizophrenic patients if diagnosed early. Electroencephalogram (EEG) carries information about the brain network connectivity which can be used to detect brain anomalies that are indicative of schizophrenia. Since deep learning is capable of automatically extracting the significant features and make classifications, the authors proposed a deep learning based model using RNN-LSTM to analyze the EEG signal data to diagnose schizophrenia. The proposed model used three dense layers on top of a 100 dimensional LSTM. EEG signal data of 45 schizophrenic patients and 39 healthy subjects were used in the study. Dimensionality reduction algorithm was used to obtain an optimal feature set and the classifier was run with both sets of data. An accuracy of 98% and 93.67% were obtained with the complete feature set and the reduced feature set respectively. The robustness of the model was evaluated using model performance measure and combined performance measure. Outcomes were compared with the outcome obtained with traditional machine learning classifiers such as Random Forest, SVM, FURIA, and AdaBoost, and the proposed model was found to perform better with the complete dataset. When compared with the result of the researchers who worked with the same set of data using either CNN or RNN, the proposed model's accuracy was either better or comparable to theirs.
    Matched MeSH terms: Algorithms
  2. Teo BG, Dhillon SK, Lim LH
    PLoS One, 2013;8(10):e77650.
    PMID: 24204903 DOI: 10.1371/journal.pone.0077650
    In this paper, a digital 3D model which allows for visualisation in three dimensions and interactive manipulation is explored as a tool to help us understand the structural morphology and elucidate the functions of morphological structures of fragile microorganisms which defy live studies. We developed a deformable generic 3D model of haptoral anchor of dactylogyridean monogeneans that can subsequently be deformed into different desired anchor shapes by using direct manipulation deformation technique. We used point primitives to construct the rectangular building blocks to develop our deformable 3D model. Point primitives are manually marked on a 2D illustration of an anchor on a Cartesian graph paper and a set of Cartesian coordinates for each point primitive is manually extracted from the graph paper. A Python script is then written in Blender to construct 3D rectangular building blocks based on the Cartesian coordinates. The rectangular building blocks are stacked on top or by the side of each other following their respective Cartesian coordinates of point primitive. More point primitives are added at the sites in the 3D model where more structural variations are likely to occur, in order to generate complex anchor structures. We used Catmull-Clark subdivision surface modifier to smoothen the surface and edge of the generic 3D model to obtain a smoother and more natural 3D shape and antialiasing option to reduce the jagged edges of the 3D model. This deformable generic 3D model can be deformed into different desired 3D anchor shapes through direct manipulation deformation technique by aligning the vertices (pilot points) of the newly developed deformable generic 3D model onto the 2D illustrations of the desired shapes and moving the vertices until the desire 3D shapes are formed. In this generic 3D model all the vertices present are deployed for displacement during deformation.
    Matched MeSH terms: Algorithms
  3. Noor NM, Rijal OM, Yunus A, Abu-Bakar SA
    Comput Med Imaging Graph, 2010 Mar;34(2):160-6.
    PMID: 19758785 DOI: 10.1016/j.compmedimag.2009.08.005
    This paper presents a statistical method for the detection of lobar pneumonia when using digitized chest X-ray films. Each region of interest was represented by a vector of wavelet texture measures which is then multiplied by the orthogonal matrix Q(2). The first two elements of the transformed vectors were shown to have a bivariate normal distribution. Misclassification probabilities were estimated using probability ellipsoids and discriminant functions. The result of this study recommends the detection of pneumonia by constructing probability ellipsoids or discriminant function using maximum energy and maximum column sum energy texture measures where misclassification probabilities were less than 0.15.
    Matched MeSH terms: Algorithms
  4. Amini A, Saboohi H, Wah TY, Herawan T
    ScientificWorldJournal, 2014;2014:926020.
    PMID: 25110753 DOI: 10.1155/2014/926020
    Data streams are continuously generated over time from Internet of Things (IoT) devices. The faster all of this data is analyzed, its hidden trends and patterns discovered, and new strategies created, the faster action can be taken, creating greater value for organizations. Density-based method is a prominent class in clustering data streams. It has the ability to detect arbitrary shape clusters, to handle outlier, and it does not need the number of clusters in advance. Therefore, density-based clustering algorithm is a proper choice for clustering IoT streams. Recently, several density-based algorithms have been proposed for clustering data streams. However, density-based clustering in limited time is still a challenging issue. In this paper, we propose a density-based clustering algorithm for IoT streams. The method has fast processing time to be applicable in real-time application of IoT devices. Experimental results show that the proposed approach obtains high quality results with low computation time on real and synthetic datasets.
    Matched MeSH terms: Algorithms*
  5. Mostafa SA, Mustapha A, Mohammed MA, Ahmad MS, Mahmoud MA
    Int J Med Inform, 2018 04;112:173-184.
    PMID: 29500017 DOI: 10.1016/j.ijmedinf.2018.02.001
    Autonomous agents are being widely used in many systems, such as ambient assisted-living systems, to perform tasks on behalf of humans. However, these systems usually operate in complex environments that entail uncertain, highly dynamic, or irregular workload. In such environments, autonomous agents tend to make decisions that lead to undesirable outcomes. In this paper, we propose a fuzzy-logic-based adjustable autonomy (FLAA) model to manage the autonomy of multi-agent systems that are operating in complex environments. This model aims to facilitate the autonomy management of agents and help them make competent autonomous decisions. The FLAA model employs fuzzy logic to quantitatively measure and distribute autonomy among several agents based on their performance. We implement and test this model in the Automated Elderly Movements Monitoring (AEMM-Care) system, which uses agents to monitor the daily movement activities of elderly users and perform fall detection and prevention tasks in a complex environment. The test results show that the FLAA model improves the accuracy and performance of these agents in detecting and preventing falls.
    Matched MeSH terms: Algorithms
  6. Hu Y, Loo CK
    ScientificWorldJournal, 2014;2014:240983.
    PMID: 24778580 DOI: 10.1155/2014/240983
    A novel decision making for intelligent agent using quantum-inspired approach is proposed. A formal, generalized solution to the problem is given. Mathematically, the proposed model is capable of modeling higher dimensional decision problems than previous researches. Four experiments are conducted, and both empirical experiments results and proposed model's experiment results are given for each experiment. Experiments showed that the results of proposed model agree with empirical results perfectly. The proposed model provides a new direction for researcher to resolve cognitive basis in designing intelligent agent.
    Matched MeSH terms: Algorithms
  7. Kamel N, Yusoff MZ
    PMID: 19163891 DOI: 10.1109/IEMBS.2008.4650388
    A "single-trial" signal subspace approach for extracting visual evoked potential (VEP) from the ongoing 'colored' electroencephalogram (EEG) noise is proposed. The algorithm applies the generalized eigendecomposition on the covariance matrices of the VEP and noise to transform them jointly into diagonal matrices in order to avoid a pre-whitening stage. The proposed generalized subspace approach (GSA) decomposes the corrupted VEP space into a signal subspace and noise subspace. Enhancement is achieved by removing the noise subspace and estimating the clean VEPs only from the signal subspace. The validity and effectiveness of the proposed GSA scheme in estimating the latencies of P100's (used in objective assessment of visual pathways) are evaluated using real data collected from Selayang Hospital in Kuala Lumpur. The performance of GSA is compared with the recently proposed single-trial technique called the Third Order Correlation (TOC).
    Matched MeSH terms: Algorithms*
  8. Othman RM, Deris S, Illias RM
    J Biomed Inform, 2008 Feb;41(1):65-81.
    PMID: 17681495
    A genetic similarity algorithm is introduced in this study to find a group of semantically similar Gene Ontology terms. The genetic similarity algorithm combines semantic similarity measure algorithm with parallel genetic algorithm. The semantic similarity measure algorithm is used to compute the similitude strength between the Gene Ontology terms. Then, the parallel genetic algorithm is employed to perform batch retrieval and to accelerate the search in large search space of the Gene Ontology graph. The genetic similarity algorithm is implemented in the Gene Ontology browser named basic UTMGO to overcome the weaknesses of the existing Gene Ontology browsers which use a conventional approach based on keyword matching. To show the applicability of the basic UTMGO, we extend its structure to develop a Gene Ontology -based protein sequence annotation tool named extended UTMGO. The objective of developing the extended UTMGO is to provide a simple and practical tool that is capable of producing better results and requires a reasonable amount of running time with low computing cost specifically for offline usage. The computational results and comparison with other related tools are presented to show the effectiveness of the proposed algorithm and tools.
    Matched MeSH terms: Algorithms*
  9. Wan Alwi SR, Manan ZA, Samingin MH, Misran N
    J Environ Manage, 2008 Jul;88(2):219-52.
    PMID: 17449168
    Water pinch analysis (WPA) is a well-established tool for the design of a maximum water recovery (MWR) network. MWR, which is primarily concerned with water recovery and regeneration, only partly addresses water minimization problem. Strictly speaking, WPA can only lead to maximum water recovery targets as opposed to the minimum water targets as widely claimed by researchers over the years. The minimum water targets can be achieved when all water minimization options including elimination, reduction, reuse/recycling, outsourcing and regeneration have been holistically applied. Even though WPA has been well established for synthesis of MWR network, research towards holistic water minimization has lagged behind. This paper describes a new holistic framework for designing a cost-effective minimum water network (CEMWN) for industry and urban systems. The framework consists of five key steps, i.e. (1) Specify the limiting water data, (2) Determine MWR targets, (3) Screen process changes using water management hierarchy (WMH), (4) Apply Systematic Hierarchical Approach for Resilient Process Screening (SHARPS) strategy, and (5) Design water network. Three key contributions have emerged from this work. First is a hierarchical approach for systematic screening of process changes guided by the WMH. Second is a set of four new heuristics for implementing process changes that considers the interactions among process changes options as well as among equipment and the implications of applying each process change on utility targets. Third is the SHARPS cost-screening technique to customize process changes and ultimately generate a minimum water utilization network that is cost-effective and affordable. The CEMWN holistic framework has been successfully implemented on semiconductor and mosque case studies and yielded results within the designer payback period criterion.
    Matched MeSH terms: Algorithms
  10. Yap KS, Lim CP, Abidin IZ
    IEEE Trans Neural Netw, 2008 Sep;19(9):1641-6.
    PMID: 18779094 DOI: 10.1109/TNN.2008.2000992
    In this brief, a new neural network model called generalized adaptive resonance theory (GART) is introduced. GART is a hybrid model that comprises a modified Gaussian adaptive resonance theory (MGA) and the generalized regression neural network (GRNN). It is an enhanced version of the GRNN, which preserves the online learning properties of adaptive resonance theory (ART). A series of empirical studies to assess the effectiveness of GART in classification, regression, and time series prediction tasks is conducted. The results demonstrate that GART is able to produce good performances as compared with those of other methods, including the online sequential extreme learning machine (OSELM) and sequential learning radial basis function (RBF) neural network models.
    Matched MeSH terms: Algorithms*
  11. Zamli KZ, Din F, Ahmed BS, Bures M
    PLoS One, 2018;13(5):e0195675.
    PMID: 29771918 DOI: 10.1371/journal.pone.0195675
    The sine-cosine algorithm (SCA) is a new population-based meta-heuristic algorithm. In addition to exploiting sine and cosine functions to perform local and global searches (hence the name sine-cosine), the SCA introduces several random and adaptive parameters to facilitate the search process. Although it shows promising results, the search process of the SCA is vulnerable to local minima/maxima due to the adoption of a fixed switch probability and the bounded magnitude of the sine and cosine functions (from -1 to 1). In this paper, we propose a new hybrid Q-learning sine-cosine- based strategy, called the Q-learning sine-cosine algorithm (QLSCA). Within the QLSCA, we eliminate the switching probability. Instead, we rely on the Q-learning algorithm (based on the penalty and reward mechanism) to dynamically identify the best operation during runtime. Additionally, we integrate two new operations (Lévy flight motion and crossover) into the QLSCA to facilitate jumping out of local minima/maxima and enhance the solution diversity. To assess its performance, we adopt the QLSCA for the combinatorial test suite minimization problem. Experimental results reveal that the QLSCA is statistically superior with regard to test suite size reduction compared to recent state-of-the-art strategies, including the original SCA, the particle swarm test generator (PSTG), adaptive particle swarm optimization (APSO) and the cuckoo search strategy (CS) at the 95% confidence level. However, concerning the comparison with discrete particle swarm optimization (DPSO), there is no significant difference in performance at the 95% confidence level. On a positive note, the QLSCA statistically outperforms the DPSO in certain configurations at the 90% confidence level.
    Matched MeSH terms: Algorithms*
  12. Aghabozorgi S, Ying Wah T, Herawan T, Jalab HA, Shaygan MA, Jalali A
    ScientificWorldJournal, 2014;2014:562194.
    PMID: 24982966 DOI: 10.1155/2014/562194
    Time series clustering is an important solution to various problems in numerous fields of research, including business, medical science, and finance. However, conventional clustering algorithms are not practical for time series data because they are essentially designed for static data. This impracticality results in poor clustering accuracy in several systems. In this paper, a new hybrid clustering algorithm is proposed based on the similarity in shape of time series data. Time series data are first grouped as subclusters based on similarity in time. The subclusters are then merged using the k-Medoids algorithm based on similarity in shape. This model has two contributions: (1) it is more accurate than other conventional and hybrid approaches and (2) it determines the similarity in shape among time series data with a low complexity. To evaluate the accuracy of the proposed model, the model is tested extensively using syntactic and real-world time series datasets.
    Matched MeSH terms: Algorithms*
  13. Natita Wangsoh, Wiboonsak Watthayu, Dusadee Sukawat
    Sains Malaysiana, 2017;46:2541-2547.
    A hybrid climate model (HCM) is a novel proposed model based on the combination of self-organizing map (SOM) and analog method (AM). The main purpose was to improve the accuracy in rainfall forecasting using HCM. In combination process of HCM, SOM algorithm classifies high dimensional input data to low dimensional of several disjointed clusters in which similar input is grouped. AM searches the future day that has similar property with the day in the past. Consequently, the analog day is mapped to each cluster of SOM to investigate rainfall. In this study, the input data, geopotential height at 850 hPa from the Climate Forecast System Reanalysis (CFSR) are training set data and also the complete rainfall data at 30-meteorological stations from Thai meteorological department (TMD) are observed. To improve capability of rainfall forecasting, three different measures were evaluated. The experimental results showed that the performance of HCM is better than the traditional AM. It is illustrated that the HCM can forecast rainfall proficiently.
    Matched MeSH terms: Algorithms
  14. Maktabdar Oghaz M, Maarof MA, Zainal A, Rohani MF, Yaghoubyan SH
    PLoS One, 2015;10(8):e0134828.
    PMID: 26267377 DOI: 10.1371/journal.pone.0134828
    Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications.
    Matched MeSH terms: Algorithms
  15. Tayan O, Kabir MN, Alginahi YM
    ScientificWorldJournal, 2014;2014:514652.
    PMID: 25254247 DOI: 10.1155/2014/514652
    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints.
    Matched MeSH terms: Algorithms*
  16. Ng KH, Ho CK, Phon-Amnuaisuk S
    PLoS One, 2012;7(10):e47216.
    PMID: 23071763 DOI: 10.1371/journal.pone.0047216
    Clustering is a key step in the processing of Expressed Sequence Tags (ESTs). The primary goal of clustering is to put ESTs from the same transcript of a single gene into a unique cluster. Recent EST clustering algorithms mostly adopt the alignment-free distance measures, where they tend to yield acceptable clustering accuracies with reasonable computational time. Despite the fact that these clustering methods work satisfactorily on a majority of the EST datasets, they have a common weakness. They are prone to deliver unsatisfactory clustering results when dealing with ESTs from the genes derived from the same family. The root cause is the distance measures applied on them are not sensitive enough to separate these closely related genes.
    Matched MeSH terms: Algorithms
  17. Khan MNA, Yunus RM
    Nutrition, 2023 Apr;108:111947.
    PMID: 36641887 DOI: 10.1016/j.nut.2022.111947
    BACKGROUND: The proper intake of nutrients is essential to the growth and maturation of youngsters. In sub-Saharan Africa, 1 in 7 children dies before age 5 y, and more than a third of these deaths are attributed to malnutrition. The main purpose of this study was to develop a majority voting-based hybrid ensemble (MVBHE) learning model to accelerate the prediction accuracy of malnutrition data of under-five children in sub-Saharan Africa.

    METHODS: This study used available under-five nutritional secondary data from the Demographic and Health Surveys performed in sub-Saharan African countries. The research used bagging, boosting, and voting algorithms, such as random forest, decision tree, eXtreme Gradient Boosting, and k-nearest neighbors machine learning methods, to generate the MVBHE model.

    RESULTS: We evaluated the model performances in contrast to each other using different measures, including accuracy, precision, recall, and the F1 score. The results of the experiment showed that the MVBHE model (96%) was better at predicting malnutrition than the random forest (81%), decision tree (60%), eXtreme Gradient Boosting (79%), and k-nearest neighbors (74%).

    CONCLUSIONS: The random forest algorithm demonstrated the highest prediction accuracy (81%) compared with the decision tree, eXtreme Gradient Boosting, and k-nearest neighbors algorithms. The accuracy was then enhanced to 96% using the MVBHE model. The MVBHE model is recommended by the present study as the best way to predict malnutrition in under-five children.

    Matched MeSH terms: Algorithms
  18. Liu J, Yinchai W, Siong TC, Li X, Zhao L, Wei F
    Sci Rep, 2022 Dec 01;12(1):20770.
    PMID: 36456582 DOI: 10.1038/s41598-022-23765-x
    For generating an interpretable deep architecture for identifying deep intrusion patterns, this study proposes an approach that combines ANFIS (Adaptive Network-based Fuzzy Inference System) and DT (Decision Tree) for interpreting the deep pattern of intrusion detection. Meanwhile, for improving the efficiency of training and predicting, Pearson Correlation analysis, standard deviation, and a new adaptive K-means are used to select attributes and make fuzzy interval decisions. The proposed algorithm was trained, validated, and tested on the NSL-KDD (National security lab-knowledge discovery and data mining) dataset. Using 22 attributes that highly related to the target, the performance of the proposed method achieves a 99.86% detection rate and 0.14% false alarm rate on the KDDTrain+ dataset, a 77.46% detection rate on the KDDTest+ dataset, which is better than many classifiers. Besides, the interpretable model can help us demonstrate the complex and overlapped pattern of intrusions and analyze the pattern of various intrusions.
    Matched MeSH terms: Algorithms
  19. Parida S, Dehuri S, Cho SB, Cacha LA, Poznanski RR
    J Integr Neurosci, 2015 Sep;14(3):355-68.
    PMID: 26455882 DOI: 10.1142/S0219635215500223
    Functional magnetic resonance imaging (fMRI) makes it possible to detect brain activities in order to elucidate cognitive-states. The complex nature of fMRI data requires under-standing of the analyses applied to produce possible avenues for developing models of cognitive state classification and improving brain activity prediction. While many models of classification task of fMRI data analysis have been developed, in this paper, we present a novel hybrid technique through combining the best attributes of genetic algorithms (GAs) and ensemble decision tree technique that consistently outperforms all other methods which are being used for cognitive-state classification. Specifically, this paper illustrates the combined effort of decision-trees ensemble and GAs for feature selection through an extensive simulation study and discusses the classification performance with respect to fMRI data. We have shown that our proposed method exhibits significant reduction of the number of features with clear edge classification accuracy over ensemble of decision-trees.
    Matched MeSH terms: Algorithms
  20. Biswas K, Nazir A, Rahman MT, Khandaker MU, Idris AM, Islam J, et al.
    PLoS One, 2022;17(1):e0261427.
    PMID: 35085239 DOI: 10.1371/journal.pone.0261427
    Cost and safety are critical factors in the oil and gas industry for optimizing wellbore trajectory, which is a constrained and nonlinear optimization problem. In this work, the wellbore trajectory is optimized using the true measured depth, well profile energy, and torque. Numerous metaheuristic algorithms were employed to optimize these objectives by tuning 17 constrained variables, with notable drawbacks including decreased exploitation/exploration capability, local optima trapping, non-uniform distribution of non-dominated solutions, and inability to track isolated minima. The purpose of this work is to propose a modified multi-objective cellular spotted hyena algorithm (MOCSHOPSO) for optimizing true measured depth, well profile energy, and torque. To overcome the aforementioned difficulties, the modification incorporates cellular automata (CA) and particle swarm optimization (PSO). By adding CA, the SHO's exploration phase is enhanced, and the SHO's hunting mechanisms are modified with PSO's velocity update property. Several geophysical and operational constraints have been utilized during trajectory optimization and data has been collected from the Gulf of Suez oil field. The proposed algorithm was compared with the standard methods (MOCPSO, MOSHO, MOCGWO) and observed significant improvements in terms of better distribution of non-dominated solutions, better-searching capability, a minimum number of isolated minima, and better Pareto optimal front. These significant improvements were validated by analysing the algorithms in terms of some statistical analysis, such as IGD, MS, SP, and ER. The proposed algorithm has obtained the lowest values in IGD, SP and ER, on the other side highest values in MS. Finally, an adaptive neighbourhood mechanism has been proposed which showed better performance than the fixed neighbourhood topology such as L5, L9, C9, C13, C21, and C25. Hopefully, this newly proposed modified algorithm will pave the way for better wellbore trajectory optimization.
    Matched MeSH terms: Algorithms
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links