Displaying publications 61 - 80 of 1459 in total

Abstract:
Sort:
  1. Ahmad M, Jung LT, Bhuiyan AA
    Comput Methods Programs Biomed, 2017 Oct;149:11-17.
    PMID: 28802326 DOI: 10.1016/j.cmpb.2017.06.021
    BACKGROUND AND OBJECTIVE: Digital signal processing techniques commonly employ fixed length window filters to process the signal contents. DNA signals differ in characteristics from common digital signals since they carry nucleotides as contents. The nucleotides own genetic code context and fuzzy behaviors due to their special structure and order in DNA strand. Employing conventional fixed length window filters for DNA signal processing produce spectral leakage and hence results in signal noise. A biological context aware adaptive window filter is required to process the DNA signals.

    METHODS: This paper introduces a biological inspired fuzzy adaptive window median filter (FAWMF) which computes the fuzzy membership strength of nucleotides in each slide of window and filters nucleotides based on median filtering with a combination of s-shaped and z-shaped filters. Since coding regions cause 3-base periodicity by an unbalanced nucleotides' distribution producing a relatively high bias for nucleotides' usage, such fundamental characteristic of nucleotides has been exploited in FAWMF to suppress the signal noise.

    RESULTS: Along with adaptive response of FAWMF, a strong correlation between median nucleotides and the Π shaped filter was observed which produced enhanced discrimination between coding and non-coding regions contrary to fixed length conventional window filters. The proposed FAWMF attains a significant enhancement in coding regions identification i.e. 40% to 125% as compared to other conventional window filters tested over more than 250 benchmarked and randomly taken DNA datasets of different organisms.

    CONCLUSION: This study proves that conventional fixed length window filters applied to DNA signals do not achieve significant results since the nucleotides carry genetic code context. The proposed FAWMF algorithm is adaptive and outperforms significantly to process DNA signal contents. The algorithm applied to variety of DNA datasets produced noteworthy discrimination between coding and non-coding regions contrary to fixed window length conventional filters.

    Matched MeSH terms: Algorithms
  2. Mohd Faizal AS, Hon WY, Thevarajah TM, Khor SM, Chang SW
    Med Biol Eng Comput, 2023 Oct;61(10):2527-2541.
    PMID: 37199891 DOI: 10.1007/s11517-023-02841-y
    Acute myocardial infarction (AMI) or heart attack is a significant global health threat and one of the leading causes of death. The evolution of machine learning has greatly revamped the risk stratification and death prediction of AMI. In this study, an integrated feature selection and machine learning approach was used to identify potential biomarkers for early detection and treatment of AMI. First, feature selection was conducted and evaluated before all classification tasks with machine learning. Full classification models (using all 62 features) and reduced classification models (using various feature selection methods ranging from 5 to 30 features) were built and evaluated using six machine learning classification algorithms. The results showed that the reduced models performed generally better (mean AUPRC via random forest (RF) algorithm for recursive feature elimination (RFE) method ranges from 0.8048 to 0.8260, while for random forest importance (RFI) method, it ranges from 0.8301 to 0.8505) than the full models (mean AUPRC via RF: 0.8044). The most notable finding of this study was the identification of a five-feature model that included cardiac troponin I, HDL cholesterol, HbA1c, anion gap, and albumin, which had achieved comparable results (mean AUPRC via RF: 0.8462) as to the models that containing more features. These five features were proven by the previous studies as significant risk factors for AMI or cardiovascular disease and could be used as potential biomarkers to predict the prognosis of AMI patients. From the medical point of view, fewer features for diagnosis or prognosis could reduce the cost and time of a patient as lesser clinical and pathological tests are needed.
    Matched MeSH terms: Algorithms*
  3. Soleymani A, Nordin MJ, Sundararajan E
    ScientificWorldJournal, 2014;2014:536930.
    PMID: 25258724 DOI: 10.1155/2014/536930
    The rapid evolution of imaging and communication technologies has transformed images into a widespread data type. Different types of data, such as personal medical information, official correspondence, or governmental and military documents, are saved and transmitted in the form of images over public networks. Hence, a fast and secure cryptosystem is needed for high-resolution images. In this paper, a novel encryption scheme is presented for securing images based on Arnold cat and Henon chaotic maps. The scheme uses Arnold cat map for bit- and pixel-level permutations on plain and secret images, while Henon map creates secret images and specific parameters for the permutations. Both the encryption and decryption processes are explained, formulated, and graphically presented. The results of security analysis of five different images demonstrate the strength of the proposed cryptosystem against statistical, brute force and differential attacks. The evaluated running time for both encryption and decryption processes guarantee that the cryptosystem can work effectively in real-time applications.
    Matched MeSH terms: Algorithms*
  4. Noor Rodi NS, Malek MA, Ismail AR, Ting SC, Tang CW
    Water Sci Technol, 2014;70(10):1641-7.
    PMID: 25429452 DOI: 10.2166/wst.2014.420
    This study applies the clonal selection algorithm (CSA) in an artificial immune system (AIS) as an alternative method to predicting future rainfall data. The stochastic and the artificial neural network techniques are commonly used in hydrology. However, in this study a novel technique for forecasting rainfall was established. Results from this study have proven that the theory of biological immune systems could be technically applied to time series data. Biological immune systems are nonlinear and chaotic in nature similar to the daily rainfall data. This study discovered that the proposed CSA was able to predict the daily rainfall data with an accuracy of 90% during the model training stage. In the testing stage, the results showed that an accuracy between the actual and the generated data was within the range of 75 to 92%. Thus, the CSA approach shows a new method in rainfall data prediction.
    Matched MeSH terms: Algorithms*
  5. Chun TS, Malek MA, Ismail AR
    Water Sci Technol, 2015;71(4):524-8.
    PMID: 25746643 DOI: 10.2166/wst.2014.451
    The development of effluent removal prediction is crucial in providing a planning tool necessary for the future development and the construction of a septic sludge treatment plant (SSTP), especially in the developing countries. In order to investigate the expected functionality of the required standard, the prediction of the effluent quality, namely biological oxygen demand, chemical oxygen demand and total suspended solid of an SSTP was modelled using an artificial intelligence approach. In this paper, we adopt the clonal selection algorithm (CSA) to set up a prediction model, with a well-established method - namely the least-square support vector machine (LS-SVM) as a baseline model. The test results of the case study showed that the prediction of the CSA-based SSTP model worked well and provided model performance as satisfactory as the LS-SVM model. The CSA approach shows that fewer control and training parameters are required for model simulation as compared with the LS-SVM approach. The ability of a CSA approach in resolving limited data samples, non-linear sample function and multidimensional pattern recognition makes it a powerful tool in modelling the prediction of effluent removals in an SSTP.
    Matched MeSH terms: Algorithms*
  6. Nordin N, Zainol Z, Mohd Noor MH, Lai Fong C
    Health Informatics J, 2021 3 23;27(1):1460458221989395.
    PMID: 33745355 DOI: 10.1177/1460458221989395
    Current suicide risk assessments for predicting suicide attempts are time consuming, of low predictive value and have inadequate reliability. This paper aims to develop a predictive model for suicide attempts among patients with depression using machine learning algorithms as well as presents a comparative study on single predictive models with ensemble predictive models for differentiating depressed patients with suicide attempts from non-suicide attempters. We applied and trained eight different machine learning algorithms using a dataset that consists of 75 patients diagnosed with a depressive disorder. A recursive feature elimination was used to reduce the features via three-fold cross validation. An ensemble predictive models outperformed the single predictive models. Voting and bagging revealed the highest accuracy of 92% compared to other machine learning algorithms. Our findings indicate that history of suicide attempt, religion, race, suicide ideation and severity of clinical depression are useful factors for prediction of suicide attempts.
    Matched MeSH terms: Algorithms
  7. Palaniappan R, Sundaraj K, Sundaraj S
    BMC Bioinformatics, 2014;15:223.
    PMID: 24970564 DOI: 10.1186/1471-2105-15-223
    Pulmonary acoustic parameters extracted from recorded respiratory sounds provide valuable information for the detection of respiratory pathologies. The automated analysis of pulmonary acoustic signals can serve as a differential diagnosis tool for medical professionals, a learning tool for medical students, and a self-management tool for patients. In this context, we intend to evaluate and compare the performance of the support vector machine (SVM) and K-nearest neighbour (K-nn) classifiers in diagnosis respiratory pathologies using respiratory sounds from R.A.L.E database.
    Matched MeSH terms: Algorithms*
  8. Hong, Choon Ong, Tilahun, Surafel Luleseged, Tang, Suey Shya
    MyJurnal
    Many studies have been carried out using different metaheuristic algorithms on optimisation problems in various fields like engineering design, economics and routes planning. In the real world, resources and time are scarce. Thus the goals of optimisation algorithms are to optimise these available resources. Different metaheuristic algorithms are available. The firefly algorithm is one of the recent metaheuristic algorithms that is used in many applications; it is also modified and hybridised to improve its performance. In this paper, we compare the Standard Firefly Algorithm, the Elitist Firefly Algorithm, also called the Modified Firefly Algorithm with the Chaotic Firefly Algorithm, which embeds chaos maps in the Standard Firefly Algorithm. The Modified Firefly Algorithm differs from the Standard Firefly Algorithm in such a way that the global optimum solution at a particular iteration will not move randomly but in a direction that is chosen from randomly generated directions that can improve its performance. If none of these directions improves its performance, then the algorithm will not be updated. On the other hand, the Chaotic Firefly Algorithm tunes the parameters of the algorithms for the purpose of increasing the global search mobility i.e. to improve the attractiveness of fireflies. In our study, we found that the Chaotic Firefly Algorithms using three different chaotic maps do not perform as well as the Modified Firefly Algorithms; however, at least one or two of the Chaotic Firefly Algorithms outperform the Standard Firefly Algorithm under the given accuracy and efficiency tests.
    Matched MeSH terms: Algorithms
  9. Chew KM, Seman N, Sudirman R, Yong CY
    Biomed Mater Eng, 2014;24(6):2161-7.
    PMID: 25226914 DOI: 10.3233/BME-141027
    The development of human-like brain phantom is important for data acquisition in microwave imaging. The characteristics of the phantom should be based on the real human body dielectric properties such as relative permittivity. The development of phantom includes the greymatter and whitematter regions, each with a relative permittivity of 38 and 28 respectively at 10 GHz frequency. Results were compared with the value obtained from the standard library of Computer Simulation Technology (CST) simulation application and the existing research by Fernandez and Gabriel. Our experimental results show a positive outcome, in which the proposed mixture was adequate to represent real human brain for data acquisition.
    Matched MeSH terms: Algorithms*
  10. Asghar A, Abdul Raman AA, Daud WM
    ScientificWorldJournal, 2014;2014:869120.
    PMID: 25258741 DOI: 10.1155/2014/869120
    In the present study, a comparison of central composite design (CCD) and Taguchi method was established for Fenton oxidation. [Dye]ini, Dye:Fe(+2), H2O2:Fe(+2), and pH were identified control variables while COD and decolorization efficiency were selected responses. L 9 orthogonal array and face-centered CCD were used for the experimental design. Maximum 99% decolorization and 80% COD removal efficiency were obtained under optimum conditions. R squared values of 0.97 and 0.95 for CCD and Taguchi method, respectively, indicate that both models are statistically significant and are in well agreement with each other. Furthermore, Prob > F less than 0.0500 and ANOVA results indicate the good fitting of selected model with experimental results. Nevertheless, possibility of ranking of input variables in terms of percent contribution to the response value has made Taguchi method a suitable approach for scrutinizing the operating parameters. For present case, pH with percent contribution of 87.62% and 66.2% was ranked as the most contributing and significant factor. This finding of Taguchi method was also verified by 3D contour plots of CCD. Therefore, from this comparative study, it is concluded that Taguchi method with 9 experimental runs and simple interaction plots is a suitable alternative to CCD for several chemical engineering applications.
    Matched MeSH terms: Algorithms*
  11. Meselhy Eltoukhy M, Faye I, Belhaouari Samir B
    Comput Biol Med, 2010 Apr;40(4):384-91.
    PMID: 20163793 DOI: 10.1016/j.compbiomed.2010.02.002
    This paper presents a comparative study between wavelet and curvelet transform for breast cancer diagnosis in digital mammogram. Using multiresolution analysis, mammogram images are decomposed into different resolution levels, which are sensitive to different frequency bands. A set of the biggest coefficients from each decomposition level is extracted. Then a supervised classifier system based on Euclidian distance is constructed. The performance of the classifier is evaluated using a 2 x 5-fold cross validation followed by a statistical analysis. The experimental results suggest that curvelet transform outperforms wavelet transform and the difference is statistically significant.
    Matched MeSH terms: Algorithms
  12. Kausar AS, Reza AW, Wo LC, Ramiah H
    ScientificWorldJournal, 2014;2014:601729.
    PMID: 25202733 DOI: 10.1155/2014/601729
    Although ray tracing based propagation prediction models are popular for indoor radio wave propagation characterization, most of them do not provide an integrated approach for achieving the goal of optimum coverage, which is a key part in designing wireless network. In this paper, an accelerated technique of three-dimensional ray tracing is presented, where rough surface scattering is included for making a more accurate ray tracing technique. Here, the rough surface scattering is represented by microfacets, for which it becomes possible to compute the scattering field in all possible directions. New optimization techniques, like dual quadrant skipping (DQS) and closest object finder (COF), are implemented for fast characterization of wireless communications and making the ray tracing technique more efficient. In conjunction with the ray tracing technique, probability based coverage optimization algorithm is accumulated with the ray tracing technique to make a compact solution for indoor propagation prediction. The proposed technique decreases the ray tracing time by omitting the unnecessary objects for ray tracing using the DQS technique and by decreasing the ray-object intersection time using the COF technique. On the other hand, the coverage optimization algorithm is based on probability theory, which finds out the minimum number of transmitters and their corresponding positions in order to achieve optimal indoor wireless coverage. Both of the space and time complexities of the proposed algorithm surpass the existing algorithms. For the verification of the proposed ray tracing technique and coverage algorithm, detailed simulation results for different scattering factors, different antenna types, and different operating frequencies are presented. Furthermore, the proposed technique is verified by the experimental results.
    Matched MeSH terms: Algorithms*
  13. Khoje SA, Bodhe SK
    Crit Rev Food Sci Nutr, 2015;55(12):1658-71.
    PMID: 24915312 DOI: 10.1080/10408398.2012.698662
    It is said that the backbone of Indian economy is agriculture. The contribution of the agriculture sector to the national GDP (Gross Domestic Products) was 14.6% in the year 2010. To attain a growth rate equivalent to that of industry (viz., about 9%), it is highly mandatory for Indian agriculture to modernize and use automation at various stages of cultivation and post-harvesting techniques. The use of computers in assessing the quality of fruits is one of the major activities in post-harvesting technology. As of now, this assessment is majorly done manually, except for a few fruits. Currently, the fruit quality assessment by machine vision in India is still at research level. Major research has been carried out in countries like China, Malaysia, UK, and Netherlands. To suit the Indian market and psychology of Indian farmers, it is necessary to develop indigenous technology. This paper is the first step toward evaluating the research carried out by the research community all over world for tropical fruits. For the purpose of survey, we have concentrated on the tropical fruits of the state of Maharashtra, while keeping in focus of the review image processing algorithms.
    Matched MeSH terms: Algorithms
  14. Senanayake C, Senanayake SM
    Comput Methods Biomech Biomed Engin, 2011 Oct;14(10):863-74.
    PMID: 20924859 DOI: 10.1080/10255842.2010.499866
    In this paper, a gait event detection algorithm is presented that uses computer intelligence (fuzzy logic) to identify seven gait phases in walking gait. Two inertial measurement units and four force-sensitive resistors were used to obtain knee angle and foot pressure patterns, respectively. Fuzzy logic is used to address the complexity in distinguishing gait phases based on discrete events. A novel application of the seven-dimensional vector analysis method to estimate the amount of abnormalities detected was also investigated based on the two gait parameters. Experiments were carried out to validate the application of the two proposed algorithms to provide accurate feedback in rehabilitation. The algorithm responses were tested for two cases, normal and abnormal gait. The large amount of data required for reliable gait-phase detection necessitate the utilisation of computer methods to store and manage the data. Therefore, a database management system and an interactive graphical user interface were developed for the utilisation of the overall system in a clinical environment.
    Matched MeSH terms: Algorithms
  15. Tiong TJ, Price GJ, Kanagasingam S
    Ultrason Sonochem, 2014 Sep;21(5):1858-65.
    PMID: 24735986 DOI: 10.1016/j.ultsonch.2014.03.024
    One of the uses of ultrasound in dentistry is in the field of endodontics (i.e. root canal treatment) in order to enhance cleaning efficiency during the treatment. The acoustic pressures generated by the oscillation of files in narrow channels has been calculated using the COMSOL simulation package. Acoustic pressures in excess of the cavitation threshold can be generated and higher values were found in narrower channels. This parallels experimental observations of sonochemiluminescence. The effect of varying the channel width and length and the dimensions and shape of the file are reported. As well as explaining experimental observations, the work provides a basis for the further development and optimisation of the design of endosonic files.
    Matched MeSH terms: Algorithms
  16. Razmara J, Deris SB, Parvizpour S
    Comput Biol Med, 2013 Oct;43(10):1614-21.
    PMID: 24034753 DOI: 10.1016/j.compbiomed.2013.07.022
    The structural comparison of proteins is a vital step in structural biology that is used to predict and analyse a new unknown protein function. Although a number of different techniques have been explored, the study to develop new alternative methods is still an active research area. The present paper introduces a text modelling-based technique for the structural comparison of proteins. The method models the secondary and tertiary structure of proteins in two linear sequences and then applies them to the comparison of two structures. The technique used for pairwise comparison of the sequences has been adopted from computational linguistics and its well-known techniques for analysing and quantifying textual sequences. To this end, an n-gram modelling technique is used to capture regularities between sequences, and then, the cross-entropy concept is employed to measure their similarities. Several experiments are conducted to evaluate the performance of the method and compare it with other commonly used programs. The assessments for information retrieval evaluation demonstrate that the technique has a high running speed, which is similar to other linear encoding methods, such as 3D-BLAST, SARST, and TS-AMIR, whereas its accuracy is comparable to CE and TM-align, which are high accuracy comparison tools. Accordingly, the results demonstrate that the algorithm has high efficiency compared with other state-of-the-art methods.
    Matched MeSH terms: Algorithms
  17. Sim KS, Kho YY, Tso CP, Nia ME, Ting HY
    Scanning, 2013 Mar-Apr;35(2):75-87.
    PMID: 22777599 DOI: 10.1002/sca.21037
    Detection of cracks from stainless steel pipe images is done using contrast stretching technique. The technique is based on an image filter technique through mathematical morphology that can expose the cracks. The cracks are highlighted and noise removal is done efficiently while still retaining the edges. An automated crack detection system with a camera platform has been successfully implemented. We compare crack extraction in terms of quality measures with those of Otsu's threshold technique and the another technique (Iyer and Sinha, 2005). The algorithm shown is able to achieve good results and perform better than these other techniques.
    Matched MeSH terms: Algorithms
  18. Mohammad Lutfi Othman, Mahmood Khalid Hadi, Noor Izzri Abdul Wahab
    MyJurnal
    Special Protection Schemes (SPSs), are corrective action schemes that are designed to protect power
    systems against severe contingency conditions. In planning of SPSs, protecting transmission network from
    overloading issue due to critical situations has become a serious challenge which needs to be taken into
    account. In this paper, a Special Protection and Control Scheme (SPCS) based on Differential Evolution
    (DE) algorithm for optimal generation rescheduling has been applied to mitigate the transmission line
    overloading in system contingency conditions. The N-1 contingency has been performed for different
    single line outages under base and increased load in which generation rescheduling strategy has been
    undertaken to overcome the overloading problem. Simulation results are presented for both pre-and
    post system emergency situations. The IEEE 30-bus test system was utilised in order to validate the
    effectiveness of the proposed method.
    Matched MeSH terms: Algorithms
  19. Reza AW, Eswaran C
    J Med Syst, 2011 Feb;35(1):17-24.
    PMID: 20703589 DOI: 10.1007/s10916-009-9337-y
    The increasing number of diabetic retinopathy (DR) cases world wide demands the development of an automated decision support system for quick and cost-effective screening of DR. We present an automatic screening system for detecting the early stage of DR, which is known as non-proliferative diabetic retinopathy (NPDR). The proposed system involves processing of fundus images for extraction of abnormal signs, such as hard exudates, cotton wool spots, and large plaque of hard exudates. A rule based classifier is used for classifying the DR into two classes, namely, normal and abnormal. The abnormal NPDR is further classified into three levels, namely, mild, moderate, and severe. To evaluate the performance of the proposed decision support framework, the algorithms have been tested on the images of STARE database. The results obtained from this study show that the proposed system can detect the bright lesions with an average accuracy of about 97%. The study further shows promising results in classifying the bright lesions correctly according to NPDR severity levels.
    Matched MeSH terms: Algorithms
  20. Jain S, Seal A, Ojha A, Yazidi A, Bures J, Tacheci I, et al.
    Comput Biol Med, 2021 10;137:104789.
    PMID: 34455302 DOI: 10.1016/j.compbiomed.2021.104789
    Wireless capsule endoscopy (WCE) is one of the most efficient methods for the examination of gastrointestinal tracts. Computer-aided intelligent diagnostic tools alleviate the challenges faced during manual inspection of long WCE videos. Several approaches have been proposed in the literature for the automatic detection and localization of anomalies in WCE images. Some of them focus on specific anomalies such as bleeding, polyp, lesion, etc. However, relatively fewer generic methods have been proposed to detect all those common anomalies simultaneously. In this paper, a deep convolutional neural network (CNN) based model 'WCENet' is proposed for anomaly detection and localization in WCE images. The model works in two phases. In the first phase, a simple and efficient attention-based CNN classifies an image into one of the four categories: polyp, vascular, inflammatory, or normal. If the image is classified in one of the abnormal categories, it is processed in the second phase for the anomaly localization. Fusion of Grad-CAM++ and a custom SegNet is used for anomalous region segmentation in the abnormal image. WCENet classifier attains accuracy and area under receiver operating characteristic of 98% and 99%. The WCENet segmentation model obtains a frequency weighted intersection over union of 81%, and an average dice score of 56% on the KID dataset. WCENet outperforms nine different state-of-the-art conventional machine learning and deep learning models on the KID dataset. The proposed model demonstrates potential for clinical applications.
    Matched MeSH terms: Algorithms
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links