Displaying publications 1 - 20 of 85 in total

Abstract:
Sort:
  1. Hussain M, Al-Haiqi A, Zaidan AA, Zaidan BB, Kiah ML, Anuar NB, et al.
    Comput Methods Programs Biomed, 2015 Dec;122(3):393-408.
    PMID: 26412009 DOI: 10.1016/j.cmpb.2015.08.015
    To survey researchers' efforts in response to the new and disruptive technology of smartphone medical apps, mapping the research landscape form the literature into a coherent taxonomy, and finding out basic characteristics of this emerging field represented on: motivation of using smartphone apps in medicine and healthcare, open challenges that hinder the utility, and the recommendations to improve the acceptance and use of medical apps in the literature.
  2. Fallahpoor M, Chakraborty S, Pradhan B, Faust O, Barua PD, Chegeni H, et al.
    Comput Methods Programs Biomed, 2024 Jan;243:107880.
    PMID: 37924769 DOI: 10.1016/j.cmpb.2023.107880
    Positron emission tomography/computed tomography (PET/CT) is increasingly used in oncology, neurology, cardiology, and emerging medical fields. The success stems from the cohesive information that hybrid PET/CT imaging offers, surpassing the capabilities of individual modalities when used in isolation for different malignancies. However, manual image interpretation requires extensive disease-specific knowledge, and it is a time-consuming aspect of physicians' daily routines. Deep learning algorithms, akin to a practitioner during training, extract knowledge from images to facilitate the diagnosis process by detecting symptoms and enhancing images. This acquired knowledge aids in supporting the diagnosis process through symptom detection and image enhancement. The available review papers on PET/CT imaging have a drawback as they either included additional modalities or examined various types of AI applications. However, there has been a lack of comprehensive investigation specifically focused on the highly specific use of AI, and deep learning, on PET/CT images. This review aims to fill that gap by investigating the characteristics of approaches used in papers that employed deep learning for PET/CT imaging. Within the review, we identified 99 studies published between 2017 and 2022 that applied deep learning to PET/CT images. We also identified the best pre-processing algorithms and the most effective deep learning models reported for PET/CT while highlighting the current limitations. Our review underscores the potential of deep learning (DL) in PET/CT imaging, with successful applications in lesion detection, tumor segmentation, and disease classification in both sinogram and image spaces. Common and specific pre-processing techniques are also discussed. DL algorithms excel at extracting meaningful features, and enhancing accuracy and efficiency in diagnosis. However, limitations arise from the scarcity of annotated datasets and challenges in explainability and uncertainty. Recent DL models, such as attention-based models, generative models, multi-modal models, graph convolutional networks, and transformers, are promising for improving PET/CT studies. Additionally, radiomics has garnered attention for tumor classification and predicting patient outcomes. Ongoing research is crucial to explore new applications and improve the accuracy of DL models in this rapidly evolving field.
  3. Faust O, Hagiwara Y, Hong TJ, Lih OS, Acharya UR
    Comput Methods Programs Biomed, 2018 Jul;161:1-13.
    PMID: 29852952 DOI: 10.1016/j.cmpb.2018.04.005
    BACKGROUND AND OBJECTIVE: We have cast the net into the ocean of knowledge to retrieve the latest scientific research on deep learning methods for physiological signals. We found 53 research papers on this topic, published from 01.01.2008 to 31.12.2017.

    METHODS: An initial bibliometric analysis shows that the reviewed papers focused on Electromyogram(EMG), Electroencephalogram(EEG), Electrocardiogram(ECG), and Electrooculogram(EOG). These four categories were used to structure the subsequent content review.

    RESULTS: During the content review, we understood that deep learning performs better for big and varied datasets than classic analysis and machine classification methods. Deep learning algorithms try to develop the model by using all the available input.

    CONCLUSIONS: This review paper depicts the application of various deep learning algorithms used till recently, but in future it will be used for more healthcare areas to improve the quality of diagnosis.

  4. Adam M, Oh SL, Sudarshan VK, Koh JE, Hagiwara Y, Tan JH, et al.
    Comput Methods Programs Biomed, 2018 Jul;161:133-143.
    PMID: 29852956 DOI: 10.1016/j.cmpb.2018.04.018
    Cardiovascular diseases (CVDs) are the leading cause of deaths worldwide. The rising mortality rate can be reduced by early detection and treatment interventions. Clinically, electrocardiogram (ECG) signal provides useful information about the cardiac abnormalities and hence employed as a diagnostic modality for the detection of various CVDs. However, subtle changes in these time series indicate a particular disease. Therefore, it may be monotonous, time-consuming and stressful to inspect these ECG beats manually. In order to overcome this limitation of manual ECG signal analysis, this paper uses a novel discrete wavelet transform (DWT) method combined with nonlinear features for automated characterization of CVDs. ECG signals of normal, and dilated cardiomyopathy (DCM), hypertrophic cardiomyopathy (HCM) and myocardial infarction (MI) are subjected to five levels of DWT. Relative wavelet of four nonlinear features such as fuzzy entropy, sample entropy, fractal dimension and signal energy are extracted from the DWT coefficients. These features are fed to sequential forward selection (SFS) technique and then ranked using ReliefF method. Our proposed methodology achieved maximum classification accuracy (acc) of 99.27%, sensitivity (sen) of 99.74%, and specificity (spec) of 98.08% with K-nearest neighbor (kNN) classifier using 15 features ranked by the ReliefF method. Our proposed methodology can be used by clinical staff to make faster and accurate diagnosis of CVDs. Thus, the chances of survival can be significantly increased by early detection and treatment of CVDs.
  5. Hagiwara Y, Koh JEW, Tan JH, Bhandary SV, Laude A, Ciaccio EJ, et al.
    Comput Methods Programs Biomed, 2018 Oct;165:1-12.
    PMID: 30337064 DOI: 10.1016/j.cmpb.2018.07.012
    BACKGROUND AND OBJECTIVES: Glaucoma is an eye condition which leads to permanent blindness when the disease progresses to an advanced stage. It occurs due to inappropriate intraocular pressure within the eye, resulting in damage to the optic nerve. Glaucoma does not exhibit any symptoms in its nascent stage and thus, it is important to diagnose early to prevent blindness. Fundus photography is widely used by ophthalmologists to assist in diagnosis of glaucoma and is cost-effective.

    METHODS: The morphological features of the disc that is characteristic of glaucoma are clearly seen in the fundus images. However, manual inspection of the acquired fundus images may be prone to inter-observer variation. Therefore, a computer-aided detection (CAD) system is proposed to make an accurate, reliable and fast diagnosis of glaucoma based on the optic nerve features of fundus imaging. In this paper, we reviewed existing techniques to automatically diagnose glaucoma.

    RESULTS: The use of CAD is very effective in the diagnosis of glaucoma and can assist the clinicians to alleviate their workload significantly. We have also discussed the advantages of employing state-of-art techniques, including deep learning (DL), when developing the automated system. The DL methods are effective in glaucoma diagnosis.

    CONCLUSIONS: Novel DL algorithms with big data availability are required to develop a reliable CAD system. Such techniques can be employed to diagnose other eye diseases accurately.

  6. Faust O, Razaghi H, Barika R, Ciaccio EJ, Acharya UR
    Comput Methods Programs Biomed, 2019 Jul;176:81-91.
    PMID: 31200914 DOI: 10.1016/j.cmpb.2019.04.032
    BACKGROUND AND OBJECTIVE: Sleep is an important part of our life. That importance is highlighted by the multitude of health problems which result from sleep disorders. Detecting these sleep disorders requires an accurate interpretation of physiological signals. Prerequisite for this interpretation is an understanding of the way in which sleep stage changes manifest themselves in the signal waveform. With that understanding it is possible to build automated sleep stage scoring systems. Apart from their practical relevance for automating sleep disorder diagnosis, these systems provide a good indication of the amount of sleep stage related information communicated by a specific physiological signal.

    METHODS: This article provides a comprehensive review of automated sleep stage scoring systems, which were created since the year 2000. The systems were developed for Electrocardiogram (ECG), Electroencephalogram (EEG), Electrooculogram (EOG), and a combination of signals.

    RESULTS: Our review shows that all of these signals contain information for sleep stage scoring.

    CONCLUSIONS: The result is important, because it allows us to shift our research focus away from information extraction methods to systemic improvements, such as patient comfort, redundancy, safety and cost.

  7. Yildirim O, Baloglu UB, Tan RS, Ciaccio EJ, Acharya UR
    Comput Methods Programs Biomed, 2019 Jul;176:121-133.
    PMID: 31200900 DOI: 10.1016/j.cmpb.2019.05.004
    BACKGROUND AND OBJECTIVE: For diagnosis of arrhythmic heart problems, electrocardiogram (ECG) signals should be recorded and monitored. The long-term signal records obtained are analyzed by expert cardiologists. Devices such as the Holter monitor have limited hardware capabilities. For improved diagnostic capacity, it would be helpful to detect arrhythmic signals automatically. In this study, a novel approach is presented as a candidate solution for these issues.

    METHODS: A convolutional auto-encoder (CAE) based nonlinear compression structure is implemented to reduce the signal size of arrhythmic beats. Long-short term memory (LSTM) classifiers are employed to automatically recognize arrhythmias using ECG features, which are deeply coded with the CAE network.

    RESULTS: Based upon the coded ECG signals, both storage requirement and classification time were considerably reduced. In experimental studies conducted with the MIT-BIH arrhythmia database, ECG signals were compressed by an average 0.70% percentage root mean square difference (PRD) rate, and an accuracy of over 99.0% was observed.

    CONCLUSIONS: One of the significant contributions of this study is that the proposed approach can significantly reduce time duration when using LSTM networks for data analysis. Thus, a novel and effective approach was proposed for both ECG signal compression, and their high-performance automatic recognition, with very low computational cost.

  8. Abbasian Ardakani A, Bureau NJ, Ciaccio EJ, Acharya UR
    Comput Methods Programs Biomed, 2022 Mar;215:106609.
    PMID: 34990929 DOI: 10.1016/j.cmpb.2021.106609
    Radiomics is a newcomer field that has opened new windows for precision medicine. It is related to extraction of a large number of quantitative features from medical images, which may be difficult to detect visually. Underlying tumor biology can change physical properties of tissues, which affect patterns of image pixels and radiomics features. The main advantage of radiomics is that it can characterize the whole tumor non-invasively, even after a single sampling from an image. Therefore, it can be linked to a "digital biopsy". Physicians need to know about radiomics features to determine how their values correlate with the appearance of lesions and diseases. Indeed, physicians need practical references to conceive of basics and concepts of each radiomics feature without knowing their sophisticated mathematical formulas. In this review, commonly used radiomics features are illustrated with practical examples to help physicians in their routine diagnostic procedures.
  9. Xu S, Deo RC, Soar J, Barua PD, Faust O, Homaira N, et al.
    Comput Methods Programs Biomed, 2023 Nov;241:107746.
    PMID: 37660550 DOI: 10.1016/j.cmpb.2023.107746
    BACKGROUND AND OBJECTIVE: Obstructive airway diseases, including asthma and Chronic Obstructive Pulmonary Disease (COPD), are two of the most common chronic respiratory health problems. Both of these conditions require health professional expertise in making a diagnosis. Hence, this process is time intensive for healthcare providers and the diagnostic quality is subject to intra- and inter- operator variability. In this study we investigate the role of automated detection of obstructive airway diseases to reduce cost and improve diagnostic quality.

    METHODS: We investigated the existing body of evidence and applied Preferred Reporting Items for Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines to search records in IEEE, Google scholar, and PubMed databases. We identified 65 papers that were published from 2013 to 2022 and these papers cover 67 different studies. The review process was structured according to the medical data that was used for disease detection. We identified six main categories, namely air flow, genetic, imaging, signals, and miscellaneous. For each of these categories, we report both disease detection methods and their performance.

    RESULTS: We found that medical imaging was used in 14 of the reviewed studies as data for automated obstructive airway disease detection. Genetics and physiological signals were used in 13 studies. Medical records and air flow were used in 9 and 7 studies, respectively. Most papers were published in 2020 and we found three times more work on Machine Learning (ML) when compared to Deep Learning (DL). Statistical analysis shows that DL techniques achieve higher Accuracy (ACC) when compared to ML. Convolutional Neural Network (CNN) is the most common DL classifier and Support Vector Machine (SVM) is the most widely used ML classifier. During our review, we discovered only two publicly available asthma and COPD datasets. Most studies used private clinical datasets, so data size and data composition are inconsistent.

    CONCLUSIONS: Our review results indicate that Artificial Intelligence (AI) can improve both decision quality and efficiency of health professionals during COPD and asthma diagnosis. However, we found several limitations in this review, such as a lack of dataset consistency, a limited dataset and remote monitoring was not sufficiently explored. We appeal to society to accept and trust computer aided airflow obstructive diseases diagnosis and we encourage health professionals to work closely with AI scientists to promote automated detection in clinical practice and hospital settings.

  10. Elhaj FA, Salim N, Harris AR, Swee TT, Ahmed T
    Comput Methods Programs Biomed, 2016 Apr;127:52-63.
    PMID: 27000289 DOI: 10.1016/j.cmpb.2015.12.024
    Arrhythmia is a cardiac condition caused by abnormal electrical activity of the heart, and an electrocardiogram (ECG) is the non-invasive method used to detect arrhythmias or heart abnormalities. Due to the presence of noise, the non-stationary nature of the ECG signal (i.e. the changing morphology of the ECG signal with respect to time) and the irregularity of the heartbeat, physicians face difficulties in the diagnosis of arrhythmias. The computer-aided analysis of ECG results assists physicians to detect cardiovascular diseases. The development of many existing arrhythmia systems has depended on the findings from linear experiments on ECG data which achieve high performance on noise-free data. However, nonlinear experiments characterize the ECG signal more effectively sense, extract hidden information in the ECG signal, and achieve good performance under noisy conditions. This paper investigates the representation ability of linear and nonlinear features and proposes a combination of such features in order to improve the classification of ECG data. In this study, five types of beat classes of arrhythmia as recommended by the Association for Advancement of Medical Instrumentation are analyzed: non-ectopic beats (N), supra-ventricular ectopic beats (S), ventricular ectopic beats (V), fusion beats (F) and unclassifiable and paced beats (U). The characterization ability of nonlinear features such as high order statistics and cumulants and nonlinear feature reduction methods such as independent component analysis are combined with linear features, namely, the principal component analysis of discrete wavelet transform coefficients. The features are tested for their ability to differentiate different classes of data using different classifiers, namely, the support vector machine and neural network methods with tenfold cross-validation. Our proposed method is able to classify the N, S, V, F and U arrhythmia classes with high accuracy (98.91%) using a combined support vector machine and radial basis function method.
  11. Akhbar MFA
    Comput Methods Programs Biomed, 2023 Apr;231:107361.
    PMID: 36736133 DOI: 10.1016/j.cmpb.2023.107361
    BACKGROUND AND OBJECTIVE: Conventional surgical drill bits suffer from several drawbacks, including extreme heat generation, breakage, jam, and undesired breakthrough. Understanding the impacts of drill margin on bone damage can provide insights that lay the foundation for improvement in the existing surgical drill bit. However, research on drill margins in bone drilling is lacking. This work assesses the influences of margin height and width on thermomechanical damage in bone drilling.

    METHODS: Thermomechanical damage-maximum bone temperature, osteonecrosis diameter, osteonecrosis depth, maximum thrust force, and torque-were calculated using the finite element method under various margin heights (0.05-0.25 mm) and widths (0.02-0.26 mm). The simulation results were validated with experimental tests and previous research data.

    RESULTS: The effect of margin height in increasing the maximum bone temperature, osteonecrosis diameter, and depth were at least 19.1%, 41.9%, and 59.6%, respectively. The thrust force and torque are highly sensitive to margin height. A higher margin height (0.21-0.25 mm) reduced the thrust force by 54.0% but increased drilling torque by 142.2%. The bone temperature, osteonecrosis diameter, and depth were 16.5%, 56.5%, and 81.4% lower, respectively, with increasing margin width. The minimum thrust force (11.1 N) and torque (41.9 Nmm) were produced with the highest margin width (0.26 mm). The margin height of 0.05-0.13 mm and a margin width of 0.22-0.26 produced the highest sum of weightage.

    CONCLUSIONS: A surgical drill bit with a margin height of 0.05-0.13 mm and a margin width of 0.22-0.26 mm can produce minimum thermomechanical damage in cortical bone drilling. The insights regarding the suitable ranges for margin height and width from this study could be adopted in future research devoted to optimizing the margin of the existing surgical drill bit.

  12. Cheong JK, Ooi EH, Chiew YS, Menichetti L, Armanetti P, Franchini MC, et al.
    Comput Methods Programs Biomed, 2023 Mar;230:107363.
    PMID: 36720181 DOI: 10.1016/j.cmpb.2023.107363
    BACKGROUND AND OBJECTIVES: Gold nanorod-assisted photothermal therapy (GNR-PTT) is a cancer treatment whereby GNRs incorporated into the tumour act as photo-absorbers to elevate the thermal destruction effect. In the case of bladder, there are few possible routes to target the tumour with GNRs, namely peri/intra-tumoural injection and intravesical instillation of GNRs. These two approaches lead to different GNR distribution inside the tumour and can affect the treatment outcome.

    METHODOLOGY: The present study investigates the effects of heterogeneous GNR distribution in a typical setup of GNR-PTT. Three cases were considered. Case 1 considered the GNRs at the tumour centre, while Case 2 represents a hypothetical scenario where GNRs are distributed at the tumour periphery; these two cases represent intratumoural accumulation with different degree of GNR spread inside the tumour. Case 3 is achieved when GNRs target the exposed tumoural surface that is invading the bladder wall, when they are delivered by intravesical instillation.

    RESULTS: Results indicate that for a laser power of 0.6 W and GNR volume fraction of 0.01%, Case 2 and 3 were successful in achieving complete tumour eradication after 330 and 470 s of laser irradiation, respectively. Case 1 failed to form complete tumour damage when the GNRs are concentrated at the tumour centre but managed to produce complete tumour damage if the spread of GNRs is wider. Results from Case 2 also demonstrated a different heating profile from Case 1, suggesting that thermal ablation during GNR-PTT is dependant on the GNRs distribution inside the tumour. Case 3 shows similar results to Case 2 whereby gradual but uniform heating is observed. Cases 2 and 3 show that uniformly heating the tumour can reduce damage to the surrounding tissues.

    CONCLUSIONS: Different GNR distribution associated with the different methods of introducing GNRs to the bladder during GNR-PTT affect the treatment outcome of bladder cancer in mice. Insufficient spreading during intratumoural injection of GNRs can render the treatment ineffective, while administered via intravesical instillation. GNR distribution achieved through intravesical instillation present some advantages over intratumoural injection and is worthy of further exploration.

  13. Mak NL, Ng WH, Ooi EH, Lau EV, Pamidi N, Foo JJ, et al.
    Comput Methods Programs Biomed, 2024 Jan;243:107866.
    PMID: 37865059 DOI: 10.1016/j.cmpb.2023.107866
    BACKGROUND AND OBJECTIVES: Thermochemical ablation (TCA) is a cancer treatment that utilises the heat released from the neutralisation of acid and base to raise tissue temperature to levels sufficient to induce thermal coagulation. Computational studies have demonstrated that the coagulation volume produced by sequential injection is smaller than that with simultaneous injection. By injecting the reagents in an ensuing manner, the region of contact between acid and base is limited to a thin contact layer sandwiched between the distribution of acid and base. It is hypothesised that increasing the frequency of acid-base injections into the tissue by shortening the injection interval for each reagent can increase the effective area of contact between acid and base, thereby intensifying neutralisation and the exothermic heat released into the tissue.

    METHODS: To verify this hypothesis, a computational model was developed to simulate the thermochemical processes involved during TCA with sequential injection. Four major processes that take place during TCA were considered, i.e., the flow of acid and base, their neutralisation, the release of exothermic heat and the formation of thermal damage inside the tissue. Equimolar acid and base at 7.5 M was injected into the tissue intermittently. Six injection intervals, namely 3, 6, 15, 20, 30 and 60 s were investigated.

    RESULTS: Shortening of the injection interval led to the enlargement of coagulation volume. If one considers only the coagulation volume as the determining factor, then a 15 s injection interval was found to be optimum. Conversely, if one places priority on safety, then a 3 s injection interval would result in the lowest amount of reagent residue inside the tissue after treatment. With a 3 s injection interval, the coagulation volume was found to be larger than that of simultaneous injection with the same treatment parameters. Not only that, the volume also surpassed that of radiofrequency ablation (RFA); a conventional thermal ablation technique commonly used for liver cancer treatment.

    CONCLUSION: The numerical results verified the hypothesis that shortening the injection interval will lead to the formation of larger thermal coagulation zone during TCA with sequential injection. More importantly, a 3 s injection interval was found to be optimum for both efficacy (large coagulation volume) and safety (least amount of reagent residue).

  14. Alsalem MA, Zaidan AA, Zaidan BB, Hashim M, Madhloom HT, Azeez ND, et al.
    Comput Methods Programs Biomed, 2018 May;158:93-112.
    PMID: 29544792 DOI: 10.1016/j.cmpb.2018.02.005
    CONTEXT: Acute leukaemia diagnosis is a field requiring automated solutions, tools and methods and the ability to facilitate early detection and even prediction. Many studies have focused on the automatic detection and classification of acute leukaemia and their subtypes to promote enable highly accurate diagnosis.

    OBJECTIVE: This study aimed to review and analyse literature related to the detection and classification of acute leukaemia. The factors that were considered to improve understanding on the field's various contextual aspects in published studies and characteristics were motivation, open challenges that confronted researchers and recommendations presented to researchers to enhance this vital research area.

    METHODS: We systematically searched all articles about the classification and detection of acute leukaemia, as well as their evaluation and benchmarking, in three main databases: ScienceDirect, Web of Science and IEEE Xplore from 2007 to 2017. These indices were considered to be sufficiently extensive to encompass our field of literature.

    RESULTS: Based on our inclusion and exclusion criteria, 89 articles were selected. Most studies (58/89) focused on the methods or algorithms of acute leukaemia classification, a number of papers (22/89) covered the developed systems for the detection or diagnosis of acute leukaemia and few papers (5/89) presented evaluation and comparative studies. The smallest portion (4/89) of articles comprised reviews and surveys.

    DISCUSSION: Acute leukaemia diagnosis, which is a field requiring automated solutions, tools and methods, entails the ability to facilitate early detection or even prediction. Many studies have been performed on the automatic detection and classification of acute leukaemia and their subtypes to promote accurate diagnosis.

    CONCLUSIONS: Research areas on medical-image classification vary, but they are all equally vital. We expect this systematic review to help emphasise current research opportunities and thus extend and create additional research fields.

  15. Abidemi A, Aziz NAB
    Comput Methods Programs Biomed, 2020 Nov;196:105585.
    PMID: 32554024 DOI: 10.1016/j.cmpb.2020.105585
    Background Dengue is a vector-borne viral disease endemic in Malaysia. The disease is presently a public health issue in the country. Hence, the use of mathematical model to gain insights into the transmission dynamics and derive the optimal control strategies for minimizing the spread of the disease is of great importance. Methods A model involving eight mutually exclusive compartments with the introduction of personal protection, larvicide and adulticide control strategies describing dengue fever transmission dynamics is presented. The control-induced basic reproduction number (R˜0) related to the model is computed using the next generation matrix method. Comparison theorem is used to analyse the global dynamics of the model. The model is fitted to the data related to the 2012 dengue outbreak in Johor, Malaysia, using the least-squares method. In a bid to optimally curtail dengue fever propagation, we apply optimal control theory to investigate the effect of several control strategies of combination of optimal personal protection, larvicide and adulticide controls on dengue fever dynamics. The resulting optimality system is simulated in MATLAB using fourth order Runge-Kutta scheme based on the forward-backward sweep method. In addition, cost-effectiveness analysis is performed to determine the most cost-effective strategy among the various control strategies analysed. Results Analysis of the model with control parameters shows that the model has two disease-free equilibria, namely, trivial equilibrium and biologically realistic disease-free equilibrium, and one endemic equilibrium point. It also reveals that the biologically realistic disease-free equilibrium is both locally and globally asymptotically stable whenever the inequality R˜0<1holds. In the case of model with time-dependent control functions, the optimality levels of the three control functions required to optimally control dengue disease transmission are derived. Conclusion We conclude that dengue fever transmission can be curtailed by adopting any of the several control strategies analysed in this study. Furthermore, a strategy which combines personal protection and adulticide controls is found to be the most cost-effective control strategy.
  16. Ahmad M, Jung LT, Bhuiyan AA
    Comput Methods Programs Biomed, 2017 Oct;149:11-17.
    PMID: 28802326 DOI: 10.1016/j.cmpb.2017.06.021
    BACKGROUND AND OBJECTIVE: Digital signal processing techniques commonly employ fixed length window filters to process the signal contents. DNA signals differ in characteristics from common digital signals since they carry nucleotides as contents. The nucleotides own genetic code context and fuzzy behaviors due to their special structure and order in DNA strand. Employing conventional fixed length window filters for DNA signal processing produce spectral leakage and hence results in signal noise. A biological context aware adaptive window filter is required to process the DNA signals.

    METHODS: This paper introduces a biological inspired fuzzy adaptive window median filter (FAWMF) which computes the fuzzy membership strength of nucleotides in each slide of window and filters nucleotides based on median filtering with a combination of s-shaped and z-shaped filters. Since coding regions cause 3-base periodicity by an unbalanced nucleotides' distribution producing a relatively high bias for nucleotides' usage, such fundamental characteristic of nucleotides has been exploited in FAWMF to suppress the signal noise.

    RESULTS: Along with adaptive response of FAWMF, a strong correlation between median nucleotides and the Π shaped filter was observed which produced enhanced discrimination between coding and non-coding regions contrary to fixed length conventional window filters. The proposed FAWMF attains a significant enhancement in coding regions identification i.e. 40% to 125% as compared to other conventional window filters tested over more than 250 benchmarked and randomly taken DNA datasets of different organisms.

    CONCLUSION: This study proves that conventional fixed length window filters applied to DNA signals do not achieve significant results since the nucleotides carry genetic code context. The proposed FAWMF algorithm is adaptive and outperforms significantly to process DNA signal contents. The algorithm applied to variety of DNA datasets produced noteworthy discrimination between coding and non-coding regions contrary to fixed window length conventional filters.

  17. Pang T, Wong JHD, Ng WL, Chan CS
    Comput Methods Programs Biomed, 2021 May;203:106018.
    PMID: 33714900 DOI: 10.1016/j.cmpb.2021.106018
    BACKGROUND AND OBJECTIVE: The capability of deep learning radiomics (DLR) to extract high-level medical imaging features has promoted the use of computer-aided diagnosis of breast mass detected on ultrasound. Recently, generative adversarial network (GAN) has aided in tackling a general issue in DLR, i.e., obtaining a sufficient number of medical images. However, GAN methods require a pair of input and labeled images, which require an exhaustive human annotation process that is very time-consuming. The aim of this paper is to develop a radiomics model based on a semi-supervised GAN method to perform data augmentation in breast ultrasound images.

    METHODS: A total of 1447 ultrasound images, including 767 benign masses and 680 malignant masses were acquired from a tertiary hospital. A semi-supervised GAN model was developed to augment the breast ultrasound images. The synthesized images were subsequently used to classify breast masses using a convolutional neural network (CNN). The model was validated using a 5-fold cross-validation method.

    RESULTS: The proposed GAN architecture generated high-quality breast ultrasound images, verified by two experienced radiologists. The improved performance of semi-supervised learning increased the quality of the synthetic data produced in comparison to the baseline method. We achieved more accurate breast mass classification results (accuracy 90.41%, sensitivity 87.94%, specificity 85.86%) with our synthetic data augmentation compared to other state-of-the-art methods.

    CONCLUSION: The proposed radiomics model has demonstrated a promising potential to synthesize and classify breast masses on ultrasound in a semi-supervised manner.

  18. Mohd Faizal AS, Thevarajah TM, Khor SM, Chang SW
    Comput Methods Programs Biomed, 2021 Aug;207:106190.
    PMID: 34077865 DOI: 10.1016/j.cmpb.2021.106190
    Cardiovascular disease (CVD) is the leading cause of death worldwide and is a global health issue. Traditionally, statistical models are used commonly in the risk prediction and assessment of CVD. However, the adoption of artificial intelligent (AI) approach is rapidly taking hold in the current era of technology to evaluate patient risks and predict the outcome of CVD. In this review, we outline various conventional risk scores and prediction models and do a comparison with the AI approach. The strengths and limitations of both conventional and AI approaches are discussed. Besides that, biomarker discovery related to CVD are also elucidated as the biomarkers can be used in the risk stratification as well as early detection of the disease. Moreover, problems and challenges involved in current CVD studies are explored. Lastly, future prospects of CVD risk prediction and assessment in the multi-modality of big data integrative approaches are proposed.
  19. Damanhuri NS, Chiew YS, Othman NA, Docherty PD, Pretty CG, Shaw GM, et al.
    Comput Methods Programs Biomed, 2016 Jul;130:175-85.
    PMID: 27208532 DOI: 10.1016/j.cmpb.2016.03.025
    BACKGROUND: Respiratory system modelling can aid clinical decision making during mechanical ventilation (MV) in intensive care. However, spontaneous breathing (SB) efforts can produce entrained "M-wave" airway pressure waveforms that inhibit identification of accurate values for respiratory system elastance and airway resistance. A pressure wave reconstruction method is proposed to accurately identify respiratory mechanics, assess the level of SB effort, and quantify the incidence of SB effort without uncommon measuring devices or interruption to care.

    METHODS: Data from 275 breaths aggregated from all mechanically ventilated patients at Christchurch Hospital were used in this study. The breath specific respiratory elastance is calculated using a time-varying elastance model. A pressure reconstruction method is proposed to reconstruct pressure waves identified as being affected by SB effort. The area under the curve of the time-varying respiratory elastance (AUC Edrs) are calculated and compared, where unreconstructed waves yield lower AUC Edrs. The difference between the reconstructed and unreconstructed pressure is denoted as a surrogate measure of SB effort.

    RESULTS: The pressure reconstruction method yielded a median AUC Edrs of 19.21 [IQR: 16.30-22.47]cmH2Os/l. In contrast, the median AUC Edrs for unreconstructed M-wave data was 20.41 [IQR: 16.68-22.81]cmH2Os/l. The pressure reconstruction method had the least variability in AUC Edrs assessed by the robust coefficient of variation (RCV)=0.04 versus 0.05 for unreconstructed data. Each patient exhibited different levels of SB effort, independent from MV setting, indicating the need for non-invasive, real time assessment of SB effort.

    CONCLUSION: A simple reconstruction method enables more consistent real-time estimation of the true, underlying respiratory system mechanics of a SB patient and provides the surrogate of SB effort, which may be clinically useful for clinicians in determining optimal ventilator settings to improve patient care.

  20. Redmond DP, Chiew YS, Major V, Chase JG
    Comput Methods Programs Biomed, 2019 Apr;171:67-79.
    PMID: 27697371 DOI: 10.1016/j.cmpb.2016.09.011
    Monitoring of respiratory mechanics is required for guiding patient-specific mechanical ventilation settings in critical care. Many models of respiratory mechanics perform poorly in the presence of variable patient effort. Typical modelling approaches either attempt to mitigate the effect of the patient effort on the airway pressure waveforms, or attempt to capture the size and shape of the patient effort. This work analyses a range of methods to identify respiratory mechanics in volume controlled ventilation modes when there is patient effort. The models are compared using 4 Datasets, each with a sample of 30 breaths before, and 2-3 minutes after sedation has been administered. The sedation will reduce patient efforts, but the underlying pulmonary mechanical properties are unlikely to change during this short time. Model identified parameters from breathing cycles with patient effort are compared to breathing cycles that do not have patient effort. All models have advantages and disadvantages, so model selection may be specific to the respiratory mechanics application. However, in general, the combined method of iterative interpolative pressure reconstruction, and stacking multiple consecutive breaths together has the best performance over the Dataset. The variability of identified elastance when there is patient effort is the lowest with this method, and there is little systematic offset in identified mechanics when sedation is administered.
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links