Displaying publications 1 - 20 of 85 in total

Abstract:
Sort:
  1. Ahmad M, Jung LT, Bhuiyan AA
    Comput Methods Programs Biomed, 2017 Oct;149:11-17.
    PMID: 28802326 DOI: 10.1016/j.cmpb.2017.06.021
    BACKGROUND AND OBJECTIVE: Digital signal processing techniques commonly employ fixed length window filters to process the signal contents. DNA signals differ in characteristics from common digital signals since they carry nucleotides as contents. The nucleotides own genetic code context and fuzzy behaviors due to their special structure and order in DNA strand. Employing conventional fixed length window filters for DNA signal processing produce spectral leakage and hence results in signal noise. A biological context aware adaptive window filter is required to process the DNA signals.

    METHODS: This paper introduces a biological inspired fuzzy adaptive window median filter (FAWMF) which computes the fuzzy membership strength of nucleotides in each slide of window and filters nucleotides based on median filtering with a combination of s-shaped and z-shaped filters. Since coding regions cause 3-base periodicity by an unbalanced nucleotides' distribution producing a relatively high bias for nucleotides' usage, such fundamental characteristic of nucleotides has been exploited in FAWMF to suppress the signal noise.

    RESULTS: Along with adaptive response of FAWMF, a strong correlation between median nucleotides and the Π shaped filter was observed which produced enhanced discrimination between coding and non-coding regions contrary to fixed length conventional window filters. The proposed FAWMF attains a significant enhancement in coding regions identification i.e. 40% to 125% as compared to other conventional window filters tested over more than 250 benchmarked and randomly taken DNA datasets of different organisms.

    CONCLUSION: This study proves that conventional fixed length window filters applied to DNA signals do not achieve significant results since the nucleotides carry genetic code context. The proposed FAWMF algorithm is adaptive and outperforms significantly to process DNA signal contents. The algorithm applied to variety of DNA datasets produced noteworthy discrimination between coding and non-coding regions contrary to fixed window length conventional filters.

  2. Cheong JKK, Yap S, Ooi ET, Ooi EH
    Comput Methods Programs Biomed, 2019 Jul;176:17-32.
    PMID: 31200904 DOI: 10.1016/j.cmpb.2019.04.028
    BACKGROUND AND OBJECTIVES: Recently, there have been calls for RFA to be implemented in the bipolar mode for cancer treatment due to the benefits it offers over the monopolar mode. These include the ability to prevent skin burns at the grounding pad and to avoid tumour track seeding. The usage of bipolar RFA in clinical practice remains uncommon however, as not many research studies have been carried out on bipolar RFA. As such, there is still uncertainty in understanding the effects of the different RF probe configurations on the treatment outcome of RFA. This paper demonstrates that the electrode lengths have a strong influence on the mechanics of bipolar RFA. The information obtained here may lead to further optimization of the system for subsequent uses in the hospitals.

    METHODS: A 2D model in the axisymmetric coordinates was developed to simulate the electro-thermophysiological responses of the tissue during a single probe bipolar RFA. Two different probe configurations were considered, namely the configuration where the active electrode is longer than the ground and the configuration where the ground electrode is longer than the active. The mathematical model was first verified with an existing experimental study found in the literature.

    RESULTS: Results from the simulations showed that heating is confined only to the region around the shorter electrode, regardless of whether the shorter electrode is the active or the ground. Consequently, thermal coagulation also occurs in the region surrounding the shorter electrode. This opened up the possibility for a better customized treatment through the development of RF probes with adjustable electrode lengths.

    CONCLUSIONS: The electrode length was found to play a significant role on the outcome of single probe bipolar RFA. In particular, the length of the shorter electrode becomes the limiting factor that influences the mechanics of single probe bipolar RFA. Results from this study can be used to further develop and optimize bipolar RFA as an effective and reliable cancer treatment technique.

  3. Yildirim O, Baloglu UB, Tan RS, Ciaccio EJ, Acharya UR
    Comput Methods Programs Biomed, 2019 Jul;176:121-133.
    PMID: 31200900 DOI: 10.1016/j.cmpb.2019.05.004
    BACKGROUND AND OBJECTIVE: For diagnosis of arrhythmic heart problems, electrocardiogram (ECG) signals should be recorded and monitored. The long-term signal records obtained are analyzed by expert cardiologists. Devices such as the Holter monitor have limited hardware capabilities. For improved diagnostic capacity, it would be helpful to detect arrhythmic signals automatically. In this study, a novel approach is presented as a candidate solution for these issues.

    METHODS: A convolutional auto-encoder (CAE) based nonlinear compression structure is implemented to reduce the signal size of arrhythmic beats. Long-short term memory (LSTM) classifiers are employed to automatically recognize arrhythmias using ECG features, which are deeply coded with the CAE network.

    RESULTS: Based upon the coded ECG signals, both storage requirement and classification time were considerably reduced. In experimental studies conducted with the MIT-BIH arrhythmia database, ECG signals were compressed by an average 0.70% percentage root mean square difference (PRD) rate, and an accuracy of over 99.0% was observed.

    CONCLUSIONS: One of the significant contributions of this study is that the proposed approach can significantly reduce time duration when using LSTM networks for data analysis. Thus, a novel and effective approach was proposed for both ECG signal compression, and their high-performance automatic recognition, with very low computational cost.

  4. Ibrahim RW, Hasan AM, Jalab HA
    Comput Methods Programs Biomed, 2018 Sep;163:21-28.
    PMID: 30119853 DOI: 10.1016/j.cmpb.2018.05.031
    BACKGROUND AND OBJECTIVES: The MRI brain tumors segmentation is challenging due to variations in terms of size, shape, location and features' intensity of the tumor. Active contour has been applied in MRI scan image segmentation due to its ability to produce regions with boundaries. The main difficulty that encounters the active contour segmentation is the boundary tracking which is controlled by minimization of energy function for segmentation. Hence, this study proposes a novel fractional Wright function (FWF) as a minimization of energy technique to improve the performance of active contour without edge method.

    METHOD: In this study, we implement FWF as an energy minimization function to replace the standard gradient-descent method as minimization function in Chan-Vese segmentation technique. The proposed FWF is used to find the boundaries of an object by controlling the inside and outside values of the contour. In this study, the objective evaluation is used to distinguish the differences between the processed segmented images and ground truth using a set of statistical parameters; true positive, true negative, false positive, and false negative.

    RESULTS: The FWF as a minimization of energy was successfully implemented on BRATS 2013 image dataset. The achieved overall average sensitivity score of the brain tumors segmentation was 94.8 ± 4.7%.

    CONCLUSIONS: The results demonstrate that the proposed FWF method minimized the energy function more than the gradient-decent method that was used in the original three-dimensional active contour without edge (3DACWE) method.

  5. Hariharan M, Polat K, Sindhu R
    Comput Methods Programs Biomed, 2014 Mar;113(3):904-13.
    PMID: 24485390 DOI: 10.1016/j.cmpb.2014.01.004
    Elderly people are commonly affected by Parkinson's disease (PD) which is one of the most common neurodegenerative disorders due to the loss of dopamine-producing brain cells. People with PD's (PWP) may have difficulty in walking, talking or completing other simple tasks. Variety of medications is available to treat PD. Recently, researchers have found that voice signals recorded from the PWP is becoming a useful tool to differentiate them from healthy controls. Several dysphonia features, feature reduction/selection techniques and classification algorithms were proposed by researchers in the literature to detect PD. In this paper, hybrid intelligent system is proposed which includes feature pre-processing using Model-based clustering (Gaussian mixture model), feature reduction/selection using principal component analysis (PCA), linear discriminant analysis (LDA), sequential forward selection (SFS) and sequential backward selection (SBS), and classification using three supervised classifiers such as least-square support vector machine (LS-SVM), probabilistic neural network (PNN) and general regression neural network (GRNN). PD dataset was used from University of California-Irvine (UCI) machine learning database. The strength of the proposed method has been evaluated through several performance measures. The experimental results show that the combination of feature pre-processing, feature reduction/selection methods and classification gives a maximum classification accuracy of 100% for the Parkinson's dataset.
  6. Abdar M, Książek W, Acharya UR, Tan RS, Makarenkov V, Pławiak P
    Comput Methods Programs Biomed, 2019 Oct;179:104992.
    PMID: 31443858 DOI: 10.1016/j.cmpb.2019.104992
    BACKGROUND AND OBJECTIVE: Coronary artery disease (CAD) is one of the commonest diseases around the world. An early and accurate diagnosis of CAD allows a timely administration of appropriate treatment and helps to reduce the mortality. Herein, we describe an innovative machine learning methodology that enables an accurate detection of CAD and apply it to data collected from Iranian patients.

    METHODS: We first tested ten traditional machine learning algorithms, and then the three-best performing algorithms (three types of SVM) were used in the rest of the study. To improve the performance of these algorithms, a data preprocessing with normalization was carried out. Moreover, a genetic algorithm and particle swarm optimization, coupled with stratified 10-fold cross-validation, were used twice: for optimization of classifier parameters and for parallel selection of features.

    RESULTS: The presented approach enhanced the performance of all traditional machine learning algorithms used in this study. We also introduced a new optimization technique called N2Genetic optimizer (a new genetic training). Our experiments demonstrated that N2Genetic-nuSVM provided the accuracy of 93.08% and F1-score of 91.51% when predicting CAD outcomes among the patients included in a well-known Z-Alizadeh Sani dataset. These results are competitive and comparable to the best results in the field.

    CONCLUSIONS: We showed that machine-learning techniques optimized by the proposed approach, can lead to highly accurate models intended for both clinical and research use.

  7. Ibrahim F, Taib MN, Abas WA, Guan CC, Sulaiman S
    Comput Methods Programs Biomed, 2005 Sep;79(3):273-81.
    PMID: 15925426
    Dengue fever (DF) is an acute febrile viral disease frequently presented with headache, bone or joint and muscular pains, and rash. A significant percentage of DF patients develop a more severe form of disease, known as dengue haemorrhagic fever (DHF). DHF is the complication of DF. The main pathophysiology of DHF is the development of plasma leakage from the capillary, resulting in haemoconcentration, ascites, and pleural effusion that may lead to shock following defervescence of fever. Therefore, accurate prediction of the day of defervescence of fever is critical for clinician to decide on patient management strategy. To date, no known literature describes of any attempt to predict the day of defervescence of fever in DF patients. This paper describes a non-invasive prediction system for predicting the day of defervescence of fever in dengue patients using artificial neural network. The developed system bases its prediction solely on the clinical symptoms and signs and uses the multilayer feed-forward neural networks (MFNN). The results show that the proposed system is able to predict the day of defervescence in dengue patients with 90% prediction accuracy.
  8. Khan DM, Yahya N, Kamel N, Faye I
    Comput Methods Programs Biomed, 2023 Jan;228:107242.
    PMID: 36423484 DOI: 10.1016/j.cmpb.2022.107242
    BACKGROUND AND OBJECTIVE: Brain connectivity plays a pivotal role in understanding the brain's information processing functions by providing various details including magnitude, direction, and temporal dynamics of inter-neuron connections. While the connectivity may be classified as structural, functional and causal, a complete in-vivo directional analysis is guaranteed by the latter and is referred to as Effective Connectivity (EC). Two most widely used EC techniques are Directed Transfer Function (DTF) and Partial Directed Coherence (PDC) which are based on multivariate autoregressive models. The drawbacks of these techniques include poor frequency resolution and the requirement for experimental approach to determine signal normalization and thresholding techniques in identifying significant connectivities between multivariate sources.

    METHODS: In this study, the drawbacks of DTF and PDC are addressed by proposing a novel technique, termed as Efficient Effective Connectivity (EEC), for the estimation of EC between multivariate sources using AR spectral estimation and Granger causality principle. In EEC, a linear predictive filter with AR coefficients obtained via multivariate EEG is used for signal prediction. This leads to the estimation of full-length signals which are then transformed into frequency domain by using Burg spectral estimation method. Furthermore, the newly proposed normalization method addressed the effect on each source in EEC using the sum of maximum connectivity values over the entire frequency range. Lastly, the proposed dynamic thresholding works by subtracting the first moment of causal effects of all the sources on one source from individual connections present for that source.

    RESULTS: The proposed method is evaluated using synthetic and real resting-state EEG of 46 healthy controls. A 3D-Convolutional Neural Network is trained and tested using the PDC and EEC samples. The result indicates that compared to PDC, EEC improves the EEG eye-state classification accuracy, sensitivity and specificity by 5.57%, 3.15% and 8.74%, respectively.

    CONCLUSION: Correct identification of all connections in synthetic data and improved resting-state classification performance using EEC proved that EEC gives better estimation of directed causality and indicates that it can be used for reliable understanding of brain mechanisms. Conclusively, the proposed technique may open up new research dimensions for clinical diagnosis of mental disorders.

  9. Faust O, Razaghi H, Barika R, Ciaccio EJ, Acharya UR
    Comput Methods Programs Biomed, 2019 Jul;176:81-91.
    PMID: 31200914 DOI: 10.1016/j.cmpb.2019.04.032
    BACKGROUND AND OBJECTIVE: Sleep is an important part of our life. That importance is highlighted by the multitude of health problems which result from sleep disorders. Detecting these sleep disorders requires an accurate interpretation of physiological signals. Prerequisite for this interpretation is an understanding of the way in which sleep stage changes manifest themselves in the signal waveform. With that understanding it is possible to build automated sleep stage scoring systems. Apart from their practical relevance for automating sleep disorder diagnosis, these systems provide a good indication of the amount of sleep stage related information communicated by a specific physiological signal.

    METHODS: This article provides a comprehensive review of automated sleep stage scoring systems, which were created since the year 2000. The systems were developed for Electrocardiogram (ECG), Electroencephalogram (EEG), Electrooculogram (EOG), and a combination of signals.

    RESULTS: Our review shows that all of these signals contain information for sleep stage scoring.

    CONCLUSIONS: The result is important, because it allows us to shift our research focus away from information extraction methods to systemic improvements, such as patient comfort, redundancy, safety and cost.

  10. Al-Qaysi ZT, Zaidan BB, Zaidan AA, Suzani MS
    Comput Methods Programs Biomed, 2018 Oct;164:221-237.
    PMID: 29958722 DOI: 10.1016/j.cmpb.2018.06.012
    CONTEXT: Intelligent wheelchair technology has recently been utilised to address several mobility problems. Techniques based on brain-computer interface (BCI) are currently used to develop electric wheelchairs. Using human brain control in wheelchairs for people with disability has elicited widespread attention due to its flexibility.

    OBJECTIVE: This study aims to determine the background of recent studies on wheelchair control based on BCI for disability and map the literature survey into a coherent taxonomy. The study intends to identify the most important aspects in this emerging field as an impetus for using BCI for disability in electric-powered wheelchair (EPW) control, which remains a challenge. The study also attempts to provide recommendations for solving other existing limitations and challenges.

    METHODS: We systematically searched all articles about EPW control based on BCI for disability in three popular databases: ScienceDirect, IEEE and Web of Science. These databases contain numerous articles that considerably influenced this field and cover most of the relevant theoretical and technical issues.

    RESULTS: We selected 100 articles on the basis of our inclusion and exclusion criteria. A large set of articles (55) discussed on developing real-time wheelchair control systems based on BCI for disability signals. Another set of articles (25) focused on analysing BCI for disability signals for wheelchair control. The third set of articles (14) considered the simulation of wheelchair control based on BCI for disability signals. Four articles designed a framework for wheelchair control based on BCI for disability signals. Finally, one article reviewed concerns regarding wheelchair control based on BCI for disability signals.

    DISCUSSION: Since 2007, researchers have pursued the possibility of using BCI for disability in EPW control through different approaches. Regardless of type, articles have focused on addressing limitations that impede the full efficiency of BCI for disability and recommended solutions for these limitations.

    CONCLUSIONS: Studies on wheelchair control based on BCI for disability considerably influence society due to the large number of people with disability. Therefore, we aim to provide researchers and developers with a clear understanding of this platform and highlight the challenges and gaps in the current and future studies.

  11. Faizal WM, Ghazali NNN, Badruddin IA, Zainon MZ, Yazid AA, Ali MAB, et al.
    Comput Methods Programs Biomed, 2019 Oct;180:105036.
    PMID: 31430594 DOI: 10.1016/j.cmpb.2019.105036
    Obstructive sleep apnea is one of the most common breathing disorders. Undiagnosed sleep apnea is a hidden health crisis to the patient and it could raise the risk of heart diseases, high blood pressure, depression and diabetes. The throat muscle (i.e., tongue and soft palate) relax narrows the airway and causes the blockage of the airway in breathing. To understand this phenomenon computational fluid dynamics method has emerged as a handy tool to conduct the modeling and analysis of airflow characteristics. The comprehensive fluid-structure interaction method provides the realistic visualization of the airflow and interaction with the throat muscle. Thus, this paper reviews the scientific work related to the fluid-structure interaction (FSI) for the evaluation of obstructive sleep apnea, using computational techniques. In total 102 articles were analyzed, each article was evaluated based on the elements related with fluid-structure interaction of sleep apnea via computational techniques. In this review, the significance of FSI for the evaluation of obstructive sleep apnea has been critically examined. Then the flow properties, boundary conditions and validation of the model are given due consideration to present a broad perspective of CFD being applied to study sleep apnea. Finally, the challenges of FSI simulation methods are also highlighted in this article.
  12. Mohd Faizal AS, Thevarajah TM, Khor SM, Chang SW
    Comput Methods Programs Biomed, 2021 Aug;207:106190.
    PMID: 34077865 DOI: 10.1016/j.cmpb.2021.106190
    Cardiovascular disease (CVD) is the leading cause of death worldwide and is a global health issue. Traditionally, statistical models are used commonly in the risk prediction and assessment of CVD. However, the adoption of artificial intelligent (AI) approach is rapidly taking hold in the current era of technology to evaluate patient risks and predict the outcome of CVD. In this review, we outline various conventional risk scores and prediction models and do a comparison with the AI approach. The strengths and limitations of both conventional and AI approaches are discussed. Besides that, biomarker discovery related to CVD are also elucidated as the biomarkers can be used in the risk stratification as well as early detection of the disease. Moreover, problems and challenges involved in current CVD studies are explored. Lastly, future prospects of CVD risk prediction and assessment in the multi-modality of big data integrative approaches are proposed.
  13. Alsalem MA, Zaidan AA, Zaidan BB, Hashim M, Madhloom HT, Azeez ND, et al.
    Comput Methods Programs Biomed, 2018 May;158:93-112.
    PMID: 29544792 DOI: 10.1016/j.cmpb.2018.02.005
    CONTEXT: Acute leukaemia diagnosis is a field requiring automated solutions, tools and methods and the ability to facilitate early detection and even prediction. Many studies have focused on the automatic detection and classification of acute leukaemia and their subtypes to promote enable highly accurate diagnosis.

    OBJECTIVE: This study aimed to review and analyse literature related to the detection and classification of acute leukaemia. The factors that were considered to improve understanding on the field's various contextual aspects in published studies and characteristics were motivation, open challenges that confronted researchers and recommendations presented to researchers to enhance this vital research area.

    METHODS: We systematically searched all articles about the classification and detection of acute leukaemia, as well as their evaluation and benchmarking, in three main databases: ScienceDirect, Web of Science and IEEE Xplore from 2007 to 2017. These indices were considered to be sufficiently extensive to encompass our field of literature.

    RESULTS: Based on our inclusion and exclusion criteria, 89 articles were selected. Most studies (58/89) focused on the methods or algorithms of acute leukaemia classification, a number of papers (22/89) covered the developed systems for the detection or diagnosis of acute leukaemia and few papers (5/89) presented evaluation and comparative studies. The smallest portion (4/89) of articles comprised reviews and surveys.

    DISCUSSION: Acute leukaemia diagnosis, which is a field requiring automated solutions, tools and methods, entails the ability to facilitate early detection or even prediction. Many studies have been performed on the automatic detection and classification of acute leukaemia and their subtypes to promote accurate diagnosis.

    CONCLUSIONS: Research areas on medical-image classification vary, but they are all equally vital. We expect this systematic review to help emphasise current research opportunities and thus extend and create additional research fields.

  14. Palaniappan R, Sundaraj K, Sundaraj S
    Comput Methods Programs Biomed, 2017 Jul;145:67-72.
    PMID: 28552127 DOI: 10.1016/j.cmpb.2017.04.013
    BACKGROUND: The monitoring of the respiratory rate is vital in several medical conditions, including sleep apnea because patients with sleep apnea exhibit an irregular respiratory rate compared with controls. Therefore, monitoring the respiratory rate by detecting the different breath phases is crucial.

    OBJECTIVES: This study aimed to segment the breath cycles from pulmonary acoustic signals using the newly developed adaptive neuro-fuzzy inference system (ANFIS) based on breath phase detection and to subsequently evaluate the performance of the system.

    METHODS: The normalised averaged power spectral density for each segment was fuzzified, and a set of fuzzy rules was formulated. The ANFIS was developed to detect the breath phases and subsequently perform breath cycle segmentation. To evaluate the performance of the proposed method, the root mean square error (RMSE) and correlation coefficient values were calculated and analysed, and the proposed method was then validated using data collected at KIMS Hospital and the RALE standard dataset.

    RESULTS: The analysis of the correlation coefficient of the neuro-fuzzy model, which was performed to evaluate its performance, revealed a correlation strength of r = 0.9925, and the RMSE for the neuro-fuzzy model was found to equal 0.0069.

    CONCLUSION: The proposed neuro-fuzzy model performs better than the fuzzy inference system (FIS) in detecting the breath phases and segmenting the breath cycles and requires less rules than FIS.

  15. Sidibé D, Sankar S, Lemaître G, Rastgoo M, Massich J, Cheung CY, et al.
    Comput Methods Programs Biomed, 2017 Feb;139:109-117.
    PMID: 28187882 DOI: 10.1016/j.cmpb.2016.11.001
    This paper proposes a method for automatic classification of spectral domain OCT data for the identification of patients with retinal diseases such as Diabetic Macular Edema (DME). We address this issue as an anomaly detection problem and propose a method that not only allows the classification of the OCT volume, but also allows the identification of the individual diseased B-scans inside the volume. Our approach is based on modeling the appearance of normal OCT images with a Gaussian Mixture Model (GMM) and detecting abnormal OCT images as outliers. The classification of an OCT volume is based on the number of detected outliers. Experimental results with two different datasets show that the proposed method achieves a sensitivity and a specificity of 80% and 93% on the first dataset, and 100% and 80% on the second one. Moreover, the experiments show that the proposed method achieves better classification performance than other recently published works.
  16. Saleh MD, Eswaran C
    Comput Methods Programs Biomed, 2012 Oct;108(1):186-96.
    PMID: 22551841 DOI: 10.1016/j.cmpb.2012.03.004
    Diabetic retinopathy (DR) has become a serious threat in our society, which causes 45% of the legal blindness in diabetes patients. Early detection as well as the periodic screening of DR helps in reducing the progress of this disease and in preventing the subsequent loss of visual capability. This paper provides an automated diagnosis system for DR integrated with a user-friendly interface. The grading of the severity level of DR is based on detecting and analyzing the early clinical signs associated with the disease, such as microaneurysms (MAs) and hemorrhages (HAs). The system extracts some retinal features, such as optic disc, fovea, and retinal tissue for easier segmentation of dark spot lesions in the fundus images. That is followed by the classification of the correctly segmented spots into MAs and HAs. Based on the number and location of MAs and HAs, the system quantifies the severity level of DR. A database of 98 color images is used in order to evaluate the performance of the developed system. From the experimental results, it is found that the proposed system achieves 84.31% and 87.53% values in terms of sensitivity for the detection of MAs and HAs respectively. In terms of specificity, the system achieves 93.63% and 95.08% values for the detection of MAs and HAs respectively. Also, the proposed system achieves 68.98% and 74.91% values in terms of kappa coefficient for the detection of MAs and HAs respectively. Moreover, the system yields sensitivity and specificity values of 89.47% and 95.65% for the classification of DR versus normal.
  17. Mak NL, Ooi EH, Lau EV, Ooi ET, Pamidi N, Foo JJ, et al.
    Comput Methods Programs Biomed, 2022 Dec;227:107195.
    PMID: 36323179 DOI: 10.1016/j.cmpb.2022.107195
    BACKGROUND AND OBJECTIVES: Thermochemical ablation (TCA) is a thermal ablation technique involving the injection of acid and base, either sequentially or simultaneously, into the target tissue. TCA remains at the conceptual stage with existing studies unable to provide recommendations on the optimum injection rate, and reagent concentration and volume. Limitations in current experimental methodology have prevented proper elucidation of the thermochemical processes inside the tissue during TCA. Nevertheless, the computational TCA framework developed recently by Mak et al. [Mak et al., Computers in Biology and Medicine, 2022, 145:105494] has opened new avenues in the development of TCA. Specifically, a recommended safe dosage is imperative in driving TCA research beyond the conceptual stage.

    METHODS: The aforesaid computational TCA framework for sequential injection was applied and adapted to simulate TCA with simultaneous injection of acid and base at equimolar and equivolume. The developed framework, which describes the flow of acid and base, their neutralisation, the rise in tissue temperature and the formation of thermal damage, was solved numerically using the finite element method. The framework will be used to investigate the effects of injection rate, reagent concentration, volume and type (weak/strong acid-base combination) on temperature rise and thermal coagulation formation.

    RESULTS: A higher injection rate resulted in higher temperature rise and larger thermal coagulation. Reagent concentration of 7500 mol/m3 was found to be optimum in producing considerable thermal coagulation without the risk of tissue overheating. Thermal coagulation volume was found to be consistently larger than the total volume of acid and base injected into the tissue, which is beneficial as it reduces the risk of chemical burn injury. Three multivariate second-order polynomials that express the targeted coagulation volume as functions of injection rate and reagent volume, for the weak-weak, weak-strong and strong-strong acid-base combinations were also derived based on the simulated data.

    CONCLUSIONS: A guideline for a safe and effective implementation of TCA with simultaneous injection of acid and base was recommended based on the numerical results of the computational model developed. The guideline correlates the coagulation volume with the reagent volume and injection rate, and may be used by clinicians in determining the safe dosage of reagents and optimum injection rate to achieve a desired thermal coagulation volume during TCA.

  18. Jamil DF, Saleem S, Roslan R, Al-Mubaddel FS, Rahimi-Gorji M, Issakhov A, et al.
    Comput Methods Programs Biomed, 2021 May;203:106044.
    PMID: 33756187 DOI: 10.1016/j.cmpb.2021.106044
    BACKGROUND AND OBJECTIVE: Arterial diseases would lead to several serious disorders in the cardiovascular system such as atherosclerosis. These disorders are mainly caused by the presence of fatty deposits, cholesterol and lipoproteins inside blood vessel. This paper deals with the analysis of non-Newtonian magnetic blood flow in an inclined stenosed artery.

    METHODS: The Casson fluid was used to model the blood that flows under the influences of uniformly distributed magnetic field and oscillating pressure gradient. The governing fractional differential equations were expressed using the Caputo Fabrizio fractional derivative without singular kernel.

    RESULTS: The analytical solutions of velocities for non-Newtonian model were then calculated by means of Laplace and finite Hankel transforms. These velocities were then presented graphically. The result shows that the velocity increases with respect to Reynolds number and Casson parameter, while decreases when Hartmann number increases.

    CONCLUSIONS: Casson blood was treated as the non-Newtonian fluid. The MHD blood flow was accelerated by pressure gradient. These findings are beneficial for studying atherosclerosis therapy, the diagnosis and therapeutic treatment of some medical problems.

  19. Acharya UR, Faust O, Ciaccio EJ, Koh JEW, Oh SL, Tan RS, et al.
    Comput Methods Programs Biomed, 2019 Jul;175:163-178.
    PMID: 31104705 DOI: 10.1016/j.cmpb.2019.04.018
    BACKGROUND AND OBJECTIVE: Complex fractionated atrial electrograms (CFAE) may contain information concerning the electrophysiological substrate of atrial fibrillation (AF); therefore they are of interest to guide catheter ablation treatment of AF. Electrogram signals are shaped by activation events, which are dynamical in nature. This makes it difficult to establish those signal properties that can provide insight into the ablation site location. Nonlinear measures may improve information. To test this hypothesis, we used nonlinear measures to analyze CFAE.

    METHODS: CFAE from several atrial sites, recorded for a duration of 16 s, were acquired from 10 patients with persistent and 9 patients with paroxysmal AF. These signals were appraised using non-overlapping windows of 1-, 2- and 4-s durations. The resulting data sets were analyzed with Recurrence Plots (RP) and Recurrence Quantification Analysis (RQA). The data was also quantified via entropy measures.

    RESULTS: RQA exhibited unique plots for persistent versus paroxysmal AF. Similar patterns were observed to be repeated throughout the RPs. Trends were consistent for signal segments of 1 and 2 s as well as 4 s in duration. This was suggestive that the underlying signal generation process is also repetitive, and that repetitiveness can be detected even in 1-s sequences. The results also showed that most entropy metrics exhibited higher measurement values (closer to equilibrium) for persistent AF data. It was also found that Determinism (DET), Trapping Time (TT), and Modified Multiscale Entropy (MMSE), extracted from signals that were acquired from locations at the posterior atrial free wall, are highly discriminative of persistent versus paroxysmal AF data.

    CONCLUSIONS: Short data sequences are sufficient to provide information to discern persistent versus paroxysmal AF data with a significant difference, and can be useful to detect repeating patterns of atrial activation.

  20. Kazemipoor M, Hajifaraji M, Radzi CW, Shamshirband S, Petković D, Mat Kiah ML
    Comput Methods Programs Biomed, 2015 Jan;118(1):69-76.
    PMID: 25453384 DOI: 10.1016/j.cmpb.2014.10.006
    This research examines the precision of an adaptive neuro-fuzzy computing technique in estimating the anti-obesity property of a potent medicinal plant in a clinical dietary intervention. Even though a number of mathematical functions such as SPSS analysis have been proposed for modeling the anti-obesity properties estimation in terms of reduction in body mass index (BMI), body fat percentage, and body weight loss, there are still disadvantages of the models like very demanding in terms of calculation time. Since it is a very crucial problem, in this paper a process was constructed which simulates the anti-obesity activities of caraway (Carum carvi) a traditional medicine on obese women with adaptive neuro-fuzzy inference (ANFIS) method. The ANFIS results are compared with the support vector regression (SVR) results using root-mean-square error (RMSE) and coefficient of determination (R(2)). The experimental results show that an improvement in predictive accuracy and capability of generalization can be achieved by the ANFIS approach. The following statistical characteristics are obtained for BMI loss estimation: RMSE=0.032118 and R(2)=0.9964 in ANFIS testing and RMSE=0.47287 and R(2)=0.361 in SVR testing. For fat loss estimation: RMSE=0.23787 and R(2)=0.8599 in ANFIS testing and RMSE=0.32822 and R(2)=0.7814 in SVR testing. For weight loss estimation: RMSE=0.00000035601 and R(2)=1 in ANFIS testing and RMSE=0.17192 and R(2)=0.6607 in SVR testing. Because of that, it can be applied for practical purposes.
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links