METHODS: We first tested ten traditional machine learning algorithms, and then the three-best performing algorithms (three types of SVM) were used in the rest of the study. To improve the performance of these algorithms, a data preprocessing with normalization was carried out. Moreover, a genetic algorithm and particle swarm optimization, coupled with stratified 10-fold cross-validation, were used twice: for optimization of classifier parameters and for parallel selection of features.
RESULTS: The presented approach enhanced the performance of all traditional machine learning algorithms used in this study. We also introduced a new optimization technique called N2Genetic optimizer (a new genetic training). Our experiments demonstrated that N2Genetic-nuSVM provided the accuracy of 93.08% and F1-score of 91.51% when predicting CAD outcomes among the patients included in a well-known Z-Alizadeh Sani dataset. These results are competitive and comparable to the best results in the field.
CONCLUSIONS: We showed that machine-learning techniques optimized by the proposed approach, can lead to highly accurate models intended for both clinical and research use.
METHODS: The proposed method uses a 2D contourlet transform and a set of texture features that are efficiently extracted from the transformed image. Then, the combination of a kernel discriminant analysis (KDA)-based feature reduction technique and analysis of variance (ANOVA)-based feature ranking technique was used, and the images were then classified into various stages of liver fibrosis.
RESULTS: Our 2D contourlet transform and texture feature analysis approach achieved a 91.46% accuracy using only four features input to the probabilistic neural network classifier, to classify the five stages of liver fibrosis. It also achieved a 92.16% sensitivity and 88.92% specificity for the same model. The evaluation was done on a database of 762 ultrasound images belonging to five different stages of liver fibrosis.
CONCLUSIONS: The findings suggest that the proposed method can be useful to automatically detect and classify liver fibrosis, which would greatly assist clinicians in making an accurate diagnosis.
METHODS: CFAE from several atrial sites, recorded for a duration of 16 s, were acquired from 10 patients with persistent and 9 patients with paroxysmal AF. These signals were appraised using non-overlapping windows of 1-, 2- and 4-s durations. The resulting data sets were analyzed with Recurrence Plots (RP) and Recurrence Quantification Analysis (RQA). The data was also quantified via entropy measures.
RESULTS: RQA exhibited unique plots for persistent versus paroxysmal AF. Similar patterns were observed to be repeated throughout the RPs. Trends were consistent for signal segments of 1 and 2 s as well as 4 s in duration. This was suggestive that the underlying signal generation process is also repetitive, and that repetitiveness can be detected even in 1-s sequences. The results also showed that most entropy metrics exhibited higher measurement values (closer to equilibrium) for persistent AF data. It was also found that Determinism (DET), Trapping Time (TT), and Modified Multiscale Entropy (MMSE), extracted from signals that were acquired from locations at the posterior atrial free wall, are highly discriminative of persistent versus paroxysmal AF data.
CONCLUSIONS: Short data sequences are sufficient to provide information to discern persistent versus paroxysmal AF data with a significant difference, and can be useful to detect repeating patterns of atrial activation.
METHODS: This paper introduces a biological inspired fuzzy adaptive window median filter (FAWMF) which computes the fuzzy membership strength of nucleotides in each slide of window and filters nucleotides based on median filtering with a combination of s-shaped and z-shaped filters. Since coding regions cause 3-base periodicity by an unbalanced nucleotides' distribution producing a relatively high bias for nucleotides' usage, such fundamental characteristic of nucleotides has been exploited in FAWMF to suppress the signal noise.
RESULTS: Along with adaptive response of FAWMF, a strong correlation between median nucleotides and the Π shaped filter was observed which produced enhanced discrimination between coding and non-coding regions contrary to fixed length conventional window filters. The proposed FAWMF attains a significant enhancement in coding regions identification i.e. 40% to 125% as compared to other conventional window filters tested over more than 250 benchmarked and randomly taken DNA datasets of different organisms.
CONCLUSION: This study proves that conventional fixed length window filters applied to DNA signals do not achieve significant results since the nucleotides carry genetic code context. The proposed FAWMF algorithm is adaptive and outperforms significantly to process DNA signal contents. The algorithm applied to variety of DNA datasets produced noteworthy discrimination between coding and non-coding regions contrary to fixed window length conventional filters.
METHODS: Eight scientific databases are selected as an appropriate database and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) method was employed as the basis method for conducting this systematic and meta-analysis review. Regarding the main objective of this research, some inclusion and exclusion criteria were considered to limit our investigation. To achieve a structured meta-analysis, all eligible articles were classified based on authors, publication year, journals or conferences, applied fuzzy methods, main objectives of the research, problems and research gaps, tools utilized to model the fuzzy system, medical disciplines, sample sizes, the inputs and outputs of the system, findings, results and finally the impact of applied fuzzy methods to improve diagnosis. Then, we analyzed the results obtained from these classifications to indicate the effect of fuzzy methods in decreasing the complexity of diagnosis.
RESULTS: Consequently, the result of this study approved the effectiveness of applying different fuzzy methods in diseases diagnosis process, presenting new insights for researchers about what kind of diseases which have been more focused. This will help to determine the diagnostic aspects of medical disciplines that are being neglected.
CONCLUSIONS: Overall, this systematic review provides an appropriate platform for further research by identifying the research needs in the domain of disease diagnosis.
METHODS: Thermomechanical damage-maximum bone temperature, osteonecrosis diameter, osteonecrosis depth, maximum thrust force, and torque-were calculated using the finite element method under various margin heights (0.05-0.25 mm) and widths (0.02-0.26 mm). The simulation results were validated with experimental tests and previous research data.
RESULTS: The effect of margin height in increasing the maximum bone temperature, osteonecrosis diameter, and depth were at least 19.1%, 41.9%, and 59.6%, respectively. The thrust force and torque are highly sensitive to margin height. A higher margin height (0.21-0.25 mm) reduced the thrust force by 54.0% but increased drilling torque by 142.2%. The bone temperature, osteonecrosis diameter, and depth were 16.5%, 56.5%, and 81.4% lower, respectively, with increasing margin width. The minimum thrust force (11.1 N) and torque (41.9 Nmm) were produced with the highest margin width (0.26 mm). The margin height of 0.05-0.13 mm and a margin width of 0.22-0.26 produced the highest sum of weightage.
CONCLUSIONS: A surgical drill bit with a margin height of 0.05-0.13 mm and a margin width of 0.22-0.26 mm can produce minimum thermomechanical damage in cortical bone drilling. The insights regarding the suitable ranges for margin height and width from this study could be adopted in future research devoted to optimizing the margin of the existing surgical drill bit.
OBJECTIVE: This study aims to determine the background of recent studies on wheelchair control based on BCI for disability and map the literature survey into a coherent taxonomy. The study intends to identify the most important aspects in this emerging field as an impetus for using BCI for disability in electric-powered wheelchair (EPW) control, which remains a challenge. The study also attempts to provide recommendations for solving other existing limitations and challenges.
METHODS: We systematically searched all articles about EPW control based on BCI for disability in three popular databases: ScienceDirect, IEEE and Web of Science. These databases contain numerous articles that considerably influenced this field and cover most of the relevant theoretical and technical issues.
RESULTS: We selected 100 articles on the basis of our inclusion and exclusion criteria. A large set of articles (55) discussed on developing real-time wheelchair control systems based on BCI for disability signals. Another set of articles (25) focused on analysing BCI for disability signals for wheelchair control. The third set of articles (14) considered the simulation of wheelchair control based on BCI for disability signals. Four articles designed a framework for wheelchair control based on BCI for disability signals. Finally, one article reviewed concerns regarding wheelchair control based on BCI for disability signals.
DISCUSSION: Since 2007, researchers have pursued the possibility of using BCI for disability in EPW control through different approaches. Regardless of type, articles have focused on addressing limitations that impede the full efficiency of BCI for disability and recommended solutions for these limitations.
CONCLUSIONS: Studies on wheelchair control based on BCI for disability considerably influence society due to the large number of people with disability. Therefore, we aim to provide researchers and developers with a clear understanding of this platform and highlight the challenges and gaps in the current and future studies.
METHODS: These models utilized experimental data of wavelengths and hemoglobin concentrations in building highly accurate Genetic Algorithm/Support Vector Regression model (GA-SVR).
RESULTS: The developed methodology showed high accuracy as indicated by the low root mean square error values of 4.65 × 10-4 and 4.62 × 10-4 for oxygenated and deoxygenated hemoglobin, respectively. In addition, the models exhibited 99.85 and 99.84% correlation coefficients (r) for the oxygenated and deoxygenated hemoglobin, thus, validating the strong agreement between the predicted and the experimental results CONCLUSIONS: Due to the accuracy and relative simplicity of the proposed models, we envisage that these models would serve as important references for future studies on optical properties of blood.
OBJECTIVE: This paper presents a rescue framework for the transfusion of the best CP to the most critical patients with COVID-19 on the basis of biological requirements by using machine learning and novel MCDM methods.
METHOD: The proposed framework is illustrated on the basis of two distinct and consecutive phases (i.e. testing and development). In testing, ABO compatibility is assessed after classifying donors into the four blood types, namely, A, B, AB and O, to indicate the suitability and safety of plasma for administration in order to refine the CP tested list repository. The development phase includes patient and donor sides. In the patient side, prioritisation is performed using a contracted patient decision matrix constructed between 'serological/protein biomarkers and the ratio of the partial pressure of oxygen in arterial blood to fractional inspired oxygen criteria' and 'patient list based on novel MCDM method known as subjective and objective decision by opinion score method'. Then, the patients with the most urgent need are classified into the four blood types and matched with a tested CP list from the test phase in the donor side. Thereafter, the prioritisation of CP tested list is performed using the contracted CP decision matrix.
RESULT: An intelligence-integrated concept is proposed to identify the most appropriate CP for corresponding prioritised patients with COVID-19 to help doctors hasten treatments.
DISCUSSION: The proposed framework implies the benefits of providing effective care and prevention of the extremely rapidly spreading COVID-19 from affecting patients and the medical sector.
METHODS: In this paper, we analyze four wide-spread deep learning models designed for the segmentation of three retinal fluids outputting dense predictions in the RETOUCH challenge data. We aim to demonstrate how a patch-based approach could push the performance for each method. Besides, we also evaluate the methods using the OPTIMA challenge dataset for generalizing network performance. The analysis is driven into two sections: the comparison between the four approaches and the significance of patching the images.
RESULTS: The performance of networks trained on the RETOUCH dataset is higher than human performance. The analysis further generalized the performance of the best network obtained by fine-tuning it and achieved a mean Dice similarity coefficient (DSC) of 0.85. Out of the three types of fluids, intraretinal fluid (IRF) is more recognized, and the highest DSC value of 0.922 is achieved using Spectralis dataset. Additionally, the highest average DSC score is 0.84, which is achieved by PaDeeplabv3+ model using Cirrus dataset.
CONCLUSIONS: The proposed method segments the three fluids in the retina with high DSC value. Fine-tuning the networks trained on the RETOUCH dataset makes the network perform better and faster than training from scratch. Enriching the networks with inputting a variety of shapes by extracting patches helped to segment the fluids better than using a full image.
OBJECTIVE: This study aimed to review and analyse literature related to the detection and classification of acute leukaemia. The factors that were considered to improve understanding on the field's various contextual aspects in published studies and characteristics were motivation, open challenges that confronted researchers and recommendations presented to researchers to enhance this vital research area.
METHODS: We systematically searched all articles about the classification and detection of acute leukaemia, as well as their evaluation and benchmarking, in three main databases: ScienceDirect, Web of Science and IEEE Xplore from 2007 to 2017. These indices were considered to be sufficiently extensive to encompass our field of literature.
RESULTS: Based on our inclusion and exclusion criteria, 89 articles were selected. Most studies (58/89) focused on the methods or algorithms of acute leukaemia classification, a number of papers (22/89) covered the developed systems for the detection or diagnosis of acute leukaemia and few papers (5/89) presented evaluation and comparative studies. The smallest portion (4/89) of articles comprised reviews and surveys.
DISCUSSION: Acute leukaemia diagnosis, which is a field requiring automated solutions, tools and methods, entails the ability to facilitate early detection or even prediction. Many studies have been performed on the automatic detection and classification of acute leukaemia and their subtypes to promote accurate diagnosis.
CONCLUSIONS: Research areas on medical-image classification vary, but they are all equally vital. We expect this systematic review to help emphasise current research opportunities and thus extend and create additional research fields.
METHOD: The CAE model was trained using 12,170,655 simulated SB flow and normal flow data (NB). The paired SB and NB flow data were simulated using a Gaussian Effort Model (GEM) with 5 basis functions. When the CAE model is given a SB flow input, it is capable of predicting a corresponding NB flow for the SB flow input. The magnitude of SB effort (SBEMag) is then quantified as the difference between the SB and NB flows. The CAE model was used to evaluate the SBEMag of 9 pressure control/ support datasets. Results were validated using a mean squared error (MSE) fitting between clinical and training SB flows.
RESULTS: The CAE model was able to produce NB flows from the clinical SB flows with the median SBEMag of the 9 datasets being 25.39% [IQR: 21.87-25.57%]. The absolute error in SBEMag using MSE validation yields a median of 4.77% [IQR: 3.77-8.56%] amongst the cohort. This shows the ability of the GEM to capture the intrinsic details present in SB flow waveforms. Analysis also shows both intra-patient and inter-patient variability in SBEMag.
CONCLUSION: A Convolutional Autoencoder model was developed with simulated SB and NB flow data and is capable of quantifying the magnitude of patient spontaneous breathing effort. This provides potential application for real-time monitoring of patient respiratory drive for better management of patient-ventilator interaction.