Displaying publications 1 - 20 of 51 in total

Abstract:
Sort:
  1. Abdul Aziz Jemain
    This paper offers a technique to create a development index among districts. To determine the weight for each criterion in the index entropy theory was used. Two approaches for criteria normalization were also suggested. The data obtained from 1991 census conducted in Peninsular Malaysia were utilized as illustration.
    Kertas ini mencadangkan teknik pembinaan indeks kemajuan daerah. Teori entropy digunakan untuk menentukan pemberat bagi kriteria yang digunakan dalam pembinaan indeks. Dua pendekatan menormalkan data turut dicadangkan. Contoh pembinaan indeks dikemukakan berdasarkan kemudahan asas yang terdapat di daerah-daerah di Semenanjung Malaysia seperti yang diperoleh berdasarkan banci tahun 1991.
    Matched MeSH terms: Entropy
  2. Nor Hasliza Mat Desa, Maznah Mat Kasim, Abdul Aziz Jemain
    Sains Malaysiana, 2015;44:239-247.
    The issue of age difference in hospital admission should be given special attention since it affects the structure of hospital care and treatments. Patients of different age groups should be given different priority in service provision. Due to crucial time and limited resources, healthcare managers need to make wise decisions in identifying priorities in age of admission. This paper aimed to propose a construction of a daily composite hospital admission index (CHAI) as an indicator that captures relevant information about the overall performance of hospital admission over time. It involves five different age groups of total patients admitted to seven major public hospitals in the Klang Valley, Malaysia for respiratory and cardiovascular diseases for a period of three years, 2008 - 2010. The criteria weights were predetermined by aggregating the subjective weight based on rank ordered centroid (ROC) method and objective weight based on entropy - kernel method. The highest and lowest scores of CHAI were marked, while the groups of patients were prioritized according to the criteria weight ranking orders.
    Matched MeSH terms: Entropy
  3. Suradi SH, Abdullah KA
    Curr Med Imaging, 2021 Jan 26.
    PMID: 33504312 DOI: 10.2174/1573405617666210127101101
    BACKGROUND: Digital mammograms with appropriate image enhancement techniques will improve breast cancer detection, and thus increase the survival rates. The objectives of this study were to systematically review and compare various image enhancement techniques in digital mammograms for breast cancer detection.

    METHODS: A literature search was conducted with the use of three online databases namely, Web of Science, Scopus, and ScienceDirect. Developed keywords strategy was used to include only the relevant articles. A Population Intervention Comparison Outcomes (PICO) strategy was used to develop the inclusion and exclusion criteria. Image quality was analyzed quantitatively based on peak signal-noise-ratio (PSNR), Mean Squared Error (MSE), Absolute Mean Brightness Error (AMBE), Entropy, and Contrast Improvement Index (CII) values.

    RESULTS: Nine studies with four types of image enhancement techniques were included in this study. Two studies used histogram-based, three studies used frequency-based, one study used fuzzy-based and three studies used filter-based. All studies reported PSNR values whilst only four studies reported MSE, AMBE, Entropy and CII values. Filter-based was the highest PSNR values of 78.93, among other types. For MSE, AMBE, Entropy, and CII values, the highest were frequency-based (7.79), fuzzy-based (93.76), filter-based (7.92), and frequency-based (6.54) respectively.

    CONCLUSION: In summary, image quality for each image enhancement technique is varied, especially for breast cancer detection. In this study, the frequency-based of Fast Discrete Curvelet Transform (FDCT) via the UnequiSpaced Fast Fourier Transform (USFFT) shows the most superior among other image enhancement techniques.

    Matched MeSH terms: Entropy
  4. Dehdasht G, Ferwati MS, Zin RM, Abidin NZ
    PLoS One, 2020;15(2):e0228746.
    PMID: 32023306 DOI: 10.1371/journal.pone.0228746
    Successful implementation of the lean concept as a sustainable approach in the construction industry requires the identification of critical drivers in lean construction. Despite this significance, the number of in-depth studies toward understanding the considerable drivers of lean construction implementation is quite limited. There is also a shortage of methodologies for identifying key drivers. To address these challenges, this paper presents a list of all essential drivers within three aspects of sustainability (social, economic, and environmental) and proposes a novel methodology to rank the drivers and identify the key drivers for successful and sustainable lean construction implementation. In this regard, the entropy weighted Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) was employed in this research. Subsequently, an empirical study was conducted within the Malaysian construction industry to demonstrate the proposed method. Moreover, sensitivity analysis and comparison with the existing method were engaged to validate the stability and accuracy of the achieved results. The significant results obtained in this study are as follows: presenting, verifying and ranking of 63 important drivers; identifying 22 key drivers; proposing an MCDM model of key drivers. The outcomes show that the proposed method in this study is an effective and accurate tool that could help managers make better decisions.
    Matched MeSH terms: Entropy*
  5. Maheshwari S, Pachori RB, Kanhangad V, Bhandary SV, Acharya UR
    Comput Biol Med, 2017 Sep 01;88:142-149.
    PMID: 28728059 DOI: 10.1016/j.compbiomed.2017.06.017
    Glaucoma is one of the leading causes of permanent vision loss. It is an ocular disorder caused by increased fluid pressure within the eye. The clinical methods available for the diagnosis of glaucoma require skilled supervision. They are manual, time consuming, and out of reach of common people. Hence, there is a need for an automated glaucoma diagnosis system for mass screening. In this paper, we present a novel method for an automated diagnosis of glaucoma using digital fundus images. Variational mode decomposition (VMD) method is used in an iterative manner for image decomposition. Various features namely, Kapoor entropy, Renyi entropy, Yager entropy, and fractal dimensions are extracted from VMD components. ReliefF algorithm is used to select the discriminatory features and these features are then fed to the least squares support vector machine (LS-SVM) for classification. Our proposed method achieved classification accuracies of 95.19% and 94.79% using three-fold and ten-fold cross-validation strategies, respectively. This system can aid the ophthalmologists in confirming their manual reading of classes (glaucoma or normal) using fundus images.
    Matched MeSH terms: Entropy
  6. Sharma M, Agarwal S, Acharya UR
    Comput Biol Med, 2018 09 01;100:100-113.
    PMID: 29990643 DOI: 10.1016/j.compbiomed.2018.06.011
    Obstructive sleep apnea (OSA) is a sleep disorder caused due to interruption of breathing resulting in insufficient oxygen to the human body and brain. If the OSA is detected and treated at an early stage the possibility of severe health impairment can be mitigated. Therefore, an accurate automated OSA detection system is indispensable. Generally, OSA based computer-aided diagnosis (CAD) system employs multi-channel, multi-signal physiological signals. However, there is a great need for single-channel bio-signal based low-power, a portable OSA-CAD system which can be used at home. In this study, we propose single-channel electrocardiogram (ECG) based OSA-CAD system using a new class of optimal biorthogonal antisymmetric wavelet filter bank (BAWFB). In this class of filter bank, all filters are of even length. The filter bank design problem is transformed into a constrained optimization problem wherein the objective is to minimize either frequency-spread for the given time-spread or time-spread for the given frequency-spread. The optimization problem is formulated as a semi-definite programming (SDP) problem. In the SDP problem, the objective function (time-spread or frequency-spread), constraints of perfect reconstruction (PR) and zero moment (ZM) are incorporated in their time domain matrix formulations. The global solution for SDP is obtained using interior point algorithm. The newly designed BAWFB is used for the classification of OSA using ECG signals taken from the physionet's Apnea-ECG database. The ECG segments of 1 min duration are decomposed into six wavelet subbands (WSBs) by employing the proposed BAWFB. Then, the fuzzy entropy (FE) and log-energy (LE) features are computed from all six WSBs. The FE and LE features are classified into normal and OSA groups using least squares support vector machine (LS-SVM) with 35-fold cross-validation strategy. The proposed OSA detection model achieved the average classification accuracy, sensitivity, specificity and F-score of 90.11%, 90.87% 88.88% and 0.92, respectively. The performance of the model is found to be better than the existing works in detecting OSA using the same database. Thus, the proposed automated OSA detection system is accurate, cost-effective and ready to be tested with a huge database.
    Matched MeSH terms: Entropy
  7. Sharma M, Tan RS, Acharya UR
    Comput Biol Med, 2018 11 01;102:341-356.
    PMID: 30049414 DOI: 10.1016/j.compbiomed.2018.07.005
    Myocardial infarction (MI), also referred to as heart attack, occurs when there is an interruption of blood flow to parts of the heart, due to the acute rupture of atherosclerotic plaque, which leads to damage of heart muscle. The heart muscle damage produces changes in the recorded surface electrocardiogram (ECG). The identification of MI by visual inspection of the ECG requires expert interpretation, and is difficult as the ECG signal changes associated with MI can be short in duration and low in magnitude. Hence, errors in diagnosis can lead to delay the initiation of appropriate medical treatment. To lessen the burden on doctors, an automated ECG based system can be installed in hospitals to help identify MI changes on ECG. In the proposed study, we develop a single-channel single lead ECG based MI diagnostic system validated using noisy and clean datasets. The raw ECG signals are taken from the Physikalisch-Technische Bundesanstalt database. We design a novel two-band optimal biorthogonal filter bank (FB) for analysis of the ECG signals. We present a method to design a novel class of two-band optimal biorthogonal FB in which not only the product filter but the analysis lowpass filter is also a halfband filter. The filter design problem has been composed as a constrained convex optimization problem in which the objective function is a convex combination of multiple quadratic functions and the regularity and perfect reconstruction conditions are imposed in the form linear equalities. ECG signals are decomposed into six subbands (SBs) using the newly designed wavelet FB. Following to this, discriminating features namely, fuzzy entropy (FE), signal-fractal-dimensions (SFD), and renyi entropy (RE) are computed from all the six SBs. The features are fed to the k-nearest neighbor (KNN). The proposed system yields an accuracy of 99.62% for the noisy dataset and an accuracy of 99.74% for the clean dataset, using 10-fold cross validation (CV) technique. Our MI identification system is robust and highly accurate. It can thus be installed in clinics for detecting MI.
    Matched MeSH terms: Entropy
  8. Azareh A, Rahmati O, Rafiei-Sardooi E, Sankey JB, Lee S, Shahabi H, et al.
    Sci Total Environ, 2019 Mar 10;655:684-696.
    PMID: 30476849 DOI: 10.1016/j.scitotenv.2018.11.235
    Gully erosion susceptibility mapping is a fundamental tool for land-use planning aimed at mitigating land degradation. However, the capabilities of some state-of-the-art data-mining models for developing accurate maps of gully erosion susceptibility have not yet been fully investigated. This study assessed and compared the performance of two different types of data-mining models for accurately mapping gully erosion susceptibility at a regional scale in Chavar, Ilam, Iran. The two methods evaluated were: Certainty Factor (CF), a bivariate statistical model; and Maximum Entropy (ME), an advanced machine learning model. Several geographic and environmental factors that can contribute to gully erosion were considered as predictor variables of gully erosion susceptibility. Based on an existing differential GPS survey inventory of gully erosion, a total of 63 eroded gullies were spatially randomly split in a 70:30 ratio for use in model calibration and validation, respectively. Accuracy assessments completed with the receiver operating characteristic curve method showed that the ME-based regional gully susceptibility map has an area under the curve (AUC) value of 88.6% whereas the CF-based map has an AUC of 81.8%. According to jackknife tests that were used to investigate the relative importance of predictor variables, aspect, distance to river, lithology and land use are the most influential factors for the spatial distribution of gully erosion susceptibility in this region of Iran. The gully erosion susceptibility maps produced in this study could be useful tools for land managers and engineers tasked with road development, urbanization and other future development.
    Matched MeSH terms: Entropy
  9. Ali BH, Sulaiman N, Al-Haddad SAR, Atan R, Hassan SLM, Alghrairi M
    Sensors (Basel), 2021 Sep 27;21(19).
    PMID: 34640773 DOI: 10.3390/s21196453
    One of the most dangerous kinds of attacks affecting computers is a distributed denial of services (DDoS) attack. The main goal of this attack is to bring the targeted machine down and make their services unavailable to legal users. This can be accomplished mainly by directing many machines to send a very large number of packets toward the specified machine to consume its resources and stop it from working. We implemented a method using Java based on entropy and sequential probabilities ratio test (ESPRT) methods to identify malicious flows and their switch interfaces that aid them in passing through. Entropy (E) is the first technique, and the sequential probabilities ratio test (SPRT) is the second technique. The entropy method alone compares its results with a certain threshold in order to make a decision. The accuracy and F-scores for entropy results thus changed when the threshold values changed. Using both entropy and SPRT removed the uncertainty associated with the entropy threshold. The false positive rate was also reduced when combining both techniques. Entropy-based detection methods divide incoming traffic into groups of traffic that have the same size. The size of these groups is determined by a parameter called window size. The Defense Advanced Research Projects Agency (DARPA) 1998, DARPA2000, and Canadian Institute for Cybersecurity (CIC-DDoS2019) databases were used to evaluate the implementation of this method. The metric of a confusion matrix was used to compare the ESPRT results with the results of other methods. The accuracy and f-scores for the DARPA 1998 dataset were 0.995 and 0.997, respectively, for the ESPRT method when the window size was set at 50 and 75 packets. The detection rate of ESPRT for the same dataset was 0.995 when the window size was set to 10 packets. The average accuracy for the DARPA 2000 dataset for ESPRT was 0.905, and the detection rate was 0.929. Finally, ESPRT was scalable to a multiple domain topology application.
    Matched MeSH terms: Entropy
  10. Mustafa S, Iqbal MW, Rana TA, Jaffar A, Shiraz M, Arif M, et al.
    Comput Intell Neurosci, 2022;2022:4348235.
    PMID: 35909861 DOI: 10.1155/2022/4348235
    Malignant melanoma is considered one of the deadliest skin diseases if ignored without treatment. The mortality rate caused by melanoma is more than two times that of other skin malignancy diseases. These facts encourage computer scientists to find automated methods to discover skin cancers. Nowadays, the analysis of skin images is widely used by assistant physicians to discover the first stage of the disease automatically. One of the challenges the computer science researchers faced when developing such a system is the un-clarity of the existing images, such as noise like shadows, low contrast, hairs, and specular reflections, which complicates detecting the skin lesions in that images. This paper proposes the solution to the problem mentioned earlier using the active contour method. Still, seed selection in the dynamic contour method has the main drawback of where it should start the segmentation process. This paper uses Gaussian filter-based maximum entropy and morphological processing methods to find automatic seed points for active contour. By incorporating this, it can segment the lesion from dermoscopic images automatically. Our proposed methodology tested quantitative and qualitative measures on standard dataset dermis and used to test the proposed method's reliability which shows encouraging results.
    Matched MeSH terms: Entropy
  11. Zhang K, Ting HN, Choo YM
    Comput Methods Programs Biomed, 2024 Mar;245:108043.
    PMID: 38306944 DOI: 10.1016/j.cmpb.2024.108043
    BACKGROUND AND OBJECTIVE: Conflict may happen when more than one classifier is used to perform prediction or classification. The recognition model error leads to conflicting evidence. These conflicts can cause decision errors in a baby cry recognition and further decrease its recognition accuracy. Thus, the objective of this study is to propose a method that can effectively minimize the conflict among deep learning models and improve the accuracy of baby cry recognition.

    METHODS: An improved Dempster-Shafer evidence theory (DST) based on Wasserstein distance and Deng entropy was proposed to reduce the conflicts among the results by combining the credibility degree between evidence and the uncertainty degree of evidence. To validate the effectiveness of the proposed method, examples were analyzed, and applied in a baby cry recognition. The Whale optimization algorithm-Variational mode decomposition (WOA-VMD) was used to optimally decompose the baby cry signals. The deep features of decomposed components were extracted using the VGG16 model. Long Short-Term Memory (LSTM) models were used to classify baby cry signals. An improved DST decision method was used to obtain the decision fusion.

    RESULTS: The proposed fusion method achieves an accuracy of 90.15% in classifying three types of baby cry. Improvement between 2.90% and 4.98% was obtained over the existing DST fusion methods. Recognition accuracy was improved by between 5.79% and 11.53% when compared to the latest methods used in baby cry recognition.

    CONCLUSION: The proposed method optimally decomposes baby cry signal, effectively reduces the conflict among the results of deep learning models and improves the accuracy of baby cry recognition.

    Matched MeSH terms: Entropy
  12. Uddin J, Ghazali R, Deris MM
    PLoS One, 2017;12(1):e0164803.
    PMID: 28068344 DOI: 10.1371/journal.pone.0164803
    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy.
    Matched MeSH terms: Entropy
  13. Al-Shamasneh AR, Jalab HA, Palaiahnakote S, Obaidellah UH, Ibrahim RW, El-Melegy MT
    Entropy (Basel), 2018 May 05;20(5).
    PMID: 33265434 DOI: 10.3390/e20050344
    Kidney image enhancement is challenging due to the unpredictable quality of MRI images, as well as the nature of kidney diseases. The focus of this work is on kidney images enhancement by proposing a new Local Fractional Entropy (LFE)-based model. The proposed model estimates the probability of pixels that represent edges based on the entropy of the neighboring pixels, which results in local fractional entropy. When there is a small change in the intensity values (indicating the presence of edge in the image), the local fractional entropy gives fine image details. Similarly, when no change in intensity values is present (indicating smooth texture), the LFE does not provide fine details, based on the fact that there is no edge information. Tests were conducted on a large dataset of different, poor-quality kidney images to show that the proposed model is useful and effective. A comparative study with the classical methods, coupled with the latest enhancement methods, shows that the proposed model outperforms the existing methods.
    Matched MeSH terms: Entropy
  14. Al-Qazzaz NK, Ali SHBM, Ahmad SA, Islam MS, Escudero J
    Med Biol Eng Comput, 2018 Jan;56(1):137-157.
    PMID: 29119540 DOI: 10.1007/s11517-017-1734-7
    Stroke survivors are more prone to developing cognitive impairment and dementia. Dementia detection is a challenge for supporting personalized healthcare. This study analyzes the electroencephalogram (EEG) background activity of 5 vascular dementia (VaD) patients, 15 stroke-related patients with mild cognitive impairment (MCI), and 15 control healthy subjects during a working memory (WM) task. The objective of this study is twofold. First, it aims to enhance the discrimination of VaD, stroke-related MCI patients, and control subjects using fuzzy neighborhood preserving analysis with QR-decomposition (FNPAQR); second, it aims to extract and investigate the spectral features that characterize the post-stroke dementia patients compared to the control subjects. Nineteen channels were recorded and analyzed using the independent component analysis and wavelet analysis (ICA-WT) denoising technique. Using ANOVA, linear spectral power including relative powers (RP) and power ratio were calculated to test whether the EEG dominant frequencies were slowed down in VaD and stroke-related MCI patients. Non-linear features including permutation entropy (PerEn) and fractal dimension (FD) were used to test the degree of irregularity and complexity, which was significantly lower in patients with VaD and stroke-related MCI than that in control subjects (ANOVA; p ˂ 0.05). This study is the first to use fuzzy neighborhood preserving analysis with QR-decomposition (FNPAQR) dimensionality reduction technique with EEG background activity of dementia patients. The impairment of post-stroke patients was detected using support vector machine (SVM) and k-nearest neighbors (kNN) classifiers. A comparative study has been performed to check the effectiveness of using FNPAQR dimensionality reduction technique with the SVM and kNN classifiers. FNPAQR with SVM and kNN obtained 91.48 and 89.63% accuracy, respectively, whereas without using the FNPAQR exhibited 70 and 67.78% accuracy for SVM and kNN, respectively, in classifying VaD, stroke-related MCI, and control patients, respectively. Therefore, EEG could be a reliable index for inspecting concise markers that are sensitive to VaD and stroke-related MCI patients compared to control healthy subjects.
    Matched MeSH terms: Entropy
  15. Tamjidy M, Baharudin BTHT, Paslar S, Matori KA, Sulaiman S, Fadaeifard F
    Materials (Basel), 2017 May 15;10(5).
    PMID: 28772893 DOI: 10.3390/ma10050533
    The development of Friction Stir Welding (FSW) has provided an alternative approach for producing high-quality welds, in a fast and reliable manner. This study focuses on the mechanical properties of the dissimilar friction stir welding of AA6061-T6 and AA7075-T6 aluminum alloys. The FSW process parameters such as tool rotational speed, tool traverse speed, tilt angle, and tool offset influence the mechanical properties of the friction stir welded joints significantly. A mathematical regression model is developed to determine the empirical relationship between the FSW process parameters and mechanical properties, and the results are validated. In order to obtain the optimal values of process parameters that simultaneously optimize the ultimate tensile strength, elongation, and minimum hardness in the heat affected zone (HAZ), a metaheuristic, multi objective algorithm based on biogeography based optimization is proposed. The Pareto optimal frontiers for triple and dual objective functions are obtained and the best optimal solution is selected through using two different decision making techniques, technique for order of preference by similarity to ideal solution (TOPSIS) and Shannon's entropy.
    Matched MeSH terms: Entropy
  16. Jawed S, Amin HU, Malik AS, Faye I
    PMID: 31133829 DOI: 10.3389/fnbeh.2019.00086
    This study analyzes the learning styles of subjects based on their electroencephalo-graphy (EEG) signals. The goal is to identify how the EEG features of a visual learner differ from those of a non-visual learner. The idea is to measure the students' EEGs during the resting states (eyes open and eyes closed conditions) and when performing learning tasks. For this purpose, 34 healthy subjects are recruited. The subjects have no background knowledge of the animated learning content. The subjects are shown the animated learning content in a video format. The experiment consists of two sessions and each session comprises two parts: (1) Learning task: the subjects are shown the animated learning content for an 8-10 min duration. (2) Memory retrieval task The EEG signals are measured during the leaning task and memory retrieval task in two sessions. The retention time for the first session was 30 min, and 2 months for the second session. The analysis is performed for the EEG measured during the memory retrieval tasks. The study characterizes and differentiates the visual learners from the non-visual learners considering the extracted EEG features, such as the power spectral density (PSD), power spectral entropy (PSE), and discrete wavelet transform (DWT). The PSD and DWT features are analyzed. The EEG PSD and DWT features are computed for the recorded EEG in the alpha and gamma frequency bands over 128 scalp sites. The alpha and gamma frequency band for frontal, occipital, and parietal regions are analyzed as these regions are activated during learning. The extracted PSD and DWT features are then reduced to 8 and 15 optimum features using principal component analysis (PCA). The optimum features are then used as an input to the k-nearest neighbor (k-NN) classifier using the Mahalanobis distance metric, with 10-fold cross validation and support vector machine (SVM) classifier using linear kernel, with 10-fold cross validation. The classification results showed 97% and 94% accuracies rate for the first session and 96% and 93% accuracies for the second session in the alpha and gamma bands for the visual learners and non-visual learners, respectively, for k-NN classifier for PSD features and 68% and 100% accuracies rate for first session and 100% accuracies rate for second session for DWT features using k-NN classifier for the second session in the alpha and gamma band. For PSD features 97% and 96% accuracies rate for the first session, 100% and 95% accuracies rate for second session using SVM classifier and 79% and 82% accuracy for first session and 56% and 74% accuracy for second session for DWT features using SVM classifier. The results showed that the PSDs in the alpha and gamma bands represent distinct and stable EEG signatures for visual learners and non-visual learners during the retrieval of the learned contents.
    Matched MeSH terms: Entropy
  17. Malviya R, Jha S, Fuloria NK, Subramaniyan V, Chakravarthi S, Sathasivam K, et al.
    Polymers (Basel), 2021 Feb 18;13(4).
    PMID: 33670569 DOI: 10.3390/polym13040610
    The rheological properties of tamarind seed polymer are characterized for its possible commercialization in the food and pharmaceutical industry. Seed polymer was extracted using water as a solvent and ethyl alcohol as a precipitating agent. The temperature's effect on the rheological behavior of the polymeric solution was studied. In addition to this, the temperature coefficient, viscosity, surface tension, activation energy, Gibbs free energy, Reynolds number, and entropy of fusion were calculated by using the Arrhenius, Gibbs-Helmholtz, Frenkel-Eyring, and Eotvos equations, respectively. The activation energy of the gum was found to be 20.46 ± 1.06 kJ/mol. Changes in entropy and enthalpy were found to be 23.66 ± 0.97 and -0.10 ± 0.01 kJ/mol, respectively. The calculated amount of entropy of fusion was found to be 0.88 kJ/mol. A considerable decrease in apparent viscosity and surface tension was produced when the temperature was raised. The present study concludes that the tamarind seed polymer solution is less sensitive to temperature change in comparison to Albzia lebbac gum, Ficus glumosa gum and A. marcocarpa gum. This study also concludes that the attainment of the transition state of viscous flow for tamarind seed gum is accompanied by bond breaking. The excellent physicochemical properties of tamarind seed polymers make them promising excipients for future drug formulation and make their application in the food and cosmetics industry possible.
    Matched MeSH terms: Entropy
  18. Sudarshan VK, Acharya UR, Ng EY, Tan RS, Chou SM, Ghista DN
    Comput Biol Med, 2016 Apr 1;71:231-40.
    PMID: 26898671 DOI: 10.1016/j.compbiomed.2016.01.028
    Cross-sectional view echocardiography is an efficient non-invasive diagnostic tool for characterizing Myocardial Infarction (MI) and stages of expansion leading to heart failure. An automated computer-aided technique of cross-sectional echocardiography feature assessment can aid clinicians in early and more reliable detection of MI patients before subsequent catastrophic post-MI medical conditions. Therefore, this paper proposes a novel Myocardial Infarction Index (MII) to discriminate infarcted and normal myocardium using features extracted from apical cross-sectional views of echocardiograms. The cross-sectional view of normal and MI echocardiography images are represented as textons using Maximum Responses (MR8) filter banks. Fractal Dimension (FD), Higher-Order Statistics (HOS), Hu's moments, Gabor Transform features, Fuzzy Entropy (FEnt), Energy, Local binary Pattern (LBP), Renyi's Entropy (REnt), Shannon's Entropy (ShEnt), and Kapur's Entropy (KEnt) features are extracted from textons. These features are ranked using t-test and fuzzy Max-Relevancy and Min-Redundancy (mRMR) ranking methods. Then, combinations of highly ranked features are used in the formulation and development of an integrated MII. This calculated novel MII is used to accurately and quickly detect infarcted myocardium by using one numerical value. Also, the highly ranked features are subjected to classification using different classifiers for the characterization of normal and MI LV ultrasound images using a minimum number of features. Our current technique is able to characterize MI with an average accuracy of 94.37%, sensitivity of 91.25% and specificity of 97.50% with 8 apical four chambers view features extracted from only single frame per patient making this a more reliable and accurate classification.
    Matched MeSH terms: Entropy
  19. Sudarshan VK, Acharya UR, Ng EY, Tan RS, Chou SM, Ghista DN
    Comput Biol Med, 2016 Apr 1;71:241-51.
    PMID: 26897481 DOI: 10.1016/j.compbiomed.2016.01.029
    Early expansion of infarcted zone after Acute Myocardial Infarction (AMI) has serious short and long-term consequences and contributes to increased mortality. Thus, identification of moderate and severe phases of AMI before leading to other catastrophic post-MI medical condition is most important for aggressive treatment and management. Advanced image processing techniques together with robust classifier using two-dimensional (2D) echocardiograms may aid for automated classification of the extent of infarcted myocardium. Therefore, this paper proposes novel algorithms namely Curvelet Transform (CT) and Local Configuration Pattern (LCP) for an automated detection of normal, moderately infarcted and severely infarcted myocardium using 2D echocardiograms. The methodology extracts the LCP features from CT coefficients of echocardiograms. The obtained features are subjected to Marginal Fisher Analysis (MFA) dimensionality reduction technique followed by fuzzy entropy based ranking method. Different classifiers are used to differentiate ranked features into three classes normal, moderate and severely infarcted based on the extent of damage to myocardium. The developed algorithm has achieved an accuracy of 98.99%, sensitivity of 98.48% and specificity of 100% for Support Vector Machine (SVM) classifier using only six features. Furthermore, we have developed an integrated index called Myocardial Infarction Risk Index (MIRI) to detect the normal, moderately and severely infarcted myocardium using a single number. The proposed system may aid the clinicians in faster identification and quantification of the extent of infarcted myocardium using 2D echocardiogram. This system may also aid in identifying the person at risk of developing heart failure based on the extent of infarcted myocardium.
    Matched MeSH terms: Entropy
  20. Abas FS, Shana'ah A, Christian B, Hasserjian R, Louissaint A, Pennell M, et al.
    Cytometry A, 2017 06;91(6):609-621.
    PMID: 28110507 DOI: 10.1002/cyto.a.23049
    The advance of high resolution digital scans of pathology slides allowed development of computer based image analysis algorithms that may help pathologists in IHC stains quantification. While very promising, these methods require further refinement before they are implemented in routine clinical setting. Particularly critical is to evaluate algorithm performance in a setting similar to current clinical practice. In this article, we present a pilot study that evaluates the use of a computerized cell quantification method in the clinical estimation of CD3 positive (CD3+) T cells in follicular lymphoma (FL). Our goal is to demonstrate the degree to which computerized quantification is comparable to the practice of estimation by a panel of expert pathologists. The computerized quantification method uses entropy based histogram thresholding to separate brown (CD3+) and blue (CD3-) regions after a color space transformation. A panel of four board-certified hematopathologists evaluated a database of 20 FL images using two different reading methods: visual estimation and manual marking of each CD3+ cell in the images. These image data and the readings provided a reference standard and the range of variability among readers. Sensitivity and specificity measures of the computer's segmentation of CD3+ and CD- T cell are recorded. For all four pathologists, mean sensitivity and specificity measures are 90.97 and 88.38%, respectively. The computerized quantification method agrees more with the manual cell marking as compared to the visual estimations. Statistical comparison between the computerized quantification method and the pathologist readings demonstrated good agreement with correlation coefficient values of 0.81 and 0.96 in terms of Lin's concordance correlation and Spearman's correlation coefficient, respectively. These values are higher than most of those calculated among the pathologists. In the future, the computerized quantification method may be used to investigate the relationship between the overall architectural pattern (i.e., interfollicular vs. follicular) and outcome measures (e.g., overall survival, and time to treatment). © 2017 International Society for Advancement of Cytometry.
    Matched MeSH terms: Entropy
Related Terms
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links