Displaying publications 1 - 20 of 1459 in total

Abstract:
Sort:
  1. Naseer S, Ali RF, Khan YD, Dominic PDD
    J Biomol Struct Dyn, 2022;40(22):11691-11704.
    PMID: 34396935 DOI: 10.1080/07391102.2021.1962738
    Lysine glutarylation is a post-translation modification which plays an important regulatory role in a variety of physiological and enzymatic processes including mitochondrial functions and metabolic processes both in eukaryotic and prokaryotic cells. This post-translational modification influences chromatin structure and thereby results in global regulation of transcription, defects in cell-cycle progression, DNA damage repair, and telomere silencing. To better understand the mechanism of lysine glutarylation, its identification in a protein is necessary, however, experimental methods are time-consuming and labor-intensive. Herein, we propose a new computational prediction approach to supplement experimental methods for identification of lysine glutarylation site prediction by deep neural networks and Chou's Pseudo Amino Acid Composition (PseAAC). We employed well-known deep neural networks for feature representation learning and classification of peptide sequences. Our approach opts raw pseudo amino acid compositions and obsoletes the need to separately perform costly and cumbersome feature extraction and selection. Among the developed deep learning-based predictors, the standard neural network-based predictor demonstrated highest scores in terms of accuracy and all other performance evaluation measures and outperforms majority of previously reported predictors without requiring expensive feature extraction process. iGluK-Deep:Computational Identification of lysine glutarylationsites using deep neural networks with general Pseudo Amino Acid Compositions Sheraz Naseer, Rao Faizan Ali, Yaser Daanial Khan, P.D.D DominicCommunicated by Ramaswamy H. Sarma.
    Matched MeSH terms: Algorithms
  2. Yusuf Dauda Jikantoro, Fudziah Ismail, Norazak Senu
    Sains Malaysiana, 2015;44:473-482.
    In this paper, an improved trigonometrically fitted zero-dissipative explicit two-step hybrid method with fifth algebraic
    order is derived. The method is applied to several problems where by the solutions are oscillatory in nature. Numerical
    results obtained are compared with existing methods in the scientific literature. The comparison shows that the new
    method is more effective and efficient than the existing methods of the same order.
    Matched MeSH terms: Algorithms
  3. Al-Saiagh W, Tiun S, Al-Saffar A, Awang S, Al-Khaleefa AS
    PLoS One, 2018;13(12):e0208695.
    PMID: 30571777 DOI: 10.1371/journal.pone.0208695
    Word sense disambiguation (WSD) is the process of identifying an appropriate sense for an ambiguous word. With the complexity of human languages in which a single word could yield different meanings, WSD has been utilized by several domains of interests such as search engines and machine translations. The literature shows a vast number of techniques used for the process of WSD. Recently, researchers have focused on the use of meta-heuristic approaches to identify the best solutions that reflect the best sense. However, the application of meta-heuristic approaches remains limited and thus requires the efficient exploration and exploitation of the problem space. Hence, the current study aims to propose a hybrid meta-heuristic method that consists of particle swarm optimization (PSO) and simulated annealing to find the global best meaning of a given text. Different semantic measures have been utilized in this model as objective functions for the proposed hybrid PSO. These measures consist of JCN and extended Lesk methods, which are combined effectively in this work. The proposed method is tested using a three-benchmark dataset (SemCor 3.0, SensEval-2, and SensEval-3). Results show that the proposed method has superior performance in comparison with state-of-the-art approaches.
    Matched MeSH terms: Algorithms*
  4. Al-Samman AM, Azmi MH, Rahman TA, Khan I, Hindia MN, Fattouh A
    PLoS One, 2016;11(12):e0164944.
    PMID: 27992445 DOI: 10.1371/journal.pone.0164944
    This work proposes channel impulse response (CIR) prediction for time-varying ultra-wideband (UWB) channels by exploiting the fast movement of channel taps within delay bins. Considering the sparsity of UWB channels, we introduce a window-based CIR (WB-CIR) to approximate the high temporal resolutions of UWB channels. A recursive least square (RLS) algorithm is adopted to predict the time evolution of the WB-CIR. For predicting the future WB-CIR tap of window wk, three RLS filter coefficients are computed from the observed WB-CIRs of the left wk-1, the current wk and the right wk+1 windows. The filter coefficient with the lowest RLS error is used to predict the future WB-CIR tap. To evaluate our proposed prediction method, UWB CIRs are collected through measurement campaigns in outdoor environments considering line-of-sight (LOS) and non-line-of-sight (NLOS) scenarios. Under similar computational complexity, our proposed method provides an improvement in prediction errors of approximately 80% for LOS and 63% for NLOS scenarios compared with a conventional method.
    Matched MeSH terms: Algorithms
  5. Barteit S, Sié A, Zabré P, Traoré I, Ouédraogo WA, Boudo V, et al.
    Front Public Health, 2023;11:1153559.
    PMID: 37304117 DOI: 10.3389/fpubh.2023.1153559
    BACKGROUND: Climate change significantly impacts health in low-and middle-income countries (LMICs), exacerbating vulnerabilities. Comprehensive data for evidence-based research and decision-making is crucial but scarce. Health and Demographic Surveillance Sites (HDSSs) in Africa and Asia provide a robust infrastructure with longitudinal population cohort data, yet they lack climate-health specific data. Acquiring this information is essential for understanding the burden of climate-sensitive diseases on populations and guiding targeted policies and interventions in LMICs to enhance mitigation and adaptation capacities.

    OBJECTIVE: The objective of this research is to develop and implement the Change and Health Evaluation and Response System (CHEERS) as a methodological framework, designed to facilitate the generation and ongoing monitoring of climate change and health-related data within existing Health and Demographic Surveillance Sites (HDSSs) and comparable research infrastructures.

    METHODS: CHEERS uses a multi-tiered approach to assess health and environmental exposures at the individual, household, and community levels, utilizing digital tools such as wearable devices, indoor temperature and humidity measurements, remotely sensed satellite data, and 3D-printed weather stations. The CHEERS framework utilizes a graph database to efficiently manage and analyze diverse data types, leveraging graph algorithms to understand the complex interplay between health and environmental exposures.

    RESULTS: The Nouna CHEERS site, established in 2022, has yielded significant preliminary findings. By using remotely-sensed data, the site has been able to predict crop yield at a household level in Nouna and explore the relationships between yield, socioeconomic factors, and health outcomes. The feasibility and acceptability of wearable technology have been confirmed in rural Burkina Faso for obtaining individual-level data, despite the presence of technical challenges. The use of wearables to study the impact of extreme weather on health has shown significant effects of heat exposure on sleep and daily activity, highlighting the urgent need for interventions to mitigate adverse health consequences.

    CONCLUSION: Implementing the CHEERS in research infrastructures can advance climate change and health research, as large and longitudinal datasets have been scarce for LMICs. This data can inform health priorities, guide resource allocation to address climate change and health exposures, and protect vulnerable communities in LMICs from these exposures.

    Matched MeSH terms: Algorithms
  6. Saeed F, Ahmed A, Shamsir MS, Salim N
    J Comput Aided Mol Des, 2014 Jun;28(6):675-84.
    PMID: 24830925 DOI: 10.1007/s10822-014-9750-2
    The cluster-based compound selection is used in the lead identification process of drug discovery and design. Many clustering methods have been used for chemical databases, but there is no clustering method that can obtain the best results under all circumstances. However, little attention has been focused on the use of combination methods for chemical structure clustering, which is known as consensus clustering. Recently, consensus clustering has been used in many areas including bioinformatics, machine learning and information theory. This process can improve the robustness, stability, consistency and novelty of clustering. For chemical databases, different consensus clustering methods have been used including the co-association matrix-based, graph-based, hypergraph-based and voting-based methods. In this paper, a weighted cumulative voting-based aggregation algorithm (W-CVAA) was developed. The MDL Drug Data Report (MDDR) benchmark chemical dataset was used in the experiments and represented by the AlogP and ECPF_4 descriptors. The results from the clustering methods were evaluated by the ability of the clustering to separate biologically active molecules in each cluster from inactive ones using different criteria, and the effectiveness of the consensus clustering was compared to that of Ward's method, which is the current standard clustering method in chemoinformatics. This study indicated that weighted voting-based consensus clustering can overcome the limitations of the existing voting-based methods and improve the effectiveness of combining multiple clusterings of chemical structures.
    Matched MeSH terms: Algorithms
  7. Manogaran G, Shakeel PM, Fouad H, Nam Y, Baskar S, Chilamkurti N, et al.
    Sensors (Basel), 2019 Jul 09;19(13).
    PMID: 31324070 DOI: 10.3390/s19133030
    According to the survey on various health centres, smart log-based multi access physical monitoring system determines the health conditions of humans and their associated problems present in their lifestyle. At present, deficiency in significant nutrients leads to deterioration of organs, which creates various health problems, particularly for infants, children, and adults. Due to the importance of a multi access physical monitoring system, children and adolescents' physical activities should be continuously monitored for eliminating difficulties in their life using a smart environment system. Nowadays, in real-time necessity on multi access physical monitoring systems, information requirements and the effective diagnosis of health condition is the challenging task in practice. In this research, wearable smart-log patch with Internet of Things (IoT) sensors has been designed and developed with multimedia technology. Further, the data computation in that smart-log patch has been analysed using edge computing on Bayesian deep learning network (EC-BDLN), which helps to infer and identify various physical data collected from the humans in an accurate manner to monitor their physical activities. Then, the efficiency of this wearable IoT system with multimedia technology is evaluated using experimental results and discussed in terms of accuracy, efficiency, mean residual error, delay, and less energy consumption. This state-of-the-art smart-log patch is considered as one of evolutionary research in health checking of multi access physical monitoring systems with multimedia technology.
    Matched MeSH terms: Algorithms
  8. Saffor A, bin Ramli AR, Ng KH
    Australas Phys Eng Sci Med, 2003 Jun;26(2):39-44.
    PMID: 12956184
    Wavelet-based image coding algorithms (lossy and lossless) use a fixed perfect reconstruction filter-bank built into the algorithm for coding and decoding of images. However, no systematic study has been performed to evaluate the coding performance of wavelet filters on medical images. We evaluated the best types of filters suitable for medical images in providing low bit rate and low computational complexity. In this study a variety of wavelet filters are used to compress and decompress computed tomography (CT) brain and abdomen images. We applied two-dimensional wavelet decomposition, quantization and reconstruction using several families of filter banks to a set of CT images. Discreet Wavelet Transform (DWT), which provides efficient framework of multi-resolution frequency was used. Compression was accomplished by applying threshold values to the wavelet coefficients. The statistical indices such as mean square error (MSE), maximum absolute error (MAE) and peak signal-to-noise ratio (PSNR) were used to quantify the effect of wavelet compression of selected images. The code was written using the wavelet and image processing toolbox of the MATLAB (version 6.1). This results show that no specific wavelet filter performs uniformly better than others except for the case of Daubechies and bi-orthogonal filters which are the best among all. MAE values achieved by these filters were 5 x 10(-14) to 12 x 10(-14) for both CT brain and abdomen images at different decomposition levels. This indicated that using these filters a very small error (approximately 7 x 10(-14)) can be achieved between original and the filtered image. The PSNR values obtained were higher for the brain than the abdomen images. For both the lossy and lossless compression, the 'most appropriate' wavelet filter should be chosen adaptively depending on the statistical properties of the image being coded to achieve higher compression ratio.
    Matched MeSH terms: Algorithms
  9. Al-Busaidi AM, Khriji L, Touati F, Rasid MF, Mnaouer AB
    J Med Syst, 2017 Sep 12;41(10):166.
    PMID: 28900815 DOI: 10.1007/s10916-017-0817-1
    One of the major issues in time-critical medical applications using wireless technology is the size of the payload packet, which is generally designed to be very small to improve the transmission process. Using small packets to transmit continuous ECG data is still costly. Thus, data compression is commonly used to reduce the huge amount of ECG data transmitted through telecardiology devices. In this paper, a new ECG compression scheme is introduced to ensure that the compressed ECG segments fit into the available limited payload packets, while maintaining a fixed CR to preserve the diagnostic information. The scheme automatically divides the ECG block into segments, while maintaining other compression parameters fixed. This scheme adopts discrete wavelet transform (DWT) method to decompose the ECG data, bit-field preserving (BFP) method to preserve the quality of the DWT coefficients, and a modified running-length encoding (RLE) scheme to encode the coefficients. The proposed dynamic compression scheme showed promising results with a percentage packet reduction (PR) of about 85.39% at low percentage root-mean square difference (PRD) values, less than 1%. ECG records from MIT-BIH Arrhythmia Database were used to test the proposed method. The simulation results showed promising performance that satisfies the needs of portable telecardiology systems, like the limited payload size and low power consumption.
    Matched MeSH terms: Algorithms
  10. Zainuddin Z, Wan Daud WR, Pauline O, Shafie A
    Bioresour Technol, 2011 Dec;102(23):10978-86.
    PMID: 21996481 DOI: 10.1016/j.biortech.2011.09.080
    In the organosolv pulping of the oil palm fronds, the influence of the operational variables of the pulping reactor (viz. cooking temperature and time, ethanol and NaOH concentration) on the properties of the resulting pulp (yield and kappa number) and paper sheets (tensile index and tear index) was investigated using a wavelet neural network model. The experimental results with error less than 0.0965 (in terms of MSE) were produced, and were then compared with those obtained from the response surface methodology. Performance assessment indicated that the neural network model possessed superior predictive ability than the polynomial model, since a very close agreement between the experimental and the predicted values was obtained.
    Matched MeSH terms: Algorithms
  11. Achuthan A, Rajeswari M, Ramachandram D, Aziz ME, Shuaib IL
    Comput Biol Med, 2010 Jul;40(7):608-20.
    PMID: 20541182 DOI: 10.1016/j.compbiomed.2010.04.005
    This paper introduces an approach to perform segmentation of regions in computed tomography (CT) images that exhibit intra-region intensity variations and at the same time have similar intensity distributions with surrounding/adjacent regions. In this work, we adapt a feature computed from wavelet transform called wavelet energy to represent the region information. The wavelet energy is embedded into a level set model to formulate the segmentation model called wavelet energy-guided level set-based active contour (WELSAC). The WELSAC model is evaluated using several synthetic and CT images focusing on tumour cases, which contain regions demonstrating the characteristics of intra-region intensity variations and having high similarity in intensity distributions with the adjacent regions. The obtained results show that the proposed WELSAC model is able to segment regions of interest in close correspondence with the manual delineation provided by the medical experts and to provide a solution for tumour detection.
    Matched MeSH terms: Algorithms*
  12. Syed Ahmad SM, Loo LY, Wan Adnan WA, Md Anwar R
    J Forensic Sci, 2017 Mar;62(2):374-381.
    PMID: 28000207 DOI: 10.1111/1556-4029.13303
    This study presents a wavelet analysis of resultant velocity features belonging to genuine and forged groups of signature sample. Signatures of individuals were initially classified based on visual human perceptions of their relative sizes, complexities, and legibilities of the genuine counterparts. Then, the resultant velocity was extracted and modeled through wavelet analysis from each sample. The wavelet signal was decomposed into several layers based on maximum overlap discrete wavelet transform (MODWT). Next, the zero crossing rate features were calculated from all the high wavelet sub-bands. A total of seven hypotheses were then tested using a two-way ANOVA testing methodology. Of these, four hypotheses were conducted to test for significance differences between distributions. In addition, three hypotheses were run to provide test for interaction between two factors of signature authentication versus perceived classification. The results demonstrated that both feature distributions belonging to genuine and forged groups of samples cannot be distinguished by themselves. Instead, they were significantly different under the influence of two other inherent factors, namely perceived size and legibility. Such new findings are useful information particularly in providing bases for forensic justifications in establishing the authenticity of handwritten signature specimens.
    Matched MeSH terms: Algorithms
  13. Wu Diyi, Zulaiha Ali Othman, Suhaila Zainudin, Ayman Srour
    MyJurnal
    The water flow-like algorithm (WFA) is a relatively new metaheuristic algorithm, which has shown good solution for the Travelling Salesman Problem (TSP) and is comparable to state of the art results. The basic WFA for TSP uses a 2-opt searching method to decide a water flow splitting decision. Previous algorithms, such as the Ant Colony System for the TSP, has shown that using k-opt (k>2) improves the solution, but increases its complexity exponentially. Therefore, this paper aims to present the performance of the WFA-TSP using 3-opt and 4-opt, respectively, compare them with the basic WFA-TSP using 2-opt and the state of the art algorithms. The algorithms are evaluated using 16 benchmarks TSP datasets. The experimental results show that the proposed WFA-TSP-4opt outperforms in solution quality compare with others, due to its capacity of more exploration and less convergence.
    Matched MeSH terms: Algorithms
  14. Mohd Hanid MH, Abd Rahim SZ, Gondro J, Sharif S, Al Bakri Abdullah MM, Zain AM, et al.
    Materials (Basel), 2021 Mar 10;14(6).
    PMID: 33802032 DOI: 10.3390/ma14061326
    It is quite challenging to control both quality and productivity of products produced using injection molding process. Although many previous researchers have used different types of optimisation approaches to obtain the best configuration of parameters setting to control the quality of the molded part, optimisation approaches in maximising the performance of cooling channels to enhance the process productivity by decreasing the mould cycle time remain lacking. In this study, optimisation approaches namely Response Surface Methodology (RSM), Genetic Algorithm (GA) and Glowworm Swarm Optimisation (GSO) were employed on front panel housing moulded using Acrylonitrile Butadiene Styrene (ABS). Each optimisation method was analysed for both straight drilled and Milled Groove Square Shape (MGSS) conformal cooling channel moulds. Results from experimental works showed that, the performance of MGSS conformal cooling channels could be enhanced by employing the optimisation approach. Therefore, this research provides useful scientific knowledge and an alternative solution for the plastic injection moulding industry to improve the quality of moulded parts in terms of deformation using the proposed optimisation approaches in the used of conformal cooling channels mould.
    Matched MeSH terms: Algorithms
  15. Tamizi NAMA, Rahim SZA, Abdellah AE, Abdullah MMAB, Nabiałek M, Wysłocki JJ, et al.
    Materials (Basel), 2021 Mar 15;14(6).
    PMID: 33804036 DOI: 10.3390/ma14061416
    Many studies have been done using recycled waste materials to minimise environmental problems. It is a great opportunity to explore mechanical recycling and the use of recycled and virgin blend as a material to produce new products with minimum defects. In this study, appropriate processing parameters were considered to mould the front panel housing part using R0% (virgin), R30% (30% virgin: 70% recycled), R40% (40% virgin: 60% recycled) and R50% (50% virgin: 50% recycled) of Polycarbonate (PC). The manufacturing ability and quality during preliminary stage can be predicted through simulation analysis using Autodesk Moldflow Insight 2012 software. The recommended processing parameters and values of warpage in x and y directions can also be obtained using this software. No value of warpage was obtained from simulation studies for x direction on the front panel housing. Therefore, this study only focused on reducing the warpage in the y direction. Response Surface Methodology (RSM) and Genetic Algorithm (GA) optimisation methods were used to find the optimal processing parameters. As the results, the optimal ratio of recycled PC material was found to be R30%, followed by R40% and R50% materials using RSM and GA methods as compared to the average value of warpage on the moulded part using R0%. The most influential processing parameter that contributed to warpage defect was packing pressure for all materials used in this study.
    Matched MeSH terms: Algorithms
  16. Saffian SM, Duffull SB, Wright D
    Clin. Pharmacol. Ther., 2017 Aug;102(2):297-304.
    PMID: 28160278 DOI: 10.1002/cpt.649
    There is preliminary evidence to suggest that some published warfarin dosing algorithms produce biased maintenance dose predictions in patients who require higher than average doses. We conducted a meta-analysis of warfarin dosing algorithms to determine if there exists a systematic under- or overprediction of dose requirements for patients requiring ≥7 mg/day across published algorithms. Medline and Embase databases were searched up to September 2015. We quantified the proportion of over- and underpredicted doses in patients whose observed maintenance dose was ≥7 mg/day. The meta-analysis included 47 evaluations of 22 different warfarin dosing algorithms from 16 studies. The meta-analysis included data from 1,492 patients who required warfarin doses of ≥7 mg/day. All 22 algorithms were found to underpredict warfarin dosing requirements in patients who required ≥7 mg/day by an average of 2.3 mg/day with a pooled estimate of underpredicted doses of 92.3% (95% confidence interval 90.3-94.1, I(2) = 24%).
    Matched MeSH terms: Algorithms*
  17. Mohd Amran, Mohd Radzi, Zai Peng, Goh, Hashim, Hizam
    MyJurnal
    This paper presents a voltage flicker estimation based on a pair of inter-harmonics analysis method. The proposed algorithm is able to estimate flicker frequency and amplitude changes of a voltage waveform. The correlation of the pair of inter-harmonics, flicker frequency, and amplitude changes are presented and their formulas highlighted. .Experimental results indicate the amplitude of pair of inter-harmonics can detect the voltage flicker. Furthermore, the experimental results are compared with the measurement results obtained by using the Fluke power analyzer (Pst).
    Matched MeSH terms: Algorithms
  18. Wali SB, Abdullah MA, Hannan MA, Hussain A, Samad SA, Ker PJ, et al.
    Sensors (Basel), 2019 May 06;19(9).
    PMID: 31064098 DOI: 10.3390/s19092093
    The automatic traffic sign detection and recognition (TSDR) system is very important research in the development of advanced driver assistance systems (ADAS). Investigations on vision-based TSDR have received substantial interest in the research community, which is mainly motivated by three factors, which are detection, tracking and classification. During the last decade, a substantial number of techniques have been reported for TSDR. This paper provides a comprehensive survey on traffic sign detection, tracking and classification. The details of algorithms, methods and their specifications on detection, tracking and classification are investigated and summarized in the tables along with the corresponding key references. A comparative study on each section has been provided to evaluate the TSDR data, performance metrics and their availability. Current issues and challenges of the existing technologies are illustrated with brief suggestions and a discussion on the progress of driver assistance system research in the future. This review will hopefully lead to increasing efforts towards the development of future vision-based TSDR system.
    Matched MeSH terms: Algorithms
  19. Voon PT, Ng TK, Lee VK, Nesaretnam K
    Eur J Clin Nutr, 2015 Jun;69(6):712-6.
    PMID: 25804278 DOI: 10.1038/ejcn.2015.26
    Effects of high-protein diets that are rich in saturated fats on cell adhesion molecules, thrombogenicity and other nonlipid markers of atherosclerosis in humans have not been firmly established. We aim to investigate the effects of high-protein Malaysian diets prepared separately with virgin olive oil (OO), palm olein (PO) and coconut oil (CO) on cell adhesion molecules, lipid inflammatory mediators and thromobogenicity indices in healthy adults.
    Matched MeSH terms: Algorithms
  20. Dawood F, Loo CK
    PLoS One, 2016;11(3):e0152003.
    PMID: 26998923 DOI: 10.1371/journal.pone.0152003
    Mirror neurons are visuo-motor neurons found in primates and thought to be significant for imitation learning. The proposition that mirror neurons result from associative learning while the neonate observes his own actions has received noteworthy empirical support. Self-exploration is regarded as a procedure by which infants become perceptually observant to their own body and engage in a perceptual communication with themselves. We assume that crude sense of self is the prerequisite for social interaction. However, the contribution of mirror neurons in encoding the perspective from which the motor acts of others are seen have not been addressed in relation to humanoid robots. In this paper we present a computational model for development of mirror neuron system for humanoid based on the hypothesis that infants acquire MNS by sensorimotor associative learning through self-exploration capable of sustaining early imitation skills. The purpose of our proposed model is to take into account the view-dependency of neurons as a probable outcome of the associative connectivity between motor and visual information. In our experiment, a humanoid robot stands in front of a mirror (represented through self-image using camera) in order to obtain the associative relationship between his own motor generated actions and his own visual body-image. In the learning process the network first forms mapping from each motor representation onto visual representation from the self-exploratory perspective. Afterwards, the representation of the motor commands is learned to be associated with all possible visual perspectives. The complete architecture was evaluated by simulation experiments performed on DARwIn-OP humanoid robot.
    Matched MeSH terms: Algorithms
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links