Displaying publications 1 - 20 of 171 in total

  1. Yang XS, Chien SF, Ting TO
    ScientificWorldJournal, 2014;2014:425853.
    PMID: 25610904 DOI: 10.1155/2014/425853
    Matched MeSH terms: Artificial Intelligence*
  2. Kalatehjari R, Rashid AS, Ali N, Hajihassani M
    ScientificWorldJournal, 2014;2014:973093.
    PMID: 24991652 DOI: 10.1155/2014/973093
    Over the last few years, particle swarm optimization (PSO) has been extensively applied in various geotechnical engineering including slope stability analysis. However, this contribution was limited to two-dimensional (2D) slope stability analysis. This paper applied PSO in three-dimensional (3D) slope stability problem to determine the critical slip surface (CSS) of soil slopes. A detailed description of adopted PSO was presented to provide a good basis for more contribution of this technique to the field of 3D slope stability problems. A general rotating ellipsoid shape was introduced as the specific particle for 3D slope stability analysis. A detailed sensitivity analysis was designed and performed to find the optimum values of parameters of PSO. Example problems were used to evaluate the applicability of PSO in determining the CSS of 3D slopes. The first example presented a comparison between the results of PSO and PLAXI-3D finite element software and the second example compared the ability of PSO to determine the CSS of 3D slopes with other optimization methods from the literature. The results demonstrated the efficiency and effectiveness of PSO in determining the CSS of 3D soil slopes.
    Matched MeSH terms: Artificial Intelligence*
  3. Nurhafizah Jamain, Ismail Musirin, Mohd Helmi Mansor, Muhammad Murtadha Othman, Siti Aliyah Mohd Salleh
    This paper presents adaptive particle swarm optimization for solving non-convex economic dispatch problems. In this study, a new technique was developed known as adaptive particle swarm optimization (APSO), to alleviate the problems experienced in the traditional particle swarm optimisation (PSO). The traditional PSO was reported that this technique always stuck at local minima. In APSO, economic dispatch problem are considered with valve point effects. The search efficiency was improved when a new parameter was inserted into the velocity term. This has achieved local minima. In order to show the effectiveness of the proposed technique, this study examined two case studies, with and without contingency.
    Matched MeSH terms: Artificial Intelligence
  4. Olayiwola Babarinsa, Hailiza Kamarulhaili
    MATEMATIKA, 2019;35(1):25-38.
    The proposed modified methods of Cramer's rule consider the column vector as well as the coefficient matrix concurrently in the linear system. The modified methods can be applied since Cramer's rule is typically known for solving the linear systems in $WZ$ factorization to yield Z-matrix. Then, we presented our results to show that there is no tangible difference in performance time between Cramer's rule and the modified methods in the factorization from improved versions of MATLAB. Additionally, the Frobenius norm of the modified methods in the factorization is better than using Cramer's rule irrespective of the version of MATLAB used.
    Matched MeSH terms: Artificial Intelligence
  5. Jawahar N, Ponnambalam SG, Sivakumar K, Thangadurai V
    ScientificWorldJournal, 2014;2014:458959.
    PMID: 24790568 DOI: 10.1155/2014/458959
    Products such as cars, trucks, and heavy machinery are assembled by two-sided assembly line. Assembly line balancing has significant impacts on the performance and productivity of flow line manufacturing systems and is an active research area for several decades. This paper addresses the line balancing problem of a two-sided assembly line in which the tasks are to be assigned at L side or R side or any one side (addressed as E). Two objectives, minimum number of workstations and minimum unbalance time among workstations, have been considered for balancing the assembly line. There are two approaches to solve multiobjective optimization problem: first approach combines all the objectives into a single composite function or moves all but one objective to the constraint set; second approach determines the Pareto optimal solution set. This paper proposes two heuristics to evolve optimal Pareto front for the TALBP under consideration: Enumerative Heuristic Algorithm (EHA) to handle problems of small and medium size and Simulated Annealing Algorithm (SAA) for large-sized problems. The proposed approaches are illustrated with example problems and their performances are compared with a set of test problems.
    Matched MeSH terms: Artificial Intelligence*
  6. Mak KK, Pichika MR
    Drug Discov Today, 2019 03;24(3):773-780.
    PMID: 30472429 DOI: 10.1016/j.drudis.2018.11.014
    Artificial intelligence (AI) uses personified knowledge and learns from the solutions it produces to address not only specific but also complex problems. Remarkable improvements in computational power coupled with advancements in AI technology could be utilised to revolutionise the drug development process. At present, the pharmaceutical industry is facing challenges in sustaining their drug development programmes because of increased R&D costs and reduced efficiency. In this review, we discuss the major causes of attrition rates in new drug approvals, the possible ways that AI can improve the efficiency of the drug development process and collaboration of pharmaceutical industry giants with AI-powered drug discovery firms.
    Matched MeSH terms: Artificial Intelligence*
  7. Yigitcanlar T, Butler L, Windle E, Desouza KC, Mehmood R, Corchado JM
    Sensors (Basel), 2020 May 25;20(10).
    PMID: 32466175 DOI: 10.3390/s20102988
    In recent years, artificial intelligence (AI) has started to manifest itself at an unprecedented pace. With highly sophisticated capabilities, AI has the potential to dramatically change our cities and societies. Despite its growing importance, the urban and social implications of AI are still an understudied area. In order to contribute to the ongoing efforts to address this research gap, this paper introduces the notion of an artificially intelligent city as the potential successor of the popular smart city brand-where the smartness of a city has come to be strongly associated with the use of viable technological solutions, including AI. The study explores whether building artificially intelligent cities can safeguard humanity from natural disasters, pandemics, and other catastrophes. All of the statements in this viewpoint are based on a thorough review of the current status of AI literature, research, developments, trends, and applications. This paper generates insights and identifies prospective research questions by charting the evolution of AI and the potential impacts of the systematic adoption of AI in cities and societies. The generated insights inform urban policymakers, managers, and planners on how to ensure the correct uptake of AI in our cities, and the identified critical questions offer scholars directions for prospective research and development.
    Matched MeSH terms: Artificial Intelligence*
  8. Kubicek J, Penhaker M, Krejcar O, Selamat A
    Sensors (Basel), 2021 Jan 27;21(3).
    PMID: 33513910 DOI: 10.3390/s21030847
    There are various modern systems for the measurement and consequent acquisition of valuable patient's records in the form of medical signals and images, which are supposed to be processed to provide significant information about the state of biological tissues [...].
    Matched MeSH terms: Artificial Intelligence*
  9. Dikshit A, Pradhan B
    Sci Total Environ, 2021 Dec 20;801:149797.
    PMID: 34467917 DOI: 10.1016/j.scitotenv.2021.149797
    Accurate prediction of any type of natural hazard is a challenging task. Of all the various hazards, drought prediction is challenging as it lacks a universal definition and is getting adverse with climate change impacting drought events both spatially and temporally. The problem becomes more complex as drought occurrence is dependent on a multitude of factors ranging from hydro-meteorological to climatic variables. A paradigm shift happened in this field when it was found that the inclusion of climatic variables in the data-driven prediction model improves the accuracy. However, this understanding has been primarily using statistical metrics used to measure the model accuracy. The present work tries to explore this finding using an explainable artificial intelligence (XAI) model. The explainable deep learning model development and comparative analysis were performed using known understandings drawn from physical-based models. The work also tries to explore how the model achieves specific results at different spatio-temporal intervals, enabling us to understand the local interactions among the predictors for different drought conditions and drought periods. The drought index used in the study is Standard Precipitation Index (SPI) at 12 month scales applied for five different regions in New South Wales, Australia, with the explainable algorithm being SHapley Additive exPlanations (SHAP). The conclusions drawn from SHAP plots depict the importance of climatic variables at a monthly scale and varying ranges of annual scale. We observe that the results obtained from SHAP align with the physical model interpretations, thus suggesting the need to add climatic variables as predictors in the prediction model.
    Matched MeSH terms: Artificial Intelligence*
  10. Tahir GA, Loo CK
    Comput Biol Med, 2021 12;139:104972.
    PMID: 34749093 DOI: 10.1016/j.compbiomed.2021.104972
    Food recognition systems recently garnered much research attention in the relevant field due to their ability to obtain objective measurements for dietary intake. This feature contributes to the management of various chronic conditions. Challenges such as inter and intraclass variations alongside the practical applications of smart glasses, wearable cameras, and mobile devices require resource-efficient food recognition models with high classification performance. Furthermore, explainable AI is also crucial in health-related domains as it characterizes model performance, enhancing its transparency and objectivity. Our proposed architecture attempts to address these challenges by drawing on the strengths of the transfer learning technique upon initializing MobiletNetV3 with weights from a pre-trained model of ImageNet. The MobileNetV3 achieves superior performance using the squeeze and excitation strategy, providing unequal weight to different input channels and contrasting equal weights in other variants. Despite being fast and efficient, there is a high possibility for it to be stuck in the local optima like other deep neural networks, reducing the desired classification performance of the model. Thus, we overcome this issue by applying the snapshot ensemble approach as it enables the M model in a single training process without any increase in the required training time. As a result, each snapshot in the ensemble visits different local minima before converging to the final solution which enhances recognition performance. On overcoming the challenge of explainability, we argue that explanations cannot be monolithic, since each stakeholder perceive the results', explanations based on different objectives and aims. Thus, we proposed a user-centered explainable artificial intelligence (AI) framework to increase the trust of the involved parties by inferencing and rationalizing the results according to needs and user profile. Our framework is comprehensive in terms of a dietary assessment app as it detects Food/Non-Food, food categories, and ingredients. Experimental results on the standard food benchmarks and newly contributed Malaysian food dataset for ingredient detection demonstrated superior performance on an integrated set of measures over other methodologies.
    Matched MeSH terms: Artificial Intelligence*
  11. Silalahi DD, Midi H, Arasan J, Mustafa MS, Caliman JP
    Heliyon, 2020 Jan;6(1):e03176.
    PMID: 32042959 DOI: 10.1016/j.heliyon.2020.e03176
    In practice, the collected spectra are very often composes of complex overtone and many overlapping peaks which may lead to misinterpretation because of its significant nonlinear characteristics. Using linear solution might not be appropriate. In addition, with a high-dimension of dataset due to large number of observations and data points the classical multiple regressions will neglect to fit. These complexities commonly will impact to multicollinearity problem, furthermore the risk of contamination of multiple outliers and high leverage points also increases. To address these problems, a new method called Kernel Partial Diagnostic Robust Potential (KPDRGP) is introduced. The method allows the nonlinear solution which maps nonlinearly the original input


    matrix into higher dimensional feature mapping with corresponds to the Reproducing Kernel Hilbert Spaces (RKHS). In dimensional reduction, the method replaces the dot products calculation of elements in the mapped data to a nonlinear function in the original input space. To prevent the contamination of the multiple outlier and high leverage points the robust procedure using Diagnostic Robust Generalized Potentials (DRGP) algorithm was used. The results verified that using the simulation and real data, the proposed KPDRGP method was superior to the methods in the class of non-kernel and some other robust methods with kernel solution.
    Matched MeSH terms: Artificial Intelligence
  12. Tsoi K, Yiu K, Lee H, Cheng HM, Wang TD, Tay JC, et al.
    J Clin Hypertens (Greenwich), 2021 03;23(3):568-574.
    PMID: 33533536 DOI: 10.1111/jch.14180
    The prevalence of hypertension is increasing along with an aging population, causing millions of premature deaths annually worldwide. Low awareness of blood pressure (BP) elevation and suboptimal hypertension diagnosis serve as the major hurdles in effective hypertension management. The advent of artificial intelligence (AI), however, sheds the light of new strategies for hypertension management, such as remote supports from telemedicine and big data-derived prediction. There is considerable evidence demonstrating the feasibility of AI applications in hypertension management. A foreseeable trend was observed in integrating BP measurements with various wearable sensors and smartphones, so as to permit continuous and convenient monitoring. In the meantime, further investigations are advised to validate the novel prediction and prognostic tools. These revolutionary developments have made a stride toward the future model for digital management of chronic diseases.
    Matched MeSH terms: Artificial Intelligence
  13. Mohd Faizal AS, Thevarajah TM, Khor SM, Chang SW
    Comput Methods Programs Biomed, 2021 Aug;207:106190.
    PMID: 34077865 DOI: 10.1016/j.cmpb.2021.106190
    Cardiovascular disease (CVD) is the leading cause of death worldwide and is a global health issue. Traditionally, statistical models are used commonly in the risk prediction and assessment of CVD. However, the adoption of artificial intelligent (AI) approach is rapidly taking hold in the current era of technology to evaluate patient risks and predict the outcome of CVD. In this review, we outline various conventional risk scores and prediction models and do a comparison with the AI approach. The strengths and limitations of both conventional and AI approaches are discussed. Besides that, biomarker discovery related to CVD are also elucidated as the biomarkers can be used in the risk stratification as well as early detection of the disease. Moreover, problems and challenges involved in current CVD studies are explored. Lastly, future prospects of CVD risk prediction and assessment in the multi-modality of big data integrative approaches are proposed.
    Matched MeSH terms: Artificial Intelligence
  14. Rahman MM, Khatun F, Uzzaman A, Sami SI, Bhuiyan MA, Kiong TS
    Int J Health Serv, 2021 10;51(4):446-461.
    PMID: 33999732 DOI: 10.1177/00207314211017469
    The novel coronavirus disease (COVID-19) has spread over 219 countries of the globe as a pandemic, creating alarming impacts on health care, socioeconomic environments, and international relationships. The principal objective of the study is to provide the current technological aspects of artificial intelligence (AI) and other relevant technologies and their implications for confronting COVID-19 and preventing the pandemic's dreadful effects. This article presents AI approaches that have significant contributions in the fields of health care, then highlights and categorizes their applications in confronting COVID-19, such as detection and diagnosis, data analysis and treatment procedures, research and drug development, social control and services, and the prediction of outbreaks. The study addresses the link between the technologies and the epidemics as well as the potential impacts of technology in health care with the introduction of machine learning and natural language processing tools. It is expected that this comprehensive study will support researchers in modeling health care systems and drive further studies in advanced technologies. Finally, we propose future directions in research and conclude that persuasive AI strategies, probabilistic models, and supervised learning are required to tackle future pandemic challenges.
    Matched MeSH terms: Artificial Intelligence
  15. Hermawan A, Amrillah T, Riapanitra A, Ong WJ, Yin S
    Adv Healthc Mater, 2021 10;10(20):e2100970.
    PMID: 34318999 DOI: 10.1002/adhm.202100970
    A fully integrated, flexible, and functional sensing device for exhaled breath analysis drastically transforms conventional medical diagnosis to non-invasive, low-cost, real-time, and personalized health care. 2D materials based on MXenes offer multiple advantages for accurately detecting various breath biomarkers compared to conventional semiconducting oxides. High surface sensitivity, large surface-to-weight ratio, room temperature detection, and easy-to-assemble structures are vital parameters for such sensing devices in which MXenes have demonstrated all these properties both experimentally and theoretically. So far, MXenes-based flexible sensor is successfully fabricated at a lab-scale and is predicted to be translated into clinical practice within the next few years. This review presents a potential application of MXenes as emerging materials for flexible and wearable sensor devices. The biomarkers from exhaled breath are described first, with emphasis on metabolic processes and diseases indicated by abnormal biomarkers. Then, biomarkers sensing performances provided by MXenes families and the enhancement strategies are discussed. The method of fabrications toward MXenes integration into various flexible substrates is summarized. Finally, the fundamental challenges and prospects, including portable integration with Internet-of-Thing (IoT) and Artificial Intelligence (AI), are addressed to realize marketization.
    Matched MeSH terms: Artificial Intelligence
  16. Nagaki K, Furuta T, Yamaji N, Kuniyoshi D, Ishihara M, Kishima Y, et al.
    Chromosome Res, 2021 12;29(3-4):361-371.
    PMID: 34648121 DOI: 10.1007/s10577-021-09676-z
    Observing chromosomes is a time-consuming and labor-intensive process, and chromosomes have been analyzed manually for many years. In the last decade, automated acquisition systems for microscopic images have advanced dramatically due to advances in their controlling computer systems, and nowadays, it is possible to automatically acquire sets of tiling-images consisting of large number, more than 1000, of images from large areas of specimens. However, there has been no simple and inexpensive system to efficiently select images containing mitotic cells among these images. In this paper, a classification system of chromosomal images by deep learning artificial intelligence (AI) that can be easily handled by non-data scientists was applied. With this system, models suitable for our own samples could be easily built on a Macintosh computer with Create ML. As examples, models constructed by learning using chromosome images derived from various plant species were able to classify images containing mitotic cells among samples from plant species not used for learning in addition to samples from the species used. The system also worked for cells in tissue sections and tetrads. Since this system is inexpensive and can be easily trained via deep learning using scientists' own samples, it can be used not only for chromosomal image analysis but also for analysis of other biology-related images.
    Matched MeSH terms: Artificial Intelligence
  17. Bhagat SK, Tiyasha T, Awadh SM, Tung TM, Jawad AH, Yaseen ZM
    Environ Pollut, 2021 Jan 01;268(Pt B):115663.
    PMID: 33120144 DOI: 10.1016/j.envpol.2020.115663
    Hybrid artificial intelligence (AI) models are developed for sediment lead (Pb) prediction in two Bays (i.e., Bramble (BB) and Deception (DB)) stations, Australia. A feature selection (FS) algorithm called extreme gradient boosting (XGBoost) is proposed to abstract the correlated input parameters for the Pb prediction and validated against principal component of analysis (PCA), recursive feature elimination (RFE), and the genetic algorithm (GA). XGBoost model is applied using a grid search strategy (Grid-XGBoost) for predicting Pb and validated against the commonly used AI models, artificial neural network (ANN) and support vector machine (SVM). The input parameter selection approaches redimensioned the 21 parameters into 9-5 parameters without losing their learned information over the models' training phase. At the BB station, the mean absolute percentage error (MAPE) values (0.06, 0.32, 0.34, and 0.33) were achieved for the XGBoost-SVM, XGBoost-ANN, XGBoost-Grid-XGBoost, and Grid-XGBoost models, respectively. At the DB station, the lowest MAPE values, 0.25 and 0.24, were attained for the XGBoost-Grid-XGBoost and Grid-XGBoost models, respectively. Overall, the proposed hybrid AI models provided a reliable and robust computer aid technology for sediment Pb prediction that contribute to the best knowledge of environmental pollution monitoring and assessment.
    Matched MeSH terms: Artificial Intelligence*
  18. Hammond ER, Foong AKM, Rosli N, Morbeck DE
    Hum Reprod, 2020 05 01;35(5):1045-1053.
    PMID: 32358601 DOI: 10.1093/humrep/deaa060
    STUDY QUESTION: What is the inter-observer agreement among embryologists for decision to freeze blastocysts of borderline morphology and can it be improved with a modified grading system?

    SUMMARY ANSWER: The inter-observer agreement among embryologists deciding whether to freeze blastocysts of marginal morphology was low and was not improved by a modified grading system.

    WHAT IS KNOWN ALREADY: While previous research on inter-observer variability on the decision of which embryo to transfer from a cohort of blastocysts is good, the impact of grading variability regarding decision to freeze borderline blastocysts has not been investigated. Agreement for inner cell mass (ICM) and trophectoderm (TE) grade is only fair, factors which contribute to the grade that influences decision to freeze.

    STUDY DESIGN, SIZE, DURATION: This was a prospective study involving 18 embryologists working at four different IVF clinics within a single organisation between January 2019 and July 2019.

    PARTICIPANTS/MATERIALS, SETTING, METHODS: All embryologists currently practicing blastocyst grading at a multi-site organisation were invited to participate. The survey was comprised of blastocyst images in three planes and asked (i) the likelihood of freezing and (ii) whether the blastocyst would be frozen based on visual assessment. Blastocysts varied by quality and were categorised as either top (n = 20), borderline (n = 60) or non-viable/degenerate quality (n = 20). A total of 1800 freeze decisions were assessed. To assess the impact of grading criteria on inter-observer agreement for decision to freeze, the survey was taken once when the embryologists used the Gardner criteria and again 6 months after transitioning to a modified Gardner criterion with four grades for ICM and TE. The fourth grade was introduced with the aim to promote higher levels of agreement for the clinical usability decision when the blastocyst was of marginal quality.

    MAIN RESULTS AND THE ROLE OF CHANCE: The inter-observer agreement for decision to freeze was near perfect (kappa 1.0) for top and non-viable/degenerate quality blastocysts, and this was not affected by the blastocysts grading criteria used (top quality; P = 0.330 and non-viable/degenerate quality; P = 0.18). In contrast, the cohort of borderline blastocysts received a mixed freeze rate (average 52.7%) during the first survey, indicative of blastocysts that showed uncertain viability and promoting significant disagreement for decision to freeze among the embryologists (kappa 0.304). After transitioning to a modified Gardner criteria with an additional grading tier, the average freeze rate increased (64.8%; P 

    Matched MeSH terms: Artificial Intelligence*
  19. Tiyasha T, Tung TM, Bhagat SK, Tan ML, Jawad AH, Mohtar WHMW, et al.
    Mar Pollut Bull, 2021 Sep;170:112639.
    PMID: 34273614 DOI: 10.1016/j.marpolbul.2021.112639
    Dissolved oxygen (DO) is an important indicator of river health for environmental engineers and ecological scientists to understand the state of river health. This study aims to evaluate the reliability of four feature selector algorithms i.e., Boruta, genetic algorithm (GA), multivariate adaptive regression splines (MARS), and extreme gradient boosting (XGBoost) to select the best suited predictor of the applied water quality (WQ) parameters; and compare four tree-based predictive models, namely, random forest (RF), conditional random forests (cForest), RANdom forest GEneRator (Ranger), and XGBoost to predict the changes of dissolved oxygen (DO) in the Klang River, Malaysia. The total features including 15 WQ parameters from monitoring site data and 7 hydrological components from remote sensing data. All predictive models performed well as per the features selected by the algorithms XGBoost and MARS in terms applied statistical evaluators. Besides, the best performance noted in case of XGBoost predictive model among all applied predictive models when the feature selected by MARS and XGBoost algorithms, with the coefficient of determination (R2) values of 0.84 and 0.85, respectively, nonetheless the marginal performance came up by Boruta-XGBoost model on in this scenario.
    Matched MeSH terms: Artificial Intelligence*
  20. Bhagat SK, Pyrgaki K, Salih SQ, Tiyasha T, Beyaztas U, Shahid S, et al.
    Chemosphere, 2021 Aug;276:130162.
    PMID: 34088083 DOI: 10.1016/j.chemosphere.2021.130162
    Copper (Cu) ion in wastewater is considered as one of the crucial hazardous elements to be quantified. This research is established to predict copper ions adsorption (Ad) by Attapulgite clay from aqueous solutions using computer-aided models. Three artificial intelligent (AI) models are developed for this purpose including Grid optimization-based random forest (Grid-RF), artificial neural network (ANN) and support vector machine (SVM). Principal component analysis (PCA) is used to select model inputs from different variables including the initial concentration of Cu (IC), the dosage of Attapulgite clay (Dose), contact time (CT), pH, and addition of NaNO3 (SN). The ANN model is found to predict Ad with minimum root mean square error (RMSE = 0.9283) and maximum coefficient of determination (R2 = 0.9974) when all the variables (i.e., IC, Dose, CT, pH, SN) were considered as input. The prediction accuracy of Grid-RF model is found similar to ANN model when a few numbers of predictors are used. According to prediction accuracy, the models can be arranged as ANN-M5> Grid-RF-M5> Grid-RF-M4> ANN-M4> SVM-M4> SVM-M5. Overall, the applied statistical analysis of the results indicates that ANN and Grid-RF models can be employed as a computer-aided model for monitoring and simulating the adsorption from aqueous solutions by Attapulgite clay.
    Matched MeSH terms: Artificial Intelligence*
Contact Us

Please provide feedback to Administrator (tengcl@gmail.com)

External Links