Displaying publications 1 - 20 of 51 in total

Abstract:
Sort:
  1. Nematzadeh H, Motameni H, Mohamad R, Nematzadeh Z
    ScientificWorldJournal, 2014;2014:847930.
    PMID: 25110748 DOI: 10.1155/2014/847930
    Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.
    Matched MeSH terms: Workflow*
  2. Abd Elaziz M, Abualigah L, Ibrahim RA, Attiya I
    Comput Intell Neurosci, 2021;2021:9114113.
    PMID: 34976046 DOI: 10.1155/2021/9114113
    Instead of the cloud, the Internet of things (IoT) activities are offloaded into fog computing to boost the quality of services (QoSs) needed by many applications. However, the availability of continuous computing resources on fog computing servers is one of the restrictions for IoT applications since transmitting the large amount of data generated using IoT devices would create network traffic and cause an increase in computational overhead. Therefore, task scheduling is the main problem that needs to be solved efficiently. This study proposes an energy-aware model using an enhanced arithmetic optimization algorithm (AOA) method called AOAM, which addresses fog computing's job scheduling problem to maximize users' QoSs by maximizing the makespan measure. In the proposed AOAM, we enhanced the conventional AOA searchability using the marine predators algorithm (MPA) search operators to address the diversity of the used solutions and local optimum problems. The proposed AOAM is validated using several parameters, including various clients, data centers, hosts, virtual machines, tasks, and standard evaluation measures, including the energy and makespan. The obtained results are compared with other state-of-the-art methods; it showed that AOAM is promising and solved task scheduling effectively compared with the other comparative methods.
    Matched MeSH terms: Workflow
  3. Zulkapli NA, Sobi S, Mohd Zubaidi NA, Abdullah JM
    Malays J Med Sci, 2016 Jul;23(4):1-4.
    PMID: 27660539 DOI: 10.21315/mjms2016.23.4.1
    The Malaysian Journal of Medical Sciences (MJMS) has conducted a simple analysis of its scholarly publication, based on the auto-generated data compiled from ScholarOne Manuscripts(™), an innovative, web-based, submission and peer-review workflow solution for scholarly publishers. The performance of the MJMS from 2014-2015 is reported on in this editorial, with a focus on the pattern of manuscript submission, geographical contributors and the acceptance-rejection rate. The total number of manuscript submissions has increased from 264 in 2014, to 272 in 2015. Malaysians are the main contributors to the MJMS. The total number of manuscript rejections following the review process was 79 (29.9%) in 2014, increasing to 92 (33.8%) the following year, in accordance with the exacting quality control criteria applied by the journal's editor to the submitted manuscripts.
    Matched MeSH terms: Workflow
  4. Olufisayo O, Mohd Yusof M, Ezat Wan Puteh S
    Stud Health Technol Inform, 2018;255:112-116.
    PMID: 30306918
    Despite the widespread use of clinical decision support systems with its alert function, there has been an increase in medical errors, adverse events as well as issues regarding patient safety, quality and efficiency. The appropriateness of CDSS must be properly evaluated by ensuring that CDSS provides clinicians with useful information at the point of care. Inefficient clinical workflow affects clinical processes; hence, it is necessary to identify processes in the healthcare system that affect provider's workflow. The Lean method was used to eliminate waste (non-value added) activities that affect the appropriate use of CDSS. Ohno's seven waste model was used to categorize waste in the context of healthcare and information technology.
    Matched MeSH terms: Workflow*
  5. Srivastava G, Padhiary SK, Mohanty N, Patil PG, Panda S, Cobo-Vazquez C, et al.
    Acta Odontol Scand, 2024 Jun 19;83:392-403.
    PMID: 38895776 DOI: 10.2340/aos.v83.40870
    OBJECTIVES: To evaluate the current evidence of digital workflow feasibility based on the data acquisition methods and the software tools used to fabricate intraoral prostheses for patients with partial or total maxillary and mandibular defects.

    MATERIALS AND METHODS: An electronic search was performed in PubMed, SCOPUS, and Web of Science using a combination of relevant keywords: digital workflow, digital designing, computer-assisted design-computer aided manufacturing, 3D printing, maxillectomy, and mandibulectomy. The Joanna Briggs Institute Critical Appraisal Tool was used to assess the quality of evidence in the studies reviewed.

    RESULTS: From a total of 542 references, 33 articles were selected, including 25 on maxillary prostheses and 8 on mandibular prostheses. The use of digital workflows was limited to one or two steps of the fabrication of the prostheses, and only four studies described a complete digital workflow. The most preferred method for data acquisition was intraoral scanning with or without a cone beam computed tomography combination.

    CONCLUSION: Currently, the fabrication process of maxillofacial prostheses requires combining digital and conventional methods. Simplifying the data acquisition methods and providing user-friendly and affordable software may encourage clinicians to use the digital workflow more frequently for patients requiring maxillofacial prostheses.

    Matched MeSH terms: Workflow*
  6. Zairina Ibrahim, Md Gapar Md Johar
    MyJurnal
    The process of software development life cycle (SDLC) is an important element of development phases to develop the application. In fact, there are needs to upgrade the sequence of methodology in software development. Thus, the SDLC is very crucial in order for them to ensure the quality of skills is placed accordingly in the workflow. This research contributes to the development of a new approach in system development workflow with the aim to properly manage system development projects. It started by providing some background data related to the previous mode of operation in the teamwork samples as shared by the stakeholders of the transformation projects and the new proposed Analysis System Development Framework (ASDF) method team members. Then, the key findings related to steps of software development such as (1) input for User Requirement Specification (URS) and (2) System Requirement Specification (SRS), (3) process for module, (4) process for database, (5) process for User Acceptance Testing (UAT) (6) output for Final Acceptance Testing (FAT) and empowerment for the whole level based on ASDF method. This paper contribution significantly to support the perception of high quality of skills in a teamwork, results in better performance of software development.
    Matched MeSH terms: Workflow
  7. Khalid H, Hashim SJ, Ahmad SMS, Hashim F, Chaudhary MA
    Sensors (Basel), 2021 Feb 18;21(4).
    PMID: 33670675 DOI: 10.3390/s21041428
    The development of the industrial Internet of Things (IIoT) promotes the integration of the cross-platform systems in fog computing, which enable users to obtain access to multiple application located in different geographical locations. Fog users at the network's edge communicate with many fog servers in different fogs and newly joined servers that they had never contacted before. This communication complexity brings enormous security challenges and potential vulnerability to malicious threats. The attacker may replace the edge device with a fake one and authenticate it as a legitimate device. Therefore, to prevent unauthorized users from accessing fog servers, we propose a new secure and lightweight multi-factor authentication scheme for cross-platform IoT systems (SELAMAT). The proposed scheme extends the Kerberos workflow and utilizes the AES-ECC algorithm for efficient encryption keys management and secure communication between the edge nodes and fog node servers to establish secure mutual authentication. The scheme was tested for its security analysis using the formal security verification under the widely accepted AVISPA tool. We proved our scheme using Burrows Abdi Needham's logic (BAN logic) to prove secure mutual authentication. The results show that the SELAMAT scheme provides better security, functionality, communication, and computation cost than the existing schemes.
    Matched MeSH terms: Workflow
  8. Agbolade O, Nazri A, Yaakob R, Ghani AAA, Cheah YK
    PeerJ Comput Sci, 2020;6:e249.
    PMID: 33816901 DOI: 10.7717/peerj-cs.249
    Over the years, neuroscientists and psychophysicists have been asking whether data acquisition for facial analysis should be performed holistically or with local feature analysis. This has led to various advanced methods of face recognition being proposed, and especially techniques using facial landmarks. The current facial landmark methods in 3D involve a mathematically complex and time-consuming workflow involving semi-landmark sliding tasks. This paper proposes a homologous multi-point warping for 3D facial landmarking, which is verified experimentally on each of the target objects in a given dataset using 500 landmarks (16 anatomical fixed points and 484 sliding semi-landmarks). This is achieved by building a template mesh as a reference object and applying this template to each of the targets in three datasets using an artificial deformation approach. The semi-landmarks are subjected to sliding along tangents to the curves or surfaces until the bending energy between a template and a target form is minimal. The results indicate that our method can be used to investigate shape variation for multiple datasets when implemented on three databases (Stirling, FRGC and Bosphorus).
    Matched MeSH terms: Workflow
  9. Yusof MM
    Stud Health Technol Inform, 2019;257:508-512.
    PMID: 30741248
    The evaluation of Health Information Systems (HIS)-induced medication errors is crucial in efforts to understand its cause, impact and mitigation measures when trying to minimize errors and increase patient safety. A review of evaluation studies on HIS-induced medication errors was carried out, which indicated the need to further structure complex socio technical aspects of the subject. In order to satisfy this requirement, a new framework was introduced for the evaluation of HIS-induced error management in clinical settings. The proposed HO(P)T-fit framework (Human, Organization, Process and Technology-fit) was developed after critically appraising existing findings in HIS related evaluation studies. It also builds on previous models related to HIS evaluation, in particular, the HOT-fit (Human, Organization, Process and Technology-fit) framework, error model, business process management, Lean method, and medication workflow. HOPT-fit incorporates the concept of fit between the four factors. The framework has the potential to be used as a tool to conduct a structured, systematic, and comprehensive HIS evaluation.
    Matched MeSH terms: Workflow
  10. Ahmad Z, Jehangiri AI, Ala'anzy MA, Othman M, Umar AI
    Sensors (Basel), 2021 Oct 30;21(21).
    PMID: 34770545 DOI: 10.3390/s21217238
    Cloud computing is a fully fledged, matured and flexible computing paradigm that provides services to scientific and business applications in a subscription-based environment. Scientific applications such as Montage and CyberShake are organized scientific workflows with data and compute-intensive tasks and also have some special characteristics. These characteristics include the tasks of scientific workflows that are executed in terms of integration, disintegration, pipeline, and parallelism, and thus require special attention to task management and data-oriented resource scheduling and management. The tasks executed during pipeline are considered as bottleneck executions, the failure of which result in the wholly futile execution, which requires a fault-tolerant-aware execution. The tasks executed during parallelism require similar instances of cloud resources, and thus, cluster-based execution may upgrade the system performance in terms of make-span and execution cost. Therefore, this research work presents a cluster-based, fault-tolerant and data-intensive (CFD) scheduling for scientific applications in cloud environments. The CFD strategy addresses the data intensiveness of tasks of scientific workflows with cluster-based, fault-tolerant mechanisms. The Montage scientific workflow is considered as a simulation and the results of the CFD strategy were compared with three well-known heuristic scheduling policies: (a) MCT, (b) Max-min, and (c) Min-min. The simulation results showed that the CFD strategy reduced the make-span by 14.28%, 20.37%, and 11.77%, respectively, as compared with the existing three policies. Similarly, the CFD reduces the execution cost by 1.27%, 5.3%, and 2.21%, respectively, as compared with the existing three policies. In case of the CFD strategy, the SLA is not violated with regard to time and cost constraints, whereas it is violated by the existing policies numerous times.
    Matched MeSH terms: Workflow
  11. Belhaj AF, Elraies KA, Alnarabiji MS, Abdul Kareem FA, Shuhli JA, Mahmood SM, et al.
    Chem Eng J, 2021 Feb 15;406:127081.
    PMID: 32989375 DOI: 10.1016/j.cej.2020.127081
    Throughout the application of enhanced oil recovery (EOR), surfactant adsorption is considered the leading constraint on both the successful implementation and economic viability of the process. In this study, a comprehensive investigation on the adsorption behaviour of nonionic and anionic individual surfactants; namely, alkyl polyglucoside (APG) and alkyl ether carboxylate (AEC) was performed using static adsorption experiments, isotherm modelling using (Langmuir, Freundlich, Sips, and Temkin models), adsorption simulation using a state-of-the-art method, binary mixture prediction using the modified extended Langmuir (MEL) model, and artificial neural network (ANN) prediction. Static adsorption experiments revealed higher adsorption capacity of APG as compared to AEC, with sips being the most fitted model with R2 (0.9915 and 0.9926, for APG and AEC respectively). It was indicated that both monolayer and multilayer adsorption took place in a heterogeneous adsorption system with non-uniform surfactant molecules distribution, which was in remarkable agreement with the simulation results. The (APG/AEC) binary mixture prediction depicted contradictory results to the experimental individual behaviour, showing that AEC had more affinity to adsorb in competition with APG for the adsorption sites on the rock surface. The adopted ANN model showed good agreement with the experimental data and the simulated adsorption values for APG and AEC showed a decreasing trend as temperature increases. Simulating the impact of binary surfactant adsorption can provide a tremendous advantage of demonstrating the binary system behaviour with less experimental data. The utilization of ANN for such prediction procedure can minimize the experimental time, operating cost and give feasible predictions compared to other computational methods. The integrated workflow followed in this study is quite innovative as it has not been employed before for surfactant adsorption studies.
    Matched MeSH terms: Workflow
  12. Tilley A, Dos Reis Lopes J, Wilkinson SP
    PLoS One, 2020;15(11):e0234760.
    PMID: 33186386 DOI: 10.1371/journal.pone.0234760
    Small-scale fisheries are responsible for landing half of the world's fish catch, yet there are very sparse data on these fishing activities and associated fisheries production in time and space. Fisheries-dependent data underpin scientific guidance of management and conservation of fisheries systems, but it is inherently difficult to generate robust and comprehensive data for small-scale fisheries, particularly given their dispersed and diverse nature. In tackling this challenge, we use open source software components including the Shiny R package to build PeskAAS; an adaptable and scalable digital application that enables the collation, classification, analysis and visualisation of small-scale fisheries catch and effort data. We piloted and refined this system in Timor-Leste; a small island developing nation. The features that make PeskAAS fit for purpose are that it is: (i) fully open-source and free to use (ii) component-based, flexible and able to integrate vessel tracking data with catch records; (iii) able to perform spatial and temporal filtering of fishing productivity by fishing method and habitat; (iv) integrated with species-specific length-weight parameters from FishBase; (v) controlled through a click-button dashboard, that was co-designed with fisheries scientists and government managers, that enables easy to read data summaries and interpretation of context-specific fisheries data. With limited training and code adaptation, the PeskAAS workflow has been used as a framework on which to build and adapt systematic, standardised data collection for small-scale fisheries in other contexts. Automated analytics of these data can provide fishers, managers and researchers with insights into a fisher's experience of fishing efforts, fisheries status, catch rates, economic efficiency and geographic preferences and limits that can potentially guide management and livelihood investments.
    Matched MeSH terms: Workflow
  13. Olakotan OO, Yusof MM
    J Eval Clin Pract, 2021 Aug;27(4):868-876.
    PMID: 33009698 DOI: 10.1111/jep.13488
    RATIONALE, AIMS, AND OBJECTIVES: Clinical decision support (CDS) generates excessive alerts that disrupt the workflow of clinicians. Therefore, inefficient clinical processes that contribute to the misfit between CDS alert and workflow must be evaluated. This study evaluates the appropriateness of CDS alerts in supporting clinical workflow from a socio-technical perspective.

    METHOD: A qualitative case study evaluation was conducted at a 620-bed public teaching hospital in Malaysia using interview, observation, and document analysis to investigate the features and functions of alert appropriateness and workflow-related issues in cardiological and dermatological settings. The current state map for medication prescribing process was also modelled to identify problems pertinent to CDS alert appropriateness.

    RESULTS: The main findings showed that CDS was not well designed to fit into a clinician's workflow due to influencing factors such as technology (usability, alert content, and alert timing), human (training, perception, knowledge, and skills), organizational (rules and regulations, privacy, and security), and processes (documenting patient information, overriding default option, waste, and delay) impeding the use of CDS with its alert function. We illustrated how alert affect workflow in clinical processes using a Lean tool known as value stream mapping. This study also proposes how CDS alerts should be integrated into clinical workflows to optimize their potential to enhance patient safety.

    CONCLUSION: The design and implementation of CDS alerts should be aligned with and incorporate socio-technical factors. Process improvement methods such as Lean can be used to enhance the appropriateness of CDS alerts by identifying inefficient clinical processes that impede the fit of these alerts into clinical workflow.

    Matched MeSH terms: Workflow
  14. Olakotan OO, Mohd Yusof M
    Health Informatics J, 2021 4 16;27(2):14604582211007536.
    PMID: 33853395 DOI: 10.1177/14604582211007536
    A CDSS generates a high number of inappropriate alerts that interrupt the clinical workflow. As a result, clinicians silence, disable, or ignore alerts, thereby undermining patient safety. Therefore, the effectiveness and appropriateness of CDSS alerts need to be evaluated. A systematic review was carried out to identify the factors that affect CDSS alert appropriateness in supporting clinical workflow. Seven electronic databases (PubMed, Scopus, ACM, Science Direct, IEEE, Ovid Medline, and Ebscohost) were searched for English language articles published between 1997 and 2018. Seventy six papers met the inclusion criteria, of which 26, 24, 15, and 11 papers are retrospective cohort, qualitative, quantitative, and mixed-method studies, respectively. The review highlights various factors influencing the appropriateness and efficiencies of CDSS alerts. These factors are categorized into technology, human, organization, and process aspects using a combination of approaches, including socio-technical framework, five rights of CDSS, and Lean. Most CDSS alerts were not properly designed based on human factor methods and principles, explaining high alert overrides in clinical practices. The identified factors and recommendations from the review may offer valuable insights into how CDSS alerts can be designed appropriately to support clinical workflow.
    Matched MeSH terms: Workflow
  15. Farook TH, Jamayet NB, Abdullah JY, Asif JA, Rajion ZA, Alam MK
    Comput Biol Med, 2020 03;118:103646.
    PMID: 32174323 DOI: 10.1016/j.compbiomed.2020.103646
    OBJECTIVE: To design and compare the outcome of commercial (CS) and open source (OS) software-based 3D prosthetic templates for rehabilitation of maxillofacial defects using a low powered personal computer setup.

    METHOD: Medical image data for five types of defects were selected, segmented, converted and decimated to 3D polygon models on a personal computer. The models were transferred to a computer aided design (CAD) software which aided in designing the prosthesis according to the virtual models. Two templates were designed for each defect, one by an OS (free) system and one by CS. The parameters for analyses were the virtual volume, Dice similarity coefficient (DSC) and Hausdorff's distance (HD) and were executed by the OS point cloud comparison tool.

    RESULT: There was no significant difference (p > 0.05) between CS and OS when comparing the volume of the template outputs. While HD was within 0.05-4.33 mm, evaluation of the percentage similarity and spatial overlap following the DSC showed an average similarity of 67.7% between the two groups. The highest similarity was with orbito-facial prostheses (88.5%) and the lowest with facial plate prosthetics (28.7%).

    CONCLUSION: Although CS and OS pipelines are capable of producing templates which are aesthetically and volumetrically similar, there are slight comparative discrepancies in the landmark position and spatial overlap. This is dependent on the software, associated commands and experienced decision-making. CAD-based templates can be planned on current personal computers following appropriate decimation.

    Matched MeSH terms: Workflow
  16. Goo CL, Tan KB
    Case Rep Dent, 2017;2017:9373818.
    PMID: 28396807 DOI: 10.1155/2017/9373818
    This report describes the clinical and technical aspects in the oral rehabilitation of an edentulous patient with knife-edge ridge at the mandibular anterior edentulous region, using implant-retained overdentures. The application of computer-aided design and computer-aided manufacturing (CAD/CAM) in the fabrication of the overdenture framework simplifies the laboratory process of the implant prostheses. The Nobel Procera CAD/CAM System was utilised to produce a lightweight titanium overdenture bar with locator attachments. It is proposed that the digital workflow of CAD/CAM milled implant overdenture bar allows us to avoid numerous technical steps and possibility of casting errors involved in the conventional casting of such bars.
    Matched MeSH terms: Workflow
  17. Elpa DP, Prabhu GRD, Wu SP, Tay KS, Urban PL
    Talanta, 2020 Feb 01;208:120304.
    PMID: 31816721 DOI: 10.1016/j.talanta.2019.120304
    The developments in mass spectrometry (MS) in the past few decades reveal the power and versatility of this technology. MS methods are utilized in routine analyses as well as research activities involving a broad range of analytes (elements and molecules) and countless matrices. However, manual MS analysis is gradually becoming a thing of the past. In this article, the available MS automation strategies are critically evaluated. Automation of analytical workflows culminating with MS detection encompasses involvement of automated operations in any of the steps related to sample handling/treatment before MS detection, sample introduction, MS data acquisition, and MS data processing. Automated MS workflows help to overcome the intrinsic limitations of MS methodology regarding reproducibility, throughput, and the expertise required to operate MS instruments. Such workflows often comprise automated off-line and on-line steps such as sampling, extraction, derivatization, and separation. The most common instrumental tools include autosamplers, multi-axis robots, flow injection systems, and lab-on-a-chip. Prototyping customized automated MS systems is a way to introduce non-standard automated features to MS workflows. The review highlights the enabling role of automated MS procedures in various sectors of academic research and industry. Examples include applications of automated MS workflows in bioscience, environmental studies, and exploration of the outer space.
    Matched MeSH terms: Workflow
  18. Mohd Yusof M, Takeda T, Mihara N, Matsumura Y
    Stud Health Technol Inform, 2020 Jun 16;270:1036-1040.
    PMID: 32570539 DOI: 10.3233/SHTI200319
    Health information systems (HIS) and clinical workflows generate medication errors that affect the quality of patient care. The rigorous evaluation of the medication process's error risk, control, and impact on clinical practice enable the understanding of latent and active factors that contribute to HIS-induced errors. This paper reports the preliminary findings of an evaluation case study of a 1000-bed Japanese secondary care teaching hospital using observation, interview, and document analysis methods. Findings were analysed from a process perspective by adopting a recently introduced framework known as Human, Organisation, Process, and Technology-fit. Process factors influencing risk in medication errors include template- and calendar-based systems, intuitive design, barcode check, ease of use, alert, policy, systematic task organisation, and safety culture Approaches for managing medication errors also exert an important role on error reduction and clinical workflow.
    Matched MeSH terms: Workflow
  19. Olakotan O, Mohd Yusof M, Ezat Wan Puteh S
    Stud Health Technol Inform, 2020 Jun 16;270:906-910.
    PMID: 32570513 DOI: 10.3233/SHTI200293
    Clinical decision support systems (CDSSs) provides vital information for managing patients by advising clinicians through an alert or reminders about adverse events and medication errors. Clinicians receive a high number of alerts, resulting in alert override and workflow disruptions. A systematic review was carried out to identify factors affecting CDSS alert appropriateness in supporting clinical workflows using a recently introduced framework. The review findings identified several influencing factors of CDSS alert appropriateness including: technology (usability, alert presentation, workload and data entry), human (training, knowledge and skills, attitude and behavior), organization (rules and regulation, privacy and security) and process (waste, delay, tuning and optimization). The findings can be used to guide the design of CDSS alert and minimise potential safety hazards associated with CDSS use.
    Matched MeSH terms: Workflow
  20. Tan YC, Mustangin M, Rosli N, Wan Ahmad Kammal WSE, Md Isa N, Low TY, et al.
    Cryobiology, 2024 Mar;114:104843.
    PMID: 38158171 DOI: 10.1016/j.cryobiol.2023.104843
    Coolant-assisted liquid nitrogen (LN) flash freezing of frozen tissues has been widely adopted to preserve tissue morphology for histopathological annotations in mass spectrometry-based spatial proteomics techniques. However, existing coolants pose health risks upon inhalation and are expensive. To overcome this challenge, we present our pilot study by introducing the EtOH-LN workflow, which demonstrates the feasibility of using 95 % ethanol as a safer and easily accessible alternative to existing coolants for LN-based cryoembedding of frozen tissues. Our study reveals that both the EtOH-LN and LN-only cryoembedding workflows exhibit significantly reduced freezing artifacts compared to cryoembedding in cryostat (p workflow, which successfully restored the tissue architecture from freezing artifacts (p workflow on the molecular profiles of tissues.
    Matched MeSH terms: Workflow
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links