Displaying publications 81 - 100 of 754 in total

Abstract:
Sort:
  1. Goh A, Kum YL, Mak SY, Quek YT
    PMID: 11187482
    Health-Level (HL) 7 message semantics allows effective functional implementation of Electronic Medical Record (EMR)--encompassing both clinical and administrative (i.e. demographic and financial) information--interchange systems, at the expense of complexity with respect the Protocol Data Unit (PDU) structure and the client-side application architecture. In this paper we feature the usage of the Extensible Markup Language (XML) document-object modelling and Java client-server connectivity towards the implementation of a Web-based system for EMR transaction processing. Our solution features an XML-based description of EMR templates, which are subsequently transcribed into a Hypertext Markup Language (HTML)-Javascript form. This allows client-side user interfaceability and server-side functionality--i.e. message validation, authentication and database connectivity--to be handled through standard Web client-server mechanisms, the primary assumption being availability of a browser capable of XML documents and the associated stylesheets. We assume usage of the Internet as the interchange medium, hence the necessity for authentication and data privacy mechanisms, both of which can be constructed using standard Java-based building blocks.
    Matched MeSH terms: Software*
  2. Sundaram A, Subramaniam H, Ab Hamid SH, Mohamad Nor A
    PeerJ, 2024;12:e17133.
    PMID: 38563009 DOI: 10.7717/peerj.17133
    BACKGROUND: In the current era of rapid technological innovation, our lives are becoming more closely intertwined with digital systems. Consequently, every human action generates a valuable repository of digital data. In this context, data-driven architectures are pivotal for organizing, manipulating, and presenting data to facilitate positive computing through ensemble machine learning models. Moreover, the COVID-19 pandemic underscored a substantial need for a flexible mental health care architecture. This architecture, inclusive of machine learning predictive models, has the potential to benefit a larger population by identifying individuals at a heightened risk of developing various mental disorders.

    OBJECTIVE: Therefore, this research aims to create a flexible mental health care architecture that leverages data-driven methodologies and ensemble machine learning models. The objective is to proficiently structure, process, and present data for positive computing. The adaptive data-driven architecture facilitates customized interventions for diverse mental disorders, fostering positive computing. Consequently, improved mental health care outcomes and enhanced accessibility for individuals with varied mental health conditions are anticipated.

    METHOD: Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, the researchers conducted a systematic literature review in databases indexed in Web of Science to identify the existing strengths and limitations of software architecture relevant to our adaptive design. The systematic review was registered in PROSPERO (CRD42023444661). Additionally, a mapping process was employed to derive essential paradigms serving as the foundation for the research architectural design. To validate the architecture based on its features, professional experts utilized a Likert scale.

    RESULTS: Through the review, the authors identified six fundamental paradigms crucial for designing architecture. Leveraging these paradigms, the authors crafted an adaptive data-driven architecture, subsequently validated by professional experts. The validation resulted in a mean score exceeding four for each evaluated feature, confirming the architecture's effectiveness. To further assess the architecture's practical application, a prototype architecture for predicting pandemic anxiety was developed.

    Matched MeSH terms: Software
  3. Dabbagh M, Lee SP
    ScientificWorldJournal, 2014;2014:737626.
    PMID: 24982987 DOI: 10.1155/2014/737626
    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.
    Matched MeSH terms: Software*
  4. Teo BG, Dhillon SK
    BMC Bioinformatics, 2019 Dec 24;20(Suppl 19):658.
    PMID: 31870297 DOI: 10.1186/s12859-019-3210-x
    BACKGROUND: Studying structural and functional morphology of small organisms such as monogenean, is difficult due to the lack of visualization in three dimensions. One possible way to resolve this visualization issue is to create digital 3D models which may aid researchers in studying morphology and function of the monogenean. However, the development of 3D models is a tedious procedure as one will have to repeat an entire complicated modelling process for every new target 3D shape using a comprehensive 3D modelling software. This study was designed to develop an alternative 3D modelling approach to build 3D models of monogenean anchors, which can be used to understand these morphological structures in three dimensions. This alternative 3D modelling approach is aimed to avoid repeating the tedious modelling procedure for every single target 3D model from scratch.

    RESULT: An automated 3D modeling pipeline empowered by an Artificial Neural Network (ANN) was developed. This automated 3D modelling pipeline enables automated deformation of a generic 3D model of monogenean anchor into another target 3D anchor. The 3D modelling pipeline empowered by ANN has managed to automate the generation of the 8 target 3D models (representing 8 species: Dactylogyrus primaries, Pellucidhaptor merus, Dactylogyrus falcatus, Dactylogyrus vastator, Dactylogyrus pterocleidus, Dactylogyrus falciunguis, Chauhanellus auriculatum and Chauhanellus caelatus) of monogenean anchor from the respective 2D illustrations input without repeating the tedious modelling procedure.

    CONCLUSIONS: Despite some constraints and limitation, the automated 3D modelling pipeline developed in this study has demonstrated a working idea of application of machine learning approach in a 3D modelling work. This study has not only developed an automated 3D modelling pipeline but also has demonstrated a cross-disciplinary research design that integrates machine learning into a specific domain of study such as 3D modelling of the biological structures.

    Matched MeSH terms: Software
  5. May Z, Alam MK, Husain K, Hasan MK
    PLoS One, 2020;15(8):e0238073.
    PMID: 32845901 DOI: 10.1371/journal.pone.0238073
    Transmission opportunity (TXOP) is a key factor to enable efficient channel bandwidth utilization over wireless campus networks (WCN) for interactive multimedia (IMM) applications. It facilitates in resource allocation for the similar categories of multiple packets transmission until the allocated time is expired. The static TXOP limits are defined for various categories of IMM traffics in the IEEE802.11e standard. Due to the variation of traffic load in WCN, the static TXOP limits are not sufficient enough to guarantee the quality of service (QoS) for IMM traffic flows. In order to address this issue, several existing works allocate the TXOP limits dynamically to ensure QoS for IMM traffics based on the current associated queue size and pre-setting threshold values. However, existing works do not take into account all the medium access control (MAC) overheads while estimating the current queue size which in turn is required for dynamic TXOP limits allocation. Hence, not considering MAC overhead appropriately results in inaccurate queue size estimation, thereby leading to inappropriate allocation of dynamic TXOP limits. In this article, an enhanced dynamic TXOP (EDTXOP) scheme is proposed that takes into account all the MAC overheads while estimating current queue size, thereby allocating appropriate dynamic TXOP limits within the pre-setting threshold values. In addition, the article presents an analytical estimation of the EDTXOP scheme to compute the dynamic TXOP limits for the current high priority traffic queues. Simulation results were carried out by varying traffic load in terms of packet size and packet arrival rate. The results show that the proposed EDTXOP scheme achieves the overall performance gains in the range of 4.41%-8.16%, 8.72%-11.15%, 14.43%-32% and 26.21%-50.85% for throughput, PDR, average ETE delay and average jitter, respectively when compared to the existing work. Hence, offering a better TXOP limit allocation solution than the rest.
    Matched MeSH terms: Software
  6. Faisal, M., Moniruddin, C., Alauddin, A.B.M.C.
    JUMMEC, 2017;20(2):1-7.
    MyJurnal
    Tuberculosis (TB) is a major public health problem worldwide. It is estimated that 2 billion people, a third of
    the world population, have TB infection, but are not down with the disease. Globally, incident cases of TB
    showed a rising trend, with a 6.6 million reported in 1990, 8.3 million in 2000, 9.24 million in 2004, and an
    estimated 9.27 million incident cases in 2007. The aim of this study was to evaluate the treatment outcome of
    TB patients in Nigeria in the state of Jigawa. A cross sectional retrospective study was conducted to evaluate the
    treatment outcome in directly observed treatment with a short course for tuberculosis (TB DOTS) in facilities
    in the state between the years 2010 to 2014. The study population were all the patients with TB, who had
    access to DOTS therapy. Data were collected from the various local governmental areas for tuberculosis control
    (LGA TB) register. The LGA TB control registers contained basic information of the patients, and a statistical
    software SPSS-V22.0 was used to analyse the data. A total of 963 TB patients were studied. More than half
    (57.4%) of the patients were male, and nearly three- fourths (71.2%) of the patients accessed care from urban
    local government areas in the state. The greater majority (96.3%) of the cases had pulmonary tuberculosis
    (PTB). Among the patients, more than two-fifths (45%) were cured, and a little over one-fifth (20.6%) of them
    were HIV positive. This study revealed that the treatment success rate (TSR) in the Jigawa State of Nigeria
    was higher than the overall TSR of Nigeria, and the defaulter rate in this state was lower than the Nigerian
    average. The aim of this study was to evaluate the treatment outcome of TB patients in Nigeria in the state
    of Jigawa. A cross sectional retrospective study was conducted to evaluate the treatment outcome in directly
    observed treatment with a short course for tuberculosis (TB DOTS) in facilities in the state between the years
    2010 to 2014. The study population were all the patients with TB, who had access to DOTS therapy. Data were
    collected from the various local governmental areas for tuberculosis control (LGA TB) register. The LGA TB
    control registers contained basic information of the patients, and a statistical software SPSS-V22.0 was used
    to analyse the data. A total of 963 TB patients were studied. More than half (57.4%) of the patients were
    male, and nearly three- fourths (71.2%) of the patients accessed care from urban local government areas in
    the state. The greater majority (96.3%) of the cases had pulmonary tuberculosis (PTB). Among the patients,
    more than two-fifths (45%) were cured, and a little over one-fifth (20.6%) of them were HIV positive. This study
    revealed that the treatment success rate (TSR) in the Jigawa State of Nigeria was higher than the overall TSR
    of Nigeria, and the defaulter rate in this state was lower than the Nigerian average.
    Matched MeSH terms: Software
  7. Jahanirad M, Wahab AW, Anuar NB
    Forensic Sci Int, 2016 May;262:242-75.
    PMID: 27060542 DOI: 10.1016/j.forsciint.2016.03.035
    Camera attribution plays an important role in digital image forensics by providing the evidence and distinguishing characteristics of the origin of the digital image. It allows the forensic analyser to find the possible source camera which captured the image under investigation. However, in real-world applications, these approaches have faced many challenges due to the large set of multimedia data publicly available through photo sharing and social network sites, captured with uncontrolled conditions and undergone variety of hardware and software post-processing operations. Moreover, the legal system only accepts the forensic analysis of the digital image evidence if the applied camera attribution techniques are unbiased, reliable, nondestructive and widely accepted by the experts in the field. The aim of this paper is to investigate the evolutionary trend of image source camera attribution approaches from fundamental to practice, in particular, with the application of image processing and data mining techniques. Extracting implicit knowledge from images using intrinsic image artifacts for source camera attribution requires a structured image mining process. In this paper, we attempt to provide an introductory tutorial on the image processing pipeline, to determine the general classification of the features corresponding to different components for source camera attribution. The article also reviews techniques of the source camera attribution more comprehensively in the domain of the image forensics in conjunction with the presentation of classifying ongoing developments within the specified area. The classification of the existing source camera attribution approaches is presented based on the specific parameters, such as colour image processing pipeline, hardware- and software-related artifacts and the methods to extract such artifacts. The more recent source camera attribution approaches, which have not yet gained sufficient attention among image forensics researchers, are also critically analysed and further categorised into four different classes, namely, optical aberrations based, sensor camera fingerprints based, processing statistics based and processing regularities based, to present a classification. Furthermore, this paper aims to investigate the challenging problems, and the proposed strategies of such schemes based on the suggested taxonomy to plot an evolution of the source camera attribution approaches with respect to the subjective optimisation criteria over the last decade. The optimisation criteria were determined based on the strategies proposed to increase the detection accuracy, robustness and computational efficiency of source camera brand, model or device attribution.
    Matched MeSH terms: Software
  8. Aris A. Z., Ismail F. A., Ng, H. Y., Praveena, S. M.
    MyJurnal
    This study was conducted using crab shells as a biosorbent to remove Cu and Cd with different initial concentrations of 1, 5, 10, 15, and 20 mg/L in a biosorption treatment process. Crab shells were selected as biosorbents due to their abundance in the environment and ready availability as waste products from the market place. This study aimed to determine the ability of Scylla Serrata shells to remove Cu and Cd in an aqueous solution, as well as to provide a comparison of the removal rate between the two metals. The data were incorporated into hydrochemical software, PHREEQC, to investigate the chemical speciation distribution of each heavy metal. The shells of S. serrata were found to have a significant (p< 0.05) ability to remove Cu and Cd following the treatment. After six hours of treatment, the crab shells had removed 60 to 80% of both metals. However, the highest removal percentage was achieved for Cu with up to 94.7% removal rate in 5 mg/L initial Cu concentration, while 85.1% of Cd was removed in 1 mg/L initial solution, respectively. It can be concluded that the shells of S. serrata could remove Cu and Cd better with significant results (p
    Matched MeSH terms: Software
  9. Alsultaney, Hazem K., Mohd Khairol Anuar Mohd Ariffin, B.T. Hang Tuah Baharudin, Aidy Ali, Faizal Mustapha
    MyJurnal
    This work was carried out with the aim to optimise the tool path by simulating the removal of material in a finite element environment which is controlled by a genetic algorithm (GA). To simulate the physical removal of material during machining, a finite element model was designed to represent a thin walled workpiece. The target was to develop models which mimic the actual cutting process using the finite element method (FEM), to validate the developed tool path strategy algorithm with the actual machining process and to programme the developed algorithm into the software. The workpiece was to be modelled using the CAD (ABAQUS CAE) software to create a basic geometry co-ordinate system which could then be used to create the finite element method and necessary requirement by ABAQUS, such as the boundary condition, the material type, and the element type.
    Matched MeSH terms: Software
  10. Al-Bashiri H, Abdulgabber MA, Romli A, Kahtan H
    PLoS One, 2018;13(10):e0204434.
    PMID: 30286123 DOI: 10.1371/journal.pone.0204434
    This paper describes an approach for improving the accuracy of memory-based collaborative filtering, based on the technique for order of preference by similarity to ideal solution (TOPSIS) method. Recommender systems are used to filter the huge amount of data available online based on user-defined preferences. Collaborative filtering (CF) is a commonly used recommendation approach that generates recommendations based on correlations among user preferences. Although several enhancements have increased the accuracy of memory-based CF through the development of improved similarity measures for finding successful neighbors, there has been less investigation into prediction score methods, in which rating/preference scores are assigned to items that have not yet been selected by a user. A TOPSIS solution for evaluating multiple alternatives based on more than one criterion is proposed as an alternative to prediction score methods for evaluating and ranking items based on the results from similar users. The recommendation accuracy of the proposed TOPSIS technique is evaluated by applying it to various common CF baseline methods, which are then used to analyze the MovieLens 100K and 1M benchmark datasets. The results show that CF based on the TOPSIS method is more accurate than baseline CF methods across a number of common evaluation metrics.
    Matched MeSH terms: Software
  11. Abdullah A, Deris S, Mohamad MS, Anwar S
    PLoS One, 2013;8(4):e61258.
    PMID: 23593445 DOI: 10.1371/journal.pone.0061258
    One of the key aspects of computational systems biology is the investigation on the dynamic biological processes within cells. Computational models are often required to elucidate the mechanisms and principles driving the processes because of the nonlinearity and complexity. The models usually incorporate a set of parameters that signify the physical properties of the actual biological systems. In most cases, these parameters are estimated by fitting the model outputs with the corresponding experimental data. However, this is a challenging task because the available experimental data are frequently noisy and incomplete. In this paper, a new hybrid optimization method is proposed to estimate these parameters from the noisy and incomplete experimental data. The proposed method, called Swarm-based Chemical Reaction Optimization, integrates the evolutionary searching strategy employed by the Chemical Reaction Optimization, into the neighbouring searching strategy of the Firefly Algorithm method. The effectiveness of the method was evaluated using a simulated nonlinear model and two biological models: synthetic transcriptional oscillators, and extracellular protease production models. The results showed that the accuracy and computational speed of the proposed method were better than the existing Differential Evolution, Firefly Algorithm and Chemical Reaction Optimization methods. The reliability of the estimated parameters was statistically validated, which suggests that the model outputs produced by these parameters were valid even when noisy and incomplete experimental data were used. Additionally, Akaike Information Criterion was employed to evaluate the model selection, which highlighted the capability of the proposed method in choosing a plausible model based on the experimental data. In conclusion, this paper presents the effectiveness of the proposed method for parameter estimation and model selection problems using noisy and incomplete experimental data. This study is hoped to provide a new insight in developing more accurate and reliable biological models based on limited and low quality experimental data.
    Matched MeSH terms: Software*
  12. Nima Khoshsirat, Nurul Amziah Md Yunus, Mohd Nizar Hamidon, Suhaidi Shafie, Nowshad Amin
    MyJurnal
    A numerical simulation and analysis was performed to investigate the effect of absorber and buffer layer band gap grading and on a Copper-Indium-Gallium-Diselenide (CIGS) solar cell. The software used is the Solar Cell Capacitance Simulator (SCAPS). The absorber and buffer layer energy band structures’ effect on the cell’s output parameters such as open circuit voltage, short circuit current density, fill factor and efficiency were extensively simulated. Two structures of the energy band gap were simulated and studied for each of the absorber and buffer layer. The simulation was done on the uniform structure in which the energy band gap is constant throughout the layer. It was then continued on the cell with graded band structure, where the energy band gap of the material is varied throughout the layer. It was found that the cell with graded band structure in absorber and buffer layer had demonstrated higher efficiency and better performance in comparison with the cell with uniform band gap structure.
    Matched MeSH terms: Software
  13. Mohammed Taher Alfates, Biak, Dayang Radiah Awang
    MyJurnal
    Transport of fuel is essential to ensure supplies are delivered as per requested by the industrial sites or other demands. Numerous accidents have been reported and recorded in which loss of containment of hazardous chemicals occurred and led to disastrous outcomes. This paper presents the analysis of Boiling Liquid Expanding Vapour Explosion (BLEVE) due to loss of containment for Liquefied Petroleum Gas (LPG) road tankers. The main objective of this paper is to evaluate the potential consequences resulting from overpressure blast and thermal radiation of tankers carrying LPG to the people and the surrounding. The aim is also to compare the outcomes obtained from PHAST software simulator 8.11 with that of established mathematical model. Malaysia North-south Expressway (NSE) was selected as the location of the incident. The volume, weather parameters and properties of LPG were identified. It was found that the effect of BLEVE on people and structures was catastrophic. The results obtained from the mathematical model were similar with that modelled using PHAST software simulator.
    Matched MeSH terms: Software
  14. Lund LA, Omar Z, Khan I
    Heliyon, 2019 Mar;5(3):e01345.
    PMID: 30949601 DOI: 10.1016/j.heliyon.2019.e01345
    This study investigates the numerical solutions of MHD boundary layer and heat transfer of the Williamson fluid flow on the exponentially vertical shrinking sheet, having variable thickness and thermal conductivity under effects of the velocity and thermal slip parameters. It is also assumed that shrinking/stretching velocity, as well as the wall temperature, has the exponential function form. In this study, the continuity, momentum and energy equations with buoyancy parameter and Hartmann number are incorporated especially in the Williamson fluid flow case. Similarity transformation variables have been employed to formulate the ordinary differential equations (ODEs) from partial differential equations (PDEs). The resultant ODEs are solved by shooting method with Runge Kutta of fourth order method in Maple software. The effects of the different applied non-dimensional physical parameters on the boundary layer and heat transfer flow problems are presented in graphs. The effects of Williamson parameter, Prandtl number, and slip parameters on velocity and temperature profiles have been thoroughly demonstrated and discussed. The numerical results show that the buoyancy force and the slip parameters contribute to the occurrence of the dual solutions on the boundary layer and heat transfer flow problems. Furthermore, the stability analysis suggests that the first solution is stable and physically possible.
    Matched MeSH terms: Software
  15. Najiy Rizal Suriani Rizal, Azuddin Mamat, Aidah Jumahat
    MyJurnal
    In recent years, injection moulding process is one of the most advanced and efficient manufacturing processes for mass production of plastic bottles. However, a good quality of parison is difficult to achieve due to uncontrollable humidity, pressure inlet and water inlet velocity. This paper investigates the effect of using multiple mould cavities to improve the process fill time and injection pressure in the production of PET plastic bottles using MoldFlow software. The modelling of parison was developed using CATIA with the consideration of every part of the parison. MoldFlow software was used to analyse the flow of 20 g parison with different cavity numbers (1, 8, 16, 24 cavity), as well as its corresponding runner size towards its fill time and injection pressure. Other important parameters that affect the production of parison, such as melting temperature, mould temperature, atmospheric temperature and cooling time, were remained constant. The fill time required to produce 24 moulds was improved by 60% compared to using 8 mould cavity only, and this enable the production of more plastic bottles in a day. Therefore, fill time and injection pressure are two important parameters to be considered in the injection moulding process, especially to reduce parison defect and increase its production rate.
    Matched MeSH terms: Software
  16. Vasudevan R, Ismail P, Jaafar N, Mohamad N, Etemad E, Wan Aliaa W, et al.
    Balkan J. Med. Genet., 2014 Jun;17(1):37-40.
    PMID: 25741213 DOI: 10.2478/bjmg-2014-0023
    The aim of this study was to determine the association of the c.894G>T; p.Glu298Asp polymorphism and the variable number tandem repeat (VNTR) polymorphism of the endothelial nitric oxide synthase (eNOS) gene and c.181C>T polymorphism of the bradykinin type 2 receptor gene (B2R) in Malaysian end-stage renal disease (ESRD) subjects. A total of 150 ESRD patients were recruited from the National Kidney Foundation's (NKF)dialysis centers in Malaysia and compared with 150 normal healthy individuals. Genomic DNA was extracted from buccal cells of all the subjects. The polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) method was carried out to amplify the products and the restricted fragments were separated by agarose gel electrophoresis. Statistical analyses were carried out using software where a level of p <0.05 was considered to be statistically significant. The genotypic and allelic frequencies of the B2R gene (c.181C>T, 4b/a) and eNOS gene (c.894G>T) polymorphisms were not statistically significant (p >0.05) when compared to the control subjects. The B2R and eNOS gene polymorphisms may not be considered as genetic susceptibility markers for Malaysian ESRD subjects.
    Matched MeSH terms: Software
  17. Hossain MI, Faruque MR, Islam MT
    Prog Biophys Mol Biol, 2015 Nov;119(2):103-10.
    PMID: 25863147 DOI: 10.1016/j.pbiomolbio.2015.03.008
    The aim of this paper is to investigate the effects of the distances between the human head and internal cellular device antenna on the specific absorption rate (SAR). This paper also analyzes the effects of inclination angles between user head and mobile terminal antenna on SAR values. The effects of the metal-glass casing of mobile phone on the SAR values were observed in the vicinity of the human head model. Moreover, the return losses were investigated in all cases to mark antenna performance. This analysis was performed by adopting finite-difference time-domain (FDTD) method on Computer Simulation Technology (CST) Microwave Studio. The results indicate that by increasing the distance between the user head and antenna, SAR values are decreased. But the increase in inclination angle does not reduce SAR values in all cases. Additionally, this investigation provides some useful indication for future design of low SAR mobile terminal antenna.
    Matched MeSH terms: Software
  18. Ahmed AA, Xue Li C
    J Forensic Sci, 2018 Jan;63(1):112-121.
    PMID: 28397244 DOI: 10.1111/1556-4029.13506
    Cloud storage service allows users to store their data online, so that they can remotely access, maintain, manage, and back up data from anywhere via the Internet. Although helpful, this storage creates a challenge to digital forensic investigators and practitioners in collecting, identifying, acquiring, and preserving evidential data. This study proposes an investigation scheme for analyzing data remnants and determining probative artifacts in a cloud environment. Using pCloud as a case study, this research collected the data remnants available on end-user device storage following the storing, uploading, and accessing of data in the cloud storage. Data remnants are collected from several sources, including client software files, directory listing, prefetch, registry, network PCAP, browser, and memory and link files. Results demonstrate that the collected remnants data are beneficial in determining a sufficient number of artifacts about the investigated cybercrime.
    Matched MeSH terms: Software
  19. Mustafa NS, Akhmal NH, Izman S, Ab Talib MH, Shaiful AIM, Omar MNB, et al.
    Polymers (Basel), 2021 May 14;13(10).
    PMID: 34069101 DOI: 10.3390/polym13101584
    The design of a scaffold of bone tissue engineering plays an important role in ensuring cell viability and cell growth. Therefore, it is a necessity to produce an ideal scaffold by predicting and simulating the properties of the scaffold. Hence, the computational method should be adopted since it has a huge potential to be used in the implementation of the scaffold of bone tissue engineering. To explore the field of computational method in the area of bone tissue engineering, this paper provides an overview of the usage of a computational method in designing a unit cell of bone tissue engineering scaffold. In order to design a unit cell of the scaffold, we discussed two categories of unit cells that can be used to design a feasible scaffold, which are non-parametric and parametric designs. These designs were later described and being categorised into multiple types according to their characteristics, such as circular structures and Triply Periodic Minimal Surface (TPMS) structures. The advantages and disadvantages of these designs were discussed. Moreover, this paper also represents some software that was used in simulating and designing the bone tissue scaffold. The challenges and future work recommendations had also been included in this paper.
    Matched MeSH terms: Software
  20. Lim H, Mat Jafri M, Abdullah K, Sultan Alsultan
    Sains Malaysiana, 2012;41:841-846.
    This study was conducted to retrieve the land surface temperature (LST) from Landsat ETM+ data for Al Qassim, Saudi Arabia. The proposed technique employed a mono window LST algorithm for retrieving surface temperature from Landsat ETM+. The land surface emissivity and solar angle values were needed in order to apply these in the proposed algorithm. The surface emissivity values were computed based on the NDVI values. The LST values derived from ATCOR2_T in the PCI Geomatica image processing software was used for algorithm calibration. The results showed a high correlation
    coefficient (R) and low root-mean-square error (RMS) between the LST values retrieved from the proposed algorithm and ATCOR2_T. This study indicated that the proposed algorithm is capable of retrieving accurate LST values and the derived information can be used in the environmental impact assessment for Al Qassim area.
    Matched MeSH terms: Software
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links