Displaying publications 41 - 60 of 754 in total

Abstract:
Sort:
  1. Dzulkarnain AAA, Rahmat S, Ismail AW, Musa R, Badzis M, Tengku Zam Zam TZH
    Med J Malaysia, 2019 04;74(2):168-173.
    PMID: 31079129
    INTRODUCTION: This paper describes the development and the evaluation of a new Two-dimensional (2D) computer-based (CB) Simulated Learning Environment (SLE) software for routine audiology tests that comes with learning assistance for audiology students. The aim of the study was to serve as preliminary evaluation on the effectiveness of the new 2D CB SLE audiology software among audiology students.

    MATERIALS AND METHODS: The development process of the new 2D CB SLE includes, (i) the identification of common errors made by students in the audiology clinic, (ii) the development of five case simulations that include four routine audiology tests incorporating learning assistance derived from the errors commonly made by audiology students and, (iii) the development of 2D CB SLE from a technical perspective. A preliminary evaluation of the use of the 2D CB SLE software was conducted among twenty-six second-year undergraduate audiology students.

    RESULTS: The pre-analysis evaluation of the new 2D CB SLE showed that the majority of the students perceived the new 2D CB SLE software as realistic and helpful for them in achieving the course learning outcomes and in improving their clinical skills. The mean overall scores among the twenty-six students using the self-reported questionnaire were significantly higher when using the 2D CB SLE software than with the existing software typically used in their SLE training.

    CONCLUSIONS: This new 2D CB SLE software has the potential for use by audiology students for enhancing their learning.

    Matched MeSH terms: Software
  2. Arifin MH, Kayode JS, Ismail MKI, Abdullah AM, Embrandiri A, Nazer NSM, et al.
    MethodsX, 2021;8:101182.
    PMID: 33365262 DOI: 10.1016/j.mex.2020.101182
    A novel methodological approach was developed to quantified the volume of industrial waste desposal (IWD) site, combined with municipal waste materials (MWM), through the integration of a non-invasive, fast, and less expenssive RES2-D Electrical Resistivity Technique (ERT), using Wenner-Schlumberger electrode array geophysical method with Oasis Montaj software. Underground water bearing structures, and the eco-system are being contaminated through seepage of the plumes emanating from the mixtures of the industrial waste materials (IWM), made of moist cemented soil with municipal solid wastes (MSW) dumped at the site. The distribution of the contiminant hazardous plumes emanating from the waste materials' mixtures within the subsurface structural lithological layers was clearly map and delineated within the near-surface structures, using the triplicate technique to collect samples of the soil with the waste mixtures, and the water analysis for the presence of dissolved ions. The deployed method helped to monitor the seepage of the contaminant leachate plumes to the groundwater aquifer units via the ground surface, through the subsurface stratum lithological layers, and hence, estimation of the waste materials' volume was possibly approximated to be 312,000 m3. In summary, the novel method adopted are as presented below:•The novel method is transferable, reproduce-able, and most importantly, it is unambiguous technique for the quantification of environmental, industrial and municipal waste materials.•It helps to map the distribution of the plumes emanating from the waste materials' mixtures within the subsurface structural lithological layers that was clearly delineated within the near-surface structures underlain the study site.•The procedure helped in the monitoring of leachate contaminants plumes seepages into the surface water bodies and the groundwater aquifer units, via the ground surface, through to the porous subsurface stratum lithological layers.
    Matched MeSH terms: Software
  3. Mohamed Moubark A, Ali SH
    ScientificWorldJournal, 2014;2014:107831.
    PMID: 25197687 DOI: 10.1155/2014/107831
    This paper presents a new practical QPSK receiver that uses digitized samples of incoming QPSK analog signal to determine the phase of the QPSK symbol. The proposed technique is more robust to phase noise and consumes up to 89.6% less power for signal detection in demodulation operation. On the contrary, the conventional QPSK demodulation process where it uses coherent detection technique requires the exact incoming signal frequency; thus, any variation in the frequency of the local oscillator or incoming signal will cause phase noise. A software simulation of the proposed design was successfully carried out using MATLAB Simulink software platform. In the conventional system, at least 10 dB signal to noise ratio (SNR) is required to achieve the bit error rate (BER) of 10(-6), whereas, in the proposed technique, the same BER value can be achieved with only 5 dB SNR. Since some of the power consuming elements such as voltage control oscillator (VCO), mixer, and low pass filter (LPF) are no longer needed, the proposed QPSK demodulator will consume almost 68.8% to 99.6% less operational power compared to conventional QPSK demodulator.
    Matched MeSH terms: Software*
  4. Dzulkarnain AA, Wan Mhd Pandi WM, Wilson WJ, Bradley AP, Sapian F
    Int J Audiol, 2014 Aug;53(8):514-21.
    PMID: 24702636 DOI: 10.3109/14992027.2014.897763
    To determine if a computer simulation can be used to improve the ability of audiology students to analyse ABR waveforms.
    Matched MeSH terms: Software
  5. Wong, Lai Hong, Balkis Bashuri, Atiah Ayunni Abdul Ghani, Khairul Osman, Nor Atika Md Ashar
    MyJurnal
    Identifi cation of unknown suspect through bite marks has always been challenging. Narrowing list of suspects through sex and race markers is always recommend but rarely utilized due to limited publication in this area. Thus, this preliminary research was aimed to study the difference of bite mark made on dental wax between sex and race. A sample size of 40 UKM undergraduates comprising of Malay (male = 10, female = 10) and Chinese (male = 10, female = 10) were used in this study. Bite mark of subject was obtained through dental wax, digitally scanned and analyzed using Image-J software. Parameters measured were anterior teeth size, intercanine width and anterior teeth relative rotation. Result indicated that mandible left canine tooth size had signifi cant sexual dimorphism (p < 0.05) in differentiating sex. The means for male and female measured were 4.63 ± 1.05 mm and 5.35 ± 0.87 mm respectively. In addition to the result, tooth size of maxillary left canine and mandible left lateral incisor were signifi cantly different (p < 0.05) between races. Means for mandible left canine Malay and Chinese were 5.27 ± 1.01 mm and 4.50 ± 1.22 mm respectively. Furthermore, left lateral incisor mandible had means of 5.15 ± 0.87 mm and 4.60 ± 0.74 mm for Malay and Chinese respectively. Unfortunately, there were no signifi cant differences for intercanine width and anterior teeth relative rotation between the two major races in Malaysia. In conclusion, this research has demonstrated the possibility of using tooth size of mandible left canine, maxillary left canine and mandible left lateral discriminate sex and race.
    Matched MeSH terms: Software
  6. Alias Mahmud, Nor Hayati Alwi, Tajularipin Sulaiman
    MyJurnal
    Objective: The study aimed to obtain the perspective and teaching practice of novice lecturers serving at the training institutions, Ministry of Health Malaysia (MOH).

    Method: A qualitative research was conducted on 4 novice lecturers at the Medical Assistant College, Seremban. Data were obtained from interview and observation on their teaching in the lecture rooms. The data analysis was performed by using NVivo 9 software.

    Result: In the aspect of the teaching perspective, the finding showed that there were two main themes; teaching concept and the teaching method. As far as the teaching concept is concerned, respondents perceived that lecturers were the source of knowledge and those who transfered the knowledge to the students. Meanwhile, the second perspective related to the teaching approach in which lecturers need to use their experiences, they need to be knowledgeable and creative in their teaching. The integration of the themes has formed the main perspective, which was the lecturer-centered teaching. In turn, in the teaching practice, it was consistent with their perspective whereby the approach of teaching is lecturer-centered.

    Conclusion: This study showed that new lecturers would employ the lecturer-centered approach. Apart from that, they were also lacking of the skills in terms of class control and value inculcation. The deficiency in both these aspects needs to be overcome as it can affect the effectiveness of the teaching, also the quality of the graduates produced.
    Matched MeSH terms: Software
  7. Shuib, A., Alwadood, Z.
    MyJurnal
    This paper presents a mathematical approach to solve railway rescheduling problems. The approach assumes that the trains are able to resume their journey after a given time frame of disruption whereby The train that experiences disruption and trains affected by the incident are rescheduled. The approach employed mathematical model to prioritise certain types of train according the railway operator’s requirement. A pre-emptive goal programming model was adapted to find an optimal solution that satisfies the operational constraints and the company’s stated goals. Initially, the model minimises the total service delay of all trains while adhering to the minimum headway requirement and track capacity. Subsequently, it maximises the train service reliability by only considering the trains with delay time window of five minutes or less. The model uses MATLAB R2014a software which automatically generates the optimal solution of the problem based on the input matrix of constraints. An experiment with three incident scenarios on a double-track railway of local network was conducted to evaluate the performance of the proposed model. The new provisional timetable was produced in short computing time and the model was able to prioritise desired train schedule.
    Matched MeSH terms: Software
  8. Almogahed A, Mahdin H, Omar M, Zakaria NH, Gu YH, Al-Masni MA, et al.
    PLoS One, 2023;18(11):e0293742.
    PMID: 37917752 DOI: 10.1371/journal.pone.0293742
    Refactoring, a widely adopted technique, has proven effective in facilitating and reducing maintenance activities and costs. Nonetheless, the effects of applying refactoring techniques on software quality exhibit inconsistencies and contradictions, leading to conflicting evidence on their overall benefit. Consequently, software developers face challenges in leveraging these techniques to improve software quality. Moreover, the absence of a categorization model hampers developers' ability to decide the most suitable refactoring techniques for improving software quality, considering specific design goals. Thus, this study aims to propose a novel refactoring categorization model that categorizes techniques based on their measurable impacts on internal quality attributes. Initially, the most common refactoring techniques used by software practitioners were identified. Subsequently, an experimental study was conducted using five case studies to measure the impacts of refactoring techniques on internal quality attributes. A subsequent multi-case analysis was conducted to analyze these effects across the case studies. The proposed model was developed based on the experimental study results and the subsequent multi-case analysis. The model categorizes refactoring techniques into green, yellow, and red categories. The proposed model, by acting as a guideline, assists developers in understanding the effects of each refactoring technique on quality attributes, allowing them to select appropriate techniques to improve specific quality attributes. Compared to existing studies, the proposed model emerges superior by offering a more granular categorization (green, yellow, and red categories), and its range is wide (including ten refactoring techniques and eleven internal quality attributes). Such granularity not only equips developers with an in-depth understanding of each technique's impact but also fosters informed decision-making. In addition, the proposed model outperforms current studies and offers a more nuanced understanding, explicitly highlighting areas of strength and concern for each refactoring technique. This enhancement aids developers in better grasping the implications of each refactoring technique on quality attributes. As a result, the model simplifies the decision-making process for developers, saving time and effort that would otherwise be spent weighing the benefits and drawbacks of various refactoring techniques. Furthermore, it has the potential to help reduce maintenance activities and associated costs.
    Matched MeSH terms: Software*
  9. Tan CS, Ting WS, Mohamad MS, Chan WH, Deris S, Shah ZA
    Biomed Res Int, 2014;2014:213656.
    PMID: 25250315 DOI: 10.1155/2014/213656
    When gene expression data are too large to be processed, they are transformed into a reduced representation set of genes. Transforming large-scale gene expression data into a set of genes is called feature extraction. If the genes extracted are carefully chosen, this gene set can extract the relevant information from the large-scale gene expression data, allowing further analysis by using this reduced representation instead of the full size data. In this paper, we review numerous software applications that can be used for feature extraction. The software reviewed is mainly for Principal Component Analysis (PCA), Independent Component Analysis (ICA), Partial Least Squares (PLS), and Local Linear Embedding (LLE). A summary and sources of the software are provided in the last section for each feature extraction method.
    Matched MeSH terms: Software*; Software Design
  10. Mohamad Arif J, Ab Razak MF, Awang S, Tuan Mat SR, Ismail NSN, Firdaus A
    PLoS One, 2021;16(9):e0257968.
    PMID: 34591930 DOI: 10.1371/journal.pone.0257968
    The evolution of malware is causing mobile devices to crash with increasing frequency. Therefore, adequate security evaluations that detect Android malware are crucial. Two techniques can be used in this regard: Static analysis, which meticulously examines the full codes of applications, and dynamic analysis, which monitors malware behaviour. While both perform security evaluations successfully, there is still room for improvement. The goal of this research is to examine the effectiveness of static analysis to detect Android malware by using permission-based features. This study proposes machine learning with different sets of classifiers was used to evaluate Android malware detection. The feature selection method in this study was applied to determine which features were most capable of distinguishing malware. A total of 5,000 Drebin malware samples and 5,000 Androzoo benign samples were utilised. The performances of the different sets of classifiers were then compared. The results indicated that with a TPR value of 91.6%, the Random Forest algorithm achieved the highest level of accuracy in malware detection.
    Matched MeSH terms: Software*
  11. Mohajeri L, Aziz HA, Isa MH, Zahed MA
    Bioresour Technol, 2010 Feb;101(3):893-900.
    PMID: 19773160 DOI: 10.1016/j.biortech.2009.09.013
    This work studied the bioremediation of weathered crude oil (WCO) in coastal sediment samples using central composite face centered design (CCFD) under response surface methodology (RSM). Initial oil concentration, biomass, nitrogen and phosphorus concentrations were used as independent variables (factors) and oil removal as dependent variable (response) in a 60 days trial. A statistically significant model for WCO removal was obtained. The coefficient of determination (R(2)=0.9732) and probability value (P<0.0001) demonstrated significance for the regression model. Numerical optimization based on desirability function were carried out for initial oil concentration of 2, 16 and 30 g per kg sediment and 83.13, 78.06 and 69.92 per cent removal were observed respectively, compare to 77.13, 74.17 and 69.87 per cent removal for un-optimized results.
    Matched MeSH terms: Software
  12. Ng, C. S., Leman, A. M., N. Asmuin
    MyJurnal
    Local exhaust ventilation (LEV) is used in industries to capture contaminants such as gases, dusts, mists, vapours or fumes out of workstations to protect occupants’ exposure to contaminants. LEV is allocated and installed by employers, however it doesn’t work accordingly. LEV design is often overlooked and underappreciated. Effectiveness of LEV system can be achieved if more attention is focused to proper design of LEV system. To solve this issue, computational fluid dynamics (CFD) can be performed. CFD is a software tool to predict and simulate fluid dynamic phenomena. CFD is used to forecast or reconstruct the behaviour of an engineering product under assumed or measure boundary conditions. However, CFD is just a prediction tool, which can lead to inaccuracy of predicting airflow due to problems with pre-processing, solver and post-processing with parameter from actual experimental results. Therefore, validation is needed to help minimizing percentage error of CFD methods. In this research, measurements of airflow parameter of LEV system at National Institute of Occupational Safety and Health (NIOSH) Bangi, Selangor were conducted. Control Speed panel found at NIOSH Bangi, which is used to increase or decrease speed of fan, was performed using Control Speed of 20%, 40% 60% and 80%. Upon validation, average absolute error obtained from four different control speeds ranges from 3.372% to 4.862%. Validity of CFD modelling is acceptable, which is less than 5% and good agreement is achieved between actual experimental results and CFD simulation results. Therefore, it can be concluded that CFD software tool can be performed to simulate air velocity in LEV system. CFD methods can save labour costs and time consumption when it is used during earliest stage of LEV design, before actual construction is implemented. The outcome of this paper can be used as a baseline for factories equipped with LEV system to protect occupants’ exposure to contaminants.
    Matched MeSH terms: Software
  13. Kathiravan, Yamunah, Mohd Fahmi Mohamad Amran, Noor Afiza Mat Razali, Mohd Afizi Mohd Shukran, Norshahriah Abdul Wahab, Mohammad Adib Khairuddin, et al.
    MyJurnal
    Privacy has always been a constant concern for many people. Internet users are often worried about the browsing information that is left on their storage media. Web browsers were later introduced with a new feature called private browsing to overcome this issue. The private browsing mode is expected to behave as normal browsing session but without storing any data such as browser cookies, history, cache and passwords on the local machine. Unfortunately, previous researchers concluded web browser often failed to provide the intended privacy protection to their user. Along the way of this reviewing process, the weakness and downside of previous web browser vendors have been identified.
    Matched MeSH terms: Software
  14. Khalid Z, Fisal N, Rozaini M
    Sensors (Basel), 2014;14(12):24046-97.
    PMID: 25615737 DOI: 10.3390/s141224046
    Wireless Sensor Network (WSN) is leading to a new paradigm of Internet of Everything (IoE). WSNs have a wide range of applications but are usually deployed in a particular application. However, the future of WSNs lies in the aggregation and allocation of resources, serving diverse applications. WSN virtualization by the middleware is an emerging concept that enables aggregation of multiple independent heterogeneous devices, networks, radios and software platforms; and enhancing application development. WSN virtualization, middleware can further be categorized into sensor virtualization and network virtualization. Middleware for WSN virtualization poses several challenges like efficient decoupling of networks, devices and software. In this paper efforts have been put forward to bring an overview of the previous and current middleware designs for WSN virtualization, the design goals, software architectures, abstracted services, testbeds and programming techniques. Furthermore, the paper also presents the proposed model, challenges and future opportunities for further research in the middleware designs for WSN virtualization.
    Matched MeSH terms: Software
  15. Elhag AA, Mohamad R, Aziz MW, Zeshan F
    PLoS One, 2015;10(4):e0123086.
    PMID: 25928358 DOI: 10.1371/journal.pone.0123086
    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system.
    Matched MeSH terms: Software*
  16. Modi R, Kohli S, Rajeshwari K, Bhatia S
    Eur J Dent, 2015 6 4;9(2):255-261.
    PMID: 26038660 DOI: 10.4103/1305-7456.156847
    OBJECTIVE: The aim of the study is to evaluate the stress distribution in tooth supported 5-unit fixed partial denture (FPD) having tooth as pier abutment using rigid and nonrigid connectors respectively, under simultaneous and progressive loading.

    MATERIAL AND METHODS: The three-dimensional (3D) finite element program (ANSYS software) was used to construct the mathematical model. Two 5-unit FPD'S were simulated, one with rigid connector and another one with nonrigid connector. For analysis, each of these models were subjected to axial and oblique forces under progressive loading (180, 180, 120, 120, 80 N force on first and second molars, premolars and canine respectively) and simultaneous loading (100, 100, 100, 100, 100 N force on first and second molars, premolars and canine respectively).

    RESULTS: The rigid and nonrigid connector design have effect on stress distribution in 5-unit FPDs with pier abutments.

    CONCLUSION: Oblique forces produce more stresses than vertical forces. Nonrigid connector resulted in decrease in stress at the level of prosthesis and increase in stress at the level of alveolar crest.

    Matched MeSH terms: Software
  17. Marufuzzaman M, Reaz MB, Ali MA, Rahman LF
    Methods Inf Med, 2015;54(3):262-70.
    PMID: 25604028 DOI: 10.3414/ME14-01-0061
    OBJECTIVES: The goal of smart homes is to create an intelligent environment adapting the inhabitants need and assisting the person who needs special care and safety in their daily life. This can be reached by collecting the ADL (activities of daily living) data and further analysis within existing computing elements. In this research, a very recent algorithm named sequence prediction via enhanced episode discovery (SPEED) is modified and in order to improve accuracy time component is included.

    METHODS: The modified SPEED or M-SPEED is a sequence prediction algorithm, which modified the previous SPEED algorithm by using time duration of appliance's ON-OFF states to decide the next state. M-SPEED discovered periodic episodes of inhabitant behavior, trained it with learned episodes, and made decisions based on the obtained knowledge.

    RESULTS: The results showed that M-SPEED achieves 96.8% prediction accuracy, which is better than other time prediction algorithms like PUBS, ALZ with temporal rules and the previous SPEED.

    CONCLUSIONS: Since human behavior shows natural temporal patterns, duration times can be used to predict future events more accurately. This inhabitant activity prediction system will certainly improve the smart homes by ensuring safety and better care for elderly and handicapped people.

    Matched MeSH terms: Software Design
  18. Nik Fadzly, Asyraf Mansor, Rahmad Zakaria, Syed Ahmad Edzham
    Sains Malaysiana, 2014;43:973-976.
    Rattans are one of the most unique and economically important plants for most tropical countries. There is however, a lack of interest in the specific study of the rattan spines. In this paper, we tested a new hypothesis concerning the functional role of rattan spines. We proposed that rattan spines also serve as a visual deterrent against herbivores or seed predators. In our proposed method we used an Imaging software, ImageJ, to measure the spine area of four species of rattan (Calamus insignis, Myrialepis schortechinii, Plectocomiopsis geminiflorus and Calamus caesius) from two different orientations (root to shoot and vice versa). Our results showed that rattan spines were very heterogeneous and highly variable between different species. One common trait that the rattan spines share is that spine area measurements of shoot to root (ShR) are larger than root to shoot (RH) orientation. We propose that the downwards spine angle might be specifically designed to discourage climbing leaf and seed predators.
    Matched MeSH terms: Software
  19. Ali A, Logeswaran R
    Comput Biol Med, 2007 Aug;37(8):1141-7.
    PMID: 17126314
    The 3D ultrasound systems produce much better reproductions than 2D ultrasound, but their prohibitively high cost deprives many less affluent organization this benefit. This paper proposes using the conventional 2D ultrasound equipment readily available in most hospitals, along with a single conventional digital camera, to construct 3D ultrasound images. The proposed system applies computer vision to extract position information of the ultrasound probe while the scanning takes place. The probe, calibrated in order to calculate the offset of the ultrasound scan from the position of the marker attached to it, is used to scan a number of geometrical objects. Using the proposed system, the 3D volumes of the objects were successfully reconstructed. The system was tested in clinical situations where human body parts were scanned. The results presented, and confirmed by medical staff, are very encouraging for cost-effective implementation of computer-aided 3D ultrasound using a simple setup with 2D ultrasound equipment and a conventional digital camera.
    Matched MeSH terms: Software Design
  20. ASSUNTA MALAR PATRICK VINCENT, HASSILAH SALLEH
    MyJurnal
    A wide range of studies have been conducted on deep learning to forecast time series data. However, very few researches have discussed the optimal number of hidden layers and nodes in each hidden layer of the architecture. It is crucial to study the number of hidden layers and nodes in each hidden layer as it controls the performance of the architecture. Apart from that, in the presence of the activation function, diverse computation between the hidden layers and output layer can take place. Therefore, in this study, the multilayer perceptron (MLP) architecture is developed using the Python software to forecast time series data. Then, the developed architecture is applied on the Apple Inc. stock price due to its volatile characteristic. Using historical prices, the accuracy of the forecast is measured by the different activation functions, number of hidden layers and size of data. The Keras deep learning library, which can be found in the Python software, is used to develop the MLP architecture to forecast the Apple Inc. stock price. The developed model is then applied on different cases, namely different sizes of data, different activation functions, different numbers of hidden layers of up to nine layers, and different numbers of nodes in each hidden layer. Then, the metrics mean squared error (MSE), mean absolute error (MAE) and root-mean-square error (RMSE) are employed to test the accuracy of the forecast. It is found that the architecture with rectified linear unit (ReLU) outperformed in every hidden layer and each case with the highest accuracy. To conclude, the optimal number of hidden layers differs in every case as there are other influencing factors.
    Matched MeSH terms: Software
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links