Displaying publications 21 - 40 of 313 in total

Abstract:
Sort:
  1. Tavana M, Khosrojerdi G, Mina H, Rahman A
    Eval Program Plann, 2019 12;77:101703.
    PMID: 31442587 DOI: 10.1016/j.evalprogplan.2019.101703
    The primary goal in project portfolio management is to select and manage the optimal set of projects that contribute the maximum in business value. However, selecting Information Technology (IT) projects is a difficult task due to the complexities and uncertainties inherent in the strategic-operational nature of the process, and the existence of both quantitative and qualitative criteria. We propose a two-stage process to select an optimal project portfolio with the aim of maximizing project benefits and minimizing project risks. We construct a two-stage hybrid mathematical programming model by integrating Fuzzy Analytic Hierarchy Process (FAHP) with Fuzzy Inference System (FIS). This hybrid framework provides the ability to consider both the quantitative and qualitative criteria while considering budget constraints and project risks. We also present a real-world case study in the cybersecurity industry to exhibit the applicability and demonstrate the efficacy of our proposed method.
    Matched MeSH terms: Models, Statistical*
  2. Awajan AM, Ismail MT, Al Wadi S
    PLoS One, 2018;13(7):e0199582.
    PMID: 30016323 DOI: 10.1371/journal.pone.0199582
    Many researchers documented that the stock market data are nonstationary and nonlinear time series data. In this study, we use EMD-HW bagging method for nonstationary and nonlinear time series forecasting. The EMD-HW bagging method is based on the empirical mode decomposition (EMD), the moving block bootstrap and the Holt-Winter. The stock market time series of six countries are used to compare EMD-HW bagging method. This comparison is based on five forecasting error measurements. The comparison shows that the forecasting results of EMD-HW bagging are more accurate than the forecasting results of the fourteen selected methods.
    Matched MeSH terms: Models, Statistical*
  3. Segun OE, Shohaimi S, Nallapan M, Lamidi-Sarumoh AA, Salari N
    PMID: 32429373 DOI: 10.3390/ijerph17103474
    Background: despite the increase in malaria control and elimination efforts, weather patterns and ecological factors continue to serve as important drivers of malaria transmission dynamics. This study examined the statistical relationship between weather variables and malaria incidence in Abuja, Nigeria. Methodology/Principal Findings: monthly data on malaria incidence and weather variables were collected in Abuja from the year 2000 to 2013. The analysis of count outcomes was based on generalized linear models, while Pearson correlation analysis was undertaken at the bivariate level. The results showed more malaria incidence in the months with the highest rainfall recorded (June-August). Based on the negative binomial model, every unit increase in humidity corresponds to about 1.010 (95% confidence interval (CI), 1.005-1.015) times increase in malaria cases while the odds of having malaria decreases by 5.8% for every extra unit increase in temperature: 0.942 (95% CI, 0.928-0.956). At lag 1 month, there was a significant positive effect of rainfall on malaria incidence while at lag 4, temperature and humidity had significant influences. Conclusions: malaria remains a widespread infectious disease among the local subjects in the study area. Relative humidity was identified as one of the factors that influence a malaria epidemic at lag 0 while the biggest significant influence of temperature was observed at lag 4. Therefore, emphasis should be given to vector control activities and to create public health awareness on the proper usage of intervention measures such as indoor residual sprays to reduce the epidemic especially during peak periods with suitable weather conditions.
    Matched MeSH terms: Models, Statistical*
  4. Sim SZ, Gupta RC, Ong SH
    Int J Biostat, 2018 Jan 09;14(1).
    PMID: 29306919 DOI: 10.1515/ijb-2016-0070
    In this paper, we study the zero-inflated Conway-Maxwell Poisson (ZICMP) distribution and develop a regression model. Score and likelihood ratio tests are also implemented for testing the inflation/deflation parameter. Simulation studies are carried out to examine the performance of these tests. A data example is presented to illustrate the concepts. In this example, the proposed model is compared to the well-known zero-inflated Poisson (ZIP) and the zero- inflated generalized Poisson (ZIGP) regression models. It is shown that the fit by ZICMP is comparable or better than these models.
    Matched MeSH terms: Models, Statistical*
  5. Stephan BCM, Pakpahan E, Siervo M, Licher S, Muniz-Terrera G, Mohan D, et al.
    Lancet Glob Health, 2020 Apr;8(4):e524-e535.
    PMID: 32199121 DOI: 10.1016/S2214-109X(20)30062-0
    BACKGROUND: To date, dementia prediction models have been exclusively developed and tested in high-income countries (HICs). However, most people with dementia live in low-income and middle-income countries (LMICs), where dementia risk prediction research is almost non-existent and the ability of current models to predict dementia is unknown. This study investigated whether dementia prediction models developed in HICs are applicable to LMICs.

    METHODS: Data were from the 10/66 Study. Individuals aged 65 years or older and without dementia at baseline were selected from China, Cuba, the Dominican Republic, Mexico, Peru, Puerto Rico, and Venezuela. Dementia incidence was assessed over 3-5 years, with diagnosis according to the 10/66 Study diagnostic algorithm. Discrimination and calibration were tested for five models: the Cardiovascular Risk Factors, Aging and Dementia risk score (CAIDE); the Study on Aging, Cognition and Dementia (AgeCoDe) model; the Australian National University Alzheimer's Disease Risk Index (ANU-ADRI); the Brief Dementia Screening Indicator (BDSI); and the Rotterdam Study Basic Dementia Risk Model (BDRM). Models were tested with use of Cox regression. The discriminative accuracy of each model was assessed using Harrell's concordance (c)-statistic, with a value of 0·70 or higher considered to indicate acceptable discriminative ability. Calibration (model fit) was assessed statistically using the Grønnesby and Borgan test.

    FINDINGS: 11 143 individuals without baseline dementia and with available follow-up data were included in the analysis. During follow-up (mean 3·8 years [SD 1·3]), 1069 people progressed to dementia across all sites (incidence rate 24·9 cases per 1000 person-years). Performance of the models varied. Across countries, the discriminative ability of the CAIDE (0·52≤c≤0·63) and AgeCoDe (0·57≤c≤0·74) models was poor. By contrast, the ANU-ADRI (0·66≤c≤0·78), BDSI (0·62≤c≤0·78), and BDRM (0·66≤c≤0·78) models showed similar levels of discriminative ability to those of the development cohorts. All models showed good calibration, especially at low and intermediate levels of predicted risk. The models validated best in Peru and poorest in the Dominican Republic and China.

    INTERPRETATION: Not all dementia prediction models developed in HICs can be simply extrapolated to LMICs. Further work defining what number and which combination of risk variables works best for predicting risk of dementia in LMICs is needed. However, models that transport well could be used immediately for dementia prevention research and targeted risk reduction in LMICs.

    FUNDING: National Institute for Health Research, Wellcome Trust, WHO, US Alzheimer's Association, and European Research Council.

    Matched MeSH terms: Models, Statistical*
  6. Waheeb W, Ghazali R, Herawan T
    PLoS One, 2016;11(12):e0167248.
    PMID: 27959927 DOI: 10.1371/journal.pone.0167248
    Time series forecasting has gained much attention due to its many practical applications. Higher-order neural network with recurrent feedback is a powerful technique that has been used successfully for time series forecasting. It maintains fast learning and the ability to learn the dynamics of the time series over time. Network output feedback is the most common recurrent feedback for many recurrent neural network models. However, not much attention has been paid to the use of network error feedback instead of network output feedback. In this study, we propose a novel model, called Ridge Polynomial Neural Network with Error Feedback (RPNN-EF) that incorporates higher order terms, recurrence and error feedback. To evaluate the performance of RPNN-EF, we used four univariate time series with different forecasting horizons, namely star brightness, monthly smoothed sunspot numbers, daily Euro/Dollar exchange rate, and Mackey-Glass time-delay differential equation. We compared the forecasting performance of RPNN-EF with the ordinary Ridge Polynomial Neural Network (RPNN) and the Dynamic Ridge Polynomial Neural Network (DRPNN). Simulation results showed an average 23.34% improvement in Root Mean Square Error (RMSE) with respect to RPNN and an average 10.74% improvement with respect to DRPNN. That means that using network errors during training helps enhance the overall forecasting performance for the network.
    Matched MeSH terms: Models, Statistical*
  7. Liang SN, Lan BL
    PLoS One, 2012;7(5):e36430.
    PMID: 22606259 DOI: 10.1371/journal.pone.0036430
    The newtonian and special-relativistic statistical predictions for the mean, standard deviation and probability density function of the position and momentum are compared for the periodically-delta-kicked particle at low speed. Contrary to expectation, we find that the statistical predictions, which are calculated from the same parameters and initial gaussian ensemble of trajectories, do not always agree if the initial ensemble is sufficiently well-localized in phase space. Moreover, the breakdown of agreement is very fast if the trajectories in the ensemble are chaotic, but very slow if the trajectories in the ensemble are non-chaotic. The breakdown of agreement implies that special-relativistic mechanics must be used, instead of the standard practice of using newtonian mechanics, to correctly calculate the statistical predictions for the dynamics of a low-speed system.
    Matched MeSH terms: Models, Statistical
  8. Zulkifley MA, Moran B, Rawlinson D
    Sensors (Basel), 2012;12(5):5623-49.
    PMID: 22778605 DOI: 10.3390/s120505623
    Foreground detection has been used extensively in many applications such as people counting, traffic monitoring and face recognition. However, most of the existing detectors can only work under limited conditions. This happens because of the inability of the detector to distinguish foreground and background pixels, especially in complex situations. Our aim is to improve the robustness of foreground detection under sudden and gradual illumination change, colour similarity issue, moving background and shadow noise. Since it is hard to achieve robustness using a single model, we have combined several methods into an integrated system. The masked grey world algorithm is introduced to handle sudden illumination change. Colour co-occurrence modelling is then fused with the probabilistic edge-based background modelling. Colour co-occurrence modelling is good in filtering moving background and robust to gradual illumination change, while an edge-based modelling is used for solving a colour similarity problem. Finally, an extended conditional random field approach is used to filter out shadow and afterimage noise. Simulation results show that our algorithm performs better compared to the existing methods, which makes it suitable for higher-level applications.
    Matched MeSH terms: Models, Statistical
  9. Jeong J
    Sensors (Basel), 2011;11(7):6816-41.
    PMID: 22163987 DOI: 10.3390/s110706816
    This paper presents an acoustic noise cancelling technique using an inverse kepstrum system as an innovations-based whitening application for an adaptive finite impulse response (FIR) filter in beamforming structure. The inverse kepstrum method uses an innovations-whitened form from one acoustic path transfer function between a reference microphone sensor and a noise source so that the rear-end reference signal will then be a whitened sequence to a cascaded adaptive FIR filter in the beamforming structure. By using an inverse kepstrum filter as a whitening filter with the use of a delay filter, the cascaded adaptive FIR filter estimates only the numerator of the polynomial part from the ratio of overall combined transfer functions. The test results have shown that the adaptive FIR filter is more effective in beamforming structure than an adaptive noise cancelling (ANC) structure in terms of signal distortion in the desired signal and noise reduction in noise with nonminimum phase components. In addition, the inverse kepstrum method shows almost the same convergence level in estimate of noise statistics with the use of a smaller amount of adaptive FIR filter weights than the kepstrum method, hence it could provide better computational simplicity in processing. Furthermore, the rear-end inverse kepstrum method in beamforming structure has shown less signal distortion in the desired signal than the front-end kepstrum method and the front-end inverse kepstrum method in beamforming structure.
    Matched MeSH terms: Models, Statistical
  10. Han LM, Haron Z, Yahya K, Bakar SA, Dimon MN
    PLoS One, 2015;10(4):e0120667.
    PMID: 25875019 DOI: 10.1371/journal.pone.0120667
    Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces.
    Matched MeSH terms: Models, Statistical
  11. Radin UR, Mackay MG, Hills BL
    Accid Anal Prev, 1996 May;28(3):325-32.
    PMID: 8799436
    Preliminary analysis of the short-term impact of a running headlights intervention revealed that there has been a significant drop in conspicuity-related motorcycle accidents in the pilot areas, Seremban and Shah Alam, Malaysia. This paper attempts to look in more detail at conspicuity-related accidents involving motorcycles. The aim of the analysis was to establish a statistical model to describe the relationship between the frequency of conspicuity-related motorcycle accidents and a range of explanatory variables so that new insights can be obtained into the effects of introducing a running headlight campaign and regulation. The exogenous variables in this analysis include the influence of time trends, changes in the recording and analysis system, the effect of fasting activities during Ramadhan and the "Balik Kampong" culture, a seasonal cultural-religious holiday activity unique to Malaysia. The model developed revealed that the running headlight intervention reduced the conspicuity-related motorcycle accidents by about 29%. It is concluded that the intervention has been successful in improving conspicuity-related motorcycle accidents in Malaysia.
    Matched MeSH terms: Models, Statistical
  12. Zulkifli Yusop, Harisaweni, Fadhilah Yusof
    Sains Malaysiana, 2016;45:87-97.
    Rainfall intensity is the main input variable in various hydrological analysis and modeling. Unfortunately, the quality of rainfall data is often poor and reliable data records are available at coarse intervals such as yearly, monthly and daily. Short interval rainfall records are scarce because of high cost and low reliability of the measurement and the monitoring systems. One way to solve this problem is by disaggregating the coarse intervals to generate the short one using the stochastic method. This paper describes the use of the Bartlett Lewis Rectangular Pulse (BLRP) model. The method was used to disaggregate 10 years of daily data for generating hourly data from 5 rainfall stations in Kelantan as representative area affected by monsoon period and 5 rainfall stations in Damansara affected by inter-monsoon period. The models were evaluated on their ability to reproduce standard and extreme rainfall model statistics derived from the historical record over disaggregation simulation results. The disaggregation of daily to hourly rainfall produced monthly and daily means and variances that closely match the historical records. However, for the disaggregation of daily to hourly rainfall, the standard deviation values are lower than the historical ones. Despite the marked differences in the standard deviation, both data series exhibit similar patterns and the model adequately preserve the trends of all the properties used in evaluating its performances.
    Matched MeSH terms: Models, Statistical
  13. Masseran N, Safari MAM
    PMID: 34201763 DOI: 10.3390/ijerph18136754
    This article proposes a novel data selection technique called the mixed peak-over-threshold-block-maxima (POT-BM) approach for modeling unhealthy air pollution events. The POT technique is employed to obtain a group of blocks containing data points satisfying extreme-event criteria that are greater than a particular threshold u. The selected groups are defined as POT blocks. In parallel with that, a declustering technique is used to overcome the problem of dependency behaviors that occurs among adjacent POT blocks. Finally, the BM concept is integrated to determine the maximum data points for each POT block. Results show that the extreme data points determined by the mixed POT-BM approach satisfy the independent properties of extreme events, with satisfactory fitted model precision results. Overall, this study concludes that the mixed POT-BM approach provides a balanced tradeoff between bias and variance in the statistical modeling of extreme-value events. A case study was conducted by modeling an extreme event based on unhealthy air pollution events with a threshold u > 100 in Klang, Malaysia.
    Matched MeSH terms: Models, Statistical
  14. Siah, W. M., Aminah, A., Ishak, A.
    MyJurnal
    The effects of soaking conditions on the quality characteristics of seaweed paste of Kappaphycus alverazii species were studied. Response Surface Methodology (RSM) with a 2-factor, 5-level central composite design (CCD) was conducted to determine the optimum soaking conditions. The interactive effect of dry seaweed: soaking water ratio (X1 = 1: 15-50) and soaking duration (X2 = 30-120 min) on the gel strength (g), whiteness, expansion (%), moisture content (%) and protein content (g/100 g) of the paste were determined. Results showed that the experimental data could be adequately fitted into a second-order polynomial model with multiple regression coefficients (R2) of 0.8141, 0.9245, 0.9118, 0.9113 and 0.9271 for the gel strength, whiteness, expansion, moisture content and protein content, respectively. The gel strength, whiteness, expansion, moisture content and protein content of seaweed paste were dependent on the ratio of dry seaweed to soaking water and also soaking duration. The proposed optimum soaking conditions for the production of seaweed paste is at a ratio of 1:15 (dry seaweed : soaking water) and soaking duration of 117.06 min. Based on the result obtained, the RSM demonstrated a suitable approach for the processing optimization of Kappaphycus alverazii paste.
    Matched MeSH terms: Models, Statistical
  15. Mohd. Izhan Mohd. Yusoff, Mohd. Rizam Abu Bakar, Abu Hassan Shaari Mohd. Nor
    MyJurnal
    Expectation Maximization (EM) algorithm has experienced a significant increase in terms of usage in many fields of study. In this paper, the performance of the said algorithm in finding the Maximum Likelihood for the Gaussian Mixed Models (GMM), a probabilistic model normally used in fraud detection and recognizing a person’s voice in speech recognition field, is shown and discussed. At the end of the paper, some suggestions for future research works will also be given.
    Matched MeSH terms: Models, Statistical
  16. Nur Hafiza, Z., Maskat, M.Y., Liew, S.L., Mamot, S.
    MyJurnal
    A study was carried out to observe the fermentation process for noni (Morinda citrifolia L.) extract by Saccharomyces cerevisiae. The experiment was based on a central composite rotatable design (CCRD) employing 5 center points with augmented axial and factorial points resulting in 30 runs. The M. citrifolia extract was fermented with different combination of substrate concentration (40, 50, 60, 70 and 80%) (w/v), inoculum size (0, 1.5, 3, 4.5 and 6%) (v/v), temperature (30, 33.5, 37, 40.5 and 44oC) and fermentation time (0, 1.5, 3, 4.5 and 6 days). Five physico-chemical characteristics which include pH, titratable acidity, turbidity, total soluble solids and total polyphenol content were measured. Results showed that all the responses could be well represented using statistical models. For pH, only fermentation time was found to be not significant, while for titratable acidity and total polyphenol content, the effects of substrate concentration and fermentation time were significant. The effects of inoculum size and temperature level were found to be significant for turbidity. For total soluble solids, only the effect of substrate concentration and inoculum size were found to be significant.
    Matched MeSH terms: Models, Statistical
  17. Rakhimov SI, Mohamed Othman
    Iterative methods, particularly over-relaxation methods, are efficiently and frequently used to solve large systems of linear equations, because in the solutions of partial differential equations, these methods are applied to systems which are resulted from different iterative schemes to discrete equations. In this paper we formulate an accelerated over-relaxation (AOR) method with the quarter-sweep iterative scheme applied to the Poisson equation. To benchmark the new method we conducted experiments by comparing it with the previous AOR methods based on full- and half-sweep iterative schemes. The results of the experiments and the estimation of the computational complexity of the methods proved the superiority of the new method.
    Matched MeSH terms: Models, Statistical
  18. Annazirin Eli, Mardhiyyah Shaffie, Wan Zawiah W
    Sains Malaysiana, 2012;41:1403-1410.
    Statistical modeling of extreme rainfall is essential since the results can often facilitate civil engineers and planners to estimate the ability of building structures to survive under the utmost extreme conditions. Data comprising of annual maximum series (AMS) of extreme rainfall in Alor Setar were fitted to Generalized Extreme Value (GEV) distribution using method of maximum likelihood (ML) and Bayesian Markov Chain Monte Carlo (MCMC) simulations. The weakness of ML method in handling small sample is hoped to be tackled by means of Bayesian MCMC simulations in this study. In order to obtain the posterior densities, non-informative and independent priors were employed. Performances of parameter estimations were verified by conducting several goodness-of-fit tests. The results showed that Bayesian MCMC method was slightly better than ML method in estimating GEV parameters.
    Matched MeSH terms: Models, Statistical
  19. Ahmad Mahir Razali, Nurulkamal Masseran, Noriszura Ismail, Malina Zulkifli
    Sains Malaysiana, 2015;44:1363-1370.
    The aim of this paper was to identify the determinants that influence vehicle theft by applying a negative binomial regression model. The identification of these determinants is very important to policy-makers, car-makers and car owners, as they can be used to establish practical steps for preventing or at least limiting vehicle thefts. In addition, this paper also proposed a crime mapping application that allows us to identify the most risky areas for vehicle theft. The results from this study can be utilized by local authorities as well as management of internal resource planning of insurance companies in planning effective strategies to reduce vehicle theft. Indirectly, this paper has built ingenuity by combining information obtained from the database of Jabatan Perangkaan Malaysia and insurance companies to pioneer the development of location map of vehicle theft in Malaysia.
    Matched MeSH terms: Models, Statistical
  20. Nor Aishah Ahad, Sharipah Soaad Syed Yahaya, Abdul Rahman Othman
    Sains Malaysiana, 2012;41:1149-1154.
    This article investigates the performance of two-sample pseudo-median based procedure in testing differences between groups. The procedure is a modification of the one-sample Wilcoxon procedure using the pseudo-median of differences between group values as the central measure of location. The test was conducted on two groups with moderate sample
    sizes of symmetric and asymmetric distributions. The performance of the procedure was measured in terms of Type I error and power rates computed via Monte Carlo methods. The performance of the procedure was compared against the t-test and Mann-Whitney-Wilcoxon test. The findings from this study revealed that the pseudo-median procedure performed very
    well in controlling Type I error rates close to the nominal value. The pseudo-median procedure outperformed the MannWhitney-Wilcoxon test and is comparable to the t-test in controlling Type I error and maintaining adequate power.
    Matched MeSH terms: Models, Statistical
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links