Mass valuation of properties is important for purposes like property tax, price indices construction, and understanding market dynamics. There are several ways that the mass valuation can be carried out. This paper reviews the conventional MRA and several other advanced methods such as SAR, Kriging, GWR, and MWR. SAR and Kriging are good for modeling spatial dependence while GWR and MWR are good for modeling spatial heterogeneity. The difference between SAR and Kriging is the calculation of weights. Kriging weights are based on the spatial dependence or so called the semi-variogram analysis of the price data whereas the weights in SAR are based on the spatial contiguity between the sample data. MWR and GWR are special types of regression where study region is subdivided into local sections to increase the accuracy of prediction through neutralizing the heterogeneity of autocorrelations. MWR assigns equal weights for observations within a window while GWR uses distance decay functions. The merits and drawbacks of each method are discussed.
Soil properties are very crucial for civil engineers to differentiate one type of soil from another and to predict its mechanical behavior. However, it is not practical to measure soil properties at all the locations at a site. In this paper, an estimator is derived to estimate the unknown values for soil properties from locations where soil samples were not collected. The estimator is obtained by combining the concept of the ‘Inverse Distance Method’ into the technique of ‘Kriging’. The method of Lagrange Multipliers is applied in this paper. It is shown that the estimator derived in this paper is an unbiased estimator. The partiality of the estimator with respect to the true value is zero. Hence, the estimated value will be equal to the true value of the soil property. It is also shown that the variance between the estimator and the soil property is minimized. Hence, the distribution of this unbiased estimator with minimum variance spreads the least from the true value. With this characteristic of minimum variance unbiased estimator, a high accuracy estimation of soil property could be obtained.
Postural movements potentially affect aiming stability in archery, thus contributing to chances of inconsistent hits. According to the expertisenovice paradigm, the factor that sets winners apart from ordinary athletes is the former’s ability to control minute changes in their performance. The
present study seeks to determine the relationship between postural sway and shooting performance amongst Malaysian skilled recurve archers. Twenty one skilled Malaysian archers participated in this study, where performance level was measured by rank tournaments International Archery Federation (FITA) score. Postural sway was assessed in terms of anterior deviation (positive value) and posterior deviation (negative value) using ZEPHYR Bio-Harness. Postural sway was analysed at the following three phases; (i) setup, (ii) aiming, and (iii) release. Participants shot 12 arrows to a 30-meter target. Data yielded a significant relationship between postural sway and shooting performance. The correlation coefficients between shooting performance and postural sway value for skilled archers ranged between (r = -0.021 to 0.248) with the highest correlation recorded at the release phase, with the lowest at the aiming phase. The setup phase showed the only anterior deviation throughout the test. During the setup and release phases, correlation between postural sway with shooting performance was significantly noted (p < 0.001). Multiple regression analysis showed that postural sway during the setup and release phases were the significant indicators for shooting performance, accounting approximately 17% and 24% of the variances respectively. In sum, the results indicate that reducing postural sway
during the release phase can increase shooting performance of skilled archery athletes, thus establishing a significant relationship between the postural sway value with shooting performance of skilled archers.
The attractiveness of food as tourism product has partly derived from the gastronomic aspect. The ingredients, the preparations, the end products, and the eating circumstances are cultural, educational, and entertaining. However, there is little research empirically demonstrates if there is a difference between first-time and repeat visitors in terms of food experience at destination, or how the various food experience attributes influence visitors’ overall satisfaction while visiting a destination. Hence, this study was undertaken to address the gap. Data were collected via on-site survey questionnaire administrated to a random sample of visitors at the Kuala Lumpur International Airport (KLIA) and various touristic areas around Kuala Lumpur. The results indicated significant differences between first-time and repeat visitors in terms of their food experience. In addition, multiple regression analysis revealed that traditional food preparation was an important factor to tourists’ overall satisfaction for both first-time and repeat visitors. In sum, the study is the first to examine the effect of food experience attributes on first time and repeat visitors separately
It is now evident that the estimation of logistic regression parameters, using Maximum LikelihoodEstimator (MLE), suffers a huge drawback in the presence of outliers. An alternative approach is touse robust logistic regression estimators, such as Mallows type leverage dependent weights estimator(MALLOWS), Conditionally Unbiased Bounded Influence Function estimator (CUBIF), Bianco andYohai estimator (BY), and Weighted Bianco and Yohai estimator (WBY). This paper investigates therobustness of the preceding robust estimators by using real data sets and Monte Carlo simulations. Theresults indicate that the MLE behaves poorly in the presence of outliers. On the other hand, the WBYestimator is more efficient than the other existing robust estimators. Thus, it is suggested that the WBYestimator be employed when outliers are present in the data to obtain a reliable estimate.
This study deals with the analysis of the cure rate estimation based on the Bounded Cumulative Hazard (BCH) model using interval censored data, given that the exact distribution of the data set is unknown. Thus, the non-parametric estimation methods are employed by means of the EM algorithm. The Turnbull and Kaplan Meier estimators were proposed to estimate the survival function, even though the Kaplan Meier estimator faces some restrictions in term of interval survival data. A comparison of the cure rate estimation based on the two estimators was done through a simulation study.
Gamma Spectrometry Counting System requires similar counting geometries for the calibration source, reference material and samples. The objectives of this study were to find out the effects of the sample density on 137 Cs activities measurement and propose reasonable corrections. Studies found that the activity of the samples is decreasing when the density of samples increased. Therefore, in order to have a more accurate estimation of samples activities; density corrections should be done either by performs mathematical corrections using equation or by increasing the expanded uncertainty when sample densities deviated from calibration source.
This study focuses on the biochar formation and torrefaction performance of sugarcane bagasse, and they are predicted using the bilinear interpolation (BLI), inverse distance weighting (IDW) interpolation, and regression analysis. It is found that the biomass torrefied at 275°C for 60min or at 300°C for 30min or longer is appropriate to produce biochar as alternative fuel to coal with low carbon footprint, but the energy yield from the torrefaction at 300°C is too low. From the biochar yield, enhancement factor of HHV, and energy yield, the results suggest that the three methods are all feasible for predicting the performance, especially for the enhancement factor. The power parameter of unity in the IDW method provides the best predictions and the error is below 5%. The second order in regression analysis gives a more reasonable approach than the first order, and is recommended for the predictions.
Management is consistently facing fast-flowing and lots of changes in business, including in the inventory management. Especially for fast-moving inventories, the correct stocking, controlling, checking and safety stock calculation is highly needed to have an exquisite inventory management and to reduce the possibility of running out of inventory which leads to unavailability to meet the demand. One of the ways to overcome this is by doing an excellent and appropriate forecasting. Therefore, the objective of this concept paper is to analyse and recommend tools to improve inventory management using the appropriate time-series forecasting method. The firm studied in this study is serving its employees as customers that demand the routine items including stationeries and other routine products to support their job as auditors and consultants for its client. However, there are occasions when there is out-of-stock situation for fast-moving items, especially in the peak season period. Furthermore, the firm is only applying replenishment based on the used inventories from the previous month. Therefore, this study suggests to eliminate out-of-stock items situation by applying precaution initiatives such as time-series forecasting. This study is planned to employ 10 time-series forecasting methods such as moving average, exponential smoothing, regression analysis, Holt-Winters analysis, Seasonal analysis and Autoregressive Integrated Moving Average (ARIMA) using Risk Simulator Software. By simulating those methods, the most appropriate method is selected based on the forecasting accuracy measurement.
Clustering is basically one of the major sources of primary data mining tools. It makes
researchers understand the natural grouping of attributes in datasets. Clustering is an
unsupervised classification method with the major aim of partitioning, where objects in the
same cluster are similar, and objects which belong to different clusters vary significantly,
with respect to their attributes. However, the classical Standardized Euclidean distance,
which uses standard deviation to down weight maximum points of the ith features on the
distance clusters, has been criticized by many scholars that the method produces outliers,
lack robustness, and has 0% breakdown points. It also has low efficiency in normal
distribution. Therefore, to remedy the problem, we suggest two statistical estimators
which have 50% breakdown points namely the Sn and Qn estimators, with 58% and 82%
efficiency, respectively. The proposed methods evidently outperformed the existing methods
in down weighting the maximum points of the ith features in distance-based clustering
This study aims to examine the influences of the plan to further study, career growth
and discriminatory treatment on turnover intention among technicians in electronic
industry in Malaysia. The objectives are: (i) To identify the relationship between the
plan to further study and turnover intention among factory technicians, (ii) To identify
the relationship between career growth and turnover intention among factory
technicians, and (iii)To identify the relationship between discriminatory treatment
factors and turnover intention among factory technicians. The population involved in
this study were the manufacturing technicians at an electronic factory. Survey
questionnaires were used to collect data. A total of 110 questionnaires were analyzed.
Pearson correlation coefficient and regression analysis were used to measure the
degree of relationship between variables. The findings showed that all independent
variables; plan to further study, career growth and discriminatory treatment, were
positive moderately correlated with turnover intention.
Dwarf bamboo is recognized as a significant determinant of the structure and dynamics in temperate forests. Quantitative relationships between the abundance (density and coverage) of dwarf bamboo, Fargesia nitida, and micro-environments, species diversity on the floor were estimated in an Abies faxoniana pure forest in southwest China. Understory microenvironmental conditions (daily differences temperature and moisture, RPPFD under bamboo layer and ground cover) changed dramatically with the bamboo density. Stepwise multiple regression analyses indicated that abundance of F. nitida was mainly correlated with canopy characteristics and disturbance factors in the A. faxoniana pure forest. All richness indices decreased significantly with the bamboo density and RPPFD under bamboo layer. Importance values (IV) of understory species in different bamboo densities and Detrended canonical correspondence analysis (DCCA) indicated three understory plant groups, resistant to high bamboo abundance (Group A), resistant to intermediate bamboo abundance (Group B) and sensitive to bamboo abundance (Group C). These groups mainly responded to abundance of bamboo and RPPFD under bamboo layer, resulted from the integration of characteristics of bamboo, canopy and topography factors. Different bamboo abundance had different influences on understory species diversity and groups. Dense F. nitida condition (> 10 culms/m2) had significant negative effect and 0-5 bamboo condition had not significant negative effect on understory species diversity in A. faxoniana forest. We suggest the fine-scale analysis on effects of bamboo abundance should be taken account into considering in heterogeneous patches in process of the succession and regeneration of natural forests.
Outliers in the X-direction or high leverage points are the latest known source of multicollinearity. Multicollinearity is a nonorthogonality of two or more explanatory variables in multiple regression models, which may have important influential impacts on interpreting a fitted regression model. In this paper, we performed Monte Carlo simulation studies to achieve two main objectives. The first objective was to study the effect of certain magnitude and percentage of high leverage points, which are two important issues in tending the high leverage points to be collinearity-enhancing observations, on the multicollinarity pattern of the data. The second objective was to investigate in which situations these points do make different degrees of multicollinearity, such as moderate or severe. According to the simulation results, high leverage points should be in large magnitude for at least two explanatory variables to guarantee that they are the cause of multicollinearity problems. We also proposed some practical Lower Bound (LB) and Upper Bound (UB) for High Leverage Collinearity Influential Measure (HLCIM) which is an essential measure in detecting the degree of multicollinearity. A well-known example is used to confirm the simulation results.
This study was carried out to determine the influence of LMX towards the SCB worker dimension in a national automotive company in Malaysia. There were 360 respondents from the automotive company involved in this study. Data collected from respondents were analysed using descriptive (demographic frequencies) and inferential statistics (correlation and regression analysis). The results showed that one of the four independent variables has a positive influence on SCB. In terms of the demographic factors, none of the variables (age, gender and period of service) made any significant difference on LMX and SCB, except the position category. Further suggestions regarding LMX and SCB are discussed based on the findings.
Multicollinearity that may exist among explanatory variables in a regression model can make the regression coefficients insignificant and difficult to interpret. Principal component regression (PCR) is an effective way for solving multicollinearity in regression analysis. The existence of multicollinearity mayor may not be induced by the presence of influential observations. This paper discusses some diagnostic methods for identifying influential observations in the PCR. A data set on water quality of New York Rivers was considered to illustrate the methods.
Multikolinearan yang wujud di kalangan pembolehubah penerang dalam model regresi boleh menyebabkan pekali regresi tidak bererti dan sukar untuk ditafsirkan. Regresi komponen utama (PCR) merupakan cara yang berkesan bagi menyelesaikan masalah multikolinearan dalam analisis regresi. Kewujudan multikolinearan mungkin disebabkan oleh data terpencil yang berpengaruh. Kertas ini membincangkan beberapa kaedah pengecaman bagi mengenalpasti data berpengaruh dalam PCR. Data tentang kualiti air di beberapa batang sungai di New York digunakan untuk memperihalkan kaedah pengecaman yang disarankan.
This paper mainly forecasts the daily closing price of stock markets. We propose a two-stage technique that combines the empirical mode decomposition (EMD) with nonparametric methods of local linear quantile (LLQ). We use the proposed technique, EMD-LLQ, to forecast two stock index time series. Detailed experiments are implemented for the proposed method, in which EMD-LPQ, EMD, and Holt-Winter methods are compared. The proposed EMD-LPQ model is determined to be superior to the EMD and Holt-Winter methods in predicting the stock closing prices.
Pharmacokinetic-pharmacodynamic information regarding warfarin is used to produce a predictive model based on the idea that pharmacodynamic variability is more important than pharmacokinetic variability in the overall dose-response variability to warfarin. A modification of the maximum effect model is tested on a group of patients initiating oral anticoagulation with warfarin. Results indicate that the model can account for at least half of the total variation in maintenance doses observed (sample coefficient of determination, 0.53) and offer the physician a framework for dose requirements at the onset of therapy. The basic prediction equation is as follows: Maintenance dose = (11/international normalized ratio)-1, with a coefficient of correlation of 0.73 (95% confidence limits, 0.46-0.88). Application of this model may improve on the traditional empiric approach to warfarin dose adjustment.
Responses were recorded from normal healthy subjects and age-related macular degeneration (AMD) patients and evaluated, using a new variant of mfVEP. Subjects information was recorded using 64 EEG channels with a computer-based acquisition system. The stimulus layout was a 84 region corticallyscaled dartboard comprising 12 sectors and seven concentric rings subtending a diameter of 23º, presented dichoptically at 60 Hz. Data from the control and AMD patients were statistically compared when fitted concurrently into the multiple regression analysis. The Pattern-pulse mfVEP technique could distinguish between normal eyes and those with a definite diagnosis of dry and wet AMD when responses from the macula were considered.
Rheology is the science of deformation and flow behavior of fluid. Knowledge of rheological properties of fluid foods and their variation with temperature and concentration have been globally important for industrialization of food technology for quality, understanding the texture, process engineering application, correlation with sensory evaluation, designing of transport system , equipment design (heat exchanger and evaporator ), deciding pump capacity and power requirement for mixing. The aim of this study was to determine the rheological behavior of pomelo juice at different concentrations (20-60.4%) and temperatures (23-60°C) by using a rotational rotational Haake Rheostress 600 rheometer. Pomelo juice was found to exhibit both Newtonian and Non-Newtonian behavior. For lower concentration the Newtonian behavior is observed while at higher concentration Non-Newtonian behavior was observed. Standard error (SE) method was selected on the basis to carry out the error analysis due to the best fit model. For the four models the values of SE show that the Herschel-Bulkley and Power Law models perform better than the Bingham and Casson models but Herschel-Bulkley model is true at higher concentration. The rheological model of pomelo juice, incorporating the effects of concentration and temperature was developed. The master-curve was investigated for comparing data from different products at a reference temperature of 40°C. Multiple regression analysis indicated Master-Curve presents good agreement for pomelo juice at all concentrations studied with R2>0.8.