Quality Function Deployment (QFD) is a structured methodology that uses customer and technical
requirements for designers and manufacturers to provide better products. Many researchers combine or
integrate the technique of QFD with other methodologies such as Theory Inventive of Problem Solving
(TRIZ) or Design for Manufacture and Assembly (DFMA) to optimise product design innovation and
improvement. The combined methodologies are even used to solve process problems. Initial literature
review of the application of stand-alone QFD poised several problems. Combining QFD with other
techniques, such as TRIZ and DFMA, has helped to address these issues and forms the basis of future
research. The integrated methods can solve main contradictory problems more precisely from product
demand analysis to product design, production and application. Review work of the literature, specifically
that on research and development of QFD, TRIZ and DFMA, showed that the said methodologies have
been widely and successfully implemented in several practical applications such as resolving conflicts
between customer and technical/engineering requirements and reducing production cost. This review work
provides an in-depth analysis of identifying and finding issues of strengths, weaknesses and outcomes
of the QFD when combined with TRIZ and also of QFD integrated with DFMA.
This paper reviewed the aspect of fatigue approaches and analysis in a fibre reinforced composite materials which have been done by researchers worldwide. The aim of this review is to provide a better picture on analytical approaches that are presently available for predicting fatigue life in composite materials. This review also proposes a new interpretation of available theories and identifies area in fatigue of natural fibre reinforced composite materials. Thus, it was concluded there are still very limited studies on fatigue analysis of natural fibre reinforced composite materials, especially using non-destructive technique (NDT) methods and a new mathematical modelling on fatigue should be formulated.
Craniofacial superimposition methods are employed for the identification of unknown skulls or living persons. There are many such methods and of particular interest is that technique developed by Furue which is inexpensive to set-up. A study was undertaken to ascertain the validity of this technique and to correlate our findings with other researchers.
There are two main reasons that motivate people to detect outliers; the first is the researchers' intention; see the example of Mr Haldum's cases in Barnett and Lewis. The second is the effect of outliers on analyses. This article does not differentiate between the various justifications for outlier detection. The aim was to advise the analyst about observations that are isolated from the other observations in the data set. In this article, we introduce the eigenstructure based angle for outlier detection. This method is simple and effective in dealing with masking and swamping problems. The method proposed is illustrated and compared with Mahalanobis distance by using several data sets.
The problem of constructing such a continuous function is called data fitting. Many times, data given only at discrete points. With interpolation, we seek a function that allows us to approximate f(x) such that functional values between the original data set values may be determined. The process of finding such a polynomial is called interpolation and one of the most important approaches used are Lagrange interpolating formula. In this study, researcher determining the polynomial interpolation by using Lagrange interpolating formula. Then, a mathematical modelling was built by using MATLAB programming to determine the polynomial interpolation for a given points using the Lagrange method. The result of the study showed that the manual calculating and the MATLAB mathematical modelling will give the same answer for evaluated x and graph.
Sustainable construction and demolition waste management relies heavily on the attitudes and actions of its constituents; nevertheless, deep analysis for introducing the best estimator is rarely attained. The main objective of this study is to perform a comparison analysis among different approaches of Structural Equation Modeling (SEM) in Construction and Demolition Waste Management (C&DWM) modeling based on an Extended Theory of Planned Behaviour (Extended TPB). The introduced research model includes twelve latent variables, six independent variables, one mediator, three control variables, and one dependent variable. Maximum likelihood (ML), partial least square (PLS), and Bayesian estimators were considered in this study. The output of SEM with the Bayesian estimator was 85.8%, and among effectiveness of six main variables on C&DWM Behavioral (Depenmalaydent variables), five of them have significant relations. Meanwhile, the variation based on SEM with ML estimator was equal to 78.2%, and four correlations with dependent variable have significant relationship. At the conclusion, the R-square of SEM with the PLS estimator was equivalent to 73.4% and three correlations with the dependent variable had significant relationships. At the same time, the values of the three statistical indices include root mean square error (RMSE), mean absolute percentage error (MPE), and mean absolute error (MSE) with involving Bayesian estimator are lower than both ML and PLS estimators. Therefore, compared to both PLS and ML, the predicted values of the Bayesian estimator are closer to the observed values. The lower values of MPE, RMSE, and MSE and the higher values of R-square will generate better goodness of fit for SEM with a Bayesian estimator. Moreover, the SEM with a Bayesian estimator revealed better data fit than both the PLS and ML estimators. The pattern shows that the relationship between research variables can change with different estimators. Hence, researchers using the SEM technique must carefully consider the primary estimator for their data analysis. The precaution is necessary because higher error means different regression coefficients in the research model.
Drucker's knowledge-worker productivity theory and knowledge-based view of the firm theory are widely employed in many disciplines but there is little application of these theories in knowledge-based innovation among academic researchers. Therefore, this study intends to evaluate the effects of the knowledge management process on knowledge-based innovation alongside with mediating role of Malaysian academic researchers' productivity during the Pandemic of COVID-19. Using a random sampling technique, data was collected from 382 academic researchers. Questionnaires were self-administered and data was analyzed via Smart PLS-SEM. Knowledge management process and knowledge workers' productivity have a positive and significant relationship with the knowledge-based innovation among academic researchers during the Pandemic of COVID-19. In addition, knowledge workers' productivity mediates the relationship between the knowledge management process (knowledge creation, knowledge acquisition, knowledge sharing, and knowledge utilization) and knowledge-based innovation during the Pandemic of COVID-19. Results have also directed knowledge sharing as the key factor in knowledge-based innovation and a stimulating task for management discipline around the world during the Pandemic of COVID-19. This study provides interesting insights on Malaysian academic researchers' productivity by evaluating the effects of knowledge creation, acquisition, sharing, and application on the knowledge-based innovation among academic researchers during the Pandemic of COVID-19. These useful insights would enable policymakers to develop more influential educational strategies. By assimilating the literature of defined variables, the main contribution of this study is the evaluation of knowledge creation, acquisition, sharing, and utilization into knowledge-based innovation alongside the mediating role of knowledge workers productivity in the higher education sector of Malaysia during the Pandemic of COVID-19.
Matched MeSH terms: Research Personnel/psychology*; Research Personnel/trends
This article seeks to address and dispel some of the popular myths and misconceptions surrounding authorship of a scientific publication as this is often misconstrued by beginners in academia especially those in the developing world. While ethical issues in publishing related to authorship have been increasingly discussed, not much has been written about the myths and misconceptions of who might be an author. Dispelling these myths and misconceptions would go a long way in shaping the thoughts and plans of students, junior faculty and researchers in academia especially in the developing world.
Among palm oil millers, the ripeness of oil palm Fresh Fruit Bunch (FFB) is determined through visual inspection. To increase the productivity of the millers, many researchers have proposed with a new detection method to replace the conventional one. The sensitivity of such a sensor plays a crucial role in determining the effectiveness of the method. In our preliminary study a novel oil palm fruit sensor to detect the maturity of oil palm fruit bunches is proposed. The design of the proposed air coil sensor based on an inductive sensor is further investigated to improve its sensitivity. This paper investigates the results pertaining to the effects of the air coil structure of an oil palm fruit sensor, taking consideration of the used copper wire diameter ranging from 0.10 mm to 0.18 mm with 60 turns. The flat-type shape of air coil was used on twenty samples of fruitlets from two categories, namely ripe and unripe. Samples are tested with frequencies ranging from 20 Hz to 120 MHz. The sensitivity of the sensor between air to fruitlet samples increases as the coil diameter increases. As for the sensitivity differences between ripe and unripe samples, the 5 mm air coil length with the 0.12 mm coil diameter provides the highest percentage difference between samples and it is amongst the highest deviation value between samples. The result from this study is important to improve the sensitivity of the inductive oil palm fruit sensor mainly with regards to the design of the air coil structure. The efficiency of the sensor to determine the maturity of the oil palm FFB and the ripening process of the fruitlet could further be enhanced.
Tissue engineering (TE) research serves to overcome the major obstacles in organ transplantation. This paper summarizes the progress of TE in Malaysia. The online database of Elsevier’s SCOPUS was accessed. Publications related to TE from 1960 till 2017 were scrutinized. The results show an increasing trend in tissue engineering research and development in Malaysia. The search result identified and examined 264 original article publications. It is hoped that the outcomes of this study could serve as a point of reference for researchers on the status of TE research and development in Malaysia. The findings of this study could assist TE researchers in Malaysia to identify the strengths, weaknesses, opportunities and obstacles towards further enhancement in their activities. Consolidating, realigning and re-strategizing those initiatives should also be seen within the context of nurturing potential and budding researchers in TE.
The co-curricular is important for students at The Centre of Foundation Studies, Universiti Teknologi MARA Dengkil to score. This is because it contributes 10% of the merits in the students’ applications for their degree program as set up by Bahagian Pengurusan Kemasukan Pelajar (UPU). In each new cohort, almost 5,000 students registered at this academic center. Previously, the grading process was done manually based on the number of activity coupon collected by the students in their co-curricular activity card. The use of co- curricular activity card had some challenges such as involving high cost, loss of coupons and activity cards, stolen activity cards and it was difficult for the students to monitor their achievement throughout their study. The process of recording thousands of students’ marks took place almost everyday, hence it must be improvised. To solve the problems mentioned, the CRS developers had came up with the idea of using QR code as an approach to ease and speed up the scoring process. Hence, this system helps supervisors to reduce human error and ease users (supervisors and administrators). Besides, the students could monitor their co-curricular total score from time to time to know how well they are doing so far. As a result, it has fruitfully benefited to almost 5,000 students, supervisors and administrator at the Student Affairs Division (HEP) as well as reducing the cost of buying co-curricular activity cards.
This paper discusses methodological dilemma that arise in qualitative research, specifically in education field. It outlines the broad principles that underpin good qualitative research and the aspects of practice that qualitative researchers should consider when designing, conducting, and disseminating their research. Two primary methodological dilemma are (i) lack of objectivity, and (ii) issue of generalizability in qualitative research. The aim of this paper is to argue the dilemmas and encourage researchers to examine the relevance of qualitative issues to their own research. These dilemmas could be taken as important consideration for others who wish to conduct qualitative research in education.
Predatory journals and conferences have little or no peer review. Their raison d'être is for making money through the article processing charges and the conference registration fees. Without a critical evaluation, predatory journals publishing flawed results and conclusions would cloud the existing scientific literature. Predatory conferences are the offshoots of predatory publishing. The conferences are not organised by learned societies, but by profit-making event organisers. There is a need for awareness among researchers and clinicians regarding predatory publishing. The scourge of predatory publishing and conferencing should be more often highlighted during scientific meetings and publication courses.
The finite element method is gaining acceptance in predicting mechanical response of various loading configurations and material orientations for failure analysis of composite laminates. Both fabrication of laminate samples and experimental procedures are often expensive and time consuming, and hence impractical, especially during the initial design stage. Finite element analyses require minimal amounts of input data, and the resulting stress and strain distributions can be determined throughout each individual ply. Using ANSYSTM, a commercially available finite element package, failure loads were predicted by simulating a uniaxial tensile loading on HTS40/977-2 Carbon/Epoxy composite with [+/-4512s lamination scheme. Two built-in failure theories in ANSYSTM features, viz., Maximum Stress and Tsai-Wu were applied in the simulation. The stress-strain and load-extension curves for both actual testing and FEA were then compared and the results are in good agreement. This paper is intended for researchers who have used or are considering using ANSYSTM for the prediction of failure in composite materials.
The importance of numerical analysis in investigation of piled embankment over soft soil has been developed since 1990. Several investigators have extended the numerical analysis to model ground improvement using soil-column to support embankment or structures. This paper presents a numerical analysis of the Pulverized Fuel Ash (PFA) column-treated peat and compared with field static-loading test results. Back analysis was performed to determine the material parameters and soil stiffness surrounding soil & soil-column. Two geometrical models were used in this analysis: (a) block (Model A), and (b) column group (Model B). This situation was analyzed using commercially available finite element package PLAXIS 2D ver. 8.2. It is found that both models are reliable to simulate the field static-loading test for column-treated peat. Model B shows a higher stability to failure if compared to Model A.
Forgery is an act of modifying a document, product, image or video, among other media. Video tampering detection research requires an inclusive database of video modification. This paper aims to discuss a comprehensive proposal to create a dataset composed of modified videos for forensic investigation, in order to standardize existing techniques for detecting video tampering. The primary purpose of developing and designing this new video library is for usage in video forensics, which can be consciously associated with reliable verification using dynamic and static camera recognition. To the best of the author's knowledge, there exists no similar library among the research community. Videos were sourced from YouTube and by exploring social networking sites extensively by observing posted videos and rating their feedback. The video tampering dataset (VTD) comprises a total of 33 videos, divided among three categories in video tampering: (1) copy-move, (2) splicing, and (3) swapping-frames. Compared to existing datasets, this is a higher number of tampered videos, and with longer durations. The duration of every video is 16s, with a 1280×720 resolution, and a frame rate of 30 frames per second. Moreover, all videos possess the same formatting quality (720p(HD).avi). Both temporal and spatial video features were considered carefully during selection of the videos, and there exists complete information related to the doctored regions in every modified video in the VTD dataset. This database has been made publically available for research on splicing, Swapping frames, and copy-move tampering, and, as such, various video tampering detection issues with ground truth. The database has been utilised by many international researchers and groups of researchers.
Among most important aspects in conducting a clinical trial are random sampling and allocation of subjects. The processes could be easier if done with familiar software used for data entry and analysis instead of relying on other programs or methods. The objective of this article is to demonstrate random sampling and allocation using SPSS in step-by-step manners using examples most relevant to clinicians as well as researchers in health sciences.