This paper considers a Monte Carlo simulation based method for estimating cycle stocks (production lot-sizing stocks) in a typical batch production system, where a variety of products is scheduled for production at determined periods of time. Delivery time is defined as the maximum lead time and pre-assembly processing time of the product's raw materials in the method. The product's final assembly cycle and delivery time, which were obtained via the production schedule and supply chain simulation, respectively, were both considered to estimate the demand distribution of product based on total duration. Efficient random variates generators were applied to model the lead time of the supply chain's stages. In order to support the performance reliability of the proposed method, a real case study is conducted and numerically analyzed.
The aim of the study was to validate the Malay version of the General Quentionnaire (GHQ-12) in patients with psychiatric morbidity secondary to urological disorder. Validity and reliability were studied in patients with lower urinary tract symptoms (LUTS) and patients without LUTS. Internal consistency was excellent. A high degree of internal consistency was observed for each of the 12 items and total scores (Cronbach's alpha value = 0.50 and higher and 0.65 respectively. Test-retest correlation coefficient for the 12 items scores was highly significant. Intraclass correlation coefficient was high (ICC=0.47 and above). A significant level between baseline and post-treatment scores were observed across 3 items in the surgical group. The Mal-GHQ-12 is a suitable, reliable, valid and sensitive to clinical change in the Malaysian population.
Numerous applications of artificial olfaction resulting from research in many branches of sciences have caused considerable interest in the enhancement of these systems. In this paper, we offer an architecture which is suitable for critical applications, such as medical diagnosis, where reliability and precision are deemed important. The proposed architecture is able to tolerate failures in the sensors of the array. In this study, the discriminating ability of the proposed architecture in detecting complex odors, as well as the performance of the proposed architecture in encountering sensor failure, were investigated and compared with the generic architecture. The results demonstrated that by applying the proposed architecture in the artificial olfactory system, the performance of system in the healthy mode was identical to the classic structure. However, in the faulty situation, the proposed architecture implied high identification ability of odor samples, while the generic architecture showed very poor performance in the same situation. Based on the results, it was possible to achieve high odor identification through the developed artificial olfactory system using the proposed architecture.
In this study, a double-negative triangular metamaterial (TMM) structure, which exhibits a resounding electric response at microwave frequency, was developed by etching two concentric triangular rings of conducting materials. A finite-difference time-domain method in conjunction with the lossy-Drude model was used in this study. Simulations were performed using the CST Microwave Studio. The specific absorption rate (SAR) reduction technique is discussed, and the effects of the position of attachment, the distance, and the size of the metamaterials on the SAR reduction are explored. The performance of the double-negative TMMs in cellular phones was also measured in the cheek and the tilted positions using the COMOSAR system. The TMMs achieved a 52.28% reduction for the 10 g SAR. These results provide a guideline to determine the triangular design of metamaterials with the maximum SAR reducing effect for a mobile phone.
BACKGROUND: The Perceived Stress Scale 10 (PSS-10) is a validated and reliable instrument to measure global levels of perceived stress. This study aims to assess the internal consistency, reliability, and factor structure of the Malay version of the PSS-10 for use among medical students.
METHODS: The original English version of the PSS-10 was translated and back-translated into Malay language. The Malay version was distributed to 242 Bachelor of Medical Science students in a private university in Malaysia. Test-retest reliability was assessed in 70 students. An exploratory principal component factor analysis with varimax rotation was performed. Reliability was tested using the intraclass correlation coefficient (ICC).
RESULTS: All 242 students participated in the initial questionnaire study (validity and factor structure), and 70 students participated in the test-retest reliability of the study. Exploratory factor analysis yielded 2 factors that accounted for 57.8% of the variance. Cronbach's alpha coefficients for the 2 factors were 0.85 and 0.70, respectively. The reliability test showed an ICC of 0.82 (95% CI: 0.70, 0.89).
CONCLUSION: The Malay version of the PSS-10 showed adequate psychometric properties. It is a useful instrument for measuring stress among medical students in Malaysia.
KEYWORDS: Malaysia; medical; psychological; reliability and validity; stress; students
A number of techniques have been proposed during the last three decades for noise variance and signal-to-noise ratio (SNR) estimation in digital images. While some methods have shown reliability and accuracy in SNR and noise variance estimations, other methods are dependent on the nature of the images and perform well on a limited number of image types. In this article, we prove the accuracy and the efficiency of the image noise cross-correlation estimation model, vs. other existing estimators, when applied to different types of scanning electron microscope images.
INTRODUCTION AND OBJECTIVE:
Most of important variables measured in medicine are in numerical forms or continuous in nature. New instruments and tests are constantly being developed for the purpose of measuring various variables, with the aim of providing cheaper, non-invasive, more convenient and safe methods. When a new method of measurement or instrument is invented, the quality of the instrument has to be assessed. Agreement and reliability are both important parameters in determining the quality of an instrument. This article will discuss some issues related to methods comparison study in medicine for the benefit of medical professional and researcher.
This is a narrative review and this article review the most common statistical methods used to assess agreement and reliability of medical instruments that measure the same continuous outcome. The two methods discussed in detail were the Bland-Altman Limits of Agreement, and Intra-class Correlation Coefficient (ICC). This article also discussed some issues related to method comparison studies including the application of inappropriate statistical methods, multiple statistical methods, and the strengths and weaknesses of each method. The importance of appropriate statistical method in the analysis of agreement and reliability in medicine is also highlighted in this article.
There is no single perfect method to assess agreement and reliability; however researchers should be aware of the inappropriate methods that they should avoid when analysing data in method comparison studies. Inappropriate analysis will lead to invalid conclusions and thus validated instrument might not be accurate or reliable. Consequently this will affect the quality of care given to a patient.
The objective of this paper is to report on the reliability and validity of a knowledge, attitude and practice instrument used among young primary school children. The instrument was developed as an evaluation tool in the HELIC study and consisted of 23 knowledge, 11 attitude and 10 practice items. A total of 335 Year 2 students from 4 randomly selected primary schools in Selangor and Wilayah Persekutuan participated in the HELIC study. Students were divided into small groups and an enumerator verbally administered the instrument to each group. Reliability for each construct (knowledge, attitude and practice) was estimated as item to total score correlation and internal consistency (Cronbach's alpha). Construct validity was determined through factor analysis and Pearson correlation. Results indicated that 3 attitude and 3 practice items did not correlate significantly to the total score (p>0.05). However, the deletion of these items did not significantly alter the Cronbach's alpha coefficients. Internal consistency was good for knowledge (a=0.68) but low for attitude (a=0.37) and practice (a=0.36) constructs. Based on factor analysis, 5 factor-solutions emerged for knowledge and 4 factor solutions for attitude and practice. Sufficient variance was obtained for the factors in knowledge (51.7%), attitude (51.2% and practice (51.0%). There were also significant positive correlations among the constructs ( ~ 4 . 0 1 ) . In conclusion, the instrument was valid and reliable, especially for the knowledge construct. Further improvements, particularly on the attitude and practice constructs, are needed in order for the instrument to be an effective assessment or evaluation tool in various settings.
Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction.
Setting a question paper for test, quiz, and examination is one of the teachers’ tasks. The factors that are usually taken into consideration in carrying out this particular task are the level of difficulty of the questions and the level of the students’ ability. In addition, teachers will also have to consider the number of questions that have impact on the examination. This research describes a model-based test theory to study the confidence intervals for the projected number of items of a test, given the reliability of the test, the difficulty of the question, and the students’ ability. Using the simulated data, the confidence intervals of the projected number of items were examined. The probability coverage and the length of the confidence interval were also used to evaluate the confidence intervals. The results showed that the data with a normal distribution, the ratio variance components of 4:1:5 and reliability equal to 0.80 gave the best confidence interval for the projected number of items.
Stone Mastic Asphalt (SMA) is one type of asphalt mixture which is highly dependent on the method
of compaction as compared to conventional Hot Mix Asphalt (HMA) mixture. A suitable laboratory compaction method which can closely simulate field compaction is evidently needed as future trend
in asphalt pavement industry all over the world is gradually changing over to the SMA due to its excellent performance characteristics. This study was conducted to evaluate the SMA slab mixtures compacted using a newly developed Turamesin roller compactor, designed to cater for laboratory compaction in field simulation conditions. As the newly developed compaction device, there is a need for evaluating the compacted slab dimensions (which include length, width, and thickness), analyzing the consistency of the measured parameters to verify the homogeneity of the compacted slabs and determining the reliability of Turamesin. A total of 15 slabs from three different types of asphalt mixtures were compacted, measured, and analyzed for their consistencies in terms of length, width, and thickness. Based on study the conducted, the compacted slabs were found to have problems in terms of the improperly compacted section of about 30 mm length at both ends of the slabs and the differences in the thickness between left- and right-side of the slab which were due to unequal load distribution from the roller compactor. The results obtained from this study have led to the development of Turamesin as an improved laboratory compaction device.
Plurality voter is one of the commonest voting methods for decision making in highly-reliable applications in which the reliability and safety of the system is critical. To resolve the problem associated with sequential plurality voter in dealing with large number of inputs, this paper introduces a new generation of plurality voter based on parallel algorithms. Since parallel algorithms normally have high processing speed and are especially appropriate for large scale systems, they are therefore used to achieve a new parallel plurality voting algorithm by using (n/log n) processors on EREW shared-memory PRAM. The asymptotic analysis of the new proposed algorithm has demonstrated that it has a time complexity of O(log n) which is less than time complexity of sequential plurality algorithm, i.e. O (n log n).
The 1 MW TRIGA MARK II research reactor at Malaysian Nuclear Agency achieved initial
criticality on June 28, 1982. The reactor is designed to effectively implement the various fields of
basic nuclear research, manpower training, and production of radioisotopes. This
paperdescribes the reactor parameters calculation for the PUSPATI TRIGA REACTOR (RTP);
focusing on the application of the developed reactor 3D model for criticality calculation,
analysis of power and neutron flux distribution and depletion study of TRIGA fuel. The 3D
continuous energy Monte Carlo code MCNP was used to develop a versatile and accurate full
model of the TRIGA reactor. The consistency and accuracy of the developed RTP MCNP model
was established by comparing calculations to the experimental results and TRIGLAV
code.MCNP and TRIGLAV criticality prediction of the critical core loading are in a very good
agreement with the experimental results.Power peaking factor calculated with TRIGLAV are
systematically higher than the MCNP but the trends are the same.Depletion calculation by both
codes show differences especially at high burnup.The results are conservative and can be
applied to show the reliability of MCNP code and the model both for design and verification of
the reactor core, and future calculation of its neutronic parameters.
Database Forensics (DBF) is a widespread area of knowledge. It has many complex features and is well known amongst database investigators and practitioners. Several models and frameworks have been created specifically to allow knowledge-sharing and effective DBF activities. However, these are often narrow in focus and address specified database incident types. We have analysed 60 such models in an attempt to uncover how numerous DBF activities are really public even when the actions vary. We then generate a unified abstract view of DBF in the form of a metamodel. We identified, extracted, and proposed a common concept and reconciled concept definitions to propose a metamodel. We have applied a metamodelling process to guarantee that this metamodel is comprehensive and consistent.
Several incidents that occurred around the world involving power failure
caused by unscheduled line outages were identified as one of the main
contributors to power failure and cascading blackout in electric power
environment. With the advancement of computer technologies, artificial
intelligence (AI) has been widely accepted as one method that can be applied
to predict the occurrence of unscheduled disturbance. This paper presents
the development of automatic contingency analysis and ranking algorithm
for the application in the Artificial Neural Network (ANN). The ANN is
developed in order to predict the post-outage severity index from a set of preoutage
data set. Data were generated using the newly developed automatic
contingency analysis and ranking (ACAR) algorithm. Tests were conducted
on the 24-bus IEEE Reliability Test Systems. Results showed that the developed
technique is feasible to be implemented practically and an agreement was
achieved in the results obtained from the tests. The developed ACAR can be
utilised for further testing and implementation in other IEEE RTS test systems
particularly in the system, which required fast computation time. On the other
hand, the developed ANN can be used for predicting the post-outage severity
index and hence system stability can be evaluated.
This study describes the review on maintenance related issues during design and construction stage
within construction industry. The paper highlights the causes and errors made during design and
construction stage and their impact during the operation/production/occupancy stage as well as the
maintenance costs associated with it. The study identifies the mistakes in the working processes within
design and construction stage leading to the errors that affect the durability, performance, reliability,
maintainability, availability and safety of the systems. The paper presents a comprehensive review of
the published literatures, journals, technical papers in the related areas in the construction field. The
review highlights the new approaches and decision framework which link the designers and
construction personnel that could reduce the errors and defects in construction which then lead to
maintenance issues and asset management. The factors of accessibility, materials, design and
documentation standardization have been discussed thoroughly for better understanding in improving
maintenance and physical asset management in project commissioning.
Semiconductor metal oxide (SMO) as a sensing layer for gas detection has been widely used. Many researches have been performed to enhance the sensing performance including its sensitivity, reliability and selectivity. Electrical sensors that use resistivity as an indicator of its sensing are popular and well established. However, the optical based sensor is still much to explore in detecting gas. By integrating it with SMO, the sensor offers good alternative to overcome some drawbacks from electrical sensors.
Negative bias temperature instability (NBTI) is the most concern issue CMOS devices with the scaling
down of the CMOS technologies. NBTI effect contributes to P-MOSFET device degradation which later
reduce the performance and reliability of CMOS circuits. This paper presents a reliability simulation study
based on R-D model on CMOS inverter circuit. HSPICE MOSRA model together with the Predictive
Technology Model (PTM) was used as to incorporate the NBTI model in the circuit reliability simulation
study for different technology nodes. PTM of High Performance (HP) models of 16nm, 22nm, 32nm
and 45nm were used in this simulation study. The atomic hydrogen based model was integrated in the
simulation. The results show that in a CMOS inverter circuit, the threshold voltage shift of p-MOSFET
under NBTI stressing increased as the year progressed.. The threshold voltage shift was observed to
increase up to 45.1% after 10 years of operation. The time exponent, n ~ 0.232 of the threshold voltage
shift observed indicates that the defect mechanism contributed to the degradation is atomic hydrogen.
The propagation delay increased to 19.5% over a 10-year period. s up to 19.5% from the zero year
of operation until 10 years of the operation. In addition, the time propagation delay increased as year
increased when the technology nodes smaller. The finding is important for understanding reliability
issues related to advanced technology nodes in CMOS circuits study.
State estimation plays a vital role in the security analysis of a power system. The weighted least squares method is one of the conventional techniques used to estimate the unknown state vector of the power system. The existence of bad data can distort the reliability of the estimated state vector. A new algorithm based on the technique of quality control charts is developed in this paper for detection of bad data. The IEEE 6-bus power system data are utilised for the implementation of the proposed algorithm. The output of the study shows that this method is practically applicable for the separation of bad data in the problem of power system state estimation.