The role of caregivers is very important in the management of person with dementia, where it is not uncommon for them to experience psychological distress. However, the level of distress can be managed and reduced through stra- tegic educational intervention. A systematic review has been conducted through searching Medline, Science direct, Cochrane library and EMBASE databases to provide a narrative synthesis that elaborate on methods and outcomes of the educational intervention among informal caregiver of person with dementia. From a total of 5125 records, eight studies were selected and included in this review, where the results show that educational intervention can be implemented either as individual or group intervention. Group intervention methods mainly focus on training pro- grams such as workshops and lectures, and also group-based discussions. While for individual intervention, most of the activities were implemented through self-learning using technology or computer-based systems. In conclusion, based on the outcome of the studies, both methods of implementations are found to be useful in reducing psycho- logical distress of the informal caregiver.
Postquantum cryptography for elevating security against attacks by quantum computers in the Internet of Everything (IoE) is still in its infancy. Most postquantum based cryptosystems have longer keys and signature sizes and require more computations that span several orders of magnitude in energy consumption and computation time, hence the sizes of the keys and signature are considered as another aspect of security by green design. To address these issues, the security solutions should migrate to the advanced and potent methods for protection against quantum attacks and offer energy efficient and faster cryptocomputations. In this context, a novel security framework Lightweight Postquantum ID-based Signature (LPQS) for secure communication in the IoE environment is presented. The proposed LPQS framework incorporates a supersingular isogeny curve to present a digital signature with small key sizes which is quantum-resistant. To reduce the size of the keys, compressed curves are used and the validation of the signature depends on the commutative property of the curves. The unforgeability of LPQS under an adaptively chosen message attack is proved. Security analysis and the experimental validation of LPQS are performed under a realistic software simulation environment to assess its lightweight performance considering embedded nodes. It is evident that the size of keys and the signature of LPQS is smaller than that of existing signature-based postquantum security techniques for IoE. It is robust in the postquantum environment and efficient in terms of energy and computations.
Free-piston engine generators (FPEGs) have huge potential to be the principal energy conversion device for generating electricity from fuel as part of a hybrid-electric vehicle (EV) powertrain system. The principal advantages lay in the fact that they are theoretically more efficient, more compact, and more lightweight compared to other competing EV hybrid and range-extender solutions (internal combustion engines, rotary engines, fuel cells, etc.). However, this potential has yet to be realized. This article details a novel dual-piston FPEG configuration and presents the full layout of a system and provides technical evidence of a commercial FPEG system's likely size and weight. The work also presents the first results obtained from a project which set-out to realize an operational FPEG system in hardware through the development and testing of a flexible prototype test platform. The work presents the performance and control system characteristics, for a first of a kind system; these show great technical potential with stable and repeatable combustion events achieved with around 700 W per cylinder and 26% indicated efficiency.
DNA computing, or more generally, molecular computing, is a recent development on computations using biological molecules, instead of the traditional silicon-chips. Some computational models which are based on different operations of DNA molecules have been developed by using the concept of formal language theory. The operations of DNA molecules inspire various types of formal language tools which include sticker systems, grammars and automata. Recently, the grammar counterparts of Watson-Crick automata known as Watson-Crick grammars which consist of regular, linear and context-free grammars, are defined as grammar models that generate double-stranded strings using the important feature of Watson-Crick complementarity rule. In this research, a new variant of static Watson-Crick linear grammar is introduced as an extension of static Watson-Crick regular grammar. A static Watson-Crick linear grammar is a grammar counterpart of sticker system that generates the double-stranded strings and uses rule as in linear grammar. The main result of the paper is to determine some computational properties of static Watson-Crick linear grammars. Next, the hierarchy between static Watson-Crick languages, Watson-Crick languages, Chomsky languages and families of languages generated by sticker systems are presented.
Life jacket is one of the safety appliances that can be found on the ship that provide buoyancy and prevention against drowning. Before the ship can sail, every element of safety of the vessel should be confirmed. Despite the establishment of standards for life jacket, both local and international, there have been cases of drowning associated with the usage of life jackets by the passengers of passenger boat/vessels for open-deck situation. Moreover, deficiency of information on safety instruction is reason the passengers are lack of personal safety information during on board. Thus, the evaluation on safety standard of life jackets and passenger vessel are vital for assessing the provision of the life jacket on board passenger vessel with respect to compatibility between life jacket and passenger vessel. In this paper, A Vessel Life Jacket Compatibility Mobile Apps (VELIT) was developed using software development methodology called Rational Unified Process (RUP) to automate the safety assessment process based on model called LCI (Life Jacket Compatibility Index). VELIT apps synchronized the safety assessment aspect and which allow user to compute the element in the model and produce the result of the safety assessment in real time. The VELIT apps are expected to be used in maritime area especially for ship safety assessment process.
The application of computer and machines for agricultural production has been one of the outstanding
developments in Malaysian agriculture, especially in overcoming labour shortages in Oil Palm plantations. The on-line automated weedicide sprayer system was developed at Universiti Putra
Malaysia to locate the existence and intensity of weeds in real-time environment and to spray the
weedicides automatically and precisely. During the start of the spraying operation, the web camera
will initially capture the image of weeds. The computer programme will compute the red, green, blue (RGB) values in the form of computer pixel. These values will be used as reference RGB values to be compared with the RGB values of the weeds captured real-time during the spraying operation. The sprayer nozzle will be turned ‘on’ or ‘off’, depending on the percentage or intensity of the green colour pixel value of weeds. The sprayer valve will open the nozzle/s when the camera detected the presence of weeds. The purpose is to reduce wastage, reduce labour, reduce cost, and control environment hazard.
Recent rootkit-attack mitigation work neglected to address the integrity of the mitigation tool itself. Both detection and prevention arms of current rootkit-attack mitigation solutions can be given credit for the advancement of multiple methodologies for rootkit defense but if the defense system itself is compromised, how is the defense system to be trusted? Another deficiency not addressed is how platform integrity can be preserved without availability of current RIDS or RIPS solutions, which operate only upon the loading of the kernel i.e. without availability of a trusted boot environment. To address these deficiencies, we present our architecture for solving rootkit persistence – Rootkit Guard (RG). RG is a marriage between TrustedGRUB (providing trusted boot), IMA (Integrity Measurement Architecture) (serves as RIDS) and SELinux (serves as RIPS). TPM hardware is utilised to provide total integrity of our platform via storage of the aggregate of the clean snapshot of our platform OS kernel into TPM hardware registers (i.e. the PCR) – of which no software attacks have been demonstrated to date. RG solves rootkit persistence by leveraging on one vital but simple strategy: the mounting of rootkit defense via prevention of the execution of configuration binaries or build initialisation scripts. We adopted the technique of rootkit persistence prevention via thwarting the initialisation of a rootkit’s installation procedure; if the rootkit is successfully installed, proper deployment via thwarting of the rootkit’s
configuration is prevented. We had subjected the RG to 8 real world Linux 2.6 rootkits and the RG was successful in solving rootkit persistence in all 8 evaluated rootkits. In terms of performance, the RG introduced a maximum of 11% overhead and an average of 4% overhead, hence permitting deployment in production environments.
The sample size calculation for a prevalence only needs a simple formula. However, there are a number of practical issues in selecting values for the parameters required in the formula. Several practical issues are addressed and appropriate recommendations are given. The paper also suggests the application of a software calculator that checks the normal approximation assumption and incorporates finite population correction in the sample size calculation.
A 360° twisted helical capacitance sensor was developed for holdup measurement in horizontal two-phase stratified flow. Instead of suppressing nonlinear response, the sensor was optimized in such a way that a 'sine-like' function was displayed on top of the linear function. This concept of design had been implemented and verified in both software and hardware. A good agreement was achieved between the finite element model of proposed design and the approximation model (pure sinusoidal function), with a maximum difference of ±1.2%. In addition, the design parameters of the sensor were analysed and investigated. It was found that the error in symmetry of the sinusoidal function could be minimized by adjusting the pitch of helix. The experiments of air-water and oil-water stratified flows were carried out and validated the sinusoidal relationship with a maximum difference of ±1.2% and ±1.3% for the range of water holdup from 0.15 to 0.85. The proposed design concept therefore may pose a promising alternative for the optimization of capacitance sensor design.
Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction.
The remote measurements of radiation level at an identified location, are not only important for
collecting data or monitoring radiation level per se, but also crucial for workers who deal with
radiation sources. A device for checking an on-site radiation level has been developed quite a
long time ago under the name of Geiger Muller and widely known as a Geiger counter. The
reading of the output can be seen on the device on-site and on real-time basis. Nowadays, with
the fast evolution of computer and networking technology, those reading not only can be read
real-time but also from a remote location that makes workers able to enter the risky area more
safely. The collected data reading also can be analyzed for predicting the future trending
pattern. The data is transferred from the monitoring devices to a server through a network. This
paper discusses about several critical issues on the design, implementation and deployment that
relates to the devices, interface programs, hardware and software that allow all parameters such
as radiation levels reading and the timestamp of the data-logging can be collected and stored in
a central storage for further processes. The compatibility issue with regards to technology
change from the previous system will also be discussed. The system has many advantages
compared to previous system and conventional method of doing the area monitoring in term of
sustainability and availability.
The aim of the study was to evaluate post-polymerization of resin composite by measuring NanoHardness (H), Young’s Modulus (E) and Degree of Conversion (DC) using nanoindentation and Micro-Raman spectroscopy. For this purpose a computer-controlled NanoIndenter™ and a Renishaw 1000 Raman Spectrometer fitted with an Olympus microscope attachment, operated at 638 nm, were used. A light-activated resin composite was used in this study, (Z250, 3MESPE). Sub-groups (n=3) of specimens were irradiated for 20, 30, 40 s. All samples for nanoindentation were polished metallographically and typically 50 nanoindentations were performed per specimen. After curing and polishing, half of the samples were tested immediately (Group 1); the others after being stored dry at 37 °C for 7 days (Group 2) to allow scope for postpolymerization. H values ranged from 1.08 to 1.40 GPa for Group 1, and from 1.64 to1.71 GPa for Group 2. E values in Group 1 ranged from 19.60 to 19.94 GPa and for Group 2, from 21.42 to 22.05 GPa. DC values ranged from 55 to 66.39%, and 60.90 to 66.47% for Group 1 and Group 2 respectively. All values obtained shown significant different between Groups 1 and 2 (p
Data transmission in field works especially that is related to industry, gas and chemical is paramount importance to ensure data accuracy and delivery time. A development of wireless detector system for remote data acquisition to be applied in conducting fieldwork in industry is described in this paper. A wireless communication which is applied in the project development is a viable and cost-effective method of transmitting data from the detector to the laptop on the site to facilitate data storage and analysis automatically, which can be used in various applications such as column scanning. The project involves hardware design for the detector and electronics parts besides programming for control board and user interface. A prototype of a wireless gamma scintillation detector is developed with capabilities of transmitting data to computer via radio frequency (RF) and recording the data within the 433MHz band at baud rate of 19200.
Skin detection has gained popularity and importance in the computer vision community. It is an essential step for important vision tasks such as the detection, tracking and recognition of face, segmentation of hand for gesture analysis, person identification, as well as video surveillance and filtering of objectionable web images. All these applications are based on the assumption that the regions of the human skin are already located. In the recent past, numerous techniques for skin colour modeling and recognition have been proposed. The aims of this paper are to compile the published pixel-based skin colour detection techniques to describe their key concepts and try to find out and summarize their advantages, disadvantages and characteristic features.
Muhammad Akmal Asraf Mohamad Sharom, Zainol Abidin Ibrahim, Wan Ahmad Tajuddin Wan Abdullah, Megat Harun Al Rashid Megat Ahmad, Faridah Mohamad Idris, Abdul Aziz Mohamed
Small angle neutron scattering (SANS) is used for probing the microstructure of materials in the range between 1-100 nm in dimension. The scattered neutrons from the target material were detected by a 128 x 128 array area sensitive, Helium gas-filled proportional counter, which is known as Position Sensitive Detector (PSD). The small angle neutron scattering (SANS) facility in Malaysian Nuclear Agency has been developed since 1995. The data acquisition system of this prototype facility consists of the two-dimensional Position Sensitive Detector (2D-PSD) and neutron monitor as a data grabber, TDC Histogram as a memory processing processor, two units of ORTEC 994 as a counter and timer and a computer as a data acquisition controller via GPIB interfacing protocol. This paper will describe on the development of GPIB interface for data acquisition of the SANS instrument on Windows based platform. The GPIB device interface and graphical user interface (GUI) for this data acquisition is developed using WaveMetrics Igor software.
In this study, the numerical simulation in a mixing vessel agitated by a six bladed Rushton turbine has
been carried out to investigate the effects of effective parameters to the mixing process. The study is intended to screen the potential parameters which affect the optimization process and to provide the detail insights into the process. Three-dimensional and steady-state flow has been performed using the fully predictive Multiple Reference Frame (MRF) technique for the impeller and tank geometry. Process optimization is always used to ensure the optimum conditions are fulfilled to attain industries’ satisfaction or needs (ie; increase profit, low cost, yields, etc). In this study, the range of recommended speed to accelerate optimization is 100, 150 and 200rpm respectively and the range of recommended clearance is 50, 75 and 100mm respectively for dual Rushton impeller. Thus, the computer fluid dynamics (CFD) was introduced in order to screen the suitable parameters efficiently and to accelerate optimization. In this study,
All relevant and essential data of an existing vehicle seat assembly line such as the operating time and processes, material handling system, workstation layout, bill of materials, equipment and hand tools, were collected and analyzed. The time standards for each of the vehicle seat assembly elements were established using work study techniques. A simulation approach was used to determine the productivity and effi ciency of the existing and proposed lines. Simulation technique was also used to determine and identify bottle-necks in both existing and proposed systems. Comparison of the existing assembly line and the proposed assembly line in terms of their productivity and effi ciency are also highlighted.
Computer vision is applied in many software and devices. The detection and
reconstruction of the human skeletal structure is one of area of interest, where the
camera will identify the human parts and construct the joints of the person standing in
front. Three-dimensional pose estimation is solved using various learning approaches,
such as Support Vector Machines and Gaussian processes. However, difficulties in
cluttered scenarios are encountered, and require additional input data, such as
silhouettes, or controlled camera settings. The paper focused on estimating the threedimensional
pose of a person without requiring background information, which is
robust to camera variations. Each of the joint has three-dimensional space position and
matrix orientation with respect to the sensor. Matlab Simulink was utilized to provide
communication tools with depth camera using Kinect device for skeletal detection.
Results on the skeletal detection using Kinect sensor is analysed in measuring the
abilities to detect skeletal structure accurately, and it is shown that the system is able
to detect human skeletal performing non-complex basic motions in daily life.
Demonstration of the access cavity preparation procedures to dental students is
challenging due to the limited operating field and detailed nature of the procedures. It is especially
difficult to visualize how instruments are functioning inside the pulp space. The aim of this study
was to develop and compare two different views of video demonstration in teaching access cavity
preparation. (Copied from article).
Feature descriptor for image retrieval has emerged as an important part of computer vision and image analysis application. In the last decades, researchers have used algorithms to generate effective, efficient and steady methods in image processing, particularly shape representation, matching and leaf retrieval. Existing leaf retrieval methods are insufficient to achieve an adequate retrieval rate due to the inherent difficulties related to available shape descriptors of different leaf images. Shape analysis and comparison for plant leaf retrieval are investigated in this study. Different image features may result in different significance interpretation of images, even though they come from almost similarly shaped of images. A new image transform, known as harmonic mean projection transform (HMPT), is proposed in this study as a feature descriptor method to extract leaf features. By using harmonic mean function, the signal carries information of greater importance is considered in signal acquisition. The selected image is extracted from the whole region where all the pixels are considered to get a set of features. Results indicate better classification rates when compared with other classification methods.