AIM: To identify the effect of triage training on the skills and accuracy of triage decisions for adult trauma patients.
METHOD: A randomized controlled trial design was conducted in ten emergency department of public hospitals. A total of 143 registered nurses and medical officer assistants who performed triage roles were recruited for the control group (n = 74) and the intervention group (n = 69). The skill and accuracy of triage decisions were measured two weeks and four weeks after the intervention group were exposed to the intervention.
RESULTS: There was a significant effect on the skill of triage decision-making between the control and the intervention group p
Method: The EEG signals are recorded for seven simple tasks using the designed data acquisition procedure. These seven tasks are conceivably used to control wheelchair movement and interact with others using any odd-ball paradigm. The proposed system records EEG signals from 10 individuals at eight-channel locations, during which the individual executes seven different mental tasks. The acquired brainwave patterns have been processed to eliminate noise, including artifacts and powerline noise, and are then partitioned into six different frequency bands. The proposed cross-correlation procedure then employs the segmented frequency bands from each channel to extract features. The cross-correlation procedure was used to obtain the coefficients in the frequency domain from consecutive frame samples. Then, the statistical measures ("minimum," "mean," "maximum," and "standard deviation") were derived from the cross-correlated signals. Finally, the extracted feature sets were validated through online sequential-extreme learning machine algorithm.
Results and Conclusion: The results of the classification networks were compared with each set of features, and the results indicated that μ (r) feature set based on cross-correlation signals had the best performance with a recognition rate of 91.93%.
MATERIALS AND METHODS: A total of 30 healthy female boer cross goats at the age of 4 months old with average initial live body weight (BW) of 20.05±0.5 kg were used for on-farm feeding trial to evaluate the growth performance as preparation for breeding purposes. The experimental goats were divided into two groups of 15 animals each labeled as control and treatment groups, which were kept under intensive farming system. Goats in control group were fed with normal routine feeding protocol practiced by the farmer, while goats in the treatment group were fed with new feed formulation. Throughout the experimental period, on-farm monitoring and data collection were carried out. Initial BW and body condition score (BCS) were recorded before the start of the experiment while final BW and BCS were gained after 7 months of the experimental period. Average daily gain (ADG) was calculated after the experiment end. Data on BW, ADG, and BCS were recorded from both groups for every 2 weeks and reported monthly. The feed intake for the control group was 2.8 kg/animal/day which practiced by the farmer and 3.2 kg/animal/day as new feed formulation for the treatment group.
RESULTS: After 7 months of the experimental period, final BW shows an improvement in treatment group (39.1±1.53 kg) compared with control group (32.3±1.23 kg). The ADG in treatment group also gives promising result when comparing with control group. Goats in treatment group significantly attained better ADG than control group which were 126.7 g/day and 83.3 g/day, respectively. For the BCS, goats in the treatment group had shown an improvement where 86.67% (13 out of 15) of the group had BCS ≥3 (1-5 scoring scale) and only 66.67% (10 out of 15) of the control group had BCS ≥3.
CONCLUSION: Therefore, it was concluded that implementation of proper feeding program as shown in treatment group give promising result to improve the growth performance of replacement breeder goats which can be adopted by the farmers to improve farm productivity.
METHOD: A pre-school was provided with an interactive hand hygiene application for two months. The device features an online administrator dashboard for data collection and for monitoring the children's hand washing steps and duration. A good hand washing is defined as hand washing which comprise all of the steps outlined in the World Health Organization (WHO) guidelines.
RESULTS: The prototype managed to capture 6882 hand wash performed with an average of 20.85 seconds per hand wash. Washing hands palm to palm was the most frequent (79.9%) step performed, whereas scrubbing fingernails and wrists were the least (56%) steps performed.
CONCLUSIONS: The device is a good prototype to educate, stimulate and monitor good hand hygiene practices. However, other measures should be undertaken to ensure sustainability of the practices.
METHODS: Data were analysed from patients in a multinational longitudinal cohort with known anti-dsDNA results from 2013 to 2021. Patients were categorized based on their anti-dsDNA results as persistently negative, fluctuating or persistently positive. Cox regression models were used to examine longitudinal associations of anti-dsDNA results with flare.
RESULTS: Data from 37 582 visits of 3484 patients were analysed. Of the patients 1029 (29.5%) had persistently positive anti-dsDNA and 1195 (34.3%) had fluctuating results. Anti-dsDNA expressed as a ratio to the normal cut-off was associated with the risk of subsequent flare, including in the persistently positive cohort (adjusted hazard ratio [HR] 1.56; 95% CI: 1.30, 1.87; P 3. Both increases and decreases in anti-dsDNA more than 2-fold compared with the previous visit were associated with increased risk of flare in the fluctuating cohort (adjusted HR 1.33; 95% CI: 1.08, 1.65; P = 0.008) and the persistently positive cohort (adjusted HR 1.36; 95% CI: 1.08, 1.71; P = 0.009).
CONCLUSION: Absolute value and change in anti-dsDNA titres predict flares, including in persistently anti-dsDNA positive patients. This indicates that repeat monitoring of dsDNA has value in routine testing.