METHODS: Edible parts of MO (n = 146) and MS (n = 50), co-occurring cereals/vegetables and soils (n = 95) underneath their canopy were sampled from localities in southern Ethiopia and Kenya. The concentrations of seven mineral elements, namely, calcium (Ca), copper (Cu), iodine (I), iron (Fe), magnesium (Mg), selenium (Se), and zinc (Zn) in edible parts and soils were determined using inductively coupled plasma-mass spectrometry.
RESULTS: In Ethiopian crops, MS leaves contained the highest median concentrations of all elements except Cu and Zn, which were greater in Enset (a.k.a., false banana). In Kenya, Mo flowers and MS leaves had the highest median Se concentration of 1.56 mg kg-1 and 3.96 mg kg-1, respectively. The median concentration of Se in MS leaves was 7-fold, 10-fold, 23-fold, 117-fold and 147-fold more than that in brassica leaves, amaranth leaves, baobab fruits, sorghum grain and maize grain, respectively. The median Se concentration was 78-fold and 98-fold greater in MO seeds than in sorghum and maize grain, respectively. There was a strong relationship between soil total Se and potassium dihydrogen phosphate (KH2PO4)-extractable Se, and Se concentration in the leaves of MO and MS.
CONCLUSION: This study confirms previous studies that Moringa is a good source of several of the measured mineral nutrients, and it includes the first wide assessment of Se and I concentrations in edible parts of MO and MS grown in various localities. Increasing the consumption of MO and MS, especially the leaves as a fresh vegetable or in powdered form, could reduce the prevalence of MNDs, most notably Se deficiency.
MATERIALS AND METHODS: The research design used a quasiexperiment. The sampling technique used cluster sampling with 76 respondents in intervention group and 76 respondents in control group. The research was conducted in the working area in Public Health Center, Malang Regency. Data analysis in this study used the Wilcoxon Signed Rank Test and Mann-Whitney.
RESULTS: The results of the study found that there were differences in the ability of mothers to fulfill nutrition in stunted children between the intervention group and the control group (p = 0.000). There were mean differences in the ability of mothers to fulfill nutrition for stunted children before and after the intervention in the intervention group with indicators of breastfeeding, food preparation and processing, complementary- feeding and responsive feeding were increased (p = 0.000). However, in the control group, there were no differences in the ability of mothers to fulfill nutrition with indicator breastfeeding (p = 0.462), food preparation and processing (p = 0.721), complementary feeding (p = 0.721), complementary feeding (p = 0.462). (p = 0.054), responsive feeding (p = 0.465) and adherence to stunting therapy (p = 0.722).
CONCLUSION: The women's empowerment model based on self-regulated learning is formed by individual mother factors, family factors, health service system factors, and child factors so that it can increase the mother's ability to fulfill nutrition in children aged 6-24 months who are stunted. The women's empowerment is a learning process about breastfeeding, food hygiene, infant and young children feeding, and responsive feeding by mothers to fulfill nutrition in children with stunting, with a goal and plan to achieve an improvement in mother's ability and nutritional status in children.
METHODS: Among 477 312 participants, intakes of 23 nutrients were estimated from validated dietary questionnaires. Using results from a previous principal component (PC) analysis, four major nutrient patterns were identified. Hazard ratios (HRs) and 95% confidence intervals (CIs) were computed for the association of each of the four patterns and CRC incidence using multivariate Cox proportional hazards models with adjustment for established CRC risk factors.
RESULTS: During an average of 11 years of follow-up, 4517 incident cases of CRC were documented. A nutrient pattern characterised by high intakes of vitamins and minerals was inversely associated with CRC (HR per 1 s.d.=0.94, 95% CI: 0.92-0.98) as was a pattern characterised by total protein, riboflavin, phosphorus and calcium (HR (1 s.d.)=0.96, 95% CI: 0.93-0.99). The remaining two patterns were not significantly associated with CRC risk.
CONCLUSIONS: Analysing nutrient patterns may improve our understanding of how groups of nutrients relate to CRC.
METHODS: Patients with primary breast and colorectal cancer undergoing elective surgery are recruited from two tertiary hospitals. Eligible patients are assigned into one of the three intervention arms: (i) Group SS will receive ONS in addition to their normal diet up to 14 days preoperatively and postoperatively up to discharge; (ii) Group SS-E will receive ONS in addition to their normal diet up to 14 days preoperatively, postoperatively up to discharge and for an extended 90 days after discharge; and (iii) Group DS will receive ONS in addition to their normal diet postoperatively up to discharge from the hospital. The ONS is a standard formula fortified with lactium to aid in sleep for recovery. The primary endpoints include changes in weight, body mass index (BMI), serum albumin and prealbumin levels, while secondary endpoints are body composition (muscle and fat mass), muscle strength (handgrip strength), energy and protein intake, sleep quality, haemoglobin, inflammatory markers (transferrin, high sensitivity C-reactive protein, interleukin-6), stress marker (saliva cortisol), length of hospital stay and postoperative complication rate.
DISCUSSION: This trial is expected to provide evidence on whether perioperative supplementation in breast and colorectal cancer patients presenting with high BMI and not severely malnourished but undergoing the stress of surgery would be beneficial in terms of nutritional and clinical outcomes.
TRIAL REGISTRATION: ClinicalTrial.gov NCT04400552. Registered on 22 May 2020, retrospectively registered.
METHODS: A qualitative study with purposive sampling was conducted using face-to-face semistructured interviews. A total of 20 participants from a tertiary general hospital in Kuantan, Malaysia, were recruited in this study. Data were analysed using framework analysis.
RESULTS: Two themes emerged from the analysis. The first theme explained the changes in the dietary practice of the participants postdiagnosis. The second theme revealed that the participants' dietary changes were greatly influenced by personal factors and external support from professionals, family and peers.
CONCLUSIONS: Urinary stone patients highlighted the fear of complications, self-determination and knowledge of nutrition as the main drivers of their dietary change postdiagnosis. Emphasising proper nutritional care by assessing and evaluating dietary self-management among patients can facilitate effective self-care in stone prevention management.