Displaying publications 1 - 20 of 24 in total

Abstract:
Sort:
  1. Chang SS, Chen YY, Yip PS, Lee WJ, Hagihara A, Gunnell D
    PLoS Med, 2014 Apr;11(4):e1001622.
    PMID: 24691071 DOI: 10.1371/journal.pmed.1001622
    BACKGROUND: Suicides by carbon monoxide poisoning resulting from burning barbecue charcoal reached epidemic levels in Hong Kong and Taiwan within 5 y of the first reported cases in the early 2000s. The objectives of this analysis were to investigate (i) time trends and regional patterns of charcoal-burning suicide throughout East/Southeast Asia during the time period 1995-2011 and (ii) whether any rises in use of this method were associated with increases in overall suicide rates. Sex- and age-specific trends over time were also examined to identify the demographic groups showing the greatest increases in charcoal-burning suicide rates across different countries.

    METHODS AND FINDINGS: We used data on suicides by gases other than domestic gas for Hong Kong, Japan, the Republic of Korea, Taiwan, and Singapore in the years 1995/1996-2011. Similar data for Malaysia, the Philippines, and Thailand were also extracted but were incomplete. Graphical and joinpoint regression analyses were used to examine time trends in suicide, and negative binomial regression analysis to study sex- and age-specific patterns. In 1995/1996, charcoal-burning suicides accounted for <1% of all suicides in all study countries, except in Japan (5%), but they increased to account for 13%, 24%, 10%, 7%, and 5% of all suicides in Hong Kong, Taiwan, Japan, the Republic of Korea, and Singapore, respectively, in 2011. Rises were first seen in Hong Kong after 1998 (95% CI 1997-1999), followed by Singapore in 1999 (95% CI 1998-2001), Taiwan in 2000 (95% CI 1999-2001), Japan in 2002 (95% CI 1999-2003), and the Republic of Korea in 2007 (95% CI 2006-2008). No marked increases were seen in Malaysia, the Philippines, or Thailand. There was some evidence that charcoal-burning suicides were associated with an increase in overall suicide rates in Hong Kong, Taiwan, and Japan (for females), but not in Japan (for males), the Republic of Korea, and Singapore. Rates of change in charcoal-burning suicide rate did not differ by sex/age group in Taiwan and Hong Kong but appeared to be greatest in people aged 15-24 y in Japan and people aged 25-64 y in the Republic of Korea. The lack of specific codes for charcoal-burning suicide in the International Classification of Diseases and variations in coding practice in different countries are potential limitations of this study.

    CONCLUSIONS: Charcoal-burning suicides increased markedly in some East/Southeast Asian countries (Hong Kong, Taiwan, Japan, the Republic of Korea, and Singapore) in the first decade of the 21st century, but such rises were not experienced by all countries in the region. In countries with a rise in charcoal-burning suicide rates, the timing, scale, and sex/age pattern of increases varied by country. Factors underlying these variations require further investigation, but may include differences in culture or in media portrayals of the method. Please see later in the article for the Editors' Summary.

  2. Babar ZU, Ibrahim MI, Singh H, Bukahri NI, Creese A
    PLoS Med, 2007 Mar 27;4(3):e82.
    PMID: 17388660
    Malaysia's stable health care system is facing challenges with increasing medicine costs. To investigate these issues a survey was carried out to evaluate medicine prices, availability, affordability, and the structure of price components.
  3. Cauchemez S, Epperson S, Biggerstaff M, Swerdlow D, Finelli L, Ferguson NM
    PLoS Med, 2013;10(3):e1001399.
    PMID: 23472057 DOI: 10.1371/journal.pmed.1001399
    BACKGROUND: Prior to emergence in human populations, zoonoses such as SARS cause occasional infections in human populations exposed to reservoir species. The risk of widespread epidemics in humans can be assessed by monitoring the reproduction number R (average number of persons infected by a human case). However, until now, estimating R required detailed outbreak investigations of human clusters, for which resources and expertise are not always available. Additionally, existing methods do not correct for important selection and under-ascertainment biases. Here, we present simple estimation methods that overcome many of these limitations.

    METHODS AND FINDINGS: Our approach is based on a parsimonious mathematical model of disease transmission and only requires data collected through routine surveillance and standard case investigations. We apply it to assess the transmissibility of swine-origin influenza A H3N2v-M virus in the US, Nipah virus in Malaysia and Bangladesh, and also present a non-zoonotic example (cholera in the Dominican Republic). Estimation is based on two simple summary statistics, the proportion infected by the natural reservoir among detected cases (G) and among the subset of the first detected cases in each cluster (F). If detection of a case does not affect detection of other cases from the same cluster, we find that R can be estimated by 1-G; otherwise R can be estimated by 1-F when the case detection rate is low. In more general cases, bounds on R can still be derived.

    CONCLUSIONS: We have developed a simple approach with limited data requirements that enables robust assessment of the risks posed by emerging zoonoses. We illustrate this by deriving transmissibility estimates for the H3N2v-M virus, an important step in evaluating the possible pandemic threat posed by this virus. Please see later in the article for the Editors' Summary.

  4. Murphy N, Cross AJ, Abubakar M, Jenab M, Aleksandrova K, Boutron-Ruault MC, et al.
    PLoS Med, 2016 Apr;13(4):e1001988.
    PMID: 27046222 DOI: 10.1371/journal.pmed.1001988
    BACKGROUND: Obesity is positively associated with colorectal cancer. Recently, body size subtypes categorised by the prevalence of hyperinsulinaemia have been defined, and metabolically healthy overweight/obese individuals (without hyperinsulinaemia) have been suggested to be at lower risk of cardiovascular disease than their metabolically unhealthy (hyperinsulinaemic) overweight/obese counterparts. Whether similarly variable relationships exist for metabolically defined body size phenotypes and colorectal cancer risk is unknown.

    METHODS AND FINDINGS: The association of metabolically defined body size phenotypes with colorectal cancer was investigated in a case-control study nested within the European Prospective Investigation into Cancer and Nutrition (EPIC) study. Metabolic health/body size phenotypes were defined according to hyperinsulinaemia status using serum concentrations of C-peptide, a marker of insulin secretion. A total of 737 incident colorectal cancer cases and 737 matched controls were divided into tertiles based on the distribution of C-peptide concentration amongst the control population, and participants were classified as metabolically healthy if below the first tertile of C-peptide and metabolically unhealthy if above the first tertile. These metabolic health definitions were then combined with body mass index (BMI) measurements to create four metabolic health/body size phenotype categories: (1) metabolically healthy/normal weight (BMI < 25 kg/m2), (2) metabolically healthy/overweight (BMI ≥ 25 kg/m2), (3) metabolically unhealthy/normal weight (BMI < 25 kg/m2), and (4) metabolically unhealthy/overweight (BMI ≥ 25 kg/m2). Additionally, in separate models, waist circumference measurements (using the International Diabetes Federation cut-points [≥80 cm for women and ≥94 cm for men]) were used (instead of BMI) to create the four metabolic health/body size phenotype categories. Statistical tests used in the analysis were all two-sided, and a p-value of <0.05 was considered statistically significant. In multivariable-adjusted conditional logistic regression models with BMI used to define adiposity, compared with metabolically healthy/normal weight individuals, we observed a higher colorectal cancer risk among metabolically unhealthy/normal weight (odds ratio [OR] = 1.59, 95% CI 1.10-2.28) and metabolically unhealthy/overweight (OR = 1.40, 95% CI 1.01-1.94) participants, but not among metabolically healthy/overweight individuals (OR = 0.96, 95% CI 0.65-1.42). Among the overweight individuals, lower colorectal cancer risk was observed for metabolically healthy/overweight individuals compared with metabolically unhealthy/overweight individuals (OR = 0.69, 95% CI 0.49-0.96). These associations were generally consistent when waist circumference was used as the measure of adiposity. To our knowledge, there is no universally accepted clinical definition for using C-peptide level as an indication of hyperinsulinaemia. Therefore, a possible limitation of our analysis was that the classification of individuals as being hyperinsulinaemic-based on their C-peptide level-was arbitrary. However, when we used quartiles or the median of C-peptide, instead of tertiles, as the cut-point of hyperinsulinaemia, a similar pattern of associations was observed.

    CONCLUSIONS: These results support the idea that individuals with the metabolically healthy/overweight phenotype (with normal insulin levels) are at lower colorectal cancer risk than those with hyperinsulinaemia. The combination of anthropometric measures with metabolic parameters, such as C-peptide, may be useful for defining strata of the population at greater risk of colorectal cancer.

  5. Rhee SY, Blanco JL, Jordan MR, Taylor J, Lemey P, Varghese V, et al.
    PLoS Med, 2015 Apr;12(4):e1001810.
    PMID: 25849352 DOI: 10.1371/journal.pmed.1001810
    BACKGROUND: Regional and subtype-specific mutational patterns of HIV-1 transmitted drug resistance (TDR) are essential for informing first-line antiretroviral (ARV) therapy guidelines and designing diagnostic assays for use in regions where standard genotypic resistance testing is not affordable. We sought to understand the molecular epidemiology of TDR and to identify the HIV-1 drug-resistance mutations responsible for TDR in different regions and virus subtypes.

    METHODS AND FINDINGS: We reviewed all GenBank submissions of HIV-1 reverse transcriptase sequences with or without protease and identified 287 studies published between March 1, 2000, and December 31, 2013, with more than 25 recently or chronically infected ARV-naïve individuals. These studies comprised 50,870 individuals from 111 countries. Each set of study sequences was analyzed for phylogenetic clustering and the presence of 93 surveillance drug-resistance mutations (SDRMs). The median overall TDR prevalence in sub-Saharan Africa (SSA), south/southeast Asia (SSEA), upper-income Asian countries, Latin America/Caribbean, Europe, and North America was 2.8%, 2.9%, 5.6%, 7.6%, 9.4%, and 11.5%, respectively. In SSA, there was a yearly 1.09-fold (95% CI: 1.05-1.14) increase in odds of TDR since national ARV scale-up attributable to an increase in non-nucleoside reverse transcriptase inhibitor (NNRTI) resistance. The odds of NNRTI-associated TDR also increased in Latin America/Caribbean (odds ratio [OR] = 1.16; 95% CI: 1.06-1.25), North America (OR = 1.19; 95% CI: 1.12-1.26), Europe (OR = 1.07; 95% CI: 1.01-1.13), and upper-income Asian countries (OR = 1.33; 95% CI: 1.12-1.55). In SSEA, there was no significant change in the odds of TDR since national ARV scale-up (OR = 0.97; 95% CI: 0.92-1.02). An analysis limited to sequences with mixtures at less than 0.5% of their nucleotide positions—a proxy for recent infection—yielded trends comparable to those obtained using the complete dataset. Four NNRTI SDRMs—K101E, K103N, Y181C, and G190A—accounted for >80% of NNRTI-associated TDR in all regions and subtypes. Sixteen nucleoside reverse transcriptase inhibitor (NRTI) SDRMs accounted for >69% of NRTI-associated TDR in all regions and subtypes. In SSA and SSEA, 89% of NNRTI SDRMs were associated with high-level resistance to nevirapine or efavirenz, whereas only 27% of NRTI SDRMs were associated with high-level resistance to zidovudine, lamivudine, tenofovir, or abacavir. Of 763 viruses with TDR in SSA and SSEA, 725 (95%) were genetically dissimilar; 38 (5%) formed 19 sequence pairs. Inherent limitations of this study are that some cohorts may not represent the broader regional population and that studies were heterogeneous with respect to duration of infection prior to sampling.

    CONCLUSIONS: Most TDR strains in SSA and SSEA arose independently, suggesting that ARV regimens with a high genetic barrier to resistance combined with improved patient adherence may mitigate TDR increases by reducing the generation of new ARV-resistant strains. A small number of NNRTI-resistance mutations were responsible for most cases of high-level resistance, suggesting that inexpensive point-mutation assays to detect these mutations may be useful for pre-therapy screening in regions with high levels of TDR. In the context of a public health approach to ARV therapy, a reliable point-of-care genotypic resistance test could identify which patients should receive standard first-line therapy and which should receive a protease-inhibitor-containing regimen.

  6. Deschasaux M, Huybrechts I, Murphy N, Julia C, Hercberg S, Srour B, et al.
    PLoS Med, 2018 09;15(9):e1002651.
    PMID: 30226842 DOI: 10.1371/journal.pmed.1002651
    BACKGROUND: Helping consumers make healthier food choices is a key issue for the prevention of cancer and other diseases. In many countries, political authorities are considering the implementation of a simplified labelling system to reflect the nutritional quality of food products. The Nutri-Score, a five-colour nutrition label, is derived from the Nutrient Profiling System of the British Food Standards Agency (modified version) (FSAm-NPS). How the consumption of foods with high/low FSAm-NPS relates to cancer risk has been studied in national/regional cohorts but has not been characterized in diverse European populations.

    METHODS AND FINDINGS: This prospective analysis included 471,495 adults from the European Prospective Investigation into Cancer and Nutrition (EPIC, 1992-2014, median follow-up: 15.3 y), among whom there were 49,794 incident cancer cases (main locations: breast, n = 12,063; prostate, n = 6,745; colon-rectum, n = 5,806). Usual food intakes were assessed with standardized country-specific diet assessment methods. The FSAm-NPS was calculated for each food/beverage using their 100-g content in energy, sugar, saturated fatty acid, sodium, fibres, proteins, and fruits/vegetables/legumes/nuts. The FSAm-NPS scores of all food items usually consumed by a participant were averaged to obtain the individual FSAm-NPS Dietary Index (DI) scores. Multi-adjusted Cox proportional hazards models were computed. A higher FSAm-NPS DI score, reflecting a lower nutritional quality of the food consumed, was associated with a higher risk of total cancer (HRQ5 versus Q1 = 1.07; 95% CI 1.03-1.10, P-trend < 0.001). Absolute cancer rates in those with high and low (quintiles 5 and 1) FSAm-NPS DI scores were 81.4 and 69.5 cases/10,000 person-years, respectively. Higher FSAm-NPS DI scores were specifically associated with higher risks of cancers of the colon-rectum, upper aerodigestive tract and stomach, lung for men, and liver and postmenopausal breast for women (all P < 0.05). The main study limitation is that it was based on an observational cohort using self-reported dietary data obtained through a single baseline food frequency questionnaire; thus, exposure misclassification and residual confounding cannot be ruled out.

    CONCLUSIONS: In this large multinational European cohort, the consumption of food products with a higher FSAm-NPS score (lower nutritional quality) was associated with a higher risk of cancer. This supports the relevance of the FSAm-NPS as underlying nutrient profiling system for front-of-pack nutrition labels, as well as for other public health nutritional measures.

  7. Lim LL, Lau ESH, Ozaki R, Chung H, Fu AWC, Chan W, et al.
    PLoS Med, 2020 10;17(10):e1003367.
    PMID: 33007052 DOI: 10.1371/journal.pmed.1003367
    BACKGROUND: Diabetes outcomes are influenced by host factors, settings, and care processes. We examined the association of data-driven integrated care assisted by information and communications technology (ICT) with clinical outcomes in type 2 diabetes in public and private healthcare settings.

    METHODS AND FINDINGS: The web-based Joint Asia Diabetes Evaluation (JADE) platform provides a protocol to guide data collection for issuing a personalized JADE report including risk categories (1-4, low-high), 5-year probabilities of cardiovascular-renal events, and trends and targets of 4 risk factors with tailored decision support. The JADE program is a prospective cohort study implemented in a naturalistic environment where patients underwent nurse-led structured evaluation (blood/urine/eye/feet) in public and private outpatient clinics and diabetes centers in Hong Kong. We retrospectively analyzed the data of 16,624 Han Chinese patients with type 2 diabetes who were enrolled in 2007-2015. In the public setting, the non-JADE group (n = 3,587) underwent structured evaluation for risk factors and complications only, while the JADE (n = 9,601) group received a JADE report with group empowerment by nurses. In a community-based, nurse-led, university-affiliated diabetes center (UDC), the JADE-Personalized (JADE-P) group (n = 3,436) received a JADE report, personalized empowerment, and annual telephone reminder for reevaluation and engagement. The primary composite outcome was time to the first occurrence of cardiovascular-renal diseases, all-site cancer, and/or death, based on hospitalization data censored on 30 June 2017. During 94,311 person-years of follow-up in 2007-2017, 7,779 primary events occurred. Compared with the JADE group (136.22 cases per 1,000 patient-years [95% CI 132.35-140.18]), the non-JADE group had higher (145.32 [95% CI 138.68-152.20]; P = 0.020) while the JADE-P group had lower event rates (70.94 [95% CI 67.12-74.91]; P < 0.001). The adjusted hazard ratios (aHRs) for the primary composite outcome were 1.22 (95% CI 1.15-1.30) and 0.70 (95% CI 0.66-0.75), respectively, independent of risk profiles, education levels, drug usage, self-care, and comorbidities at baseline. We reported consistent results in propensity-score-matched analyses and after accounting for loss to follow-up. Potential limitations include its nonrandomized design that precludes causal inference, residual confounding, and participation bias.

    CONCLUSIONS: ICT-assisted integrated care was associated with a reduction in clinical events, including death in type 2 diabetes in public and private healthcare settings.

  8. Chen CH, Shin SD, Sun JT, Jamaluddin SF, Tanaka H, Song KJ, et al.
    PLoS Med, 2020 10;17(10):e1003360.
    PMID: 33022018 DOI: 10.1371/journal.pmed.1003360
    BACKGROUND: Whether rapid transportation can benefit patients with trauma remains controversial. We determined the association between prehospital time and outcome to explore the concept of the "golden hour" for injured patients.

    METHODS AND FINDINGS: We conducted a retrospective cohort study of trauma patients transported from the scene to hospitals by emergency medical service (EMS) from January 1, 2016, to November 30, 2018, using data from the Pan-Asia Trauma Outcomes Study (PATOS) database. Prehospital time intervals were categorized into response time (RT), scene to hospital time (SH), and total prehospital time (TPT). The outcomes were 30-day mortality and functional status at hospital discharge. Multivariable logistic regression was used to investigate the association of prehospital time and outcomes to adjust for factors including age, sex, mechanism and type of injury, Injury Severity Score (ISS), Revised Trauma Score (RTS), and prehospital interventions. Overall, 24,365 patients from 4 countries (645 patients from Japan, 16,476 patients from Korea, 5,358 patients from Malaysia, and 1,886 patients from Taiwan) were included in the analysis. Among included patients, the median age was 45 years (lower quartile [Q1]-upper quartile [Q3]: 25-62), and 15,498 (63.6%) patients were male. Median (Q1-Q3) RT, SH, and TPT were 20 (Q1-Q3: 12-39), 21 (Q1-Q3: 16-29), and 47 (Q1-Q3: 32-60) minutes, respectively. In all, 280 patients (1.1%) died within 30 days after injury. Prehospital time intervals were not associated with 30-day mortality. The adjusted odds ratios (aORs) per 10 minutes of RT, SH, and TPT were 0.99 (95% CI 0.92-1.06, p = 0.740), 1.08 (95% CI 1.00-1.17, p = 0.065), and 1.03 (95% CI 0.98-1.09, p = 0.236), respectively. However, long prehospital time was detrimental to functional survival. The aORs of RT, SH, and TPT per 10-minute delay were 1.06 (95% CI 1.04-1.08, p < 0.001), 1.05 (95% CI 1.01-1.08, p = 0.007), and 1.06 (95% CI 1.04-1.08, p < 0.001), respectively. The key limitation of our study is the missing data inherent to the retrospective design. Another major limitation is the aggregate nature of the data from different countries and unaccounted confounders such as in-hospital management.

    CONCLUSIONS: Longer prehospital time was not associated with an increased risk of 30-day mortality, but it may be associated with increased risk of poor functional outcomes in injured patients. This finding supports the concept of the "golden hour" for trauma patients during prehospital care in the countries studied.

  9. Muglu J, Rather H, Arroyo-Manzano D, Bhattacharya S, Balchin I, Khalil A, et al.
    PLoS Med, 2019 07;16(7):e1002838.
    PMID: 31265456 DOI: 10.1371/journal.pmed.1002838
    BACKGROUND: Despite advances in healthcare, stillbirth rates remain relatively unchanged. We conducted a systematic review to quantify the risks of stillbirth and neonatal death at term (from 37 weeks gestation) according to gestational age.

    METHODS AND FINDINGS: We searched the major electronic databases Medline, Embase, and Google Scholar (January 1990-October 2018) without language restrictions. We included cohort studies on term pregnancies that provided estimates of stillbirths or neonatal deaths by gestation week. We estimated the additional weekly risk of stillbirth in term pregnancies that continued versus delivered at various gestational ages. We compared week-specific neonatal mortality rates by gestational age at delivery. We used mixed-effects logistic regression models with random intercepts, and computed risk ratios (RRs), odds ratios (ORs), and 95% confidence intervals (CIs). Thirteen studies (15 million pregnancies, 17,830 stillbirths) were included. All studies were from high-income countries. Four studies provided the risks of stillbirth in mothers of White and Black race, 2 in mothers of White and Asian race, 5 in mothers of White race only, and 2 in mothers of Black race only. The prospective risk of stillbirth increased with gestational age from 0.11 per 1,000 pregnancies at 37 weeks (95% CI 0.07 to 0.15) to 3.18 per 1,000 at 42 weeks (95% CI 1.84 to 4.35). Neonatal mortality increased when pregnancies continued beyond 41 weeks; the risk increased significantly for deliveries at 42 versus 41 weeks gestation (RR 1.87, 95% CI 1.07 to 2.86, p = 0.012). One additional stillbirth occurred for every 1,449 (95% CI 1,237 to 1,747) pregnancies that advanced from 40 to 41 weeks. Limitations include variations in the definition of low-risk pregnancy, the wide time span of the studies, the use of registry-based data, and potential confounders affecting the outcome.

    CONCLUSIONS: Our findings suggest there is a significant additional risk of stillbirth, with no corresponding reduction in neonatal mortality, when term pregnancies continue to 41 weeks compared to delivery at 40 weeks.

    SYSTEMATIC REVIEW REGISTRATION: PROSPERO CRD42015013785.

  10. Chandramouli C, Tay WT, Bamadhaj NS, Tromp J, Teng TK, Yap JJL, et al.
    PLoS Med, 2019 09;16(9):e1002916.
    PMID: 31550265 DOI: 10.1371/journal.pmed.1002916
    BACKGROUND: Asians are predisposed to a lean heart failure (HF) phenotype. Data on the 'obesity paradox', reported in Western populations, are scarce in Asia and have only utilised the traditional classification of body mass index (BMI). We aimed to investigate the association between obesity (defined by BMI and abdominal measures) and HF outcomes in Asia.

    METHODS AND FINDINGS: Utilising the Asian Sudden Cardiac Death in Heart Failure (ASIAN-HF) registry (11 Asian regions including Taiwan, Hong Kong, China, India, Malaysia, Thailand, Singapore, Indonesia, Philippines, Japan, and Korea; 46 centres with enrolment between 1 October 2012 and 6 October 2016), we prospectively examined 5,964 patients with symptomatic HF (mean age 61.3 ± 13.3 years, 26% women, mean BMI 25.3 ± 5.3 kg/m2, 16% with HF with preserved ejection fraction [HFpEF; ejection fraction ≥ 50%]), among whom 2,051 also had waist-to-height ratio (WHtR) measurements (mean age 60.8 ± 12.9 years, 24% women, mean BMI 25.0 ± 5.2 kg/m2, 7% HFpEF). Patients were categorised by BMI quartiles or WHtR quartiles or 4 combined groups of BMI (low, <24.5 kg/m2 [lean], or high, ≥24.5 kg/m2 [obese]) and WHtR (low, <0.55 [thin], or high, ≥0.55 [fat]). Cox proportional hazards models were used to examine a 1-year composite outcome (HF hospitalisation or mortality). Across BMI quartiles, higher BMI was associated with lower risk of the composite outcome (ptrend < 0.001). Contrastingly, higher WHtR was associated with higher risk of the composite outcome. Individuals in the lean-fat group, with low BMI and high WHtR (13.9%), were more likely to be women (35.4%) and to be from low-income countries (47.7%) (predominantly in South/Southeast Asia), and had higher prevalence of diabetes (46%), worse quality of life scores (63.3 ± 24.2), and a higher rate of the composite outcome (51/232; 22%), compared to the other groups (p < 0.05 for all). Following multivariable adjustment, the lean-fat group had higher adjusted risk of the composite outcome (hazard ratio 1.93, 95% CI 1.17-3.18, p = 0.01), compared to the obese-thin group, with high BMI and low WHtR. Results were consistent across both HF subtypes (HFpEF and HF with reduced ejection fraction [HFrEF]; pinteraction = 0.355). Selection bias and residual confounding are potential limitations of such multinational observational registries.

    CONCLUSIONS: In this cohort of Asian patients with HF, the 'obesity paradox' is observed only when defined using BMI, with WHtR showing the opposite association with the composite outcome. Lean-fat patients, with high WHtR and low BMI, have the worst outcomes. A direct correlation between high WHtR and the composite outcome is apparent in both HFpEF and HFrEF.

    TRIAL REGISTRATION: Asian Sudden Cardiac Death in HF (ASIAN-HF) Registry ClinicalTrials.gov Identifier: NCT01633398.

  11. Burton A, Maskarinec G, Perez-Gomez B, Vachon C, Miao H, Lajous M, et al.
    PLoS Med, 2017 Jun;14(6):e1002335.
    PMID: 28666001 DOI: 10.1371/journal.pmed.1002335
    BACKGROUND: Mammographic density (MD) is one of the strongest breast cancer risk factors. Its age-related characteristics have been studied in women in western countries, but whether these associations apply to women worldwide is not known.

    METHODS AND FINDINGS: We examined cross-sectional differences in MD by age and menopausal status in over 11,000 breast-cancer-free women aged 35-85 years, from 40 ethnicity- and location-specific population groups across 22 countries in the International Consortium on Mammographic Density (ICMD). MD was read centrally using a quantitative method (Cumulus) and its square-root metrics were analysed using meta-analysis of group-level estimates and linear regression models of pooled data, adjusted for body mass index, reproductive factors, mammogram view, image type, and reader. In all, 4,534 women were premenopausal, and 6,481 postmenopausal, at the time of mammography. A large age-adjusted difference in percent MD (PD) between post- and premenopausal women was apparent (-0.46 cm [95% CI: -0.53, -0.39]) and appeared greater in women with lower breast cancer risk profiles; variation across population groups due to heterogeneity (I2) was 16.5%. Among premenopausal women, the √PD difference per 10-year increase in age was -0.24 cm (95% CI: -0.34, -0.14; I2 = 30%), reflecting a compositional change (lower dense area and higher non-dense area, with no difference in breast area). In postmenopausal women, the corresponding difference in √PD (-0.38 cm [95% CI: -0.44, -0.33]; I2 = 30%) was additionally driven by increasing breast area. The study is limited by different mammography systems and its cross-sectional rather than longitudinal nature.

    CONCLUSIONS: Declines in MD with increasing age are present premenopausally, continue postmenopausally, and are most pronounced over the menopausal transition. These effects were highly consistent across diverse groups of women worldwide, suggesting that they result from an intrinsic biological, likely hormonal, mechanism common to women. If cumulative breast density is a key determinant of breast cancer risk, younger ages may be the more critical periods for lifestyle modifications aimed at breast density and breast cancer risk reduction.

  12. Rashid A, Iguchi Y, Afiqah SN
    PLoS Med, 2020 10;17(10):e1003303.
    PMID: 33108371 DOI: 10.1371/journal.pmed.1003303
    BACKGROUND: Despite the clear stand taken by the United Nations (UN) and other international bodies in ensuring that female genital cutting (FGC) is not performed by health professionals, the rate of medicalization has not reduced. The current study aimed to determine the extent of medicalization of FGC among doctors in Malaysia, who the doctors were who practiced it, how and what was practiced, and the motivations for the practice.

    METHODS AND FINDINGS: This mixed method (qualitative and quantitative) study was conducted from 2018 to 2019 using a self-administered questionnaire among Muslim medical doctors from 2 main medical associations with a large number of Muslim members from all over Malaysia who attended their annual conference. For those doctors who did not attend the conference, the questionnaire was posted to them. Association A had 510 members, 64 male Muslim doctors and 333 female Muslim doctors. Association B only had Muslim doctors; 3,088 were female, and 1,323 were male. In total, 894 questionnaires were distributed either by hand or by post, and 366 completed questionnaires were received back. For the qualitative part of the study, a snowball sampling method was used, and 24 in-depth interviews were conducted using a semi-structured questionnaire, until data reached saturation. Quantitative data were analysed using SPSS version 18 (IBM, Armonk, NY). A chi-squared test and binary logistic regression were performed. The qualitative data were transcribed manually, organized, coded, and recoded using NVivo version 12. The clustered codes were elicited as common themes. Most of the respondents were women, had medical degrees from Malaysia, and had a postgraduate degree in Family Medicine. The median age was 42. Most were working with the Ministry of Health (MoH) Malaysia, and in a clinic located in an urban location. The prevalence of Muslim doctors practising FGC was 20.5% (95% CI 16.6-24.9). The main reason cited for practising FGC was religious obligation. Qualitative findings too showed that religion was a strong motivating factor for the practice and its continuation, besides culture and harm reduction. Although most Muslim doctors performed type IV FGC, there were a substantial number performing type I. Respondents who were women (adjusted odds ratio [aOR] 4.4, 95% CI 1.9-10.0. P ≤ 0.001), who owned a clinic (aOR 30.7, 95% CI 12.0-78.4. P ≤ 0.001) or jointly owned a clinic (aOR 7.61, 95% CI 3.2-18.1. P ≤ 0.001), who thought that FGC was legal in Malaysia (aOR 2.09, 95% CI 1.02-4.3. P = 0.04), and who were encouraged in religion (aOR 2.25, 95% CI 3.2-18.1. P = 0.036) and thought that FGC should continue (aOR 3.54, 95% CI 1.25-10.04. P = 0.017) were more likely to practice FGC. The main limitations of the study were the small sample size and low response rate.

    CONCLUSIONS: In this study, we found that many of the Muslim doctors were unaware of the legal and international stand against FGC, and many wanted the practice to continue. It is a concern that type IV FGC carried out by traditional midwives may be supplanted and exacerbated by type I FGC performed by doctors, calling for strong and urgent action by the Malaysian medical authorities.

  13. Legido-Quigley H, Leh Hoon Chuah F, Howard N
    PLoS Med, 2020 11;17(11):e1003143.
    PMID: 33170834 DOI: 10.1371/journal.pmed.1003143
    BACKGROUND: Southeast Asian countries host signficant numbers of forcibly displaced people. This study was conducted to examine how health systems in Southeast Asia have responded to the health system challenges of forced migration and refugee-related health including the health needs of populations affected by forced displacement; the health systems-level barriers and facilitators in addressing these needs; and the implications of existing health policies relating to forcibly displaced and refugee populations. This study aims to fill in the gap in knowledge by analysing how health systems are organised in Southeast Asia to address the health needs of forcibly displaced people.

    METHODS AND FINDINGS: We conducted 30 semistructured interviews with health policy-makers, health service providers, and other experts working in the United Nations (n = 6), ministries and public health (n = 5), international (n = 9) and national civil society (n = 7), and academia (n = 3) based in Indonesia (n = 6), Malaysia (n = 10), Myanmar (n = 6), and Thailand (n = 8). Data were analysed thematically using deductive and inductive coding. Interviewees described the cumulative nature of health risks at each migratory phase. Perceived barriers to addressing migrants' cumulative health needs were primarily financial, juridico-political, and sociocultural, whereas key facilitators were many health workers' humanitarian stance and positive national commitment to pursuing universal health coverage (UHC). Across all countries, financial constraints were identified as the main challenges in addressing the comprehensive health needs of refugees and asylum seekers. Participants recommended regional and multisectoral approaches led by national governments, recognising refugee and asylum-seeker contributions, and promoting inclusion and livelihoods. Main study limitations included that we were not able to include migrant voices or those professionals not already interested in migrants.

    CONCLUSIONS: To our knowledge, this is one of the first qualitative studies to investigate the health concerns and barriers to access among migrants experiencing forced displacement, particularly refugees and asylum seekers, in Southeast Asia. Findings provide practical new insights with implications for informing policy and practice. Overall, sociopolitical inclusion of forcibly displaced populations remains difficult in these four countries despite their significant contributions to host-country economies.

  14. Mousa A, Al-Taiar A, Anstey NM, Badaut C, Barber BE, Bassat Q, et al.
    PLoS Med, 2020 10;17(10):e1003359.
    PMID: 33075101 DOI: 10.1371/journal.pmed.1003359
    BACKGROUND: Delay in receiving treatment for uncomplicated malaria (UM) is often reported to increase the risk of developing severe malaria (SM), but access to treatment remains low in most high-burden areas. Understanding the contribution of treatment delay on progression to severe disease is critical to determine how quickly patients need to receive treatment and to quantify the impact of widely implemented treatment interventions, such as 'test-and-treat' policies administered by community health workers (CHWs). We conducted a pooled individual-participant meta-analysis to estimate the association between treatment delay and presenting with SM.

    METHODS AND FINDINGS: A search using Ovid MEDLINE and Embase was initially conducted to identify studies on severe Plasmodium falciparum malaria that included information on treatment delay, such as fever duration (inception to 22nd September 2017). Studies identified included 5 case-control and 8 other observational clinical studies of SM and UM cases. Risk of bias was assessed using the Newcastle-Ottawa scale, and all studies were ranked as 'Good', scoring ≥7/10. Individual-patient data (IPD) were pooled from 13 studies of 3,989 (94.1% aged <15 years) SM patients and 5,780 (79.6% aged <15 years) UM cases in Benin, Malaysia, Mozambique, Tanzania, The Gambia, Uganda, Yemen, and Zambia. Definitions of SM were standardised across studies to compare treatment delay in patients with UM and different SM phenotypes using age-adjusted mixed-effects regression. The odds of any SM phenotype were significantly higher in children with longer delays between initial symptoms and arrival at the health facility (odds ratio [OR] = 1.33, 95% CI: 1.07-1.64 for a delay of >24 hours versus ≤24 hours; p = 0.009). Reported illness duration was a strong predictor of presenting with severe malarial anaemia (SMA) in children, with an OR of 2.79 (95% CI:1.92-4.06; p < 0.001) for a delay of 2-3 days and 5.46 (95% CI: 3.49-8.53; p < 0.001) for a delay of >7 days, compared with receiving treatment within 24 hours from symptom onset. We estimate that 42.8% of childhood SMA cases and 48.5% of adult SMA cases in the study areas would have been averted if all individuals were able to access treatment within the first day of symptom onset, if the association is fully causal. In studies specifically recording onset of nonsevere symptoms, long treatment delay was moderately associated with other SM phenotypes (OR [95% CI] >3 to ≤4 days versus ≤24 hours: cerebral malaria [CM] = 2.42 [1.24-4.72], p = 0.01; respiratory distress syndrome [RDS] = 4.09 [1.70-9.82], p = 0.002). In addition to unmeasured confounding, which is commonly present in observational studies, a key limitation is that many severe cases and deaths occur outside healthcare facilities in endemic countries, where the effect of delayed or no treatment is difficult to quantify.

    CONCLUSIONS: Our results quantify the relationship between rapid access to treatment and reduced risk of severe disease, which was particularly strong for SMA. There was some evidence to suggest that progression to other severe phenotypes may also be prevented by prompt treatment, though the association was not as strong, which may be explained by potential selection bias, sample size issues, or a difference in underlying pathology. These findings may help assess the impact of interventions that improve access to treatment.

  15. Baker BK
    PLoS Med, 2016 Mar;13(3):e1001970.
    PMID: 26954325 DOI: 10.1371/journal.pmed.1001970
    Brook Baker describes the potential harms to global health from the Trans Pacific Partnership Agreement and its failure to balance the interests of patients and the public with those of industry.
  16. Tromp J, Tay WT, Ouwerkerk W, Teng TK, Yap J, MacDonald MR, et al.
    PLoS Med, 2018 03;15(3):e1002541.
    PMID: 29584721 DOI: 10.1371/journal.pmed.1002541
    BACKGROUND: Comorbidities are common in patients with heart failure (HF) and complicate treatment and outcomes. We identified patterns of multimorbidity in Asian patients with HF and their association with patients' quality of life (QoL) and health outcomes.

    METHODS AND FINDINGS: We used data on 6,480 patients with chronic HF (1,204 with preserved ejection fraction) enrolled between 1 October 2012 and 6 October 2016 in the Asian Sudden Cardiac Death in Heart Failure (ASIAN-HF) registry. The ASIAN-HF registry is a prospective cohort study, with patients prospectively enrolled from in- and outpatient clinics from 11 Asian regions (Hong Kong, Taiwan, China, Japan, Korea, India, Malaysia, Thailand, Singapore, Indonesia, and Philippines). Latent class analysis was used to identify patterns of multimorbidity. The primary outcome was defined as a composite of all-cause mortality or HF hospitalization within 1 year. To assess differences in QoL, we used the Kansas City Cardiomyopathy Questionnaire. We identified 5 distinct multimorbidity groups: elderly/atrial fibrillation (AF) (N = 1,048; oldest, more AF), metabolic (N = 1,129; obesity, diabetes, hypertension), young (N = 1,759; youngest, low comorbidity rates, non-ischemic etiology), ischemic (N = 1,261; ischemic etiology), and lean diabetic (N = 1,283; diabetic, hypertensive, low prevalence of obesity, high prevalence of chronic kidney disease). Patients in the lean diabetic group had the worst QoL, more severe signs and symptoms of HF, and the highest rate of the primary combined outcome within 1 year (29% versus 11% in the young group) (p for all <0.001). Adjusting for confounders (demographics, New York Heart Association class, and medication) the lean diabetic (hazard ratio [HR] 1.79, 95% CI 1.46-2.22), elderly/AF (HR 1.57, 95% CI 1.26-1.96), ischemic (HR 1.51, 95% CI 1.22-1.88), and metabolic (HR 1.28, 95% CI 1.02-1.60) groups had higher rates of the primary combined outcome compared to the young group. Potential limitations include site selection and participation bias.

    CONCLUSIONS: Among Asian patients with HF, comorbidities naturally clustered in 5 distinct patterns, each differentially impacting patients' QoL and health outcomes. These data underscore the importance of studying multimorbidity in HF and the need for more comprehensive approaches in phenotyping patients with HF and multimorbidity.

    TRIAL REGISTRATION: ClinicalTrials.gov NCT01633398.
  17. Loeliger KB, Meyer JP, Desai MM, Ciarleglio MM, Gallagher C, Altice FL
    PLoS Med, 2018 10;15(10):e1002667.
    PMID: 30300351 DOI: 10.1371/journal.pmed.1002667
    BACKGROUND: Sustained retention in HIV care (RIC) and viral suppression (VS) are central to US national HIV prevention strategies, but have not been comprehensively assessed in criminal justice (CJ) populations with known health disparities. The purpose of this study is to identify predictors of RIC and VS following release from prison or jail.

    METHODS AND FINDINGS: This is a retrospective cohort study of all adult people living with HIV (PLWH) incarcerated in Connecticut, US, during the period January 1, 2007, to December 31, 2011, and observed through December 31, 2014 (n = 1,094). Most cohort participants were unmarried (83.7%) men (77.0%) who were black or Hispanic (78.1%) and acquired HIV from injection drug use (72.6%). Prison-based pharmacy and custody databases were linked with community HIV surveillance monitoring and case management databases. Post-release RIC declined steadily over 3 years of follow-up (67.2% retained for year 1, 51.3% retained for years 1-2, and 42.5% retained for years 1-3). Compared with individuals who were not re-incarcerated, individuals who were re-incarcerated were more likely to meet RIC criteria (48% versus 34%; p < 0.001) but less likely to have VS (72% versus 81%; p = 0.048). Using multivariable logistic regression models (individual-level analysis for 1,001 individuals after excluding 93 deaths), both sustained RIC and VS at 3 years post-release were independently associated with older age (RIC: adjusted odds ratio [AOR] = 1.61, 95% CI = 1.22-2.12; VS: AOR = 1.37, 95% CI = 1.06-1.78), having health insurance (RIC: AOR = 2.15, 95% CI = 1.60-2.89; VS: AOR = 2.01, 95% CI = 1.53-2.64), and receiving an increased number of transitional case management visits. The same factors were significant when we assessed RIC and VS outcomes in each 6-month period using generalized estimating equations (for 1,094 individuals contributing 6,227 6-month periods prior to death or censoring). Additionally, receipt of antiretroviral therapy during incarceration (RIC: AOR = 1.33, 95% CI 1.07-1.65; VS: AOR = 1.91, 95% CI = 1.56-2.34), early linkage to care post-release (RIC: AOR = 2.64, 95% CI = 2.03-3.43; VS: AOR = 1.79; 95% CI = 1.45-2.21), and absolute time and proportion of follow-up time spent re-incarcerated were highly correlated with better treatment outcomes. Limited data were available on changes over time in injection drug use or other substance use disorders, psychiatric disorders, or housing status.

    CONCLUSIONS: In a large cohort of CJ-involved PLWH with a 3-year post-release evaluation, RIC diminished significantly over time, but was associated with HIV care during incarceration, health insurance, case management services, and early linkage to care post-release. While re-incarceration and conditional release provide opportunities to engage in care, reducing recidivism and supporting community-based RIC efforts are key to improving longitudinal treatment outcomes among CJ-involved PLWH.

  18. Commons RJ, Simpson JA, Thriemer K, Abreha T, Adam I, Anstey NM, et al.
    PLoS Med, 2019 Oct;16(10):e1002928.
    PMID: 31584960 DOI: 10.1371/journal.pmed.1002928
    BACKGROUND: Artemisinin-based combination therapy (ACT) is recommended for uncomplicated Plasmodium vivax malaria in areas of emerging chloroquine resistance. We undertook a systematic review and individual patient data meta-analysis to compare the efficacies of dihydroartemisinin-piperaquine (DP) and artemether-lumefantrine (AL) with or without primaquine (PQ) on the risk of recurrent P. vivax.

    METHODS AND FINDINGS: Clinical efficacy studies of uncomplicated P. vivax treated with DP or AL and published between January 1, 2000, and January 31, 2018, were identified by conducting a systematic review registered with the International Prospective Register of Systematic Reviews (PROSPERO): CRD42016053310. Investigators of eligible studies were invited to contribute individual patient data that were pooled using standardised methodology. The effect of mg/kg dose of piperaquine/lumefantrine, ACT administered, and PQ on the rate of P. vivax recurrence between days 7 and 42 after starting treatment were investigated by Cox regression analyses according to an a priori analysis plan. Secondary outcomes were the risk of recurrence assessed on days 28 and 63. Nineteen studies enrolling 2,017 patients were included in the analysis. The risk of recurrent P. vivax at day 42 was significantly higher in the 384 patients treated with AL alone (44.0%, 95% confidence interval [CI] 38.7-49.8) compared with the 812 patients treated with DP alone (9.3%, 95% CI 7.1-12.2): adjusted hazard ratio (AHR) 12.63 (95% CI 6.40-24.92), p < 0.001. The rates of recurrence assessed at days 42 and 63 were associated inversely with the dose of piperaquine: AHRs (95% CI) for every 5-mg/kg increase 0.63 (0.48-0.84), p = 0.0013 and 0.83 (0.73-0.94), p = 0.0033, respectively. The dose of lumefantrine was not significantly associated with the rate of recurrence (1.07 for every 5-mg/kg increase, 95% CI 0.99-1.16, p = 0.0869). In a post hoc analysis, in patients with symptomatic recurrence after AL, the mean haemoglobin increased 0.13 g/dL (95% CI 0.01-0.26) for every 5 days that recurrence was delayed, p = 0.0407. Coadministration of PQ reduced substantially the rate of recurrence assessed at day 42 after AL (AHR = 0.20, 95% CI 0.10-0.41, p < 0.001) and at day 63 after DP (AHR = 0.08, 95% CI 0.01-0.70, p = 0.0233). Results were limited by follow-up of patients to 63 days or less and nonrandomised treatment groups.

    CONCLUSIONS: In this study, we observed the risk of P. vivax recurrence at day 42 to be significantly lower following treatment with DP compared with AL, reflecting the longer period of post-treatment prophylaxis; this risk was reduced substantially by coadministration with PQ. We found that delaying P. vivax recurrence was associated with a small but significant improvement in haemoglobin. These results highlight the benefits of PQ radical cure and also the provision of blood-stage antimalarial agents with prolonged post-treatment prophylaxis.

Related Terms
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links