METHODS: HIV-positive patients enrolled in the TREAT Asia HIV Observational Database who had used second-line ART for ≥6 months were included. ART use and rates and predictors of second-line treatment failure were evaluated.
RESULTS: There were 302 eligible patients. Most were male (76.5%) and exposed to HIV via heterosexual contact (71.5%). Median age at second-line initiation was 39.2 years, median CD4 cell count was 146 cells per cubic millimeter, and median HIV viral load was 16,224 copies per milliliter. Patients started second-line ART before 2007 (n = 105), 2007-2010 (n = 147) and after 2010 (n = 50). Ritonavir-boosted lopinavir and atazanavir accounted for the majority of protease inhibitor use after 2006. Median follow-up time on second-line therapy was 2.3 years. The rates of treatment failure and mortality per 100 patient/years were 8.8 (95% confidence interval: 7.1 to 10.9) and 1.1 (95% confidence interval: 0.6 to 1.9), respectively. Older age, high baseline viral load, and use of a protease inhibitor other than lopinavir or atazanavir were associated with a significantly shorter time to second-line failure.
CONCLUSIONS: Increased access to viral load monitoring to facilitate early detection of first-line ART failure and subsequent treatment switch is important for maximizing the durability of second-line therapy in Asia. Although second-line ART is highly effective in the region, the reported rate of failure emphasizes the need for third-line ART in a small portion of patients.
METHODS: Patients with oral epithelial dysplasia at one hospital were selected as the 'training set' (n = 56) whilst those at another hospital were selected for the 'test set' (n = 66). RNA was extracted from formalin-fixed paraffin-embedded (FFPE) diagnostic biopsies and analysed using the NanoString nCounter platform. A targeted panel of 42 genes selected on their association with oral carcinogenesis was used to develop a prognostic gene signature. Following data normalisation, uni- and multivariable analysis, as well as prognostic modelling, were employed to develop and validate the gene signature.
RESULTS: A prognostic classifier composed of 11 genes was developed using the training set. The multivariable prognostic model was used to predict patient risk scores in the test set. The prognostic gene signature was an independent predictor of malignant transformation when assessed in the test set, with the high-risk group showing worse prognosis [Hazard ratio = 12.65, p = 0.0003].
CONCLUSIONS: This study demonstrates proof of principle that RNA extracted from FFPE diagnostic biopsies of OPMD, when analysed on the NanoString nCounter platform, can be used to generate a molecular classifier that stratifies the risk of malignant transformation with promising clinical utility.
METHODS: The Prospective Urban Rural Epidemiology (PURE) study is a large, epidemiological cohort study of individuals aged 35-70 years (enrolled between Jan 1, 2003, and March 31, 2013) in 18 countries with a median follow-up of 7·4 years (IQR 5·3-9·3). Dietary intake of 135 335 individuals was recorded using validated food frequency questionnaires. The primary outcomes were total mortality and major cardiovascular events (fatal cardiovascular disease, non-fatal myocardial infarction, stroke, and heart failure). Secondary outcomes were all myocardial infarctions, stroke, cardiovascular disease mortality, and non-cardiovascular disease mortality. Participants were categorised into quintiles of nutrient intake (carbohydrate, fats, and protein) based on percentage of energy provided by nutrients. We assessed the associations between consumption of carbohydrate, total fat, and each type of fat with cardiovascular disease and total mortality. We calculated hazard ratios (HRs) using a multivariable Cox frailty model with random intercepts to account for centre clustering.
FINDINGS: During follow-up, we documented 5796 deaths and 4784 major cardiovascular disease events. Higher carbohydrate intake was associated with an increased risk of total mortality (highest [quintile 5] vs lowest quintile [quintile 1] category, HR 1·28 [95% CI 1·12-1·46], ptrend=0·0001) but not with the risk of cardiovascular disease or cardiovascular disease mortality. Intake of total fat and each type of fat was associated with lower risk of total mortality (quintile 5 vs quintile 1, total fat: HR 0·77 [95% CI 0·67-0·87], ptrend<0·0001; saturated fat, HR 0·86 [0·76-0·99], ptrend=0·0088; monounsaturated fat: HR 0·81 [0·71-0·92], ptrend<0·0001; and polyunsaturated fat: HR 0·80 [0·71-0·89], ptrend<0·0001). Higher saturated fat intake was associated with lower risk of stroke (quintile 5 vs quintile 1, HR 0·79 [95% CI 0·64-0·98], ptrend=0·0498). Total fat and saturated and unsaturated fats were not significantly associated with risk of myocardial infarction or cardiovascular disease mortality.
INTERPRETATION: High carbohydrate intake was associated with higher risk of total mortality, whereas total fat and individual types of fat were related to lower total mortality. Total fat and types of fat were not associated with cardiovascular disease, myocardial infarction, or cardiovascular disease mortality, whereas saturated fat had an inverse association with stroke. Global dietary guidelines should be reconsidered in light of these findings.
FUNDING: Full funding sources listed at the end of the paper (see Acknowledgments).
METHODS: We did a prospective cohort study (Prospective Urban Rural Epidemiology [PURE] in 135 335 individuals aged 35 to 70 years without cardiovascular disease from 613 communities in 18 low-income, middle-income, and high-income countries in seven geographical regions: North America and Europe, South America, the Middle East, south Asia, China, southeast Asia, and Africa. We documented their diet using country-specific food frequency questionnaires at baseline. Standardised questionnaires were used to collect information about demographic factors, socioeconomic status (education, income, and employment), lifestyle (smoking, physical activity, and alcohol intake), health history and medication use, and family history of cardiovascular disease. The follow-up period varied based on the date when recruitment began at each site or country. The main clinical outcomes were major cardiovascular disease (defined as death from cardiovascular causes and non-fatal myocardial infarction, stroke, and heart failure), fatal and non-fatal myocardial infarction, fatal and non-fatal strokes, cardiovascular mortality, non-cardiovascular mortality, and total mortality. Cox frailty models with random effects were used to assess associations between fruit, vegetable, and legume consumption with risk of cardiovascular disease events and mortality.
FINDINGS: Participants were enrolled into the study between Jan 1, 2003, and March 31, 2013. For the current analysis, we included all unrefuted outcome events in the PURE study database through March 31, 2017. Overall, combined mean fruit, vegetable and legume intake was 3·91 (SD 2·77) servings per day. During a median 7·4 years (5·5-9·3) of follow-up, 4784 major cardiovascular disease events, 1649 cardiovascular deaths, and 5796 total deaths were documented. Higher total fruit, vegetable, and legume intake was inversely associated with major cardiovascular disease, myocardial infarction, cardiovascular mortality, non-cardiovascular mortality, and total mortality in the models adjusted for age, sex, and centre (random effect). The estimates were substantially attenuated in the multivariable adjusted models for major cardiovascular disease (hazard ratio [HR] 0·90, 95% CI 0·74-1·10, ptrend=0·1301), myocardial infarction (0·99, 0·74-1·31; ptrend=0·2033), stroke (0·92, 0·67-1·25; ptrend=0·7092), cardiovascular mortality (0·73, 0·53-1·02; ptrend=0·0568), non-cardiovascular mortality (0·84, 0·68-1·04; ptrend =0·0038), and total mortality (0·81, 0·68-0·96; ptrend<0·0001). The HR for total mortality was lowest for three to four servings per day (0·78, 95% CI 0·69-0·88) compared with the reference group, with no further apparent decrease in HR with higher consumption. When examined separately, fruit intake was associated with lower risk of cardiovascular, non-cardiovascular, and total mortality, while legume intake was inversely associated with non-cardiovascular death and total mortality (in fully adjusted models). For vegetables, raw vegetable intake was strongly associated with a lower risk of total mortality, whereas cooked vegetable intake showed a modest benefit against mortality.
INTERPRETATION: Higher fruit, vegetable, and legume consumption was associated with a lower risk of non-cardiovascular, and total mortality. Benefits appear to be maximum for both non-cardiovascular mortality and total mortality at three to four servings per day (equivalent to 375-500 g/day).
FUNDING: Full funding sources listed at the end of the paper (see Acknowledgments).
METHODOLOGY: This was a retrospective analysis of all OHCA cases collected from the Pan-Asian Resuscitation Outcomes Study (PAROS) registry in 7 countries in Asia between 2009 and 2012. We included OHCA cases of presumed cardiac etiology, aged 18-years and above and resuscitation attempted by EMS. We performed multivariate logistic regression analyses to assess the relationship between initial and subsequent shockable rhythm and survival and neurological outcomes. 2-stage seemingly unrelated bivariate probit models were developed to jointly model the survival and neurological outcomes. We adjusted for the clustering effects of country variance in all models.
RESULTS: 40,160 OHCA cases met the inclusion criteria. There were 5356 OHCA cases (13.3%) with initial shockable rhythm and 33,974 (84.7%) with initial non-shockable rhythm. After adjustment of baseline and prehospital characteristics, OHCA with initial shockable rhythm (odds ratio/OR=6.10, 95% confidence interval/CI=5.06-7.34) and subsequent conversion to shockable rhythm (OR=2.00,95%CI=1.10-3.65) independently predicted better survival-to-hospital-discharge outcomes. Subsequent shockable rhythm conversion significantly improved survival-to-admission, discharge and post-arrest overall and cerebral performance outcomes in the multivariate logistic regression and 2-stage analyses.
CONCLUSION: Initial shockable rhythm was the strongest predictor for survival. However, conversion to subsequent shockable rhythm significantly improved post-arrest survival and neurological outcomes. This study suggests the importance of early resuscitation efforts even for initially non-shockable rhythms which has prognostic implications and selection of subsequent post-resuscitation therapy.
METHODS: This was a prospective, international, multicenter cohort study of out-of-hospital cardiac arrest in the Asia-Pacific. Arrests caused by trauma, patients who were not transported by emergency medical services (EMS), and pediatric out-of-hospital cardiac arrest cases (<18 years) were excluded from the analysis. Modifiable out-of-hospital factors (bystander cardiopulmonary resuscitation [CPR] and defibrillation, out-of-hospital defibrillation, advanced airway, and drug administration) were compared for all out-of-hospital cardiac arrest patients presenting to EMS and participating hospitals. The primary outcome measure was survival to hospital discharge or 30 days of hospitalization (if not discharged). We used multilevel mixed-effects logistic regression models to identify factors independently associated with out-of-hospital cardiac arrest survival, accounting for clustering within each community.
RESULTS: Of 66,780 out-of-hospital cardiac arrest cases reported between January 2009 and December 2012, we included 56,765 in the analysis. In the adjusted model, modifiable factors associated with improved out-of-hospital cardiac arrest outcomes included bystander CPR (odds ratio [OR] 1.43; 95% confidence interval [CI] 1.31 to 1.55), response time less than or equal to 8 minutes (OR 1.52; 95% CI 1.35 to 1.71), and out-of-hospital defibrillation (OR 2.31; 95% CI 1.96 to 2.72). Out-of-hospital advanced airway (OR 0.73; 95% CI 0.67 to 0.80) was negatively associated with out-of-hospital cardiac arrest survival.
CONCLUSION: In the PAROS cohort, bystander CPR, out-of-hospital defibrillation, and response time less than or equal to 8 minutes were positively associated with increased out-of-hospital cardiac arrest survival, whereas out-of-hospital advanced airway was associated with decreased out-of-hospital cardiac arrest survival. Developing EMS systems should focus on basic life support interventions in out-of-hospital cardiac arrest resuscitation.