METHODS: Patients initiating cART between 2006 and 2013 were included. TI was defined as stopping cART for >1 day. Treatment failure was defined as confirmed virological, immunological or clinical failure. Time to treatment failure during cART was analysed using Cox regression, not including periods off treatment. Covariables with P < 0.10 in univariable analyses were included in multivariable analyses, where P < 0.05 was considered statistically significant.
RESULTS: Of 4549 patients from 13 countries in Asia, 3176 (69.8%) were male and the median age was 34 years. A total of 111 (2.4%) had TIs due to AEs and 135 (3.0%) had TIs for other reasons. Median interruption times were 22 days for AE and 148 days for non-AE TIs. In multivariable analyses, interruptions >30 days were associated with failure (31-180 days HR = 2.66, 95%CI (1.70-4.16); 181-365 days HR = 6.22, 95%CI (3.26-11.86); and >365 days HR = 9.10, 95% CI (4.27-19.38), all P < 0.001, compared to 0-14 days). Reasons for previous TI were not statistically significant (P = 0.158).
CONCLUSIONS: Duration of interruptions of more than 30 days was the key factor associated with large increases in subsequent risk of treatment failure. If TI is unavoidable, its duration should be minimised to reduce the risk of failure after treatment resumption.
METHODS: Data were retrieved for major SGC patients diagnosed between 1988 and 2011 from Surveillance, Epidemiology, and End Results program.
RESULTS: We have included 5446 patients with major SGC. Most patients had parotid gland cancer (84.61%). Patients having >18 ELNs, >4 PLNs, and >33.33% LNR were associated with a worse survival. Moreover, older age, male patients, grade IV, distant stage, unmarried patients, submandibular gland cancer, and received chemotherapy but not received surgery were significantly associated with a worse survival.
CONCLUSIONS: We demonstrated that patients with >18 ELNs and >4 PLNs counts, and >33.33% LNR were high-risk group patients. We strongly suggest adding the ELNs and PLNs counts and/or LNR into the current staging system.
METHODS: A total of 715 incident PD cases were ascertained in a cohort of 220 494 individuals from NeuroEPIC4PD, a prospective European population-based cohort study including 13 centres in eight countries. Smoking habits were recorded at recruitment. We analysed smoking status, duration, and intensity and exposure to passive smoking in relation to PD onset.
RESULTS: Former smokers had a 20% decreased risk and current smokers a halved risk of developing PD compared with never smokers. Strong dose-response relationships with smoking intensity and duration were found. Hazard ratios (HRs) for smoking <20 years were 0.84 [95% confidence interval (CI) 0.67-1.07], 20-29 years 0.73 (95% CI 0.56-0.96) and >30 years 0.54 (95% CI 0.43-0.36) compared with never smokers. The proportional hazard assumption was verified, showing no change of risk over time, arguing against a delaying effect. Reverse causality was disproved by the consistency of dose-response relationships among former and current smokers. The inverse association between passive smoking and PD, HR 0.70 (95% CI 0.49-0.99) ruled out the effect of unmeasured confounding.
CONCLUSIONS: These results are highly suggestive of a true causal link between smoking and PD, although it is not clear which is the chemical compound in cigarette smoking responsible for the biological effect.
METHODS: The Norfolk (UK) based European Prospective Investigation into Cancer (EPIC-Norfolk) recruited 25,639 participants between 1993 and 1997. FEV1 measured by portable spirometry, was categorized into sex-specific quintiles. Mortality and morbidity from all causes, cardiovascular disease (CVD) and respiratory disease were collected from 1997 up to 2015. Cox proportional hazard regression analysis was used with adjustment for socio-economic factors, physical activity and co-morbidities.
RESULTS: Mean age of the population was 58.7 ± 9.3 years, mean FEV1 for men was 294± 74 cL/s and 214± 52 cL/s for women. The adjusted hazard ratios for all-cause mortality for participants in the highest fifth of the FEV1 category was 0.63 (0.52, 0.76) for men and 0.62 (0.51, 0.76) for women compared to the lowest quintile. Adjusted HRs for every 70 cL/s increase in FEV1 among men and women were 0.77 (p < 0.001) and 0.68 (p < 0.001) for total mortality, 0.85 (p<0.001) and 0.77 (p<0.001) for CVD and 0.52 (p <0.001) and 0.42 (p <0.001) for respiratory disease.
CONCLUSIONS: Participants with higher FEV1 levels had a lower risk of CVD and all-cause mortality. Measuring the FEV1 with a portable handheld spirometry measurement may be used as a surrogate marker for cardiovascular risk. Every effort should be made to identify those with poorer lung function even in the absence of cardiovascular disease as they are at greater risk of total and CV mortality.
METHODS: We did a prospective cohort study (Prospective Urban Rural Epidemiology [PURE] in 135 335 individuals aged 35 to 70 years without cardiovascular disease from 613 communities in 18 low-income, middle-income, and high-income countries in seven geographical regions: North America and Europe, South America, the Middle East, south Asia, China, southeast Asia, and Africa. We documented their diet using country-specific food frequency questionnaires at baseline. Standardised questionnaires were used to collect information about demographic factors, socioeconomic status (education, income, and employment), lifestyle (smoking, physical activity, and alcohol intake), health history and medication use, and family history of cardiovascular disease. The follow-up period varied based on the date when recruitment began at each site or country. The main clinical outcomes were major cardiovascular disease (defined as death from cardiovascular causes and non-fatal myocardial infarction, stroke, and heart failure), fatal and non-fatal myocardial infarction, fatal and non-fatal strokes, cardiovascular mortality, non-cardiovascular mortality, and total mortality. Cox frailty models with random effects were used to assess associations between fruit, vegetable, and legume consumption with risk of cardiovascular disease events and mortality.
FINDINGS: Participants were enrolled into the study between Jan 1, 2003, and March 31, 2013. For the current analysis, we included all unrefuted outcome events in the PURE study database through March 31, 2017. Overall, combined mean fruit, vegetable and legume intake was 3·91 (SD 2·77) servings per day. During a median 7·4 years (5·5-9·3) of follow-up, 4784 major cardiovascular disease events, 1649 cardiovascular deaths, and 5796 total deaths were documented. Higher total fruit, vegetable, and legume intake was inversely associated with major cardiovascular disease, myocardial infarction, cardiovascular mortality, non-cardiovascular mortality, and total mortality in the models adjusted for age, sex, and centre (random effect). The estimates were substantially attenuated in the multivariable adjusted models for major cardiovascular disease (hazard ratio [HR] 0·90, 95% CI 0·74-1·10, ptrend=0·1301), myocardial infarction (0·99, 0·74-1·31; ptrend=0·2033), stroke (0·92, 0·67-1·25; ptrend=0·7092), cardiovascular mortality (0·73, 0·53-1·02; ptrend=0·0568), non-cardiovascular mortality (0·84, 0·68-1·04; ptrend =0·0038), and total mortality (0·81, 0·68-0·96; ptrend<0·0001). The HR for total mortality was lowest for three to four servings per day (0·78, 95% CI 0·69-0·88) compared with the reference group, with no further apparent decrease in HR with higher consumption. When examined separately, fruit intake was associated with lower risk of cardiovascular, non-cardiovascular, and total mortality, while legume intake was inversely associated with non-cardiovascular death and total mortality (in fully adjusted models). For vegetables, raw vegetable intake was strongly associated with a lower risk of total mortality, whereas cooked vegetable intake showed a modest benefit against mortality.
INTERPRETATION: Higher fruit, vegetable, and legume consumption was associated with a lower risk of non-cardiovascular, and total mortality. Benefits appear to be maximum for both non-cardiovascular mortality and total mortality at three to four servings per day (equivalent to 375-500 g/day).
FUNDING: Full funding sources listed at the end of the paper (see Acknowledgments).
METHODS: We studied 522 patients who underwent mastectomy between 1998 and 2002 and followed them up until 2008. We defined PMLRR as recurrence to the axilla, supraclavicular nodes and or chest wall. ILR was defined as PMLRR occurring as an isolated event. Prognostic factors for locoregional recurrence were determined using the Cox proportional hazards regression model.
RESULTS: The overall PMLRR rate was 16.4%. ILR developed in 42 of 522 patients (8.0%). Within this subgroup, 25 (59.5%) remained disease free after treatment while 17 (40.5%) suffered disease progression. Univariate analyses identified race, age, size, stage, margin involvement, lymph node involvement, grade, lymphovascular invasion and ER status as probable prognostic factors for ILR. Cox regression resulted in only stage III disease and margin involvement as independent prognostic factors. The hazard of ILR was 2.5 times higher when the margins were involved compared to when they were clear (aHRR 2.5; 95% CI 1.3 to 5.0). Similarly, compared with stage I those with Stage II (aHRR 2.1; 95%CI 0.6 to 6.8) and stage III (aHRR 4.6; 95%CI 1.4 to 15.9) had worse prognosis for ILR.
CONCLUSION: Margin involvement and stage III disease were identified to be independent prognostic factors for ILR. Close follow-up of high risk patients and prompt treatment of locoregional recurrence were recommended.
METHOD: A historical cohort of 986 premenopausal, and 1123 postmenopausal, parous breast cancer patients diagnosed from 2001 to 2012 in University Malaya Medical Centre were included in the analyses. Time since LCB was categorized into quintiles. Multivariable Cox regression was used to determine whether time since LCB was associated with survival following breast cancer, adjusting for demographic, tumor, and treatment characteristics.
RESULTS: Premenopausal breast cancer patients with the most recent childbirth (LCB quintile 1) were younger, more likely to present with unfavorable prognostic profiles and had the lowest 5-year overall survival (OS) (66.9; 95% CI 60.2-73.6%), compared to women with longer duration since LCB (quintile 2 thru 5). In univariable analysis, time since LCB was inversely associated with risk of mortality and the hazard ratio for LCB quintile 2, 3, 4, and 5 versus quintile 1 were 0.53 (95% CI 0.36-0.77), 0.49 (95% CI 0.33-0.75), 0.61 (95% CI 0.43-0.85), and 0.64 (95% CI 0.44-0.93), respectively; P trend = 0.016. However, this association was attenuated substantially following adjustment for age at diagnosis and other prognostic factors. Similarly, postmenopausal breast cancer patients with the most recent childbirth were also more likely to present with unfavorable disease profiles. Compared to postmenopausal breast cancer patients in LCB quintile 1, patients in quintile 5 had a higher risk of mortality. This association was not significant following multivariable adjustment.
CONCLUSION: Time since LCB is not independently associated with survival in premenopausal or postmenopausal breast cancers. The apparent increase in risks of mortality in premenopausal breast cancer patients with a recent childbirth, and postmenopausal patients with longer duration since LCB, appear to be largely explained by their age at diagnosis.