Displaying publications 61 - 80 of 165 in total

Abstract:
Sort:
  1. Loeliger KB, Altice FL, Ciarleglio MM, Rich KM, Chandra DK, Gallagher C, et al.
    Lancet HIV, 2018 11;5(11):e617-e628.
    PMID: 30197101 DOI: 10.1016/S2352-3018(18)30175-9
    BACKGROUND: People transitioning from prisons or jails have high mortality, but data are scarce for people with HIV and no studies have integrated data from both criminal justice and community settings. We aimed to assess all-cause mortality in people with HIV released from an integrated system of prisons and jails in Connecticut, USA.

    METHODS: We linked pharmacy, custodial, death, case management, and HIV surveillance data from Connecticut Departments of Correction and Public Health to create a retrospective cohort of all adults with HIV released from jails and prisons in Connecticut between 2007 and 2014. We compared the mortality rate of adults with HIV released from incarceration with the general US and Connecticut populations, and modelled time-to-death from any cause after prison release with Cox proportional hazard models.

    FINDINGS: We identified 1350 people with HIV who were released after 24 h or more of incarceration between 2007 and 2014, of whom 184 (14%) died after index release; median age was 45 years (IQR 39-50) and median follow-up was 5·2 years (IQR 3·0-6·7) after index release. The crude mortality rate for people with HIV released from incarceration was 2868 deaths per 100 000 person-years, and the standardised mortality ratio showed that mortality was higher for this cohort than the general US population (6·97, 95% CI 5·96-7·97) and population of Connecticut (8·47, 7·25-9·69). Primary cause of death was reported for 170 individuals; the most common causes were HIV/AIDS (78 [46%]), drug overdose (26 [15%]), liver disease (17 [10%]), cardiovascular disease (16 [9%]), and accidental injury or suicide (13 [8%]). Black race (adjusted hazard ratio [HR] 0·52, 95% CI 0·34-0·80), having health insurance (0·09, 0·05-0·17), being re-incarcerated at least once for 365 days or longer (0·41, 0·22-0·76), and having a high percentage of re-incarcerations in which antiretroviral therapy was prescribed (0·08, 0·03-0·21) were protective against mortality. Positive predictors of time-to-death were age (≥50 years; adjusted HR 3·65, 95% CI 1·21-11·08), lower CD4 count (200-499 cells per μL, 2·54, 1·50-4·31; <200 cells per μL, 3·44, 1·90-6·20), a high number of comorbidities (1·86, 95% CI 1·23-2·82), virological failure (2·76, 1·94-3·92), and unmonitored viral load (2·13, 1·09-4·18).

    INTERPRETATION: To reduce mortality after release from incarceration in people with HIV, resources are needed to identify and treat HIV, in addition to medical comorbidities, psychiatric disorders, and substance use disorders, during and following incarceration. Policies that reduce incarceration and support integrated systems of care between prisons and communities could have a substantial effect on the survival of people with HIV.

    FUNDING: US National Institutes of Health.

    Matched MeSH terms: Proportional Hazards Models
  2. Zaini ZM, McParland H, Møller H, Husband K, Odell EW
    Sci Rep, 2018 10 26;8(1):15874.
    PMID: 30367100 DOI: 10.1038/s41598-018-34165-5
    The value of image cytometry DNA ploidy analysis and dysplasia grading to predict malignant transformation has been determined in oral lesions considered to be at 'high' risk on the basis of clinical information and biopsy result. 10-year follow up data for 259 sequential patients with oral lesions clinically at 'high' risk of malignant transformation were matched to cancer registry and local pathology database records of malignant outcomes, ploidy result and histological dysplasia grade. In multivariate analysis (n = 228 patients), 24 developed carcinoma and of these, 14 prior biopsy samples were aneuploid. Aneuploidy was a significant predictor (hazard ratio 7.92; 95% CI 3.45, 18.17) compared with diploidy (p 
    Matched MeSH terms: Proportional Hazards Models
  3. Mat Bah MN, Sapian MH, Jamil MT, Alias A, Zahari N
    Pediatr Cardiol, 2018 Oct;39(7):1389-1396.
    PMID: 29756159 DOI: 10.1007/s00246-018-1908-6
    Critical congenital heart disease (CCHD) is associated with significant morbidity and mortality. However, data on survival of CCHD and the risk factors associated with its mortality are limited. This study examined CCHD survival and the risk factors for CCHD mortality. Using a retrospective cohort study of infants born with CCHD from 2006 to 2015, survival over 10 years was estimated using Kaplan-Meier analysis, and the risk factors for mortality were analyzed using multivariate Cox proportional hazards regression. A total of 491 CCHD cases were included in the study, with an overall mortality rate of 34.8% (95% confidence interval [CI] 30.6-39.2). The intervention/surgical mortality rate was 9.8% ≤ 30 days and 11.5% > 30 days after surgery, and 17% died before surgery or intervention. The median age at death was 2.7 months [first quartile: 1 month, third quartile: 7.3 months]. The CCHD survival rate was 90.4% (95% CI 89-91.8%) at 1 month, 69.3% (95% CI 67.2-71.4%) at 1 year, 63.4% (95% CI 61.1-65.7%) at 5 years, and 61.4% (95% CI 58.9-63.9%) at 10 years. Weight of
    Matched MeSH terms: Proportional Hazards Models
  4. Gan CS, Wong JJ, Samransamruajkit R, Chuah SL, Chor YK, Qian S, et al.
    Pediatr Crit Care Med, 2018 10;19(10):e504-e513.
    PMID: 30036234 DOI: 10.1097/PCC.0000000000001667
    OBJECTIVES: Extrapulmonary pediatric acute respiratory distress syndrome and pulmonary pediatric acute respiratory distress syndrome are poorly described in the literature. We aimed to describe and compare the epidemiology, risk factors for mortality, and outcomes in extrapulmonary pediatric acute respiratory distress syndrome and pulmonary pediatric acute respiratory distress syndrome.

    DESIGN: This is a secondary analysis of a multicenter, retrospective, cohort study. Data on epidemiology, ventilation, therapies, and outcomes were collected and analyzed. Patients were classified into two mutually exclusive groups (extrapulmonary pediatric acute respiratory distress syndrome and pulmonary pediatric acute respiratory distress syndrome) based on etiologies. Primary outcome was PICU mortality. Cox proportional hazard regression was used to identify risk factors for mortality.

    SETTING: Ten multidisciplinary PICUs in Asia.

    PATIENTS: Mechanically ventilated children meeting the Pediatric Acute Lung Injury Consensus Conference criteria for pediatric acute respiratory distress syndrome between 2009 and 2015.

    INTERVENTIONS: None.

    MEASUREMENTS AND MAIN RESULTS: Forty-one of 307 patients (13.4%) and 266 of 307 patients (86.6%) were classified into extrapulmonary pediatric acute respiratory distress syndrome and pulmonary pediatric acute respiratory distress syndrome groups, respectively. The most common causes for extrapulmonary pediatric acute respiratory distress syndrome and pulmonary pediatric acute respiratory distress syndrome were sepsis (82.9%) and pneumonia (91.7%), respectively. Children with extrapulmonary pediatric acute respiratory distress syndrome were older, had higher admission severity scores, and had a greater proportion of organ dysfunction compared with pulmonary pediatric acute respiratory distress syndrome group. Patients in the extrapulmonary pediatric acute respiratory distress syndrome group had higher mortality (48.8% vs 24.8%; p = 0.002) and reduced ventilator-free days (median 2.0 d [interquartile range 0.0-18.0 d] vs 19.0 d [0.5-24.0 d]; p = 0.001) compared with the pulmonary pediatric acute respiratory distress syndrome group. After adjusting for site, severity of illness, comorbidities, multiple organ dysfunction, and severity of acute respiratory distress syndrome, extrapulmonary pediatric acute respiratory distress syndrome etiology was not associated with mortality (adjusted hazard ratio, 1.56 [95% CI, 0.90-2.71]).

    CONCLUSIONS: Patients with extrapulmonary pediatric acute respiratory distress syndrome were sicker and had poorer clinical outcomes. However, after adjusting for confounders, it was not an independent risk factor for mortality.

    Matched MeSH terms: Proportional Hazards Models
  5. Naudin S, Li K, Jaouen T, Assi N, Kyrø C, Tjønneland A, et al.
    Int J Cancer, 2018 Aug 15;143(4):801-812.
    PMID: 29524225 DOI: 10.1002/ijc.31367
    Recent evidence suggested a weak relationship between alcohol consumption and pancreatic cancer (PC) risk. In our study, the association between lifetime and baseline alcohol intakes and the risk of PC was evaluated, including the type of alcoholic beverages and potential interaction with smoking. Within the European Prospective Investigation into Cancer and Nutrition (EPIC) study, 1,283 incident PC (57% women) were diagnosed from 476,106 cancer-free participants, followed up for 14 years. Amounts of lifetime and baseline alcohol were estimated through lifestyle and dietary questionnaires, respectively. Cox proportional hazard models with age as primary time variable were used to estimate PC hazard ratios (HR) and their 95% confidence interval (CI). Alcohol intake was positively associated with PC risk in men. Associations were mainly driven by extreme alcohol levels, with HRs comparing heavy drinkers (>60 g/day) to the reference category (0.1-4.9 g/day) equal to 1.77 (95% CI: 1.06, 2.95) and 1.63 (95% CI: 1.16, 2.29) for lifetime and baseline alcohol, respectively. Baseline alcohol intakes from beer (>40 g/day) and spirits/liquors (>10 g/day) showed HRs equal to 1.58 (95% CI: 1.07, 2.34) and 1.41 (95% CI: 1.03, 1.94), respectively, compared to the reference category (0.1-2.9 g/day). In women, HR estimates did not reach statistically significance. The alcohol and PC risk association was not modified by smoking status. Findings from a large prospective study suggest that baseline and lifetime alcohol intakes were positively associated with PC risk, with more apparent risk estimates for beer and spirits/liquors than wine intake.
    Matched MeSH terms: Proportional Hazards Models
  6. Biccard BM, Scott DJA, Chan MTV, Archbold A, Wang CY, Sigamani A, et al.
    Ann Surg, 2018 08;268(2):357-363.
    PMID: 28486392 DOI: 10.1097/SLA.0000000000002290
    OBJECTIVE: To determine the prognostic relevance, clinical characteristics, and 30-day outcomes associated with myocardial injury after noncardiac surgery (MINS) in vascular surgical patients.

    BACKGROUND: MINS has been independently associated with 30-day mortality after noncardiac surgery. The characteristics and prognostic importance of MINS in vascular surgery patients are poorly described.

    METHODS: This was an international prospective cohort study of 15,102 noncardiac surgery patients 45 years or older, of whom 502 patients underwent vascular surgery. All patients had fourth-generation plasma troponin T (TnT) concentrations measured during the first 3 postoperative days. MINS was defined as a TnT of 0.03 ng/mL of higher secondary to ischemia. The objectives of the present study were to determine (i) if MINS is prognostically important in vascular surgical patients, (ii) the clinical characteristics of vascular surgery patients with and without MINS, (iii) the 30-day outcomes for vascular surgery patients with and without MINS, and (iv) the proportion of MINS that probably would have gone undetected without routine troponin monitoring.

    RESULTS: The incidence of MINS in the vascular surgery patients was 19.1% (95% confidence interval (CI), 15.7%-22.6%). 30-day all-cause mortality in the vascular cohort was 12.5% (95% CI 7.3%-20.6%) in patients with MINS compared with 1.5% (95% CI 0.7%-3.2%) in patients without MINS (P < 0.001). MINS was independently associated with 30-day mortality in vascular patients (odds ratio, 9.48; 95% CI, 3.46-25.96). The 30-day mortality was similar in MINS patients with (15.0%; 95% CI, 7.1-29.1) and without an ischemic feature (12.2%; 95% CI, 5.3-25.5, P = 0.76). The proportion of vascular surgery patients who suffered MINS without overt evidence of myocardial ischemia was 74.1% (95% CI, 63.6-82.4).

    CONCLUSIONS: Approximately 1 in 5 patients experienced MINS after vascular surgery. MINS was independently associated with 30-day mortality. The majority of patients with MINS were asymptomatic and would have gone undetected without routine postoperative troponin measurement.

    Matched MeSH terms: Proportional Hazards Models
  7. Raman P, Suliman NB, Zahari M, Kook M, Ramli N
    Eye (Lond), 2018 07;32(7):1183-1189.
    PMID: 29491486 DOI: 10.1038/s41433-018-0057-8
    OBJECTIVE: To assess the relationship between baseline intraocular pressure (IOP), blood pressure (BP) and ocular perfusion pressure (OPP), and the 5-year visual field progression in normal-tension glaucoma (NTG) patients.

    DESIGN: Prospective, longitudinal study.

    METHODS: Sixty-five NTG patients who were followed up for 5 years are included in this study. All the enrolled patients underwent baseline 24-h IOP and BP monitoring via 2-hourly measurements in their habitual position and were followed up for over 5 years with reliable VF tests. Modified Anderson criteria were used to assess VF progression. Univariable and multivariable analyses using Cox's proportional hazards model were used to identify the systemic and clinical risk factors that predict progression. Kaplan-Meier survival analyses were used to compare the time elapsed to confirmed VF progression in the presence or absence of each potential risk factor.

    RESULTS: At 5-year follow-up, 35.4% of the enrolled patients demonstrated visual field progression. There were statistically significant differences in the mean diastolic blood pressure (p  43.7 mmHg (log rank = 0.018).

    CONCLUSION: Diastolic parameters of BP and OPP were significantly lower in the NTG patients who progressed after 5 years. Low nocturnal DOPP is an independent predictor of glaucomatous visual field progression in NTG patients.

    Matched MeSH terms: Proportional Hazards Models
  8. Shehabi Y, Bellomo R, Kadiman S, Ti LK, Howe B, Reade MC, et al.
    Crit Care Med, 2018 06;46(6):850-859.
    PMID: 29498938 DOI: 10.1097/CCM.0000000000003071
    OBJECTIVES: In the absence of a universal definition of light or deep sedation, the level of sedation that conveys favorable outcomes is unknown. We quantified the relationship between escalating intensity of sedation in the first 48 hours of mechanical ventilation and 180-day survival, time to extubation, and delirium.

    DESIGN: Harmonized data from prospective multicenter international longitudinal cohort studies SETTING:: Diverse mix of ICUs.

    PATIENTS: Critically ill patients expected to be ventilated for longer than 24 hours.

    INTERVENTIONS: Richmond Agitation Sedation Scale and pain were assessed every 4 hours. Delirium and mobilization were assessed daily using the Confusion Assessment Method of ICU and a standardized mobility assessment, respectively.

    MEASUREMENTS AND MAIN RESULTS: Sedation intensity was assessed using a Sedation Index, calculated as the sum of negative Richmond Agitation Sedation Scale measurements divided by the total number of assessments. We used multivariable Cox proportional hazard models to adjust for relevant covariates. We performed subgroup and sensitivity analysis accounting for immortal time bias using the same variables within 120 and 168 hours. The main outcome was 180-day survival. We assessed 703 patients in 42 ICUs with a mean (SD) Acute Physiology and Chronic Health Evaluation II score of 22.2 (8.5) with 180-day mortality of 32.3% (227). The median (interquartile range) ventilation time was 4.54 days (2.47-8.43 d). Delirium occurred in 273 (38.8%) of patients. Sedation intensity, in an escalating dose-dependent relationship, independently predicted increased risk of death (hazard ratio [95% CI], 1.29 [1.15-1.46]; p < 0.001, delirium hazard ratio [95% CI], 1.25 [1.10-1.43]), p value equals to 0.001 and reduced chance of early extubation hazard ratio (95% CI) 0.80 (0.73-0.87), p value of less than 0.001. Agitation level independently predicted subsequent delirium hazard ratio [95% CI], of 1.25 (1.04-1.49), p value equals to 0.02. Delirium or mobilization episodes within 168 hours, adjusted for sedation intensity, were not associated with survival.

    CONCLUSIONS: Sedation intensity independently, in an ascending relationship, predicted increased risk of death, delirium, and delayed time to extubation. These observations suggest that keeping sedation level equivalent to a Richmond Agitation Sedation Scale 0 is a clinically desirable goal.

    Matched MeSH terms: Proportional Hazards Models
  9. Yeoh AE, Li Z, Dong D, Lu Y, Jiang N, Trka J, et al.
    Br J Haematol, 2018 Jun;181(5):653-663.
    PMID: 29808917 DOI: 10.1111/bjh.15252
    Accurate risk assignment in childhood acute lymphoblastic leukaemia is essential to avoid under- or over-treatment. We hypothesized that time-series gene expression profiles (GEPs) of bone marrow samples during remission-induction therapy can measure the response and be used for relapse prediction. We computed the time-series changes from diagnosis to Day 8 of remission-induction, termed Effective Response Metric (ERM-D8) and tested its ability to predict relapse against contemporary risk assignment methods, including National Cancer Institutes (NCI) criteria, genetics and minimal residual disease (MRD). ERM-D8 was trained on a set of 131 patients and validated on an independent set of 79 patients. In the independent blinded test set, unfavourable ERM-D8 patients had >3-fold increased risk of relapse compared to favourable ERM-D8 (5-year cumulative incidence of relapse 38·1% vs. 10·6%; P = 2·5 × 10-3 ). ERM-D8 remained predictive of relapse [P = 0·05; Hazard ratio 4·09, 95% confidence interval (CI) 1·03-16·23] after adjusting for NCI criteria, genetics, Day 8 peripheral response and Day 33 MRD. ERM-D8 improved risk stratification in favourable genetics subgroups (P = 0·01) and Day 33 MRD positive patients (P = 1·7 × 10-3 ). We conclude that our novel metric - ERM-D8 - based on time-series GEP after 8 days of remission-induction therapy can independently predict relapse even after adjusting for NCI risk, genetics, Day 8 peripheral blood response and MRD.
    Matched MeSH terms: Proportional Hazards Models
  10. Sathasivam HP, Davies GR, Boyd NM
    Head Neck, 2018 Jan;40(1):46-54.
    PMID: 29149496 DOI: 10.1002/hed.24907
    BACKGROUND: Osteoradionecrosis of the jaw (ORNJ) is a well-recognized complication of radiotherapy. The purpose of this study was to assess predictive factors for the development of ORNJ.

    METHODS: A retrospective study of 325 patients with head and neck squamous cell carcinoma (HNSCC) treated at one institution between January 1, 1999, and December 31, 2008, was conducted. Outcome measure was the presence/absence of ORNJ. Time to event was recorded and Cox proportional hazard regression analysis was used to determine statistically significant predictive factors.

    RESULTS: Fifty-nine patients had ORNJ. Statistical analysis using Cox regression analysis identified several statistically significant variables: dentoalveolar surgery; peri-resective surgery of the jaw; continued tobacco usage after radiotherapy, diabetes mellitus type 2 (DM2); and total radiation dose.

    CONCLUSION: Patients at greater risk of developing ORNJ can be identified and measures can be instituted to reduce its incidence and expedite management when it does occur.

    Matched MeSH terms: Proportional Hazards Models
  11. Law KB, Chang KM, Hamzah NA, Ng KH, Ong TC
    Indian J Hematol Blood Transfus, 2017 Dec;33(4):483-491.
    PMID: 29075058 DOI: 10.1007/s12288-017-0790-3
    The study aimed to investigate the effect of consolidation treatment with fludarabine, high-dose cytarabine and granulocyte colony-stimulating factor or FLAG in older AML patients. The study included 41 eligible patients above 54 years old, who received both induction and consolidation chemotherapy for AML from 2008 to 2013. The study cohort had a minimum 24 months follow-up period. Survival analysis was carried out to assess patients' overall survival and disease free survival based on types of consolidation regimens. The consolidation treatment with FLAG exerted a protective effect to both overall survival and disease free survival in older patients. Patients who were consolidated with FLAG regimen had a significant longer overall survival (log-rank, p = 0.0025) and disease free survival (log-rank, p = 0.0026). The median overall survival was longer (18.70 months) with the use of FLAG when compared to non-FLAG group (8.09 months). The median disease free survival was also longer (13.84 months) with use of FLAG when compared to the non-FLAG group (4.44 months). Regression analysis with Cox model yielded hazard ratio of 0.245 (p = 0.0094) in overall survival and 0.217 (p = 0.0068) in disease free survival. The use of FLAG as consolidation treatment was associated with approximately 60-80% reduction in hazard rates. The result was adjusted for age, race and gender in regression analysis. Older AML patients had longer remission and survival when consolidated with FLAG regimen after the induction chemotherapy.
    Matched MeSH terms: Proportional Hazards Models
  12. Miller V, Mente A, Dehghan M, Rangarajan S, Zhang X, Swaminathan S, et al.
    Lancet, 2017 Nov 04;390(10107):2037-2049.
    PMID: 28864331 DOI: 10.1016/S0140-6736(17)32253-5
    BACKGROUND: The association between intake of fruits, vegetables, and legumes with cardiovascular disease and deaths has been investigated extensively in Europe, the USA, Japan, and China, but little or no data are available from the Middle East, South America, Africa, or south Asia.

    METHODS: We did a prospective cohort study (Prospective Urban Rural Epidemiology [PURE] in 135 335 individuals aged 35 to 70 years without cardiovascular disease from 613 communities in 18 low-income, middle-income, and high-income countries in seven geographical regions: North America and Europe, South America, the Middle East, south Asia, China, southeast Asia, and Africa. We documented their diet using country-specific food frequency questionnaires at baseline. Standardised questionnaires were used to collect information about demographic factors, socioeconomic status (education, income, and employment), lifestyle (smoking, physical activity, and alcohol intake), health history and medication use, and family history of cardiovascular disease. The follow-up period varied based on the date when recruitment began at each site or country. The main clinical outcomes were major cardiovascular disease (defined as death from cardiovascular causes and non-fatal myocardial infarction, stroke, and heart failure), fatal and non-fatal myocardial infarction, fatal and non-fatal strokes, cardiovascular mortality, non-cardiovascular mortality, and total mortality. Cox frailty models with random effects were used to assess associations between fruit, vegetable, and legume consumption with risk of cardiovascular disease events and mortality.

    FINDINGS: Participants were enrolled into the study between Jan 1, 2003, and March 31, 2013. For the current analysis, we included all unrefuted outcome events in the PURE study database through March 31, 2017. Overall, combined mean fruit, vegetable and legume intake was 3·91 (SD 2·77) servings per day. During a median 7·4 years (5·5-9·3) of follow-up, 4784 major cardiovascular disease events, 1649 cardiovascular deaths, and 5796 total deaths were documented. Higher total fruit, vegetable, and legume intake was inversely associated with major cardiovascular disease, myocardial infarction, cardiovascular mortality, non-cardiovascular mortality, and total mortality in the models adjusted for age, sex, and centre (random effect). The estimates were substantially attenuated in the multivariable adjusted models for major cardiovascular disease (hazard ratio [HR] 0·90, 95% CI 0·74-1·10, ptrend=0·1301), myocardial infarction (0·99, 0·74-1·31; ptrend=0·2033), stroke (0·92, 0·67-1·25; ptrend=0·7092), cardiovascular mortality (0·73, 0·53-1·02; ptrend=0·0568), non-cardiovascular mortality (0·84, 0·68-1·04; ptrend =0·0038), and total mortality (0·81, 0·68-0·96; ptrend<0·0001). The HR for total mortality was lowest for three to four servings per day (0·78, 95% CI 0·69-0·88) compared with the reference group, with no further apparent decrease in HR with higher consumption. When examined separately, fruit intake was associated with lower risk of cardiovascular, non-cardiovascular, and total mortality, while legume intake was inversely associated with non-cardiovascular death and total mortality (in fully adjusted models). For vegetables, raw vegetable intake was strongly associated with a lower risk of total mortality, whereas cooked vegetable intake showed a modest benefit against mortality.

    INTERPRETATION: Higher fruit, vegetable, and legume consumption was associated with a lower risk of non-cardiovascular, and total mortality. Benefits appear to be maximum for both non-cardiovascular mortality and total mortality at three to four servings per day (equivalent to 375-500 g/day).

    FUNDING: Full funding sources listed at the end of the paper (see Acknowledgments).

    Matched MeSH terms: Proportional Hazards Models
  13. Balakrishnan N, Teo SH, Sinnadurai S, Bhoo Pathy NT, See MH, Taib NA, et al.
    World J Surg, 2017 11;41(11):2735-2745.
    PMID: 28653143 DOI: 10.1007/s00268-017-4081-9
    BACKGROUND: Reproductive factors are associated with risk of breast cancer, but the association with breast cancer survival is less well known. Previous studies have reported conflicting results on the association between time since last childbirth and breast cancer survival. We determined the association between time since last childbirth (LCB) and survival of women with premenopausal and postmenopausal breast cancers in Malaysia.

    METHOD: A historical cohort of 986 premenopausal, and 1123 postmenopausal, parous breast cancer patients diagnosed from 2001 to 2012 in University Malaya Medical Centre were included in the analyses. Time since LCB was categorized into quintiles. Multivariable Cox regression was used to determine whether time since LCB was associated with survival following breast cancer, adjusting for demographic, tumor, and treatment characteristics.

    RESULTS: Premenopausal breast cancer patients with the most recent childbirth (LCB quintile 1) were younger, more likely to present with unfavorable prognostic profiles and had the lowest 5-year overall survival (OS) (66.9; 95% CI 60.2-73.6%), compared to women with longer duration since LCB (quintile 2 thru 5). In univariable analysis, time since LCB was inversely associated with risk of mortality and the hazard ratio for LCB quintile 2, 3, 4, and 5 versus quintile 1 were 0.53 (95% CI 0.36-0.77), 0.49 (95% CI 0.33-0.75), 0.61 (95% CI 0.43-0.85), and 0.64 (95% CI 0.44-0.93), respectively; P trend = 0.016. However, this association was attenuated substantially following adjustment for age at diagnosis and other prognostic factors. Similarly, postmenopausal breast cancer patients with the most recent childbirth were also more likely to present with unfavorable disease profiles. Compared to postmenopausal breast cancer patients in LCB quintile 1, patients in quintile 5 had a higher risk of mortality. This association was not significant following multivariable adjustment.

    CONCLUSION: Time since LCB is not independently associated with survival in premenopausal or postmenopausal breast cancers. The apparent increase in risks of mortality in premenopausal breast cancer patients with a recent childbirth, and postmenopausal patients with longer duration since LCB, appear to be largely explained by their age at diagnosis.

    Matched MeSH terms: Proportional Hazards Models
  14. Bamia C, Orfanos P, Juerges H, Schöttker B, Brenner H, Lorbeer R, et al.
    Maturitas, 2017 Sep;103:37-44.
    PMID: 28778331 DOI: 10.1016/j.maturitas.2017.06.023
    OBJECTIVES: To evaluate, among the elderly, the association of self-rated health (SRH) with mortality, and to identify determinants of self-rating health as "at-least-good".

    STUDY DESIGN: Individual data on SRH and important covariates were obtained for 424,791 European and United States residents, ≥60 years at recruitment (1982-2008), in eight prospective studies in the Consortium on Health and Ageing: Network of Cohorts in Europe and the United States (CHANCES). In each study, adjusted mortality ratios (hazard ratios, HRs) in relation to SRH were calculated and subsequently combined with random-effect meta-analyses.

    MAIN OUTCOME MEASURES: All-cause, cardiovascular and cancer mortality.

    RESULTS: Within the median 12.5 years of follow-up, 93,014 (22%) deaths occurred. SRH "fair" or "poor" vs. "at-least-good" was associated with increased mortality: HRs 1.46 (95% CI 1·23-1.74) and 2.31 (1.79-2.99), respectively. These associations were evident: for cardiovascular and, to a lesser extent, cancer mortality, and within-study, within-subgroup analyses. Accounting for lifestyle, sociodemographic, somatometric factors and, subsequently, for medical history explained only a modest amount of the unadjusted associations. Factors favourably associated with SRH were: sex (males), age (younger-old), education (high), marital status (married/cohabiting), physical activity (active), body mass index (non-obese), alcohol consumption (low to moderate) and previous morbidity (absence).

    CONCLUSION: SRH provides a quick and simple tool for assessing health and identifying groups of elders at risk of early mortality that may be useful also in clinical settings. Modifying determinants of favourably rating health, e.g. by increasing physical activity and/or by eliminating obesity, may be important for older adults to "feel healthy" and "be healthy".

    Matched MeSH terms: Proportional Hazards Models
  15. Perez-Cornago A, Appleby PN, Pischon T, Tsilidis KK, Tjønneland A, Olsen A, et al.
    BMC Med, 2017 07 13;15(1):115.
    PMID: 28701188 DOI: 10.1186/s12916-017-0876-7
    BACKGROUND: The relationship between body size and prostate cancer risk, and in particular risk by tumour characteristics, is not clear because most studies have not differentiated between high-grade or advanced stage tumours, but rather have assessed risk with a combined category of aggressive disease. We investigated the association of height and adiposity with incidence of and death from prostate cancer in 141,896 men in the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort.

    METHODS: Multivariable-adjusted Cox proportional hazards models were used to calculate hazard ratios (HRs) and 95% confidence intervals (CIs). After an average of 13.9 years of follow-up, there were 7024 incident prostate cancers and 934 prostate cancer deaths.

    RESULTS: Height was not associated with total prostate cancer risk. Subgroup analyses showed heterogeneity in the association with height by tumour grade (P heterogeneity = 0.002), with a positive association with risk for high-grade but not low-intermediate-grade disease (HR for high-grade disease tallest versus shortest fifth of height, 1.54; 95% CI, 1.18-2.03). Greater height was also associated with a higher risk for prostate cancer death (HR = 1.43, 1.14-1.80). Body mass index (BMI) was significantly inversely associated with total prostate cancer, but there was evidence of heterogeneity by tumour grade (P heterogeneity = 0.01; HR = 0.89, 0.79-0.99 for low-intermediate grade and HR = 1.32, 1.01-1.72 for high-grade prostate cancer) and stage (P heterogeneity = 0.01; HR = 0.86, 0.75-0.99 for localised stage and HR = 1.11, 0.92-1.33 for advanced stage). BMI was positively associated with prostate cancer death (HR = 1.35, 1.09-1.68). The results for waist circumference were generally similar to those for BMI, but the associations were slightly stronger for high-grade (HR = 1.43, 1.07-1.92) and fatal prostate cancer (HR = 1.55, 1.23-1.96).

    CONCLUSIONS: The findings from this large prospective study show that men who are taller and who have greater adiposity have an elevated risk of high-grade prostate cancer and prostate cancer death.

    Matched MeSH terms: Proportional Hazards Models
  16. Kosalaraksa P, Boettiger DC, Bunupuradah T, Hansudewechakul R, Saramony S, Do VC, et al.
    J Pediatric Infect Dis Soc, 2017 Jun 01;6(2):173-177.
    PMID: 27295973 DOI: 10.1093/jpids/piw031
    Background.: Regular CD4 count testing is often used to monitor antiretroviral therapy efficacy. However, this practice may be redundant in children with a suppressed human immunodeficiency virus (HIV) viral load.

    Methods: Study end points were as follows: (1) a CD4 count <200 cells/mm3 followed by a CD4 count ≥200 cells/mm3 (transient CD4 <200); (2) CD4 count <200 cells/mm3 confirmed within 6 months (confirmed CD4 <200); and (3) a new or recurrent World Health Organization (WHO) stage 3 or 4 illness (clinical failure). Kaplan-Meier curves and Cox regression were used to evaluate rates and predictors of transient CD4 <200, confirmed CD4 <200, and clinical failure among virally suppressed children aged 5-15 years who were enrolled in the TREAT Asia Pediatric HIV Observational Database.

    Results: Data from 967 children were included in the analysis. At the time of confirmed viral suppression, median age was 10.2 years, 50.4% of children were female, and 95.4% were perinatally infected with HIV. Median CD4 cell count was 837 cells/mm3, and 54.8% of children were classified as having WHO stage 3 or 4 disease. In total, 18 transient CD4 <200 events, 2 confirmed CD4 <200 events, and10 clinical failures occurred at rates of 0.73 (95% confidence interval [95% CI], 0.46-1.16), 0.08 (95% CI, 0.02-0.32), and 0.40 (95% CI, 0.22-0.75) events per 100 patient-years, respectively. CD4 <500 cells/mm3 at the time of viral suppression confirmation was associated with higher rates of both CD4 outcomes.

    Conclusions: Regular CD4 testing may be unnecessary for virally suppressed children aged 5-15 years with CD4 ≥500 cells/mm3.

    Matched MeSH terms: Proportional Hazards Models
  17. Magaji BA, Moy FM, Roslani AC, Law CW
    BMC Cancer, 2017 05 18;17(1):339.
    PMID: 28521746 DOI: 10.1186/s12885-017-3336-z
    BACKGROUND: Colorectal cancer is the third most commonly diagnosed malignancy and the fourth leading cause of cancer-related death globally. It is the second most common cancer among both males and females in Malaysia. The economic burden of colorectal cancer is likely to increase over time owing to its current trend and aging population. Cancer survival analysis is an essential indicator for early detection and improvement in cancer treatment. However, there was a scarcity of studies concerning survival of colorectal cancer patients as well as its predictors. Therefore, we aimed to determine the 1-, 3- and 5-year survival rates, compare survival rates among ethnic groups and determine the predictors of survival among colorectal cancer patients.
    METHODS: This was an ambidirectional cohort study conducted at the University Malaya Medical Centre (UMMC) in Kuala Lumpur, Malaysia. All Malaysian citizens or permanent residents with histologically confirmed diagnosis of colorectal cancer seen at UMMC from 1 January 2001 to 31 December 2010 were included in the study. Demographic and clinical characteristics were extracted from the medical records. Patients were followed-up until death or censored at the end of the study (31st December 2010). Censored patients' vital status (whether alive or dead) were cross checked with the National Registration Department. Survival analyses at 1-, 3- and 5-year intervals were performed using the Kaplan-Meier method. Log-rank test was used to compare the survival rates, while Cox proportional hazard regression analysis was carried out to determine the predictors of 5-year colorectal cancer survival.
    RESULTS: Among 1212 patients, the median survival for colorectal, colon and rectal cancers were 42.0, 42.0 and 41.0 months respectively; while the 1-, 3-, and 5-year relative survival rates ranged from 73.8 to 76.0%, 52.1 to 53.7% and 40.4 to 45.4% respectively. The Chinese patients had the lowest 5-year survival compared to Malay and Indian patients. Based on the 814 patients with data on their Duke's staging, independent predictors of poor colorectal cancer (5-year) survival were male sex (Hazard Ratio [HR]: 1.41; 95% CI: 1.12, 1.76), Chinese ethnicity (HR: 1.41; 95% CI: 1.07,1.85), elevated (≥ 5.1 ng/ml) pre-operative carcino-embryonic antigen (CEA) level (HR: 2.13; 95% CI: 1.60, 2.83), Duke's stage C (HR: 1.68; 95% CI: 1.28, 2.21), Duke's stage D (HR: 4.61; 95% CI: 3.39, 6.28) and emergency surgery (HR: 1.52; 95% CI: 1.07, 2.15).
    CONCLUSIONS: The survival rates of colorectal cancer among our patients were comparable with those of some Asian countries but lower than those found in more developed countries. Males and patients from the Chinese ethnic group had lower survival rates compared to their counterparts. More advanced staging and late presentation were important predictors of colorectal cancer survival. Health education programs targeting high risk groups and emphasizing the importance of screening and early diagnosis, as well as the recognition of symptoms and risk factors should be implemented. A nationwide colorectal cancer screening program should be designed and implemented to increase early detection and improve survival outcomes.
    Matched MeSH terms: Proportional Hazards Models
  18. Md Ralib A, Mat Nor MB, Pickering JW
    Nephrology (Carlton), 2017 May;22(5):412-419.
    PMID: 27062515 DOI: 10.1111/nep.12796
    AIM: Sepsis is the leading cause of intensive care unit (ICU) admission. Plasma Neutrophil Gelatinase Associated-Lipocalin (NGAL) is a promising biomarker for acute kidney injury (AKI) detection; however, it is also increased with inflammation and few studies have been conducted in non-Caucasian populations and/or in developing economies. Therefore, we evaluated plasma NGAL's diagnostic performance in the presence of sepsis and systemic inflammatory response syndrome (SIRS) in a Malaysian ICU cohort.

    METHODS: This is a prospective observational study on patients with SIRS. Plasma creatinine (pCr) and NGAL were measured on ICU admission. Patients were classified according to the occurrence of AKI and sepsis.

    RESULTS: Of 225 patients recruited, 129 (57%) had sepsis of whom 67 (52%) also had AKI. 96 patients (43%) had non-infectious SIRS, of whom 20 (21%) also had AKI. NGAL concentrations were higher in AKI patients within both the sepsis and non-infectious SIRS cohorts (both P 

    Matched MeSH terms: Proportional Hazards Models
  19. Kong YC, Bhoo-Pathy N, Subramaniam S, Bhoo-Pathy N, Taib NA, Jamaris S, et al.
    PMID: 28420149 DOI: 10.3390/ijerph14040427
    Background: Survival disparities in cancer are known to occur between public and private hospitals. We compared breast cancer presentation, treatment and survival between a public academic hospital and a private hospital in a middle-income country. Methods: The demographics, clinical characteristics, treatment and overall survival (OS) of 2767 patients with invasive breast carcinoma diagnosed between 2001 and 2011 in the public hospital were compared with 1199 patients from the private hospital. Results: Compared to patients in the private hospital, patients from the public hospital were older at presentation, and had more advanced cancer stages. They were also more likely to receive mastectomy and chemotherapy but less radiotherapy. The five-year OS in public patients was significantly lower than in private patients (71.6% vs. 86.8%). This difference was largely attributed to discrepancies in stage at diagnosis and, although to a much smaller extent, to demographic differences and treatment disparities. Even following adjustment for these factors, patients in the public hospital remained at increased risk of mortality compared to their counterparts in the private hospital (Hazard Ratio: 1.59; 95% Confidence Interval: 1.36-1.85). Conclusion: Late stage at diagnosis appears to be a major contributing factor explaining the breast cancer survival disparity between public and private patients in this middle-income setting.
    Matched MeSH terms: Proportional Hazards Models
  20. Bonsu KO, Owusu IK, Buabeng KO, Reidpath DD, Kadirvelu A
    J Am Heart Assoc, 2017 Apr 01;6(4).
    PMID: 28365564 DOI: 10.1161/JAHA.116.004706
    BACKGROUND: Randomized control trials of statins have not demonstrated significant benefits in outcomes of heart failure (HF). However, randomized control trials may not always be generalizable. The aim was to determine whether statin and statin type-lipophilic or -hydrophilic improve long-term outcomes in Africans with HF.

    METHODS AND RESULTS: This was a retrospective longitudinal study of HF patients aged ≥18 years hospitalized at a tertiary healthcare center between January 1, 2009 and December 31, 2013 in Ghana. Patients were eligible if they were discharged from first admission for HF (index admission) and followed up to time of all-cause, cardiovascular, and HF mortality or end of study. Multivariable time-dependent Cox model and inverse-probability-of-treatment weighting of marginal structural model were used to estimate associations between statin treatment and outcomes. Adjusted hazard ratios were also estimated for lipophilic and hydrophilic statin compared with no statin use. The study included 1488 patients (mean age 60.3±14.2 years) with 9306 person-years of observation. Using the time-dependent Cox model, the 5-year adjusted hazard ratios with 95% CI for statin treatment on all-cause, cardiovascular, and HF mortality were 0.68 (0.55-0.83), 0.67 (0.54-0.82), and 0.63 (0.51-0.79), respectively. Use of inverse-probability-of-treatment weighting resulted in estimates of 0.79 (0.65-0.96), 0.77 (0.63-0.96), and 0.77 (0.61-0.95) for statin treatment on all-cause, cardiovascular, and HF mortality, respectively, compared with no statin use.

    CONCLUSIONS: Among Africans with HF, statin treatment was associated with significant reduction in mortality.

    Matched MeSH terms: Proportional Hazards Models
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links