METHODS: Blips were defined as detectable VL (≥ 50 copies/mL) preceded and followed by undetectable VL (<50 copies/mL). Virological failure (VF) was defined as two consecutive VL ≥50 copies/ml. Cox proportional hazard models of time to first VF after entry, were developed.
RESULTS: 5040 patients (AHOD n = 2597 and TAHOD n = 2521) were included; 910 (18%) of patients experienced blips. 744 (21%) and 166 (11%) of high- and middle/low-income participants, respectively, experienced blips ever. 711 (14%) experienced blips prior to virological failure. 559 (16%) and 152 (10%) of high- and middle/low-income participants, respectively, experienced blips prior to virological failure. VL testing occurred at a median frequency of 175 and 91 days in middle/low- and high-income sites, respectively. Longer time to VF occurred in middle/low income sites, compared with high-income sites (adjusted hazards ratio (AHR) 0.41; p<0.001), adjusted for year of first cART, Hepatitis C co-infection, cART regimen, and prior blips. Prior blips were not a significant predictor of VF in univariate analysis (AHR 0.97, p = 0.82). Differing magnitudes of blips were not significant in univariate analyses as predictors of virological failure (p = 0.360 for blip 50-≤1000, p = 0.309 for blip 50-≤400 and p = 0.300 for blip 50-≤200). 209 of 866 (24%) patients were switched to an alternate regimen in the setting of a blip.
CONCLUSION: Despite a lower proportion of blips occurring in low/middle-income settings, no significant difference was found between settings. Nonetheless, a substantial number of participants were switched to alternative regimens in the setting of blips.
METHODS: Data linkages with the national death registry or national HIV database were conducted in 2020 on all PLHIV who met LTFU criteria while enrolled in care at participating HIV clinical sites. LTFU was defined as having no documented clinical contact in the previous year, excluding transfers and deaths. Survival time was analyzed using the Cox regression, stratified by site.
RESULTS: Data linkages were performed for 489 PLHIV who had been LTFU at sites in Malaysia (n = 2) and Thailand (n = 4). There were 151 (31%) deaths after being LTFU; the mortality rate was 4.89 per 100 person-years. Risk factors for mortality after being LTFU were older age [41-50 years: hazard ratio (HR) = 1.99, 95% confidence interval (CI): 1.08 to 3.68; and older than 50 years: HR = 4.93, 95% CI: 2.63 to 9.22; vs. age 30 years or younger]; receiving NRTI + PI (HR = 1.87, 95% CI: 1.22 to 2.85 vs. NRTI + NNRTI); positive hepatitis C antibody (HR = 2.25, 95% CI: 1.40 to 3.62); and having previous AIDS illness (HR = 1.45, 95% CI: 1.03 to 2.05). An improved survival was seen with a higher CD4 count (CD4 351-500 cells/µL: HR = 0.40, 95%CI: 0.21-0.76; and CD4 >500 cells/µL: HR = 0.43, 95%CI: 0.25-0.75; vs. CD4 ≤200 cells/µL).
CONCLUSIONS: Almost one-third of PLHIV who were LTFU in this cohort had died while out of care, emphasizing the importance of efforts to reengage PLHIV after they have been LTFU and ensure they have access to ongoing ART.
METHODS: HIV+ patients from the Australian HIV Observational Database (AHOD) and the TREAT Asia HIV Observational Database (TAHOD) meeting specific criteria were included. In these analyses Asian and Caucasian status were defined by cohort. Factors associated with a low CD4:CD8 ratio (cutoff <0.2) prior to ART commencement, and with achieving a normal CD4:CD8 ratio (>1) at 12 and 24 months post ART commencement were assessed using logistic regression.
RESULTS: There were 591 patients from AHOD and 2,620 patients from TAHOD who met the inclusion criteria. TAHOD patients had a significantly (P<0.001) lower odds of having a baseline (prior to ART initiation) CD4:CD8 ratio greater than 0.2. After 12 months of ART, AHOD patients were more than twice as likely to achieve a normal CD4:CD8 ratio compared to TAHOD patients (15% versus 6%). However, after adjustment for confounding factors there was no significant difference between cohorts in the odds of achieving a CD4:CD8 ratio >1 (P=0.475).
CONCLUSIONS: We found a significantly lower CD4:CD8 ratio prior to commencing ART in TAHOD compared to AHOD even after adjusting for confounders. However, after adjustment, there was no significant difference between the cohorts in odds of achieving normal ratio. Baseline CD4+ and CD8+ counts seem to be the main driver for this difference between these two populations.
METHODS: HIV-positive patients enrolled in the TREAT Asia HIV Observational Database who had used second-line ART for ≥6 months were included. ART use and rates and predictors of second-line treatment failure were evaluated.
RESULTS: There were 302 eligible patients. Most were male (76.5%) and exposed to HIV via heterosexual contact (71.5%). Median age at second-line initiation was 39.2 years, median CD4 cell count was 146 cells per cubic millimeter, and median HIV viral load was 16,224 copies per milliliter. Patients started second-line ART before 2007 (n = 105), 2007-2010 (n = 147) and after 2010 (n = 50). Ritonavir-boosted lopinavir and atazanavir accounted for the majority of protease inhibitor use after 2006. Median follow-up time on second-line therapy was 2.3 years. The rates of treatment failure and mortality per 100 patient/years were 8.8 (95% confidence interval: 7.1 to 10.9) and 1.1 (95% confidence interval: 0.6 to 1.9), respectively. Older age, high baseline viral load, and use of a protease inhibitor other than lopinavir or atazanavir were associated with a significantly shorter time to second-line failure.
CONCLUSIONS: Increased access to viral load monitoring to facilitate early detection of first-line ART failure and subsequent treatment switch is important for maximizing the durability of second-line therapy in Asia. Although second-line ART is highly effective in the region, the reported rate of failure emphasizes the need for third-line ART in a small portion of patients.
METHODS: Adults > 18 years of age on second-line ART for ≥ 6 months were eligible. Cross-sectional data on HIV viral load (VL) and genotypic resistance testing were collected or testing was conducted between July 2015 and May 2017 at 12 Asia-Pacific sites. Virological failure (VF) was defined as VL > 1000 copies/mL with a second VL > 1000 copies/mL within 3-6 months. FASTA files were submitted to Stanford University HIV Drug Resistance Database and RAMs were compared against the IAS-USA 2019 mutations list. VF risk factors were analysed using logistic regression.
RESULTS: Of 1378 patients, 74% were male and 70% acquired HIV through heterosexual exposure. At second-line switch, median [interquartile range (IQR)] age was 37 (32-42) years and median (IQR) CD4 count was 103 (43.5-229.5) cells/µL; 93% received regimens with boosted protease inhibitors (PIs). Median duration on second line was 3 years. Among 101 patients (7%) with VF, CD4 count > 200 cells/µL at switch [odds ratio (OR) = 0.36, 95% confidence interval (CI): 0.17-0.77 vs. CD4 ≤ 50) and HIV exposure through male-male sex (OR = 0.32, 95% CI: 0.17-0.64 vs. heterosexual) or injecting drug use (OR = 0.24, 95% CI: 0.12-0.49) were associated with reduced VF. Of 41 (41%) patients with resistance data, 80% had at least one RAM to nonnucleoside reverse transcriptase inhibitors (NNRTIs), 63% to NRTIs, and 35% to PIs. Of those with PI RAMs, 71% had two or more.
CONCLUSIONS: There were low proportions with VF and significant RAMs in our cohort, reflecting the durability of current second-line regimens.
OBJECTIVE: To assess the socio-economic determinants of TB in HIV-infected patients in Asia.
DESIGN: This was a matched case-control study. HIV-positive, TB-positive cases were matched to HIV-positive, TB-negative controls according to age, sex and CD4 cell count. A socio-economic questionnaire comprising 23 questions, including education level, employment, housing and substance use, was distributed. Socio-economic risk factors for TB were analysed using conditional logistic regression analysis.
RESULTS: A total of 340 patients (170 matched pairs) were recruited, with 262 (77.1%) matched for all three criteria. Pulmonary TB was the predominant type (n = 115, 67.6%). The main risk factor for TB was not having a university level education (OR 4.45, 95%CI 1.50-13.17, P = 0.007). Burning wood or coal regularly inside the house and living in the same place of origin were weakly associated with TB diagnosis.
CONCLUSIONS: These data suggest that lower socio-economic status is associated with an increased risk of TB in Asia. Integrating clinical and socio-economic factors into HIV treatment may help in the prevention of opportunistic infections and disease progression.
METHODS: We did a cohort analysis of TB cases in SECOND-LINE. TB cases included any clinical or laboratory-confirmed diagnoses and/or commencement of treatment for TB after randomization. Baseline factors associated with TB were analyzed using Cox regression stratified by site.
RESULTS: TB cases occurred at sites in Argentina, India, Malaysia, Nigeria, South Africa, and Thailand, in a cohort of 355 of the 541 SECOND-LINE participants. Overall, 20 cases of TB occurred, an incidence rate of 3.4 per 100 person-years (95% CI: 2.1 to 5.1). Increased TB risk was associated with a low CD4+-cell count (≤200 cells/μL), high viral load (>200 copies/mL), low platelet count (<150 ×109/L), and low total serum cholesterol (≤4.5 mmol/L) at baseline. An increased risk of death was associated with TB, adjusted for CD4, platelets, and cholesterol. A low CD4+-cell count was significantly associated with incident TB, mortality, other AIDS diagnoses, and virologic failure.
DISCUSSION: The risk of TB remains elevated in PLHIV in the setting of second-line HIV therapy in TB endemic regions. TB was associated with a greater risk of death. Finding that low CD4+ T-cell count was significantly associated with poor outcomes in this population supports the value of CD4+ monitoring in HIV clinical management.
METHODS: Prospectively collected longitudinal data from patients in Thailand, Hong Kong, Malaysia, Japan, Taiwan, and South Korea were provided for analysis. Covariates included demographics, hepatitis B and C coinfections, baseline CD4 T lymphocyte count, and plasma HIV-1 RNA levels. Clinical deterioration (a new diagnosis of Centers for Disease Control and Prevention category B/AIDS-defining illness or death) was assessed by proportional hazards models. Surrogate endpoints were 12-month change in CD4 cell count and virologic suppression post therapy, evaluated by linear and logistic regression, respectively.
RESULTS: Of 1105 patients, 1036 (93.8%) infected with CRF01_AE or subtype B were eligible for inclusion in clinical deterioration analyses and contributed 1546.7 person-years of follow-up (median: 413 days, interquartile range: 169-672 days). Patients >40 years demonstrated smaller immunological increases (P = 0.002) and higher risk of clinical deterioration (hazard ratio = 2.17; P = 0.008). Patients with baseline CD4 cell counts >200 cells per microliter had lower risk of clinical deterioration (hazard ratio = 0.373; P = 0.003). A total of 532 patients (48.1% of eligible) had CD4 counts available at baseline and 12 months post therapy for inclusion in immunolgic analyses. Patients infected with subtype B had larger increases in CD4 counts at 12 months (P = 0.024). A total of 530 patients (48.0% of eligible) were included in virological analyses with no differences in response found between genotypes.
CONCLUSIONS: Results suggest that patients infected with CRF01_AE have reduced immunologic response to therapy at 12 months, compared with subtype B-infected counterparts. Clinical deterioration was associated with low baseline CD4 counts and older age. The lack of differences in virologic outcomes suggests that all patients have opportunities for virological suppression.
METHODS: The study population consisted of HIV-infected patients enrolled in the TREAT Asia HIV Observational Database (TAHOD). Individuals were included in this analysis if they started combination antiretroviral treatment (cART) after 2002, were being treated at a centre that documented a median rate of viral load monitoring ≥0.8 tests/patient/year among TAHOD enrolees, and experienced a minor or major treatment substitution while on virally suppressive cART. The primary endpoint to evaluate outcomes was clinical or virological failure (VF), followed by an ART class change. Clinical failure was defined as death or an AIDS diagnosis. VF was defined as confirmed viral load measurements ≥400 copies/mL followed by an ART class change within six months. Minor regimen substitutions were defined as within-class changes and major regimen substitutions were defined as changes to a drug class. The patterns of substitutions and rate of clinical or VF after substitutions were analyzed.
RESULTS: Of 3994 adults who started ART after 2002, 3119 (78.1%) had at least one period of virological suppression. Among these, 1170 (37.5%) underwent a minor regimen substitution, and 296 (9.5%) underwent a major regimen substitution during suppression. The rates of clinical or VF were 1.48/100 person years (95% CI 1.14 to 1.91) in the minor substitution group, 2.85/100 person years (95% CI 1.88 to 4.33) in the major substitution group and 2.53/100 person years (95% CI 2.20 to 2.92) among patients that did not undergo a treatment substitution.
CONCLUSIONS: The rate of clinical or VF was low in both major and minor substitution groups, showing that regimen substitution is generally effective in non-clinical trial settings in Asian countries.
METHODS: Long-term LTFU was defined as LTFU occurring after 5 years on ART. LTFU was defined as (1) patients not seen in the previous 12 months; and (2) patients not seen in the previous 6 months. Factors associated with LTFU were analysed using competing risk regression.
RESULTS: Under the 12-month definition, the LTFU rate was 2.0 per 100 person-years (PY) [95% confidence interval (CI) 1.8-2.2 among 4889 patients included in the study. LTFU was associated with age > 50 years [sub-hazard ratio (SHR) 1.64; 95% CI 1.17-2.31] compared with 31-40 years, viral load ≥ 1000 copies/mL (SHR 1.86; 95% CI 1.16-2.97) compared with viral load < 1000 copies/mL, and hepatitis C coinfection (SHR 1.48; 95% CI 1.06-2.05). LTFU was less likely to occur in females, in individuals with higher CD4 counts, in those with self-reported adherence ≥ 95%, and in those living in high-income countries. The 6-month LTFU definition produced an incidence rate of 3.2 per 100 PY (95% CI 2.9-3.4 and had similar associations but with greater risks of LTFU for ART initiation in later years (2006-2009: SHR 2.38; 95% CI 1.93-2.94; and 2010-2011: SHR 4.26; 95% CI 3.17-5.73) compared with 2003-2005.
CONCLUSIONS: The long-term LTFU rate in our cohort was low, with older age being associated with LTFU. The increased risk of LTFU with later years of ART initiation in the 6-month analysis, but not the 12-month analysis, implies that there was a possible move towards longer HIV clinic scheduling in Asia.
DESIGN: Death-related data were retrospectively and prospectively assessed in a longitudinal regional cohort study.
METHODS: Children under routine HIV care at sites in Cambodia, India, Indonesia, Malaysia, Thailand, and Vietnam between 2008 and 2017 were followed. Causes of death were reported and then independently and centrally reviewed. Predictors were compared using competing risks survival regression analyses.
RESULTS: Among 5918 children, 5523 (93%; 52% male) had ever been on combination antiretroviral therapy. Of 371 (6.3%) deaths, 312 (84%) occurred in those with a history of combination antiretroviral therapy (crude all-cause mortality 9.6 per 1000 person-years; total follow-up time 32 361 person-years). In this group, median age at death was 7.0 (2.9-13) years; median CD4 cell count was 73 (16-325) cells/μl. The most common underlying causes of death were pneumonia due to unspecified pathogens (17%), tuberculosis (16%), sepsis (8.0%), and AIDS (6.7%); 12% of causes were unknown. These clinical diagnoses were further grouped into AIDS-related infections (22%) and noninfections (5.8%), and non-AIDS-related infections (47%) and noninfections (11%); with 12% unknown, 2.2% not reviewed. Higher CD4 cell count and better weight-for-age z-score were protective against death.
CONCLUSION: Our standardized cause of death assessment provides robust data to inform regional resource allocation for pediatric diagnostic evaluations and prioritization of clinical interventions, and highlight the continued importance of opportunistic and nonopportunistic infections as causes of death in our cohort.