METHODS: We investigated serum creatinine (S-Cr) monitoring rates before and during ART and the incidence and prevalence of renal dysfunction after starting TDF by using data from a regional cohort of HIV-infected individuals in the Asia-Pacific. Time to renal dysfunction was defined as time from TDF initiation to the decline in estimated glomerular filtration rate (eGFR) to <60 ml/min/1.73m2 with >30% reduction from baseline using the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation or the decision to stop TDF for reported TDF-nephrotoxicity. Predictors of S-Cr monitoring rates were assessed by Poisson regression and risk factors for developing renal dysfunction were assessed by Cox regression.
RESULTS: Among 2,425 patients who received TDF, S-Cr monitoring rates increased from 1.01 to 1.84 per person per year after starting TDF (incidence rate ratio 1.68, 95%CI 1.62-1.74, p <0.001). Renal dysfunction on TDF occurred in 103 patients over 5,368 person-years of TDF use (4.2%; incidence 1.75 per 100 person-years). Risk factors for developing renal dysfunction included older age (>50 vs. ≤30, hazard ratio [HR] 5.39, 95%CI 2.52-11.50, p <0.001; and using PI-based regimen (HR 1.93, 95%CI 1.22-3.07, p = 0.005). Having an eGFR prior to TDF (pre-TDF eGFR) of ≥60 ml/min/1.73m2 showed a protective effect (HR 0.38, 95%CI, 0.17-0.85, p = 0.018).
CONCLUSIONS: Renal dysfunction on commencing TDF use was not common, however, older age, lower baseline eGFR and PI-based ART were associated with higher risk of renal dysfunction during TDF use in adult HIV-infected individuals in the Asia-Pacific region.
METHODS: PLHIV enrolled in the Therapeutics, Research, Education and AIDS Training in Asia (TREAT Asia) HIV Observational Database (TAHOD) who initiated ART with a CD4 count 1 year were censored at 12 months. Competing risk regression was used to analyse risk factors with loss to follow-up as a competing risk.
RESULTS: A total of 1813 PLHIV were included in the study, of whom 74% were male. With 73 (4%) deaths, the overall first-year mortality rate was 4.27 per 100 person-years (PY). Thirty-eight deaths (52%) were AIDS-related, 10 (14%) were immune reconstituted inflammatory syndrome (IRIS)-related, 13 (18%) were non-AIDS-related and 12 (16%) had an unknown cause. Risk factors included having a body mass index (BMI) 100 cells/μL: SHR 0.12; 95% CI 0.05-0.26) was associated with reduced hazard for mortality compared to CD4 count ≤ 25 cells/μL.
CONCLUSIONS: Fifty-two per cent of early deaths were AIDS-related. Efforts to initiate ART at CD4 counts > 50 cell/μL are associated with improved short-term survival rates, even in those with late stages of HIV disease.
METHODS: Adults > 18 years of age on second-line ART for ≥ 6 months were eligible. Cross-sectional data on HIV viral load (VL) and genotypic resistance testing were collected or testing was conducted between July 2015 and May 2017 at 12 Asia-Pacific sites. Virological failure (VF) was defined as VL > 1000 copies/mL with a second VL > 1000 copies/mL within 3-6 months. FASTA files were submitted to Stanford University HIV Drug Resistance Database and RAMs were compared against the IAS-USA 2019 mutations list. VF risk factors were analysed using logistic regression.
RESULTS: Of 1378 patients, 74% were male and 70% acquired HIV through heterosexual exposure. At second-line switch, median [interquartile range (IQR)] age was 37 (32-42) years and median (IQR) CD4 count was 103 (43.5-229.5) cells/µL; 93% received regimens with boosted protease inhibitors (PIs). Median duration on second line was 3 years. Among 101 patients (7%) with VF, CD4 count > 200 cells/µL at switch [odds ratio (OR) = 0.36, 95% confidence interval (CI): 0.17-0.77 vs. CD4 ≤ 50) and HIV exposure through male-male sex (OR = 0.32, 95% CI: 0.17-0.64 vs. heterosexual) or injecting drug use (OR = 0.24, 95% CI: 0.12-0.49) were associated with reduced VF. Of 41 (41%) patients with resistance data, 80% had at least one RAM to nonnucleoside reverse transcriptase inhibitors (NNRTIs), 63% to NRTIs, and 35% to PIs. Of those with PI RAMs, 71% had two or more.
CONCLUSIONS: There were low proportions with VF and significant RAMs in our cohort, reflecting the durability of current second-line regimens.
METHODS: This cross-sectional study recruited adult PWH during routine follow-up at five HIV clinical sites in the Asia-Pacific region. Participants were screened for depression using Patient Health Questionnaire-9 and SU using Alcohol, Smoking, and Substance Involvement Screening Test (ASSIST). Quality of life (QoL) was assessed with WHOQOL-HIV BREF and functional ability with World Health Organization Disability Assessment Schedule 2.0 (WHODAS 2.0). Factors associated with mean QoL and disability scores were analysed using linear regression.
RESULTS: Of 864 PWH enrolled, 753 screened positive for depression or SU. The median (interquartile range, IQR) age was 38 (31-47) years and 97% were on ART. Overall mean WHOQOL-HIV BREF and WHODAS scores indicated greater impairment with increasing depressive symptom severity and SU risk. In multivariate analysis, PWH reporting previous trauma/stress (difference = 2.7, 95% confidence interval [CI] 1.5-3.9, P
METHODS: Factors associated with survival and failure were analyzed using Cox proportional hazards and discrete time conditional logistic models.
RESULTS: TDR, found in 60 (4.1%) of 1471 Asian treatment-naive patients, was one of the significant predictors of failure. Patients with TDR to >1 drug in their regimen were >3 times as likely to fail compared to no TDR.
CONCLUSIONS: TDR was associated with failure in the context of non-fully sensitive regimens. Efforts are needed to incorporate resistance testing into national treatment programs.
METHODS: Adults living with HIV enrolled in a regional observational cohort in Asia who had initiated combination antiretroviral therapy (cART) were included in the analysis. Factors associated with new TB diagnoses after cohort entry and survival after cART initiation were analysed using Cox regression, stratified by site.
RESULTS: A total of 7355 patients from 12 countries enrolled into the cohort between 2003 and 2016 were included in the study. There were 368 reported cases of TB after cohort entry with an incidence rate of 0.99 per 100 person-years (/100 pys). Multivariate analyses adjusted for viral load (VL), CD4 count, body mass index (BMI) and cART duration showed that CTX reduced the hazard for new TB infection by 28% (HR 0.72, 95% CI l 0.56, 0.93). Mortality after cART initiation was 0.85/100 pys, with a median follow-up time of 4.63 years. Predictors of survival included age, female sex, hepatitis C co-infection, TB diagnosis, HIV VL, CD4 count and BMI.
CONCLUSIONS: CTX was associated with a reduction in the hazard for new TB infection but did not impact survival in our Asian cohort. The potential preventive effect of CTX against TB during periods of severe immunosuppression should be further explored.
METHODS: HIV-infected adults enrolled in the TREAT Asia HIV Observational Database were eligible if they had an HIV RNA measurement documented at the time of ART initiation. The dataset was randomly split into a derivation data set (75% of patients) and a validation data set (25%). Factors associated with pre-treatment HIV RNA <100,000 copies/mL were evaluated by logistic regression adjusted for study site. A prediction model and prediction scores were created.
RESULTS: A total of 2592 patients were enrolled for the analysis. Median [interquartile range (IQR)] age was 35.8 (29.9-42.5) years; CD4 count was 147 (50-248) cells/mm3; and pre-treatment HIV RNA was 100,000 (34,045-301,075) copies/mL. Factors associated with pre-treatment HIV RNA <100,000 copies/mL were age <30 years [OR 1.40 vs. 41-50 years; 95% confidence interval (CI) 1.10-1.80, p = 0.01], body mass index >30 kg/m2(OR 2.4 vs. <18.5 kg/m2; 95% CI 1.1-5.1, p = 0.02), anemia (OR 1.70; 95% CI 1.40-2.10, p 350 cells/mm3(OR 3.9 vs. <100 cells/mm3; 95% CI 2.0-4.1, p 2000 cells/mm3(OR 1.7 vs. <1000 cells/mm3; 95% CI 1.3-2.3, p 25 yielded the sensitivity of 46.7%, specificity of 79.1%, positive predictive value of 67.7%, and negative predictive value of 61.2% for prediction of pre-treatment HIV RNA <100,000 copies/mL among derivation patients.
CONCLUSION: A model prediction for pre-treatment HIV RNA <100,000 copies/mL produced an area under the ROC curve of 0.70. A larger sample size for prediction model development as well as for model validation is warranted.
METHODS: Data from two regional cohort observational databases were analyzed for trends in median CD4 cell counts at ART initiation and the proportion of late ART initiation (CD4 cell counts <200 cells/mm(3) or prior AIDS diagnosis). Predictors for late ART initiation and mortality were determined.
RESULTS: A total of 2737 HIV-positive ART-naïve patients from 22 sites in 13 Asian countries and territories were eligible. The overall median (IQR) CD4 cell count at ART initiation was 150 (46-241) cells/mm(3). Median CD4 cell counts at ART initiation increased over time, from a low point of 115 cells/mm(3) in 2008 to a peak of 302 cells/mm(3) after 2011 (p for trend 0.002). The proportion of patients with late ART initiation significantly decreased over time from 79.1% before 2007 to 36.3% after 2011 (p for trend <0.001). Factors associated with late ART initiation were year of ART initiation (e.g. 2010 vs. before 2007; OR 0.40, 95% CI 0.27-0.59; p<0.001), sex (male vs. female; OR 1.51, 95% CI 1.18-1.93; p=0.001) and HIV exposure risk (heterosexual vs. homosexual; OR 1.66, 95% CI 1.24-2.23; p=0.001 and intravenous drug use vs. homosexual; OR 3.03, 95% CI 1.77-5.21; p<0.001). Factors associated with mortality after ART initiation were late ART initiation (HR 2.13, 95% CI 1.19-3.79; p=0.010), sex (male vs. female; HR 2.12, 95% CI 1.31-3.43; p=0.002), age (≥51 vs. ≤30 years; HR 3.91, 95% CI 2.18-7.04; p<0.001) and hepatitis C serostatus (positive vs. negative; HR 2.48, 95% CI 1.-4.36; p=0.035).
CONCLUSIONS: Median CD4 cell count at ART initiation among Asian patients significantly increases over time but the proportion of patients with late ART initiation is still significant. ART initiation at higher CD4 cell counts remains a challenge. Strategic interventions to increase earlier diagnosis of HIV infection and prompt more rapid linkage to ART must be implemented.
METHODS: The study population consisted of HIV-infected patients enrolled in the TREAT Asia HIV Observational Database (TAHOD). Individuals were included in this analysis if they started combination antiretroviral treatment (cART) after 2002, were being treated at a centre that documented a median rate of viral load monitoring ≥0.8 tests/patient/year among TAHOD enrolees, and experienced a minor or major treatment substitution while on virally suppressive cART. The primary endpoint to evaluate outcomes was clinical or virological failure (VF), followed by an ART class change. Clinical failure was defined as death or an AIDS diagnosis. VF was defined as confirmed viral load measurements ≥400 copies/mL followed by an ART class change within six months. Minor regimen substitutions were defined as within-class changes and major regimen substitutions were defined as changes to a drug class. The patterns of substitutions and rate of clinical or VF after substitutions were analyzed.
RESULTS: Of 3994 adults who started ART after 2002, 3119 (78.1%) had at least one period of virological suppression. Among these, 1170 (37.5%) underwent a minor regimen substitution, and 296 (9.5%) underwent a major regimen substitution during suppression. The rates of clinical or VF were 1.48/100 person years (95% CI 1.14 to 1.91) in the minor substitution group, 2.85/100 person years (95% CI 1.88 to 4.33) in the major substitution group and 2.53/100 person years (95% CI 2.20 to 2.92) among patients that did not undergo a treatment substitution.
CONCLUSIONS: The rate of clinical or VF was low in both major and minor substitution groups, showing that regimen substitution is generally effective in non-clinical trial settings in Asian countries.
METHODS: We used data from the TREAT Asia HIV Observational Database. Patients were included if they started antiretroviral therapy during or after 2003, had a serum creatinine measurement at antiretroviral therapy initiation (baseline), and had at least 2 follow-up creatinine measurements taken ≥3 months apart. Patients with a baseline estimated glomerular filtration rate (eGFR) ≤60 mL/min/1.73 m2 were excluded. Chronic kidney disease was defined as 2 consecutive eGFR values ≤60 mL/min/1.73 m2 taken ≥3 months apart. Generalized estimating equations were used to identify factors associated with eGFR change. Competing risk regression adjusted for study site, age and sex, and cumulative incidence plots were used to evaluate factors associated with chronic kidney disease (CKD).
RESULTS: Of 2547 patients eligible for this analysis, tenofovir was being used by 703 (27.6%) at baseline. Tenofovir use, high baseline eGFR, advanced HIV disease stage, and low nadir CD4 were associated with a decrease in eGFR during follow-up. Chronic kidney disease occurred at a rate of 3.4 per 1000 patient/years. Factors associated with CKD were tenofovir use, old age, low baseline eGFR, low nadir CD4, and protease inhibitor use.
CONCLUSIONS: There is an urgent need to enhance renal monitoring and management capacity among at-risk groups in Asia and improve access to less nephrotoxic antiretrovirals.
METHODS: Long-term LTFU was defined as LTFU occurring after 5 years on ART. LTFU was defined as (1) patients not seen in the previous 12 months; and (2) patients not seen in the previous 6 months. Factors associated with LTFU were analysed using competing risk regression.
RESULTS: Under the 12-month definition, the LTFU rate was 2.0 per 100 person-years (PY) [95% confidence interval (CI) 1.8-2.2 among 4889 patients included in the study. LTFU was associated with age > 50 years [sub-hazard ratio (SHR) 1.64; 95% CI 1.17-2.31] compared with 31-40 years, viral load ≥ 1000 copies/mL (SHR 1.86; 95% CI 1.16-2.97) compared with viral load < 1000 copies/mL, and hepatitis C coinfection (SHR 1.48; 95% CI 1.06-2.05). LTFU was less likely to occur in females, in individuals with higher CD4 counts, in those with self-reported adherence ≥ 95%, and in those living in high-income countries. The 6-month LTFU definition produced an incidence rate of 3.2 per 100 PY (95% CI 2.9-3.4 and had similar associations but with greater risks of LTFU for ART initiation in later years (2006-2009: SHR 2.38; 95% CI 1.93-2.94; and 2010-2011: SHR 4.26; 95% CI 3.17-5.73) compared with 2003-2005.
CONCLUSIONS: The long-term LTFU rate in our cohort was low, with older age being associated with LTFU. The increased risk of LTFU with later years of ART initiation in the 6-month analysis, but not the 12-month analysis, implies that there was a possible move towards longer HIV clinic scheduling in Asia.
METHODS: Treatment modification was defined as a change of two antiretrovirals, a drug class change or treatment interruption (TI), all for >14 days. We assessed factors associated with CD4 changes and undetectable viral load (UVL <1,000 copies/ml) at 1 year after second-line failure using linear and logistic regression, respectively. Survival time was analysed using competing risk regression.
RESULTS: Of the 328 patients who failed second-line ART in our cohorts, 208 (63%) had a subsequent treatment modification. Compared with those who continued the failing regimen, the average CD4 cell increase was higher in patients who had a modification without TI (difference =77.5, 95% CI 35.3, 119.7) while no difference was observed among those with TI (difference =-5.3, 95% CI -67.3, 56.8). Compared with those who continued the failing regimen, the odds of achieving UVL was lower in patients with TI (OR=0.18, 95% CI 0.06, 0.60) and similar among those who had a modification without TI (OR=1.97, 95% CI 0.95, 4.10), with proportions of UVL 60%, 22% and 75%, respectively. Survival time was not affected by treatment modifications.
CONCLUSIONS: CD4 cell improvements were observed in those who had treatment modification without TI compared with those on the failing regimen. When no other options are available, maintaining the same failing ART combination provided better VL control than interrupting treatment.
METHODS: Patients initiating cART between 2006 and 2013 were included. TI was defined as stopping cART for >1 day. Treatment failure was defined as confirmed virological, immunological or clinical failure. Time to treatment failure during cART was analysed using Cox regression, not including periods off treatment. Covariables with P < 0.10 in univariable analyses were included in multivariable analyses, where P < 0.05 was considered statistically significant.
RESULTS: Of 4549 patients from 13 countries in Asia, 3176 (69.8%) were male and the median age was 34 years. A total of 111 (2.4%) had TIs due to AEs and 135 (3.0%) had TIs for other reasons. Median interruption times were 22 days for AE and 148 days for non-AE TIs. In multivariable analyses, interruptions >30 days were associated with failure (31-180 days HR = 2.66, 95%CI (1.70-4.16); 181-365 days HR = 6.22, 95%CI (3.26-11.86); and >365 days HR = 9.10, 95% CI (4.27-19.38), all P < 0.001, compared to 0-14 days). Reasons for previous TI were not statistically significant (P = 0.158).
CONCLUSIONS: Duration of interruptions of more than 30 days was the key factor associated with large increases in subsequent risk of treatment failure. If TI is unavoidable, its duration should be minimised to reduce the risk of failure after treatment resumption.
METHODS: Treatment modification was defined as a change of two antiretrovirals, a drug class change or treatment interruption (TI), all for >14 days. We assessed factors associated with CD4 changes and undetectable viral load (UVL <1,000 copies/ml) at 1 year after second-line failure using linear and logistic regression, respectively. Survival time was analysed using competing risk regression.
RESULTS: Of the 328 patients who failed second-line ART in our cohorts, 208 (63%) had a subsequent treatment modification. Compared with those who continued the failing regimen, the average CD4 cell increase was higher in patients who had a modification without TI (difference =77.5, 95% CI 35.3, 119.7) while no difference was observed among those with TI (difference =-5.3, 95% CI -67.3, 56.8). Compared with those who continued the failing regimen, the odds of achieving UVL was lower in patients with TI (OR=0.18, 95% CI 0.06, 0.60) and similar among those who had a modification without TI (OR=1.97, 95% CI 0.95, 4.10), with proportions of UVL 60%, 22% and 75%, respectively. Survival time was not affected by treatment modifications.
CONCLUSIONS: CD4 cell improvements were observed in those who had treatment modification without TI compared with those on the failing regimen. When no other options are available, maintaining the same failing ART combination provided better VL control than interrupting treatment.
OBJECTIVE: To assess the socio-economic determinants of TB in HIV-infected patients in Asia.
DESIGN: This was a matched case-control study. HIV-positive, TB-positive cases were matched to HIV-positive, TB-negative controls according to age, sex and CD4 cell count. A socio-economic questionnaire comprising 23 questions, including education level, employment, housing and substance use, was distributed. Socio-economic risk factors for TB were analysed using conditional logistic regression analysis.
RESULTS: A total of 340 patients (170 matched pairs) were recruited, with 262 (77.1%) matched for all three criteria. Pulmonary TB was the predominant type (n = 115, 67.6%). The main risk factor for TB was not having a university level education (OR 4.45, 95%CI 1.50-13.17, P = 0.007). Burning wood or coal regularly inside the house and living in the same place of origin were weakly associated with TB diagnosis.
CONCLUSIONS: These data suggest that lower socio-economic status is associated with an increased risk of TB in Asia. Integrating clinical and socio-economic factors into HIV treatment may help in the prevention of opportunistic infections and disease progression.