METHODS: Perinatally HIV-infected Asian adolescents (10-19 years) with documented virologic suppression (two consecutive viral loads [VLs] <400 copies/mL ≥6 months apart) were included. Baseline was the date of the first VL <400 copies/mL at age ≥10 years or the 10th birthday for those with prior suppression. Cox proportional hazards models were used to identify predictors of postsuppression VR (VL >1,000 copies/mL).
RESULTS: Of 1,379 eligible adolescents, 47% were males. At baseline, 22% were receiving protease inhibitor-containing regimens; median CD4 cell count (interquartile range [IQR]) was 685 (448-937) cells/mm3; 2% had preadolescent virologic failure (VF) before subsequent suppression. During adolescence, 180 individuals (13%) experienced postsuppression VR at a rate of 3.4 (95% confidence interval: 2.9-3.9) per 100 person-years, which was consistent over time. Median time to VR during adolescence (IQR) was 3.3 (2.1-4.8) years. Wasting (weight-for-age z-score
DESIGN: Ongoing observational database collating clinical data on HIV-infected children and adolescents in Asia.
METHODS: Data from 2001 to 2016 relating to adolescents (10-19 years) with perinatal HIV infection were analysed to describe characteristics at adolescent entry and transition and combination antiretroviral therapy (cART) regimens across adolescence. A competing risk regression analysis was used to determine characteristics at adolescent entry associated with mortality. Outcomes at transition were compared on the basis of age at cART initiation.
RESULTS: Of 3448 PHIVA, 644 had reached transition. Median age at HIV diagnosis was 5.5 years, cART initiation 7.2 years and transition 17.9 years. At adolescent entry, 35.0% had CD4+ cell count less than 500 cells/μl and 51.1% had experienced a WHO stage III/IV clinical event. At transition, 38.9% had CD4+ cell count less than 500 copies/ml, and 53.4% had experienced a WHO stage III/IV clinical event. Mortality rate was 0.71 per 100 person-years, with HIV RNA ≥1000 copies/ml, CD4+ cell count less than 500 cells/μl, height-for-age or weight-for-age z-score less than -2, history of a WHO stage III/IV clinical event or hospitalization and at least second cART associated with mortality. For transitioning PHIVA, those who commenced cART age less than 5 years had better virologic and immunologic outcomes, though were more likely to be on at least second cART.
CONCLUSION: Delayed HIV diagnosis and cART initiation resulted in considerable morbidity and poor immune status by adolescent entry. Durable first-line cART regimens to optimize disease control are key to minimizing mortality. Early cART initiation provides the best virologic and immunologic outcomes at transition.
METHODS: Long-term LTFU was defined as LTFU occurring after 5 years on ART. LTFU was defined as (1) patients not seen in the previous 12 months; and (2) patients not seen in the previous 6 months. Factors associated with LTFU were analysed using competing risk regression.
RESULTS: Under the 12-month definition, the LTFU rate was 2.0 per 100 person-years (PY) [95% confidence interval (CI) 1.8-2.2 among 4889 patients included in the study. LTFU was associated with age > 50 years [sub-hazard ratio (SHR) 1.64; 95% CI 1.17-2.31] compared with 31-40 years, viral load ≥ 1000 copies/mL (SHR 1.86; 95% CI 1.16-2.97) compared with viral load < 1000 copies/mL, and hepatitis C coinfection (SHR 1.48; 95% CI 1.06-2.05). LTFU was less likely to occur in females, in individuals with higher CD4 counts, in those with self-reported adherence ≥ 95%, and in those living in high-income countries. The 6-month LTFU definition produced an incidence rate of 3.2 per 100 PY (95% CI 2.9-3.4 and had similar associations but with greater risks of LTFU for ART initiation in later years (2006-2009: SHR 2.38; 95% CI 1.93-2.94; and 2010-2011: SHR 4.26; 95% CI 3.17-5.73) compared with 2003-2005.
CONCLUSIONS: The long-term LTFU rate in our cohort was low, with older age being associated with LTFU. The increased risk of LTFU with later years of ART initiation in the 6-month analysis, but not the 12-month analysis, implies that there was a possible move towards longer HIV clinic scheduling in Asia.
METHODS: Data on children with perinatally acquired HIV aged <18 years on first-line, non-nucleoside reverse transcriptase inhibitor-based cART with viral suppression (two consecutive pVL <400 copies/mL over a six-month period) were included from a regional cohort study; those exposed to prior mono- or dual antiretroviral treatment were excluded. Frequency of pVL monitoring was determined at the site-level based on the median rate of pVL measurement: annual 0.75 to 1.5, and semi-annual >1.5 tests/patient/year. Treatment failure was defined as virologic failure (two consecutive pVL >1000 copies/mL), change of antiretroviral drug class, or death. Baseline was the date of the second consecutive pVL <400 copies/mL. Competing risk regression models were used to identify predictors of treatment failure.
RESULTS: During January 2008 to March 2015, there were 1220 eligible children from 10 sites that performed at least annual pVL monitoring, 1042 (85%) and 178 (15%) were from sites performing annual (n = 6) and semi-annual pVL monitoring (n = 4) respectively. Pre-cART, 675 children (55%) had World Health Organization clinical stage 3 or 4, the median nadir CD4 percentage was 9%, and the median pVL was 5.2 log10 copies/mL. At baseline, the median age was 9.2 years, 64% were on nevirapine-based regimens, the median cART duration was 1.6 years, and the median CD4 percentage was 26%. Over the follow-up period, 258 (25%) CLWH with annual and 40 (23%) with semi-annual pVL monitoring developed treatment failure, corresponding to incidence rates of 5.4 (95% CI: 4.8 to 6.1) and 4.3 (95% CI: 3.1 to 5.8) per 100 patient-years of follow-up respectively (p = 0.27). In multivariable analyses, the frequency of pVL monitoring was not associated with treatment failure (adjusted hazard ratio: 1.12; 95% CI: 0.80 to 1.59).
CONCLUSIONS: Annual compared to semi-annual pVL monitoring was not associated with an increased risk of treatment failure in our cohort of virally suppressed children with perinatally acquired HIV on first-line NNRTI-based cART.
METHODS: Individuals enrolled in the Therapeutics Research, Education, and AIDS Training in Asia Pediatric HIV Observational Database were included if they started ART at ages 1 month-14 years and had both height and weight measurements available at ART initiation (baseline). Generalized estimating equations were used to identify factors associated with change in height-for-age z-score (HAZ), follow-up HAZ ≥ -2, change in weight-for-age z-score (WAZ), and follow-up WAZ ≥ -2.
RESULTS: A total of 3217 children were eligible for analysis. The adjusted mean change in HAZ among cotrimoxazole and non-cotrimoxazole users did not differ significantly over the first 24 months of ART. In children who were stunted (HAZ < -2) at baseline, cotrimoxazole use was not associated with a follow-up HAZ ≥ -2. The adjusted mean change in WAZ among children with a baseline CD4 percentage (CD4%) >25% became significantly different between cotrimoxazole and non-cotrimoxazole users after 6 months of ART and remained significant after 24 months (overall P < .01). Similar changes in WAZ were observed in those with a baseline CD4% between 10% and 24% (overall P < .01). Cotrimoxazole use was not associated with a significant difference in follow-up WAZ in children with a baseline CD4% <10%. In those underweight (WAZ < -2) at baseline, cotrimoxazole use was associated with a follow-up WAZ ≥ -2 (adjusted odds ratio, 1.70 vs not using cotrimoxazole [95% confidence interval, 1.28-2.25], P < .01). This association was driven by children with a baseline CD4% ≥10%.
CONCLUSIONS: Cotrimoxazole use is associated with benefits to WAZ but not HAZ during early ART in Asian children.
METHODS: We used data from the TREAT Asia HIV Observational Database. Patients were included if they started antiretroviral therapy during or after 2003, had a serum creatinine measurement at antiretroviral therapy initiation (baseline), and had at least 2 follow-up creatinine measurements taken ≥3 months apart. Patients with a baseline estimated glomerular filtration rate (eGFR) ≤60 mL/min/1.73 m2 were excluded. Chronic kidney disease was defined as 2 consecutive eGFR values ≤60 mL/min/1.73 m2 taken ≥3 months apart. Generalized estimating equations were used to identify factors associated with eGFR change. Competing risk regression adjusted for study site, age and sex, and cumulative incidence plots were used to evaluate factors associated with chronic kidney disease (CKD).
RESULTS: Of 2547 patients eligible for this analysis, tenofovir was being used by 703 (27.6%) at baseline. Tenofovir use, high baseline eGFR, advanced HIV disease stage, and low nadir CD4 were associated with a decrease in eGFR during follow-up. Chronic kidney disease occurred at a rate of 3.4 per 1000 patient/years. Factors associated with CKD were tenofovir use, old age, low baseline eGFR, low nadir CD4, and protease inhibitor use.
CONCLUSIONS: There is an urgent need to enhance renal monitoring and management capacity among at-risk groups in Asia and improve access to less nephrotoxic antiretrovirals.
METHODS: HIV-infected adults enrolled in the TREAT Asia HIV Observational Database were eligible if they had an HIV RNA measurement documented at the time of ART initiation. The dataset was randomly split into a derivation data set (75% of patients) and a validation data set (25%). Factors associated with pre-treatment HIV RNA <100,000 copies/mL were evaluated by logistic regression adjusted for study site. A prediction model and prediction scores were created.
RESULTS: A total of 2592 patients were enrolled for the analysis. Median [interquartile range (IQR)] age was 35.8 (29.9-42.5) years; CD4 count was 147 (50-248) cells/mm3; and pre-treatment HIV RNA was 100,000 (34,045-301,075) copies/mL. Factors associated with pre-treatment HIV RNA <100,000 copies/mL were age <30 years [OR 1.40 vs. 41-50 years; 95% confidence interval (CI) 1.10-1.80, p = 0.01], body mass index >30 kg/m2(OR 2.4 vs. <18.5 kg/m2; 95% CI 1.1-5.1, p = 0.02), anemia (OR 1.70; 95% CI 1.40-2.10, p 350 cells/mm3(OR 3.9 vs. <100 cells/mm3; 95% CI 2.0-4.1, p 2000 cells/mm3(OR 1.7 vs. <1000 cells/mm3; 95% CI 1.3-2.3, p 25 yielded the sensitivity of 46.7%, specificity of 79.1%, positive predictive value of 67.7%, and negative predictive value of 61.2% for prediction of pre-treatment HIV RNA <100,000 copies/mL among derivation patients.
CONCLUSION: A model prediction for pre-treatment HIV RNA <100,000 copies/mL produced an area under the ROC curve of 0.70. A larger sample size for prediction model development as well as for model validation is warranted.
METHODS: Incidence of malignancy after cohort enrollment was evaluated. Factors associated with development of hematological and nonhematological malignancy were analyzed using competing risk regression and survival time using Kaplan-Meier.
RESULTS: Of 7455 patients, 107 patients (1%) developed a malignancy: 34 (0.5%) hematological [0.08 per 100 person-years (/100PY)] and 73 (1%) nonhematological (0.17/100PY). Of the hematological malignancies, non-Hodgkin lymphoma was predominant (n = 26, 76%): immunoblastic (n = 6, 18%), Burkitt (n = 5, 15%), diffuse large B-cell (n = 5, 15%), and unspecified (n = 10, 30%). Others include central nervous system lymphoma (n = 7, 21%) and myelodysplastic syndrome (n = 1, 3%). Nonhematological malignancies were mostly Kaposi sarcoma (n = 12, 16%) and cervical cancer (n = 10, 14%). Risk factors for hematological malignancy included age >50 vs. ≤30 years [subhazard ratio (SHR) = 6.48, 95% confidence interval (CI): 1.79 to 23.43] and being from a high-income vs. a lower-middle-income country (SHR = 3.97, 95% CI: 1.45 to 10.84). Risk was reduced with CD4 351-500 cells/µL (SHR = 0.20, 95% CI: 0.05 to 0.74) and CD4 >500 cells/µL (SHR = 0.14, 95% CI: 0.04 to 0.78), compared to CD4 ≤200 cells/µL. Similar risk factors were seen for nonhematological malignancy, with prior AIDS diagnosis showing a weak association. Patients diagnosed with a hematological malignancy had shorter survival time compared to patients diagnosed with a nonhematological malignancy.
CONCLUSIONS: Nonhematological malignancies were common but non-Hodgkin lymphoma was more predominant in our cohort. PLHIV from high-income countries were more likely to be diagnosed, indicating a potential underdiagnosis of cancer in low-income settings.
SETTING: An Asian cohort in 16 pediatric HIV services across 6 countries.
METHODS: From 2005 to 2014, patients younger than 20 years who achieved virologic suppression and had subsequent viral load testing were included. Early virologic failure was defined as a HIV RNA ≥1000 copies per milliliter within 12 months of virologic suppression, and late virologic as a HIV RNA ≥1000 copies per milliliter after 12 months following virologic suppression. Characteristics at combination antiretroviral therapy initiation and virologic suppression were described, and a competing risk time-to-event analysis was used to determine cumulative incidence of virologic failure and factors at virologic suppression associated with early and late virologic failure.
RESULTS: Of 1105 included in the analysis, 182 (17.9%) experienced virologic failure. The median age at virologic suppression was 6.9 years, and the median time to virologic failure was 24.6 months after virologic suppression. The incidence rate for a first virologic failure event was 3.3 per 100 person-years. Factors at virologic suppression associated with late virologic failure included older age, mostly rural clinic setting, tuberculosis, protease inhibitor-based regimens, and early virologic failure. No risk factors were identified for early virologic failure.
CONCLUSIONS: Around 1 in 5 experienced virologic failure in our cohort after achieving virologic suppression. Targeted interventions to manage complex treatment scenarios, including adolescents, tuberculosis coinfection, and those with poor virologic control are required.
METHODS: Adults living with HIV enrolled in a regional observational cohort in Asia who had initiated combination antiretroviral therapy (cART) were included in the analysis. Factors associated with new TB diagnoses after cohort entry and survival after cART initiation were analysed using Cox regression, stratified by site.
RESULTS: A total of 7355 patients from 12 countries enrolled into the cohort between 2003 and 2016 were included in the study. There were 368 reported cases of TB after cohort entry with an incidence rate of 0.99 per 100 person-years (/100 pys). Multivariate analyses adjusted for viral load (VL), CD4 count, body mass index (BMI) and cART duration showed that CTX reduced the hazard for new TB infection by 28% (HR 0.72, 95% CI l 0.56, 0.93). Mortality after cART initiation was 0.85/100 pys, with a median follow-up time of 4.63 years. Predictors of survival included age, female sex, hepatitis C co-infection, TB diagnosis, HIV VL, CD4 count and BMI.
CONCLUSIONS: CTX was associated with a reduction in the hazard for new TB infection but did not impact survival in our Asian cohort. The potential preventive effect of CTX against TB during periods of severe immunosuppression should be further explored.
METHODS: Regional Asian data (2001-2016) were analyzed to describe PHIVA who experienced ≥2 weeks of lamivudine or emtricitabine monotherapy or treatment interruption and trends in CD4 count and HIV viral load during and after episodes. Survival analyses were used for World Health Organization (WHO) stage III/IV clinical and immunologic event-free survival during monotherapy or treatment interruption, and a Poisson regression to determine factors associated with monotherapy or treatment interruption.
RESULTS: Of 3,448 PHIVA, 84 (2.4%) experienced 94 monotherapy episodes, and 147 (4.3%) experienced 174 treatment interruptions. Monotherapy was associated with older age, HIV RNA >400 copies/mL, younger age at ART initiation, and exposure to ≥2 combination ART regimens. Treatment interruption was associated with CD4 count <350 cells/μL, HIV RNA ≥1,000 copies/mL, ART adverse event, and commencing ART age ≥10 years compared with age <3 years. WHO clinical stage III/IV 1-year event-free survival was 96% and 85% for monotherapy and treatment interruption cohorts, respectively. WHO immunologic stage III/IV 1-year event-free survival was 52% for both cohorts. Those who experienced monotherapy or treatment interruption for more than 6 months had worse immunologic and virologic outcomes.
CONCLUSIONS: Until challenges of treatment adherence, engagement in care, and combination ART durability/tolerability are met, monotherapy and treatment interruption will lead to poor long-term outcomes.