METHODS: Prospectively collected longitudinal data from patients in Thailand, Hong Kong, Malaysia, Japan, Taiwan, and South Korea were provided for analysis. Covariates included demographics, hepatitis B and C coinfections, baseline CD4 T lymphocyte count, and plasma HIV-1 RNA levels. Clinical deterioration (a new diagnosis of Centers for Disease Control and Prevention category B/AIDS-defining illness or death) was assessed by proportional hazards models. Surrogate endpoints were 12-month change in CD4 cell count and virologic suppression post therapy, evaluated by linear and logistic regression, respectively.
RESULTS: Of 1105 patients, 1036 (93.8%) infected with CRF01_AE or subtype B were eligible for inclusion in clinical deterioration analyses and contributed 1546.7 person-years of follow-up (median: 413 days, interquartile range: 169-672 days). Patients >40 years demonstrated smaller immunological increases (P = 0.002) and higher risk of clinical deterioration (hazard ratio = 2.17; P = 0.008). Patients with baseline CD4 cell counts >200 cells per microliter had lower risk of clinical deterioration (hazard ratio = 0.373; P = 0.003). A total of 532 patients (48.1% of eligible) had CD4 counts available at baseline and 12 months post therapy for inclusion in immunolgic analyses. Patients infected with subtype B had larger increases in CD4 counts at 12 months (P = 0.024). A total of 530 patients (48.0% of eligible) were included in virological analyses with no differences in response found between genotypes.
CONCLUSIONS: Results suggest that patients infected with CRF01_AE have reduced immunologic response to therapy at 12 months, compared with subtype B-infected counterparts. Clinical deterioration was associated with low baseline CD4 counts and older age. The lack of differences in virologic outcomes suggests that all patients have opportunities for virological suppression.
METHODS: In a regional HIV observational cohort in the Asia-Pacific region, patients with viral suppression (2 consecutive viral loads <400 copies/mL) and a CD4 count ≥200 cells per microliter who had CD4 testing 6 monthly were analyzed. Main study end points were occurrence of 1 CD4 count <200 cells per microliter (single CD4 <200) and 2 CD4 counts <200 cells per microliter within a 6-month period (confirmed CD4 <200). A comparison of time with single and confirmed CD4 <200 with biannual or annual CD4 assessment was performed by generating a hypothetical group comprising the same patients with annual CD4 testing by removing every second CD4 count.
RESULTS: Among 1538 patients, the rate of single CD4 <200 was 3.45/100 patient-years and of confirmed CD4 <200 was 0.77/100 patient-years. During 5 years of viral suppression, patients with baseline CD4 200-249 cells per microliter were significantly more likely to experience confirmed CD4 <200 compared with patients with higher baseline CD4 [hazard ratio, 55.47 (95% confidence interval: 7.36 to 418.20), P < 0.001 versus baseline CD4 ≥500 cells/μL]. Cumulative probabilities of confirmed CD4 <200 was also higher in patients with baseline CD4 200-249 cells per microliter compared with patients with higher baseline CD4. There was no significant difference in time to confirmed CD4 <200 between biannual and annual CD4 measurement (P = 0.336).
CONCLUSIONS: Annual CD4 monitoring in virally suppressed HIV patients with a baseline CD4 ≥250 cells per microliter may be sufficient for clinical management.
OBJECTIVES: To study the initial ART regimens and the rate of switch of ART regimens used during the first 36 months in HIV-infected children with severe anemia and to evaluate their clinical and laboratory outcomes.
METHODS: We analyzed regional cohort data of 130 Asian children aged <18 years with baseline severe anemia (hemoglobin <7.5 g/dl) who started antiretroviral therapy (ART) between January 2003 and September 2013.
RESULTS: At ART initiation, median age was 3.5 years old (interquartile range (IQR) 1.7 to 6.3) and median hemoglobin was 6.7 g/dL (IQR 5.9-7.1, range 3.0-7.4). Initial ART regimens included stavudine (85.4%), zidovudine (13.8%), and abacavir (0.8%). In 81 children with available hemoglobin data after 6 months of ART, 90% recovered from severe anemia with a median hemoglobin of 10.7 g/dL (IQR 9.6-11.7, range 4.4-13.5). Those starting AZT-based ART had a mortality rate of 10.8 (95% confidence interval (CI) 4.8-23.9) per 100 patient-years compared to 2.7 (95% CI 1.6-4.6) per 100 patient-years among those who started d4T-based ART.
CONCLUSIONS: With the phase-out of stavudine, age-appropriate non-zidovudine options are needed for younger Asian children with severe anemia.
METHODS: Individuals enrolled in the Therapeutics Research, Education, and AIDS Training in Asia Pediatric HIV Observational Database were included if they started ART at ages 1 month-14 years and had both height and weight measurements available at ART initiation (baseline). Generalized estimating equations were used to identify factors associated with change in height-for-age z-score (HAZ), follow-up HAZ ≥ -2, change in weight-for-age z-score (WAZ), and follow-up WAZ ≥ -2.
RESULTS: A total of 3217 children were eligible for analysis. The adjusted mean change in HAZ among cotrimoxazole and non-cotrimoxazole users did not differ significantly over the first 24 months of ART. In children who were stunted (HAZ < -2) at baseline, cotrimoxazole use was not associated with a follow-up HAZ ≥ -2. The adjusted mean change in WAZ among children with a baseline CD4 percentage (CD4%) >25% became significantly different between cotrimoxazole and non-cotrimoxazole users after 6 months of ART and remained significant after 24 months (overall P < .01). Similar changes in WAZ were observed in those with a baseline CD4% between 10% and 24% (overall P < .01). Cotrimoxazole use was not associated with a significant difference in follow-up WAZ in children with a baseline CD4% <10%. In those underweight (WAZ < -2) at baseline, cotrimoxazole use was associated with a follow-up WAZ ≥ -2 (adjusted odds ratio, 1.70 vs not using cotrimoxazole [95% confidence interval, 1.28-2.25], P < .01). This association was driven by children with a baseline CD4% ≥10%.
CONCLUSIONS: Cotrimoxazole use is associated with benefits to WAZ but not HAZ during early ART in Asian children.
METHODS: We investigated serum creatinine (S-Cr) monitoring rates before and during ART and the incidence and prevalence of renal dysfunction after starting TDF by using data from a regional cohort of HIV-infected individuals in the Asia-Pacific. Time to renal dysfunction was defined as time from TDF initiation to the decline in estimated glomerular filtration rate (eGFR) to <60 ml/min/1.73m2 with >30% reduction from baseline using the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation or the decision to stop TDF for reported TDF-nephrotoxicity. Predictors of S-Cr monitoring rates were assessed by Poisson regression and risk factors for developing renal dysfunction were assessed by Cox regression.
RESULTS: Among 2,425 patients who received TDF, S-Cr monitoring rates increased from 1.01 to 1.84 per person per year after starting TDF (incidence rate ratio 1.68, 95%CI 1.62-1.74, p <0.001). Renal dysfunction on TDF occurred in 103 patients over 5,368 person-years of TDF use (4.2%; incidence 1.75 per 100 person-years). Risk factors for developing renal dysfunction included older age (>50 vs. ≤30, hazard ratio [HR] 5.39, 95%CI 2.52-11.50, p <0.001; and using PI-based regimen (HR 1.93, 95%CI 1.22-3.07, p = 0.005). Having an eGFR prior to TDF (pre-TDF eGFR) of ≥60 ml/min/1.73m2 showed a protective effect (HR 0.38, 95%CI, 0.17-0.85, p = 0.018).
CONCLUSIONS: Renal dysfunction on commencing TDF use was not common, however, older age, lower baseline eGFR and PI-based ART were associated with higher risk of renal dysfunction during TDF use in adult HIV-infected individuals in the Asia-Pacific region.
Methods: Study end points were as follows: (1) a CD4 count <200 cells/mm3 followed by a CD4 count ≥200 cells/mm3 (transient CD4 <200); (2) CD4 count <200 cells/mm3 confirmed within 6 months (confirmed CD4 <200); and (3) a new or recurrent World Health Organization (WHO) stage 3 or 4 illness (clinical failure). Kaplan-Meier curves and Cox regression were used to evaluate rates and predictors of transient CD4 <200, confirmed CD4 <200, and clinical failure among virally suppressed children aged 5-15 years who were enrolled in the TREAT Asia Pediatric HIV Observational Database.
Results: Data from 967 children were included in the analysis. At the time of confirmed viral suppression, median age was 10.2 years, 50.4% of children were female, and 95.4% were perinatally infected with HIV. Median CD4 cell count was 837 cells/mm3, and 54.8% of children were classified as having WHO stage 3 or 4 disease. In total, 18 transient CD4 <200 events, 2 confirmed CD4 <200 events, and10 clinical failures occurred at rates of 0.73 (95% confidence interval [95% CI], 0.46-1.16), 0.08 (95% CI, 0.02-0.32), and 0.40 (95% CI, 0.22-0.75) events per 100 patient-years, respectively. CD4 <500 cells/mm3 at the time of viral suppression confirmation was associated with higher rates of both CD4 outcomes.
Conclusions: Regular CD4 testing may be unnecessary for virally suppressed children aged 5-15 years with CD4 ≥500 cells/mm3.
METHODS: Perinatally HIV-infected Asian adolescents (10-19 years) with documented virologic suppression (two consecutive viral loads [VLs] <400 copies/mL ≥6 months apart) were included. Baseline was the date of the first VL <400 copies/mL at age ≥10 years or the 10th birthday for those with prior suppression. Cox proportional hazards models were used to identify predictors of postsuppression VR (VL >1,000 copies/mL).
RESULTS: Of 1,379 eligible adolescents, 47% were males. At baseline, 22% were receiving protease inhibitor-containing regimens; median CD4 cell count (interquartile range [IQR]) was 685 (448-937) cells/mm3; 2% had preadolescent virologic failure (VF) before subsequent suppression. During adolescence, 180 individuals (13%) experienced postsuppression VR at a rate of 3.4 (95% confidence interval: 2.9-3.9) per 100 person-years, which was consistent over time. Median time to VR during adolescence (IQR) was 3.3 (2.1-4.8) years. Wasting (weight-for-age z-score
DESIGN: Ongoing observational database collating clinical data on HIV-infected children and adolescents in Asia.
METHODS: Data from 2001 to 2016 relating to adolescents (10-19 years) with perinatal HIV infection were analysed to describe characteristics at adolescent entry and transition and combination antiretroviral therapy (cART) regimens across adolescence. A competing risk regression analysis was used to determine characteristics at adolescent entry associated with mortality. Outcomes at transition were compared on the basis of age at cART initiation.
RESULTS: Of 3448 PHIVA, 644 had reached transition. Median age at HIV diagnosis was 5.5 years, cART initiation 7.2 years and transition 17.9 years. At adolescent entry, 35.0% had CD4+ cell count less than 500 cells/μl and 51.1% had experienced a WHO stage III/IV clinical event. At transition, 38.9% had CD4+ cell count less than 500 copies/ml, and 53.4% had experienced a WHO stage III/IV clinical event. Mortality rate was 0.71 per 100 person-years, with HIV RNA ≥1000 copies/ml, CD4+ cell count less than 500 cells/μl, height-for-age or weight-for-age z-score less than -2, history of a WHO stage III/IV clinical event or hospitalization and at least second cART associated with mortality. For transitioning PHIVA, those who commenced cART age less than 5 years had better virologic and immunologic outcomes, though were more likely to be on at least second cART.
CONCLUSION: Delayed HIV diagnosis and cART initiation resulted in considerable morbidity and poor immune status by adolescent entry. Durable first-line cART regimens to optimize disease control are key to minimizing mortality. Early cART initiation provides the best virologic and immunologic outcomes at transition.
METHODS: A multisite cross-sectional study was conducted in HIV-infected patients currently <25 years old receiving antiretroviral treatment (ART) who had HBV surface antigen (HBsAg), or HBV surface antibody (anti-HBs) or HBV core antibody (anti-HBc) tested during 2012-2013. HBV coinfection was defined as having either a positive HBsAg test or being anti-HBc positive and anti-HBs negative, reflective of past HBV infection. HBV seroprotection was defined as having a positive anti-HBs test.
RESULTS: A total of 3380 patients from 6 countries (Vietnam, Thailand, Cambodia, Malaysia, Indonesia and India) were included. The current median (interquartile range) age was 11.2 (7.8-15.1) years. Of the 2755 patients (81.5%) with HBsAg testing, 130 (4.7%) were positive. Of 1558 (46%) with anti-HBc testing, 77 (4.9%) were positive. Thirteen of 1037 patients with all 3 tests were anti-HBc positive and HBsAg and anti-HBs negative. One child was positive for anti-HBc and negative for anti-HBs but did not have HBsAg tested. The prevalence of HBV coinfection was 144/2759 (5.2%) (95% confidence interval: 4.4-6.1). Of 1093 patients (32%) with anti-HBs testing, 257 (23.5%; confidence interval: 21.0-26.0) had positive tests representing HBV seroprotection.
CONCLUSIONS: The estimated prevalence of HBV coinfection in this cohort of Asian HIV-infected children and adolescents on ART was 5.2%. The majority of children and adolescents tested in this cohort (76.5%) did not have protective HBV antibody. The finding supports HBV screening of HIV-infected children and adolescents to guide revaccination, the use of ART with anti-HBV activity and future monitoring.
SETTING: An Asian cohort in 16 pediatric HIV services across 6 countries.
METHODS: From 2005 to 2014, patients younger than 20 years who achieved virologic suppression and had subsequent viral load testing were included. Early virologic failure was defined as a HIV RNA ≥1000 copies per milliliter within 12 months of virologic suppression, and late virologic as a HIV RNA ≥1000 copies per milliliter after 12 months following virologic suppression. Characteristics at combination antiretroviral therapy initiation and virologic suppression were described, and a competing risk time-to-event analysis was used to determine cumulative incidence of virologic failure and factors at virologic suppression associated with early and late virologic failure.
RESULTS: Of 1105 included in the analysis, 182 (17.9%) experienced virologic failure. The median age at virologic suppression was 6.9 years, and the median time to virologic failure was 24.6 months after virologic suppression. The incidence rate for a first virologic failure event was 3.3 per 100 person-years. Factors at virologic suppression associated with late virologic failure included older age, mostly rural clinic setting, tuberculosis, protease inhibitor-based regimens, and early virologic failure. No risk factors were identified for early virologic failure.
CONCLUSIONS: Around 1 in 5 experienced virologic failure in our cohort after achieving virologic suppression. Targeted interventions to manage complex treatment scenarios, including adolescents, tuberculosis coinfection, and those with poor virologic control are required.