METHODS: Perinatally HIV-infected Asian adolescents (10-19 years) with documented virologic suppression (two consecutive viral loads [VLs] <400 copies/mL ≥6 months apart) were included. Baseline was the date of the first VL <400 copies/mL at age ≥10 years or the 10th birthday for those with prior suppression. Cox proportional hazards models were used to identify predictors of postsuppression VR (VL >1,000 copies/mL).
RESULTS: Of 1,379 eligible adolescents, 47% were males. At baseline, 22% were receiving protease inhibitor-containing regimens; median CD4 cell count (interquartile range [IQR]) was 685 (448-937) cells/mm3; 2% had preadolescent virologic failure (VF) before subsequent suppression. During adolescence, 180 individuals (13%) experienced postsuppression VR at a rate of 3.4 (95% confidence interval: 2.9-3.9) per 100 person-years, which was consistent over time. Median time to VR during adolescence (IQR) was 3.3 (2.1-4.8) years. Wasting (weight-for-age z-score
METHODS: Data on children with perinatally acquired HIV aged <18 years on first-line, non-nucleoside reverse transcriptase inhibitor-based cART with viral suppression (two consecutive pVL <400 copies/mL over a six-month period) were included from a regional cohort study; those exposed to prior mono- or dual antiretroviral treatment were excluded. Frequency of pVL monitoring was determined at the site-level based on the median rate of pVL measurement: annual 0.75 to 1.5, and semi-annual >1.5 tests/patient/year. Treatment failure was defined as virologic failure (two consecutive pVL >1000 copies/mL), change of antiretroviral drug class, or death. Baseline was the date of the second consecutive pVL <400 copies/mL. Competing risk regression models were used to identify predictors of treatment failure.
RESULTS: During January 2008 to March 2015, there were 1220 eligible children from 10 sites that performed at least annual pVL monitoring, 1042 (85%) and 178 (15%) were from sites performing annual (n = 6) and semi-annual pVL monitoring (n = 4) respectively. Pre-cART, 675 children (55%) had World Health Organization clinical stage 3 or 4, the median nadir CD4 percentage was 9%, and the median pVL was 5.2 log10 copies/mL. At baseline, the median age was 9.2 years, 64% were on nevirapine-based regimens, the median cART duration was 1.6 years, and the median CD4 percentage was 26%. Over the follow-up period, 258 (25%) CLWH with annual and 40 (23%) with semi-annual pVL monitoring developed treatment failure, corresponding to incidence rates of 5.4 (95% CI: 4.8 to 6.1) and 4.3 (95% CI: 3.1 to 5.8) per 100 patient-years of follow-up respectively (p = 0.27). In multivariable analyses, the frequency of pVL monitoring was not associated with treatment failure (adjusted hazard ratio: 1.12; 95% CI: 0.80 to 1.59).
CONCLUSIONS: Annual compared to semi-annual pVL monitoring was not associated with an increased risk of treatment failure in our cohort of virally suppressed children with perinatally acquired HIV on first-line NNRTI-based cART.
METHODS: CLHIV aged <18 years, who were on first-line cART for ≥12 months, and had virological suppression (two consecutive plasma viral load [pVL] <50 copies/mL) were included. Those who started treatment with mono/dual antiretroviral therapy, had a history of treatment interruption >14 days, or received treatment and care at sites with a pVL lower limit of detection >50 copies/mL were excluded. LLV was defined as a pVL 50 to 1000 copies/mL, and VF as a single pVL >1000 copies/mL. Baseline was the time of the second pVL
DESIGN: Death-related data were retrospectively and prospectively assessed in a longitudinal regional cohort study.
METHODS: Children under routine HIV care at sites in Cambodia, India, Indonesia, Malaysia, Thailand, and Vietnam between 2008 and 2017 were followed. Causes of death were reported and then independently and centrally reviewed. Predictors were compared using competing risks survival regression analyses.
RESULTS: Among 5918 children, 5523 (93%; 52% male) had ever been on combination antiretroviral therapy. Of 371 (6.3%) deaths, 312 (84%) occurred in those with a history of combination antiretroviral therapy (crude all-cause mortality 9.6 per 1000 person-years; total follow-up time 32 361 person-years). In this group, median age at death was 7.0 (2.9-13) years; median CD4 cell count was 73 (16-325) cells/μl. The most common underlying causes of death were pneumonia due to unspecified pathogens (17%), tuberculosis (16%), sepsis (8.0%), and AIDS (6.7%); 12% of causes were unknown. These clinical diagnoses were further grouped into AIDS-related infections (22%) and noninfections (5.8%), and non-AIDS-related infections (47%) and noninfections (11%); with 12% unknown, 2.2% not reviewed. Higher CD4 cell count and better weight-for-age z-score were protective against death.
CONCLUSION: Our standardized cause of death assessment provides robust data to inform regional resource allocation for pediatric diagnostic evaluations and prioritization of clinical interventions, and highlight the continued importance of opportunistic and nonopportunistic infections as causes of death in our cohort.
METHODS: PLHIV enrolled in the Therapeutics, Research, Education and AIDS Training in Asia (TREAT Asia) HIV Observational Database (TAHOD) who initiated ART with a CD4 count 1 year were censored at 12 months. Competing risk regression was used to analyse risk factors with loss to follow-up as a competing risk.
RESULTS: A total of 1813 PLHIV were included in the study, of whom 74% were male. With 73 (4%) deaths, the overall first-year mortality rate was 4.27 per 100 person-years (PY). Thirty-eight deaths (52%) were AIDS-related, 10 (14%) were immune reconstituted inflammatory syndrome (IRIS)-related, 13 (18%) were non-AIDS-related and 12 (16%) had an unknown cause. Risk factors included having a body mass index (BMI) 100 cells/μL: SHR 0.12; 95% CI 0.05-0.26) was associated with reduced hazard for mortality compared to CD4 count ≤ 25 cells/μL.
CONCLUSIONS: Fifty-two per cent of early deaths were AIDS-related. Efforts to initiate ART at CD4 counts > 50 cell/μL are associated with improved short-term survival rates, even in those with late stages of HIV disease.
METHODS: Adults > 18 years of age on second-line ART for ≥ 6 months were eligible. Cross-sectional data on HIV viral load (VL) and genotypic resistance testing were collected or testing was conducted between July 2015 and May 2017 at 12 Asia-Pacific sites. Virological failure (VF) was defined as VL > 1000 copies/mL with a second VL > 1000 copies/mL within 3-6 months. FASTA files were submitted to Stanford University HIV Drug Resistance Database and RAMs were compared against the IAS-USA 2019 mutations list. VF risk factors were analysed using logistic regression.
RESULTS: Of 1378 patients, 74% were male and 70% acquired HIV through heterosexual exposure. At second-line switch, median [interquartile range (IQR)] age was 37 (32-42) years and median (IQR) CD4 count was 103 (43.5-229.5) cells/µL; 93% received regimens with boosted protease inhibitors (PIs). Median duration on second line was 3 years. Among 101 patients (7%) with VF, CD4 count > 200 cells/µL at switch [odds ratio (OR) = 0.36, 95% confidence interval (CI): 0.17-0.77 vs. CD4 ≤ 50) and HIV exposure through male-male sex (OR = 0.32, 95% CI: 0.17-0.64 vs. heterosexual) or injecting drug use (OR = 0.24, 95% CI: 0.12-0.49) were associated with reduced VF. Of 41 (41%) patients with resistance data, 80% had at least one RAM to nonnucleoside reverse transcriptase inhibitors (NNRTIs), 63% to NRTIs, and 35% to PIs. Of those with PI RAMs, 71% had two or more.
CONCLUSIONS: There were low proportions with VF and significant RAMs in our cohort, reflecting the durability of current second-line regimens.
SETTING: An Asian cohort in 16 pediatric HIV services across 6 countries.
METHODS: From 2005 to 2014, patients younger than 20 years who achieved virologic suppression and had subsequent viral load testing were included. Early virologic failure was defined as a HIV RNA ≥1000 copies per milliliter within 12 months of virologic suppression, and late virologic as a HIV RNA ≥1000 copies per milliliter after 12 months following virologic suppression. Characteristics at combination antiretroviral therapy initiation and virologic suppression were described, and a competing risk time-to-event analysis was used to determine cumulative incidence of virologic failure and factors at virologic suppression associated with early and late virologic failure.
RESULTS: Of 1105 included in the analysis, 182 (17.9%) experienced virologic failure. The median age at virologic suppression was 6.9 years, and the median time to virologic failure was 24.6 months after virologic suppression. The incidence rate for a first virologic failure event was 3.3 per 100 person-years. Factors at virologic suppression associated with late virologic failure included older age, mostly rural clinic setting, tuberculosis, protease inhibitor-based regimens, and early virologic failure. No risk factors were identified for early virologic failure.
CONCLUSIONS: Around 1 in 5 experienced virologic failure in our cohort after achieving virologic suppression. Targeted interventions to manage complex treatment scenarios, including adolescents, tuberculosis coinfection, and those with poor virologic control are required.
METHODS: Adults living with HIV enrolled in a regional observational cohort in Asia who had initiated combination antiretroviral therapy (cART) were included in the analysis. Factors associated with new TB diagnoses after cohort entry and survival after cART initiation were analysed using Cox regression, stratified by site.
RESULTS: A total of 7355 patients from 12 countries enrolled into the cohort between 2003 and 2016 were included in the study. There were 368 reported cases of TB after cohort entry with an incidence rate of 0.99 per 100 person-years (/100 pys). Multivariate analyses adjusted for viral load (VL), CD4 count, body mass index (BMI) and cART duration showed that CTX reduced the hazard for new TB infection by 28% (HR 0.72, 95% CI l 0.56, 0.93). Mortality after cART initiation was 0.85/100 pys, with a median follow-up time of 4.63 years. Predictors of survival included age, female sex, hepatitis C co-infection, TB diagnosis, HIV VL, CD4 count and BMI.
CONCLUSIONS: CTX was associated with a reduction in the hazard for new TB infection but did not impact survival in our Asian cohort. The potential preventive effect of CTX against TB during periods of severe immunosuppression should be further explored.
METHODS: HIV-infected adults enrolled in the TREAT Asia HIV Observational Database were eligible if they had an HIV RNA measurement documented at the time of ART initiation. The dataset was randomly split into a derivation data set (75% of patients) and a validation data set (25%). Factors associated with pre-treatment HIV RNA <100,000 copies/mL were evaluated by logistic regression adjusted for study site. A prediction model and prediction scores were created.
RESULTS: A total of 2592 patients were enrolled for the analysis. Median [interquartile range (IQR)] age was 35.8 (29.9-42.5) years; CD4 count was 147 (50-248) cells/mm3; and pre-treatment HIV RNA was 100,000 (34,045-301,075) copies/mL. Factors associated with pre-treatment HIV RNA <100,000 copies/mL were age <30 years [OR 1.40 vs. 41-50 years; 95% confidence interval (CI) 1.10-1.80, p = 0.01], body mass index >30 kg/m2(OR 2.4 vs. <18.5 kg/m2; 95% CI 1.1-5.1, p = 0.02), anemia (OR 1.70; 95% CI 1.40-2.10, p 350 cells/mm3(OR 3.9 vs. <100 cells/mm3; 95% CI 2.0-4.1, p 2000 cells/mm3(OR 1.7 vs. <1000 cells/mm3; 95% CI 1.3-2.3, p 25 yielded the sensitivity of 46.7%, specificity of 79.1%, positive predictive value of 67.7%, and negative predictive value of 61.2% for prediction of pre-treatment HIV RNA <100,000 copies/mL among derivation patients.
CONCLUSION: A model prediction for pre-treatment HIV RNA <100,000 copies/mL produced an area under the ROC curve of 0.70. A larger sample size for prediction model development as well as for model validation is warranted.
METHODS: This study included people living with HIV enrolled in a longitudinal cohort study from 2003 to 2019, receiving antiretroviral therapy (ART), and without prior tuberculosis. BMI at ART initiation was categorized using Asian BMI classifications: underweight (<18.5 kg/m2 ), normal (18.5-22.9 kg/m2 ), overweight (23-24.9 kg/m2 ), and obese (≥25 kg/m2 ). High FBG was defined as a single post-ART FBG measurement ≥126 mg/dL. Factors associated with high FBG were analyzed using Cox regression models stratified by site.
RESULTS: A total of 3939 people living with HIV (63% male) were included. In total, 50% had a BMI in the normal weight range, 23% were underweight, 13% were overweight, and 14% were obese. Median age at ART initiation was 34 years (interquartile range 29-41). Overall, 8% had a high FBG, with an incidence rate of 1.14 per 100 person-years. Factors associated with an increased hazard of high FBG included being obese (≥25 kg/m2 ) compared with normal weight (hazard ratio [HR] = 1.79; 95% confidence interval [CI] 1.31-2.44; p 25 kg/m2 were at increased risk of high FBG. This indicates that regular assessments should be performed in those with high BMI, irrespective of the classification used.
METHODS: We used data from the TREAT Asia HIV Observational Database. Patients were included if they started antiretroviral therapy during or after 2003, had a serum creatinine measurement at antiretroviral therapy initiation (baseline), and had at least 2 follow-up creatinine measurements taken ≥3 months apart. Patients with a baseline estimated glomerular filtration rate (eGFR) ≤60 mL/min/1.73 m2 were excluded. Chronic kidney disease was defined as 2 consecutive eGFR values ≤60 mL/min/1.73 m2 taken ≥3 months apart. Generalized estimating equations were used to identify factors associated with eGFR change. Competing risk regression adjusted for study site, age and sex, and cumulative incidence plots were used to evaluate factors associated with chronic kidney disease (CKD).
RESULTS: Of 2547 patients eligible for this analysis, tenofovir was being used by 703 (27.6%) at baseline. Tenofovir use, high baseline eGFR, advanced HIV disease stage, and low nadir CD4 were associated with a decrease in eGFR during follow-up. Chronic kidney disease occurred at a rate of 3.4 per 1000 patient/years. Factors associated with CKD were tenofovir use, old age, low baseline eGFR, low nadir CD4, and protease inhibitor use.
CONCLUSIONS: There is an urgent need to enhance renal monitoring and management capacity among at-risk groups in Asia and improve access to less nephrotoxic antiretrovirals.
METHODS: Incidence of malignancy after cohort enrollment was evaluated. Factors associated with development of hematological and nonhematological malignancy were analyzed using competing risk regression and survival time using Kaplan-Meier.
RESULTS: Of 7455 patients, 107 patients (1%) developed a malignancy: 34 (0.5%) hematological [0.08 per 100 person-years (/100PY)] and 73 (1%) nonhematological (0.17/100PY). Of the hematological malignancies, non-Hodgkin lymphoma was predominant (n = 26, 76%): immunoblastic (n = 6, 18%), Burkitt (n = 5, 15%), diffuse large B-cell (n = 5, 15%), and unspecified (n = 10, 30%). Others include central nervous system lymphoma (n = 7, 21%) and myelodysplastic syndrome (n = 1, 3%). Nonhematological malignancies were mostly Kaposi sarcoma (n = 12, 16%) and cervical cancer (n = 10, 14%). Risk factors for hematological malignancy included age >50 vs. ≤30 years [subhazard ratio (SHR) = 6.48, 95% confidence interval (CI): 1.79 to 23.43] and being from a high-income vs. a lower-middle-income country (SHR = 3.97, 95% CI: 1.45 to 10.84). Risk was reduced with CD4 351-500 cells/µL (SHR = 0.20, 95% CI: 0.05 to 0.74) and CD4 >500 cells/µL (SHR = 0.14, 95% CI: 0.04 to 0.78), compared to CD4 ≤200 cells/µL. Similar risk factors were seen for nonhematological malignancy, with prior AIDS diagnosis showing a weak association. Patients diagnosed with a hematological malignancy had shorter survival time compared to patients diagnosed with a nonhematological malignancy.
CONCLUSIONS: Nonhematological malignancies were common but non-Hodgkin lymphoma was more predominant in our cohort. PLHIV from high-income countries were more likely to be diagnosed, indicating a potential underdiagnosis of cancer in low-income settings.
METHODS: Long-term LTFU was defined as LTFU occurring after 5 years on ART. LTFU was defined as (1) patients not seen in the previous 12 months; and (2) patients not seen in the previous 6 months. Factors associated with LTFU were analysed using competing risk regression.
RESULTS: Under the 12-month definition, the LTFU rate was 2.0 per 100 person-years (PY) [95% confidence interval (CI) 1.8-2.2 among 4889 patients included in the study. LTFU was associated with age > 50 years [sub-hazard ratio (SHR) 1.64; 95% CI 1.17-2.31] compared with 31-40 years, viral load ≥ 1000 copies/mL (SHR 1.86; 95% CI 1.16-2.97) compared with viral load < 1000 copies/mL, and hepatitis C coinfection (SHR 1.48; 95% CI 1.06-2.05). LTFU was less likely to occur in females, in individuals with higher CD4 counts, in those with self-reported adherence ≥ 95%, and in those living in high-income countries. The 6-month LTFU definition produced an incidence rate of 3.2 per 100 PY (95% CI 2.9-3.4 and had similar associations but with greater risks of LTFU for ART initiation in later years (2006-2009: SHR 2.38; 95% CI 1.93-2.94; and 2010-2011: SHR 4.26; 95% CI 3.17-5.73) compared with 2003-2005.
CONCLUSIONS: The long-term LTFU rate in our cohort was low, with older age being associated with LTFU. The increased risk of LTFU with later years of ART initiation in the 6-month analysis, but not the 12-month analysis, implies that there was a possible move towards longer HIV clinic scheduling in Asia.
METHODS: Treatment modification was defined as a change of two antiretrovirals, a drug class change or treatment interruption (TI), all for >14 days. We assessed factors associated with CD4 changes and undetectable viral load (UVL <1,000 copies/ml) at 1 year after second-line failure using linear and logistic regression, respectively. Survival time was analysed using competing risk regression.
RESULTS: Of the 328 patients who failed second-line ART in our cohorts, 208 (63%) had a subsequent treatment modification. Compared with those who continued the failing regimen, the average CD4 cell increase was higher in patients who had a modification without TI (difference =77.5, 95% CI 35.3, 119.7) while no difference was observed among those with TI (difference =-5.3, 95% CI -67.3, 56.8). Compared with those who continued the failing regimen, the odds of achieving UVL was lower in patients with TI (OR=0.18, 95% CI 0.06, 0.60) and similar among those who had a modification without TI (OR=1.97, 95% CI 0.95, 4.10), with proportions of UVL 60%, 22% and 75%, respectively. Survival time was not affected by treatment modifications.
CONCLUSIONS: CD4 cell improvements were observed in those who had treatment modification without TI compared with those on the failing regimen. When no other options are available, maintaining the same failing ART combination provided better VL control than interrupting treatment.
METHODS: Treatment modification was defined as a change of two antiretrovirals, a drug class change or treatment interruption (TI), all for >14 days. We assessed factors associated with CD4 changes and undetectable viral load (UVL <1,000 copies/ml) at 1 year after second-line failure using linear and logistic regression, respectively. Survival time was analysed using competing risk regression.
RESULTS: Of the 328 patients who failed second-line ART in our cohorts, 208 (63%) had a subsequent treatment modification. Compared with those who continued the failing regimen, the average CD4 cell increase was higher in patients who had a modification without TI (difference =77.5, 95% CI 35.3, 119.7) while no difference was observed among those with TI (difference =-5.3, 95% CI -67.3, 56.8). Compared with those who continued the failing regimen, the odds of achieving UVL was lower in patients with TI (OR=0.18, 95% CI 0.06, 0.60) and similar among those who had a modification without TI (OR=1.97, 95% CI 0.95, 4.10), with proportions of UVL 60%, 22% and 75%, respectively. Survival time was not affected by treatment modifications.
CONCLUSIONS: CD4 cell improvements were observed in those who had treatment modification without TI compared with those on the failing regimen. When no other options are available, maintaining the same failing ART combination provided better VL control than interrupting treatment.
SETTINGS: A validation study among people living with HIV(PLHIV) aged ≥18 years among the cohorts in the Asia-Pacific region.
METHODS: PLHIV with baseline eGFR>60 mL/min/1.73m were included for validation of the D:A:D CKD full version and the short version without cardiovascular risk factors. Those with <3 eGFR measurements from baseline or previous exposure to potentially nephrotoxic antiretrovirals were excluded. Kaplan-Meier methods were used to estimate the probability of CKD development. Area Under the Receiver Operating Characteristics (AUROC) was also used to validate the risk score.
RESULTS: We included 5,701 participants in full model(median 8.1 [IQR 4.8-10.9] years follow-up) and 9,791 in short model validation(median 4.9 [IQR 2.5-7.3] years follow-up). The crude incidence rate of CKD was 8.1 (95%CI 7.3-8.9) per 1,000 person-years(PYS) in the full model cohort and 10.5 (95%CI 9.6-11.4) per 1,000 PYS in the short model cohort. The progression rates for CKD at 10 years in the full model cohort were 2.7%, 8.9% and 26.1% for low-, medium- and high-risk groups, and 3.5%, 11.7% and 32.4% in the short model cohort. The AUROC for the full and short risk score was 0.81 (95%CI 0.79-0.83) and 0.83 (95%CI 0.81-0.85), respectively.
CONCLUSION: The D:A:D CKD full- and short-risk score performed well in predicting CKD events among Asian PLHIV. These risk prediction models may be useful to assist clinicians in identifying individuals at high risk of developing CKD.