SETTING: An Asian cohort in 16 pediatric HIV services across 6 countries.
METHODS: From 2005 to 2014, patients younger than 20 years who achieved virologic suppression and had subsequent viral load testing were included. Early virologic failure was defined as a HIV RNA ≥1000 copies per milliliter within 12 months of virologic suppression, and late virologic as a HIV RNA ≥1000 copies per milliliter after 12 months following virologic suppression. Characteristics at combination antiretroviral therapy initiation and virologic suppression were described, and a competing risk time-to-event analysis was used to determine cumulative incidence of virologic failure and factors at virologic suppression associated with early and late virologic failure.
RESULTS: Of 1105 included in the analysis, 182 (17.9%) experienced virologic failure. The median age at virologic suppression was 6.9 years, and the median time to virologic failure was 24.6 months after virologic suppression. The incidence rate for a first virologic failure event was 3.3 per 100 person-years. Factors at virologic suppression associated with late virologic failure included older age, mostly rural clinic setting, tuberculosis, protease inhibitor-based regimens, and early virologic failure. No risk factors were identified for early virologic failure.
CONCLUSIONS: Around 1 in 5 experienced virologic failure in our cohort after achieving virologic suppression. Targeted interventions to manage complex treatment scenarios, including adolescents, tuberculosis coinfection, and those with poor virologic control are required.
METHODS: Prospectively collected longitudinal data from patients in Thailand, Hong Kong, Malaysia, Japan, Taiwan, and South Korea were provided for analysis. Covariates included demographics, hepatitis B and C coinfections, baseline CD4 T lymphocyte count, and plasma HIV-1 RNA levels. Clinical deterioration (a new diagnosis of Centers for Disease Control and Prevention category B/AIDS-defining illness or death) was assessed by proportional hazards models. Surrogate endpoints were 12-month change in CD4 cell count and virologic suppression post therapy, evaluated by linear and logistic regression, respectively.
RESULTS: Of 1105 patients, 1036 (93.8%) infected with CRF01_AE or subtype B were eligible for inclusion in clinical deterioration analyses and contributed 1546.7 person-years of follow-up (median: 413 days, interquartile range: 169-672 days). Patients >40 years demonstrated smaller immunological increases (P = 0.002) and higher risk of clinical deterioration (hazard ratio = 2.17; P = 0.008). Patients with baseline CD4 cell counts >200 cells per microliter had lower risk of clinical deterioration (hazard ratio = 0.373; P = 0.003). A total of 532 patients (48.1% of eligible) had CD4 counts available at baseline and 12 months post therapy for inclusion in immunolgic analyses. Patients infected with subtype B had larger increases in CD4 counts at 12 months (P = 0.024). A total of 530 patients (48.0% of eligible) were included in virological analyses with no differences in response found between genotypes.
CONCLUSIONS: Results suggest that patients infected with CRF01_AE have reduced immunologic response to therapy at 12 months, compared with subtype B-infected counterparts. Clinical deterioration was associated with low baseline CD4 counts and older age. The lack of differences in virologic outcomes suggests that all patients have opportunities for virological suppression.
METHODS: HIV+ patients from the Australian HIV Observational Database (AHOD) and the TREAT Asia HIV Observational Database (TAHOD) meeting specific criteria were included. In these analyses Asian and Caucasian status were defined by cohort. Factors associated with a low CD4:CD8 ratio (cutoff <0.2) prior to ART commencement, and with achieving a normal CD4:CD8 ratio (>1) at 12 and 24 months post ART commencement were assessed using logistic regression.
RESULTS: There were 591 patients from AHOD and 2,620 patients from TAHOD who met the inclusion criteria. TAHOD patients had a significantly (P<0.001) lower odds of having a baseline (prior to ART initiation) CD4:CD8 ratio greater than 0.2. After 12 months of ART, AHOD patients were more than twice as likely to achieve a normal CD4:CD8 ratio compared to TAHOD patients (15% versus 6%). However, after adjustment for confounding factors there was no significant difference between cohorts in the odds of achieving a CD4:CD8 ratio >1 (P=0.475).
CONCLUSIONS: We found a significantly lower CD4:CD8 ratio prior to commencing ART in TAHOD compared to AHOD even after adjusting for confounders. However, after adjustment, there was no significant difference between the cohorts in odds of achieving normal ratio. Baseline CD4+ and CD8+ counts seem to be the main driver for this difference between these two populations.
METHODS: Factors associated with survival and failure were analyzed using Cox proportional hazards and discrete time conditional logistic models.
RESULTS: TDR, found in 60 (4.1%) of 1471 Asian treatment-naive patients, was one of the significant predictors of failure. Patients with TDR to >1 drug in their regimen were >3 times as likely to fail compared to no TDR.
CONCLUSIONS: TDR was associated with failure in the context of non-fully sensitive regimens. Efforts are needed to incorporate resistance testing into national treatment programs.
METHODS AND FINDINGS: We reviewed all GenBank submissions of HIV-1 reverse transcriptase sequences with or without protease and identified 287 studies published between March 1, 2000, and December 31, 2013, with more than 25 recently or chronically infected ARV-naïve individuals. These studies comprised 50,870 individuals from 111 countries. Each set of study sequences was analyzed for phylogenetic clustering and the presence of 93 surveillance drug-resistance mutations (SDRMs). The median overall TDR prevalence in sub-Saharan Africa (SSA), south/southeast Asia (SSEA), upper-income Asian countries, Latin America/Caribbean, Europe, and North America was 2.8%, 2.9%, 5.6%, 7.6%, 9.4%, and 11.5%, respectively. In SSA, there was a yearly 1.09-fold (95% CI: 1.05-1.14) increase in odds of TDR since national ARV scale-up attributable to an increase in non-nucleoside reverse transcriptase inhibitor (NNRTI) resistance. The odds of NNRTI-associated TDR also increased in Latin America/Caribbean (odds ratio [OR] = 1.16; 95% CI: 1.06-1.25), North America (OR = 1.19; 95% CI: 1.12-1.26), Europe (OR = 1.07; 95% CI: 1.01-1.13), and upper-income Asian countries (OR = 1.33; 95% CI: 1.12-1.55). In SSEA, there was no significant change in the odds of TDR since national ARV scale-up (OR = 0.97; 95% CI: 0.92-1.02). An analysis limited to sequences with mixtures at less than 0.5% of their nucleotide positions—a proxy for recent infection—yielded trends comparable to those obtained using the complete dataset. Four NNRTI SDRMs—K101E, K103N, Y181C, and G190A—accounted for >80% of NNRTI-associated TDR in all regions and subtypes. Sixteen nucleoside reverse transcriptase inhibitor (NRTI) SDRMs accounted for >69% of NRTI-associated TDR in all regions and subtypes. In SSA and SSEA, 89% of NNRTI SDRMs were associated with high-level resistance to nevirapine or efavirenz, whereas only 27% of NRTI SDRMs were associated with high-level resistance to zidovudine, lamivudine, tenofovir, or abacavir. Of 763 viruses with TDR in SSA and SSEA, 725 (95%) were genetically dissimilar; 38 (5%) formed 19 sequence pairs. Inherent limitations of this study are that some cohorts may not represent the broader regional population and that studies were heterogeneous with respect to duration of infection prior to sampling.
CONCLUSIONS: Most TDR strains in SSA and SSEA arose independently, suggesting that ARV regimens with a high genetic barrier to resistance combined with improved patient adherence may mitigate TDR increases by reducing the generation of new ARV-resistant strains. A small number of NNRTI-resistance mutations were responsible for most cases of high-level resistance, suggesting that inexpensive point-mutation assays to detect these mutations may be useful for pre-therapy screening in regions with high levels of TDR. In the context of a public health approach to ARV therapy, a reliable point-of-care genotypic resistance test could identify which patients should receive standard first-line therapy and which should receive a protease-inhibitor-containing regimen.
METHODS: We analysed incident HIV diagnoses from 2015-2018 and mortality trends from 2016-2018 for three age groups: 1) 15-24 years; 2) 25-49 years; and 3) ≥50 years. AIDS was defined as CD4<200cells/mL. Mortality was defined as deaths per 1000 patients newly diagnosed with HIV within the same calendar year. Mortality rates were calculated for 2016, 2017, and 2018, compared to age-matched general population rates, and all-cause standardized mortality ratios (SMRs) were calculated.
RESULTS: From 2015-2018, the proportion of OPWH annually diagnosed with HIV increased from 11.2% to 14.9% (p<0.01). At the time of diagnosis, OPWH were also significantly (p<0.01) more likely to have AIDS (43.8%) than those aged 25-49 years (29.5%) and 15-24 years (13.3%). Newly diagnosed OPWH had the same-year mortality ranging from 3 to 8 times higher than age-matched groups in the Ukrainian general population.
CONCLUSIONS: These findings suggest a reassessment of HIV testing, prevention and treatment strategies in Ukraine is needed to bring OPWH into focus. OPWH are more likely to present with late-stage HIV and have higher mortality rates. Re-designing testing practices is especially crucial since OPWH are absent from targeted testing programs and are increasingly diagnosed as they present with AIDS-defining symptoms. New strategies for linkage and treatment programs should reflect the distinct needs of this target population.