KEY FINDINGS: Among ARVs, the most common drugs employed from the class of entry inhibitors are maraviroc (MVC), which is a CCR5 receptor antagonist. Other entry inhibitors like emtricitabine (FTC) and tenofovir (TFV) are also used. Rilpivirine (RPV) and dapivirine (DPV) are the most common drugs employed from the Non-nucleoside reverse transcriptase inhibitor (NNRTIs) class, whereas, tenofovir disoproxil fumarate (TDF) is primarily used in the Nucleoside Reverse Transcriptase Inhibitor (NRTIs) class. Cabotegravir (CAB) is an analog of dolutegravir, and it is an integrase inhibitor. Some of these drugs are also used in combination with other drugs from the same class.
SUMMARY: Some of the most common pre-exposure prophylactic strategies employed currently are the use of inhibitors, namely entry inhibitors, non-nucleoside reverse transcriptase inhibitors, nucleoside reverse transcriptase inhibitors, integrase and protease inhibitors. In addition, we have also discussed on the adverse effects caused by ART in PrEP, pharmacoeconomics factors and the use of antiretroviral prophylaxis in serodiscordant couples.
METHODS: To create a retrospective cohort of all adults with HIV released from jails and prisons in Connecticut, USA (2007-14), we linked administrative custody and pharmacy databases with mandatory HIV/AIDS surveillance monitoring and case management data. We examined time to LTC (defined as first viral load measurement after release) and viral suppression at LTC. We used generalised estimating equations to show predictors of LTC within 14 days and 30 days of release.
FINDINGS: Among 3302 incarceration periods for 1350 individuals between 2007 and 2014, 672 (21%) of 3181 periods had LTC within 14 days of release, 1042 (34%) of 3064 had LTC within 30 days of release, and 301 (29%) of 1042 had detectable viral loads at LTC. Factors positively associated with LTC within 14 days of release are intermediate (31-364 days) incarceration duration (adjusted odds ratio 1·52; 95% CI 1·19-1·95), and transitional case management (1·65; 1·36-1·99), receipt of antiretroviral therapy during incarceration (1·39; 1·11-1·74), and two or more medical comorbidities (1·86; 1·48-2·36). Reincarceration (0·70; 0·56-0·88) and conditional release (0·62; 0·50-0·78) were negatively associated with LTC within 14 days. Hispanic ethnicity, bonded release, and psychiatric comorbidity were also associated with LTC within 30 days but reincarceration was not.
INTERPRETATION: LTC after release is suboptimal but improves when inmates' medical, psychiatric, and case management needs are identified and addressed before release. People who are rapidly cycling through jail facilities are particularly vulnerable to missed linkage opportunities. The use of integrated programmes to align justice and health-care goals has great potential to improve long-term HIV treatment outcomes.
FUNDING: US National Institutes of Health.
DESIGN: Sarcopenia (age-related muscle loss) causes significant morbidity to the elderly, leading to frequent hospitalizations, disability and death. Few have characterized sarcopenia in the HIV-infected who experience accelerated aging.
METHODS: Sarcopenia was defined as low muscle mass with weak grip strength and/or slow gait speed using lower 20th percentiles of controls. Multivariate logistic and linear regression analyses were used to explore risk factors and health-related outcomes associated with sarcopenia among HIV-infected individuals.
RESULTS: We recruited 315 HIV-infected individuals aged at least 25 years with at least 1-year history of undetectable viral load on treatment (HIV RNA <50 copies/ml). Percentage of sarcopenia in 315 HIV-infected was 8%. Subsequently, 153 of the 315 were paired with age, sex and ethnically matched HIV-uninfected. The percentage of sarcopenia in the HIV-infected (n = 153) compared with uninfected (n = 153) were 10 vs. 6% (P = 0.193) respectively, whereas of those at least 50 years of age among them were 17% vs. 4% (P = 0.049), respectively. Associated risk factors among the HIV-infected include education level, employment status, BMI, baseline CD4 cell count, duration on NRTIs and GGT levels. Identified negative outcomes include mortality risk scores [5.42; 95% CI 1.46-9.37; P = 0.007) and functional disability (3.95; 95% CI 1.57-9.97; P = 0.004).
CONCLUSION: Sarcopenia is more prevalent in HIV-infected at least 50 years old compared with matched controls. Our findings highlight associations between sarcopenia with loss of independence and greater healthcare burden among treated HIV-infected individuals necessitating early recognition and intervention.
METHODS: Data from two regional cohort observational databases were analyzed for trends in median CD4 cell counts at ART initiation and the proportion of late ART initiation (CD4 cell counts <200 cells/mm(3) or prior AIDS diagnosis). Predictors for late ART initiation and mortality were determined.
RESULTS: A total of 2737 HIV-positive ART-naïve patients from 22 sites in 13 Asian countries and territories were eligible. The overall median (IQR) CD4 cell count at ART initiation was 150 (46-241) cells/mm(3). Median CD4 cell counts at ART initiation increased over time, from a low point of 115 cells/mm(3) in 2008 to a peak of 302 cells/mm(3) after 2011 (p for trend 0.002). The proportion of patients with late ART initiation significantly decreased over time from 79.1% before 2007 to 36.3% after 2011 (p for trend <0.001). Factors associated with late ART initiation were year of ART initiation (e.g. 2010 vs. before 2007; OR 0.40, 95% CI 0.27-0.59; p<0.001), sex (male vs. female; OR 1.51, 95% CI 1.18-1.93; p=0.001) and HIV exposure risk (heterosexual vs. homosexual; OR 1.66, 95% CI 1.24-2.23; p=0.001 and intravenous drug use vs. homosexual; OR 3.03, 95% CI 1.77-5.21; p<0.001). Factors associated with mortality after ART initiation were late ART initiation (HR 2.13, 95% CI 1.19-3.79; p=0.010), sex (male vs. female; HR 2.12, 95% CI 1.31-3.43; p=0.002), age (≥51 vs. ≤30 years; HR 3.91, 95% CI 2.18-7.04; p<0.001) and hepatitis C serostatus (positive vs. negative; HR 2.48, 95% CI 1.-4.36; p=0.035).
CONCLUSIONS: Median CD4 cell count at ART initiation among Asian patients significantly increases over time but the proportion of patients with late ART initiation is still significant. ART initiation at higher CD4 cell counts remains a challenge. Strategic interventions to increase earlier diagnosis of HIV infection and prompt more rapid linkage to ART must be implemented.
METHODS: CLHIV aged <18 years, who were on first-line cART for ≥12 months, and had virological suppression (two consecutive plasma viral load [pVL] <50 copies/mL) were included. Those who started treatment with mono/dual antiretroviral therapy, had a history of treatment interruption >14 days, or received treatment and care at sites with a pVL lower limit of detection >50 copies/mL were excluded. LLV was defined as a pVL 50 to 1000 copies/mL, and VF as a single pVL >1000 copies/mL. Baseline was the time of the second pVL
METHODS: Data on children with perinatally acquired HIV aged <18 years on first-line, non-nucleoside reverse transcriptase inhibitor-based cART with viral suppression (two consecutive pVL <400 copies/mL over a six-month period) were included from a regional cohort study; those exposed to prior mono- or dual antiretroviral treatment were excluded. Frequency of pVL monitoring was determined at the site-level based on the median rate of pVL measurement: annual 0.75 to 1.5, and semi-annual >1.5 tests/patient/year. Treatment failure was defined as virologic failure (two consecutive pVL >1000 copies/mL), change of antiretroviral drug class, or death. Baseline was the date of the second consecutive pVL <400 copies/mL. Competing risk regression models were used to identify predictors of treatment failure.
RESULTS: During January 2008 to March 2015, there were 1220 eligible children from 10 sites that performed at least annual pVL monitoring, 1042 (85%) and 178 (15%) were from sites performing annual (n = 6) and semi-annual pVL monitoring (n = 4) respectively. Pre-cART, 675 children (55%) had World Health Organization clinical stage 3 or 4, the median nadir CD4 percentage was 9%, and the median pVL was 5.2 log10 copies/mL. At baseline, the median age was 9.2 years, 64% were on nevirapine-based regimens, the median cART duration was 1.6 years, and the median CD4 percentage was 26%. Over the follow-up period, 258 (25%) CLWH with annual and 40 (23%) with semi-annual pVL monitoring developed treatment failure, corresponding to incidence rates of 5.4 (95% CI: 4.8 to 6.1) and 4.3 (95% CI: 3.1 to 5.8) per 100 patient-years of follow-up respectively (p = 0.27). In multivariable analyses, the frequency of pVL monitoring was not associated with treatment failure (adjusted hazard ratio: 1.12; 95% CI: 0.80 to 1.59).
CONCLUSIONS: Annual compared to semi-annual pVL monitoring was not associated with an increased risk of treatment failure in our cohort of virally suppressed children with perinatally acquired HIV on first-line NNRTI-based cART.
METHODS: Patients enrolled in the TREAT Asia HIV Observational Database cohort and on cART for more than six months were analysed. Comorbidities included hypertension, diabetes, dyslipidaemia and impaired renal function. Treatment outcomes of patients ≥50 years of age with comorbidities were compared with those <50 years and those ≥50 years without comorbidities. We analysed 5411 patients with virological failure and 5621 with immunologic failure. Our failure outcomes were defined to be in-line with the World Health Organization 2016 guidelines. Cox regression analysis was used to analyse time to first virological and immunological failure.
RESULTS: The incidence of virologic failure was 7.72/100 person-years. Virological failure was less likely in patients with better adherence and higher CD4 count at cART initiation. Those acquiring HIV through intravenous drug use were more likely to have virological failure compared to those infected through heterosexual contact. On univariate analysis, patients aged <50 years without comorbidities were more likely to experience virological failure than those aged ≥50 years with comorbidities (hazard ratio 1.75, 95% confidence interval (CI) 1.31 to 2.33, p
METHODS: We compared these regimens with respect to clinical, immunologic, and virologic outcomes using data from prospective studies of human immunodeficiency virus (HIV)-infected individuals in Europe and the United States in the HIV-CAUSAL Collaboration, 2004-2013. Antiretroviral therapy-naive and AIDS-free individuals were followed from the time they started a lopinavir or an atazanavir regimen. We estimated the 'intention-to-treat' effect for atazanavir vs lopinavir regimens on each of the outcomes.
RESULTS: A total of 6668 individuals started a lopinavir regimen (213 deaths, 457 AIDS-defining illnesses or deaths), and 4301 individuals started an atazanavir regimen (83 deaths, 157 AIDS-defining illnesses or deaths). The adjusted intention-to-treat hazard ratios for atazanavir vs lopinavir regimens were 0.70 (95% confidence interval [CI], .53-.91) for death, 0.67 (95% CI, .55-.82) for AIDS-defining illness or death, and 0.91 (95% CI, .84-.99) for virologic failure at 12 months. The mean 12-month increase in CD4 count was 8.15 (95% CI, -.13 to 16.43) cells/µL higher in the atazanavir group. Estimates differed by NRTI backbone.
CONCLUSIONS: Our estimates are consistent with a lower mortality, a lower incidence of AIDS-defining illness, a greater 12-month increase in CD4 cell count, and a smaller risk of virologic failure at 12 months for atazanavir compared with lopinavir regimens.
METHODS: We used Cox regression to analyze data of a cohort of Asian children.
RESULTS: A total of 2608 children were included; median age at cART was 5.7 years. Time-updated weight for age z score < -3 was associated with mortality (P < 0.001) independent of CD4% and < -2 was associated with immunological failure (P ≤ 0.03) independent of age at cART.
CONCLUSIONS: Weight monitoring provides useful data to inform clinical management of children on cART in resource-limited settings.
METHODS: Blips were defined as detectable VL (≥ 50 copies/mL) preceded and followed by undetectable VL (<50 copies/mL). Virological failure (VF) was defined as two consecutive VL ≥50 copies/ml. Cox proportional hazard models of time to first VF after entry, were developed.
RESULTS: 5040 patients (AHOD n = 2597 and TAHOD n = 2521) were included; 910 (18%) of patients experienced blips. 744 (21%) and 166 (11%) of high- and middle/low-income participants, respectively, experienced blips ever. 711 (14%) experienced blips prior to virological failure. 559 (16%) and 152 (10%) of high- and middle/low-income participants, respectively, experienced blips prior to virological failure. VL testing occurred at a median frequency of 175 and 91 days in middle/low- and high-income sites, respectively. Longer time to VF occurred in middle/low income sites, compared with high-income sites (adjusted hazards ratio (AHR) 0.41; p<0.001), adjusted for year of first cART, Hepatitis C co-infection, cART regimen, and prior blips. Prior blips were not a significant predictor of VF in univariate analysis (AHR 0.97, p = 0.82). Differing magnitudes of blips were not significant in univariate analyses as predictors of virological failure (p = 0.360 for blip 50-≤1000, p = 0.309 for blip 50-≤400 and p = 0.300 for blip 50-≤200). 209 of 866 (24%) patients were switched to an alternate regimen in the setting of a blip.
CONCLUSION: Despite a lower proportion of blips occurring in low/middle-income settings, no significant difference was found between settings. Nonetheless, a substantial number of participants were switched to alternative regimens in the setting of blips.
METHODS: Surveys were conducted in April 2009. Analysis data from the Asia cohort were collected in March 2009 from 12 centres in Cambodia, India, Indonesia, Malaysia, and Thailand. Data from the IeDEA Southern Africa cohort were finalized in February 2008 from 10 centres in Malawi, Mozambique, South Africa and Zimbabwe.
RESULTS: Survey responses reflected inter-regional variations in drug access and national guidelines. A total of 1301 children in the TREAT Asia and 4561 children in the IeDEA Southern Africa cohorts met inclusion criteria for the cross-sectional analysis. Ten percent of Asian and 3.3% of African children were on second-line ART at the time of data transfer. Median age (interquartile range) in months at second-line initiation was 120 (78-145) months in the Asian cohort and 66 (29-112) months in the southern African cohort. Regimens varied, and the then current World Health Organization-recommended nucleoside reverse transcriptase combination of abacavir and didanosine was used in less than 5% of children in each region.
CONCLUSIONS: In order to provide life-long ART for children, better use of current first-line regimens and broader access to heat-stable, paediatric second-line and salvage formulations are needed. There will be limited benefit to earlier diagnosis of treatment failure unless providers and patients have access to appropriate drugs for children to switch to.