Displaying publications 41 - 44 of 44 in total

Abstract:
Sort:
  1. Han WM, Avihingsanon A, Rajasuriar R, Tanuma J, Mundhe S, Lee MP, et al.
    J Acquir Immune Defic Syndr, 2023 Feb 01;92(2):180-188.
    PMID: 36625858 DOI: 10.1097/QAI.0000000000003121
    BACKGROUND: We evaluated trends in CD4/CD8 ratio among people living with HIV (PLWH) starting antiretroviral therapy (ART) with first-line integrase strand transfer inhibitors (INSTI) compared with non-INSTI-based ART, and the incidence of CD4/CD8 ratio normalization.

    METHODS: All PLWH enrolled in adult HIV cohorts of IeDEA Asia-Pacific who started with triple-ART with at least 1 CD4, CD8 (3-month window), and HIV-1 RNA measurement post-ART were included. CD4/CD8 ratio normalization was defined as a ratio ≥1. Longitudinal changes in CD4/CD8 ratio were analyzed by linear mixed model, the incidence of the normalization by Cox regression, and the differences in ratio recovery by group-based trajectory modeling.

    RESULTS: A total of 5529 PLWH were included; 80% male, median age 35 years (interquartile range [IQR], 29-43). First-line regimens were comprised of 65% NNRTI, 19% PI, and 16% INSTI. The baseline CD4/CD8 ratio was 0.19 (IQR, 0.09-0.33). PLWH starting with NNRTI- (P = 0.005) or PI-based ART (P = 0.030) had lower CD4/CD8 recovery over 5 years compared with INSTI. During 24,304 person-years of follow-up, 32% had CD4/CD8 ratio normalization. After adjusting for age, sex, baseline CD4, HIV-1 RNA, HCV, and year of ART initiation, PLWH started with INSTI had higher odds of achieving CD4/CD8 ratio normalization than NNRTI- (P < 0.001) or PI-based ART (P = 0.015). In group-based trajectory modeling analysis, INSTI was associated with greater odds of being in the higher ratio trajectory.

    CONCLUSIONS: INSTI use was associated with higher rates of CD4/CD8 ratio recovery and normalization in our cohort. These results emphasize the relative benefits of INSTI-based ART for immune restoration.

  2. Ku NS, Jiamsakul A, Ng OT, Yunihastuti E, Cuong DD, Lee MP, et al.
    Medicine (Baltimore), 2016 Aug;95(32):e4570.
    PMID: 27512885 DOI: 10.1097/MD.0000000000004570
    Elevated CD8 counts with combination antiretroviral therapy (cART) initiation may be an early warning indicator for future treatment failure. Thus, we investigated whether elevated CD8 counts were associated with virological failure (VF) in the first 4 years of cART in Asian HIV-infected patients in a multicenter regional cohort.We included patients from the TREAT Asia HIV Observational Database (TAHOD). Patients were included in the analysis if they started cART between 1996 and 2013 with at least one CD8 measurement within 6 months prior to cART initiation and at least one CD8 and viral load (VL) measurement beyond 6 months after starting cART. We defined VF as VL ≥400 copies/mL after 6 months on cART. Elevated CD8 was defined as CD8 ≥1200 cells/μL. Time to VF was modeled using Cox regression analysis, stratified by site.In total, 2475 patients from 19 sites were included in this analysis, of whom 665 (27%) experienced VF in the first 4 years of cART. The overall rate of VF was 12.95 per 100 person-years. In the multivariate model, the most recent elevated CD8 was significantly associated with a greater hazard of VF (HR = 1.35, 95% CI 1.14-1.61; P = 0.001). However, the sensitivity analysis showed that time-lagged CD8 measured at least 6 months prior to our virological endpoint was not statistically significant (P = 0.420).This study indicates that the relationship between the most recent CD8 count and VF was possibly due to the CD8 cells reacting to the increase in VL rather than causing the VL increase itself. However, CD8 levels may be a useful indicator for VF in HIV-infected patients after starting cART.
  3. Tanuma J, Jiamsakul A, Makane A, Avihingsanon A, Ng OT, Kiertiburanakul S, et al.
    PLoS One, 2016;11(8):e0161562.
    PMID: 27560968 DOI: 10.1371/journal.pone.0161562
    BACKGROUND: In resource-limited settings, routine monitoring of renal function during antiretroviral therapy (ART) has not been recommended. However, concerns for tenofovir disoproxil fumarate (TDF)-related nephrotoxicity persist with increased use.

    METHODS: We investigated serum creatinine (S-Cr) monitoring rates before and during ART and the incidence and prevalence of renal dysfunction after starting TDF by using data from a regional cohort of HIV-infected individuals in the Asia-Pacific. Time to renal dysfunction was defined as time from TDF initiation to the decline in estimated glomerular filtration rate (eGFR) to <60 ml/min/1.73m2 with >30% reduction from baseline using the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation or the decision to stop TDF for reported TDF-nephrotoxicity. Predictors of S-Cr monitoring rates were assessed by Poisson regression and risk factors for developing renal dysfunction were assessed by Cox regression.

    RESULTS: Among 2,425 patients who received TDF, S-Cr monitoring rates increased from 1.01 to 1.84 per person per year after starting TDF (incidence rate ratio 1.68, 95%CI 1.62-1.74, p <0.001). Renal dysfunction on TDF occurred in 103 patients over 5,368 person-years of TDF use (4.2%; incidence 1.75 per 100 person-years). Risk factors for developing renal dysfunction included older age (>50 vs. ≤30, hazard ratio [HR] 5.39, 95%CI 2.52-11.50, p <0.001; and using PI-based regimen (HR 1.93, 95%CI 1.22-3.07, p = 0.005). Having an eGFR prior to TDF (pre-TDF eGFR) of ≥60 ml/min/1.73m2 showed a protective effect (HR 0.38, 95%CI, 0.17-0.85, p = 0.018).

    CONCLUSIONS: Renal dysfunction on commencing TDF use was not common, however, older age, lower baseline eGFR and PI-based ART were associated with higher risk of renal dysfunction during TDF use in adult HIV-infected individuals in the Asia-Pacific region.

  4. Rhee SY, Blanco JL, Jordan MR, Taylor J, Lemey P, Varghese V, et al.
    PLoS Med, 2015 Apr;12(4):e1001810.
    PMID: 25849352 DOI: 10.1371/journal.pmed.1001810
    BACKGROUND: Regional and subtype-specific mutational patterns of HIV-1 transmitted drug resistance (TDR) are essential for informing first-line antiretroviral (ARV) therapy guidelines and designing diagnostic assays for use in regions where standard genotypic resistance testing is not affordable. We sought to understand the molecular epidemiology of TDR and to identify the HIV-1 drug-resistance mutations responsible for TDR in different regions and virus subtypes.

    METHODS AND FINDINGS: We reviewed all GenBank submissions of HIV-1 reverse transcriptase sequences with or without protease and identified 287 studies published between March 1, 2000, and December 31, 2013, with more than 25 recently or chronically infected ARV-naïve individuals. These studies comprised 50,870 individuals from 111 countries. Each set of study sequences was analyzed for phylogenetic clustering and the presence of 93 surveillance drug-resistance mutations (SDRMs). The median overall TDR prevalence in sub-Saharan Africa (SSA), south/southeast Asia (SSEA), upper-income Asian countries, Latin America/Caribbean, Europe, and North America was 2.8%, 2.9%, 5.6%, 7.6%, 9.4%, and 11.5%, respectively. In SSA, there was a yearly 1.09-fold (95% CI: 1.05-1.14) increase in odds of TDR since national ARV scale-up attributable to an increase in non-nucleoside reverse transcriptase inhibitor (NNRTI) resistance. The odds of NNRTI-associated TDR also increased in Latin America/Caribbean (odds ratio [OR] = 1.16; 95% CI: 1.06-1.25), North America (OR = 1.19; 95% CI: 1.12-1.26), Europe (OR = 1.07; 95% CI: 1.01-1.13), and upper-income Asian countries (OR = 1.33; 95% CI: 1.12-1.55). In SSEA, there was no significant change in the odds of TDR since national ARV scale-up (OR = 0.97; 95% CI: 0.92-1.02). An analysis limited to sequences with mixtures at less than 0.5% of their nucleotide positions—a proxy for recent infection—yielded trends comparable to those obtained using the complete dataset. Four NNRTI SDRMs—K101E, K103N, Y181C, and G190A—accounted for >80% of NNRTI-associated TDR in all regions and subtypes. Sixteen nucleoside reverse transcriptase inhibitor (NRTI) SDRMs accounted for >69% of NRTI-associated TDR in all regions and subtypes. In SSA and SSEA, 89% of NNRTI SDRMs were associated with high-level resistance to nevirapine or efavirenz, whereas only 27% of NRTI SDRMs were associated with high-level resistance to zidovudine, lamivudine, tenofovir, or abacavir. Of 763 viruses with TDR in SSA and SSEA, 725 (95%) were genetically dissimilar; 38 (5%) formed 19 sequence pairs. Inherent limitations of this study are that some cohorts may not represent the broader regional population and that studies were heterogeneous with respect to duration of infection prior to sampling.

    CONCLUSIONS: Most TDR strains in SSA and SSEA arose independently, suggesting that ARV regimens with a high genetic barrier to resistance combined with improved patient adherence may mitigate TDR increases by reducing the generation of new ARV-resistant strains. A small number of NNRTI-resistance mutations were responsible for most cases of high-level resistance, suggesting that inexpensive point-mutation assays to detect these mutations may be useful for pre-therapy screening in regions with high levels of TDR. In the context of a public health approach to ARV therapy, a reliable point-of-care genotypic resistance test could identify which patients should receive standard first-line therapy and which should receive a protease-inhibitor-containing regimen.

Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links