METHODOLOGY/PRINCIPLE FINDINGS: The D. siamensis monovalent antivenom displayed extensive recognition and binding to proteins found in D. siamensis venom, irrespective of the geographical origin of those venoms. Similar immunological characteristics were observed with the Hemato Polyvalent antivenom, which also uses D. siamensis venom as an immunogen, but binding levels were dramatically reduced when using comparator monovalent antivenoms manufactured against different snake species. A similar pattern was observed when investigating neutralization of coagulopathy, with the procoagulant action of all four geographical venom variants neutralized by both the D. siamensis monovalent and the Hemato Polyvalent antivenoms, while the comparator monovalent antivenoms were ineffective. These in vitro findings translated into therapeutic efficacy in vivo, as the D. siamensis monovalent antivenom was found to effectively protect against the lethal effects of all four geographical venom variants preclinically. Assessments of in vivo nephrotoxicity revealed that D. siamensis venom (700 μg/kg) significantly increased plasma creatinine and blood urea nitrogen levels in anaesthetised rats. The intravenous administration of D. siamensis monovalent antivenom at three times higher than the recommended scaled therapeutic dose, prior to and 1 h after the injection of venom, resulted in reduced levels of markers of nephrotoxicity and prevented renal morphological changes, although lower doses had no therapeutic effect.
CONCLUSIONS/SIGNIFICANCE: This study highlights the potential broad geographical utility of the Thai D. siamensis monovalent antivenom for treating envenomings by the Eastern Russell's viper. However, only the early delivery of high antivenom doses appears to be capable of preventing venom-induced nephrotoxicity.
METHODS: We conducted a case-control study comparing 25 patients with biopsy-proven LACR against 25 stable controls matched for age group, primary diagnosis and time post-transplant. IPV was calculated using coefficient of variance (CV) and mean absolute deviation (MAD) using tacrolimus levels in the preceding 12 months. We also assessed the percentage time for tacrolimus levels
METHODS: We used data from the TREAT Asia HIV Observational Database. Patients were included if they started antiretroviral therapy during or after 2003, had a serum creatinine measurement at antiretroviral therapy initiation (baseline), and had at least 2 follow-up creatinine measurements taken ≥3 months apart. Patients with a baseline estimated glomerular filtration rate (eGFR) ≤60 mL/min/1.73 m2 were excluded. Chronic kidney disease was defined as 2 consecutive eGFR values ≤60 mL/min/1.73 m2 taken ≥3 months apart. Generalized estimating equations were used to identify factors associated with eGFR change. Competing risk regression adjusted for study site, age and sex, and cumulative incidence plots were used to evaluate factors associated with chronic kidney disease (CKD).
RESULTS: Of 2547 patients eligible for this analysis, tenofovir was being used by 703 (27.6%) at baseline. Tenofovir use, high baseline eGFR, advanced HIV disease stage, and low nadir CD4 were associated with a decrease in eGFR during follow-up. Chronic kidney disease occurred at a rate of 3.4 per 1000 patient/years. Factors associated with CKD were tenofovir use, old age, low baseline eGFR, low nadir CD4, and protease inhibitor use.
CONCLUSIONS: There is an urgent need to enhance renal monitoring and management capacity among at-risk groups in Asia and improve access to less nephrotoxic antiretrovirals.
MATERIALS AND METHODS: This was an investigator-initiated, single-center, randomized, controlled, clinical trial in patients with T2DM and DKD, comparing 12-weeks of low carbohydrate diet (<20g daily intake) versus standard low protein (0.8g/kg/day) and low salt diet. Patients in the VLCBD group underwent 2-weekly monitoring including their 3-day food diaries. In addition, Dual-energy x-ray absorptiometry (DEXA) was performed to estimate body fat percentages.
RESULTS: The study population (n = 30) had a median age of 57 years old and a BMI of 30.68kg/m2. Both groups showed similar total calorie intake, i.e. 739.33 (IQR288.48) vs 789.92 (IQR522.4) kcal, by the end of the study. The VLCBD group showed significantly lower daily carbohydrate intake 27 (IQR25) g vs 89.33 (IQR77.4) g, p<0.001, significantly higher protein intake per day 44.08 (IQR21.98) g vs 29.63 (IQR16.35) g, p<0.05 and no difference in in daily fat intake. Both groups showed no worsening of serum creatinine at study end, with consistent declines in HbA1c (1.3(1.1) vs 0.7(1.25) %) and fasting blood glucose (1.5(3.37) vs 1.3(5.7) mmol/L). The VLCBD group showed significant reductions in total daily insulin dose (39(22) vs 0 IU, p<0.001), increased LDL-C and HDL-C, decline in IL-6 levels; with contrasting results in the control group. This was associated with significant weight reduction (-4.0(3.9) vs 0.2(4.2) kg, p = <0.001) and improvements in body fat percentages. WC was significantly reduced in the VLCBD group, even after adjustments to age, HbA1c, weight and creatinine changes. Both dietary interventions were well received with no reported adverse events.
CONCLUSION: This study demonstrated that dietary intervention of very low carbohydrate diet in patients with underlying diabetic kidney disease was safe and associated with significant improvements in glycemic control, anthropometric measurements including weight, abdominal adiposity and IL-6. Renal outcomes remained unchanged. These findings would strengthen the importance of this dietary intervention as part of the management of patients with diabetic kidney disease.
METHODS: We investigated serum creatinine (S-Cr) monitoring rates before and during ART and the incidence and prevalence of renal dysfunction after starting TDF by using data from a regional cohort of HIV-infected individuals in the Asia-Pacific. Time to renal dysfunction was defined as time from TDF initiation to the decline in estimated glomerular filtration rate (eGFR) to <60 ml/min/1.73m2 with >30% reduction from baseline using the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation or the decision to stop TDF for reported TDF-nephrotoxicity. Predictors of S-Cr monitoring rates were assessed by Poisson regression and risk factors for developing renal dysfunction were assessed by Cox regression.
RESULTS: Among 2,425 patients who received TDF, S-Cr monitoring rates increased from 1.01 to 1.84 per person per year after starting TDF (incidence rate ratio 1.68, 95%CI 1.62-1.74, p <0.001). Renal dysfunction on TDF occurred in 103 patients over 5,368 person-years of TDF use (4.2%; incidence 1.75 per 100 person-years). Risk factors for developing renal dysfunction included older age (>50 vs. ≤30, hazard ratio [HR] 5.39, 95%CI 2.52-11.50, p <0.001; and using PI-based regimen (HR 1.93, 95%CI 1.22-3.07, p = 0.005). Having an eGFR prior to TDF (pre-TDF eGFR) of ≥60 ml/min/1.73m2 showed a protective effect (HR 0.38, 95%CI, 0.17-0.85, p = 0.018).
CONCLUSIONS: Renal dysfunction on commencing TDF use was not common, however, older age, lower baseline eGFR and PI-based ART were associated with higher risk of renal dysfunction during TDF use in adult HIV-infected individuals in the Asia-Pacific region.
METHODS: This study included all biopsy-proven IgAN patients with ≥ 1year follow-up. Patients with diabetes mellitus at diagnosis and secondary IgAN were excluded. Medical records were reviewed for demographics, clinical presentation, blood pressure, 24-hour urine protein, serum creatinine, renal biopsy and treatment received. The primary outcome was defined as combined event of 50% estimated glomerular filtration rate (eGFR) reduction or ESRD.
RESULTS: We included 130 (74 females; 56 males) patients of mean age 38.0 ± 14.0 years and median eGFR of 75.2 (interquartile range (IQR) 49.3-101.4) ml/min/1.73m2. Eighty-four (64.6%) were hypertensive at presentation, 35 (26.9%) had nephrotic syndrome and 57 (43.8%) had nephrotic range proteinuria (NRP). Median follow-up duration was 7.5 (IQR 4.0-13.0) years. It was noted that 18 (13.8%) developed ESRD and 34 (26.2%) reached the primary outcome. Annual eGFR decline was -2.1 (IQR -5.3 to -0.1) ml/min/1.73m2/year, with median survival of 20 years. Survival rates from the combined event (50% decrease in eGFR or ESRD) at 10, 20 and 30 years were 80%, 53% and 25%, while survival from ESRD were 87%, 73% and 65%, respectively. In the univariate analysis, time-average proteinuria (hazard ratio (HR) = 2.41, 95% CI 1.77-3.30), eGFR <45ml/min/1.73m2 at biopsy (HR = 2.35, 95% CI 1.03-5.32), hypertension (HR = 2.81, 95% CI 1.16-6.80), mean arterial pressure (HR = 1.02, 95% CI 1.01-1.04), tubular atrophy/interstitial fibrosis score (HR = 3.77, 95% CI 1.84-7.73), and cellular/fibrocellular crescent score (HR = 2.44, 95% CI 1.19-5.00) were found to be significant. Whereas only time-average proteinuria (TA-proteinuria) remained as a significant predictor in the multivariate analysis (HR = 2.23, 95% CI 1.57-3.16).
CONCLUSION: In our cohort, TA-proteinuria was the most important predictor in the progression of IgAN, irrespective of degree of proteinuria at presentation.