Other Research
John W Larkin, Maggie Han, Hao Han, Murilo H Guedes, Priscila Bezerra Gonçalves, Carlos Eduardo Poli-de-Figueiredo, Américo Lourenço Cuvello-Neto, Ana Beatriz L Barra, Thyago Proença de Moraes, Len A Usvyat, Peter Kotanko, Maria Eugenia F Canziani, Jochen G Raimann, Roberto Pecoits-Filho
No abstract available
Alhaji Cherif, Nadja Grobe, Xiaoling Wang, Peter Kotanko
No abstract available
Marijke J E Dekker, Len A Usvyat, Constantijn J A M Konings, Jeroen P Kooman, Bernard Canaud, Paola Carioni, Daniele Marcelli, Frank M van der Sande, Vaibhav Maheshwari, Yuedong Wang, Peter Kotanko, Jochen G Raimann
No abstract available
Jochen G Raimann, Joseph Marfo Boaheng, Philipp Narh, Harrison Matti, Seth Johnson, Linda Donald, Hongbin Zhang, Friedrich Port, Nathan W Levin
No abstract available
Vaibhav Maheshwari, Robert S Hoffman, Stephan Thijssen, Xia Tao, Doris H Fuertinger, Peter Kotanko
No abstract available
Juliana Leme, Murilo Guedes, John Larkin, Maggie Han, Ana Beatriz Lesqueves Barra, Maria Eugenia F Canziani, Américo Lourenço Cuvello Neto, Carlos Eduardo Poli-de-Figueiredo, Thyago Proenca de Moraes, Roberto Pecoits-Filho
No abstract available
Bernard Canaud, Jeroen P Kooman, Nicholas M Selby, Maarten W Taal, Susan Francis, Andreas Maierhofer, Pascal Kopperschmidt, Allan Collins, Peter Kotanko
Hemodialysis has saved many lives, albeit with significant residual mortality. Although poor outcomes may reflect advanced age and comorbid conditions, hemodialysis per se may harm patients, contributing to morbidity and perhaps mortality. Systemic circulatory "stress" resulting from hemodialysis treatment schedule may act as a disease modifier, resulting in a multiorgan injury superimposed on preexistent comorbidities. New functional intradialytic imaging (i.e., echocardiography, cardiac magnetic resonance imaging [MRI]) and kinetic of specific cardiac biomarkers (i.e., Troponin I) have clearly documented this additional source of end-organ damage. In this context, several factors resulting from patient-hemodialysis interaction and/or patient management have been identified. Intradialytic hypovolemia, hypotensive episodes, hypoxemia, solutes, and electrolyte fluxes as well as cardiac arrhythmias are among the contributing factors to systemic circulatory stress that are induced by hemodialysis. Additionally, these factors contribute to patients' symptom burden, impair cognitive function, and finally have a negative impact on patients' perception and quality of life. In this review, we summarize the adverse systemic effects of current intermittent hemodialysis therapy, their pathophysiologic consequences, review the evidence for interventions that are cardioprotective, and explore new approaches that may further reduce the systemic burden of hemodialysis. These include improved biocompatible materials, smart dialysis machines that automatically may control the fluxes of solutes and electrolytes, volume and hemodynamic control, health trackers, and potentially disruptive technologies facilitating a more personalized medicine approach.
Sheetal Chaudhuri, Andrew Long, Hanjie Zhang, Caitlin Monaghan, John W Larkin, Peter Kotanko, Shashi Kalaskar, Jeroen P Kooman, Frank M van der Sande, Franklin W Maddux, Len A Usvyat
No abstract available
N Pilia, S Severi, J G Raimann, S Genovesi, O Dössel, P Kotanko, C Corsi, A Loewe
No abstract available
Murilo Guedes, Ana Claudia Dambiski, Sinaia Canhada, Ana Beatriz L Barra, Carlos Eduardo Poli-de-Figueiredo, Américo Lourenço Cuvello Neto, Maria Eugênia F Canziani, Jorge Paulo Strogoff-de-Matos, Jochen G Raimann, John Larkin, Bernard Canaud, Roberto Pecoits-Filho
No abstract available
Xiaoling Wang, Amrish Patel, Lela Tisdale, Zahin Haq, Xiaoling Ye, Rachel Lasky, Priscila Preciado, Xia Tao, Gabriela Ferreira Dias, Joshua E Chao, Mohamad Hakim, Maggie Han, Ohnmar Thwin, Jochen Raimann, Dinesh Chatoth, Peter Kotanko, Nadja Grobe
RESULTSA total of 26 spent PD dialysate samples were collected from 11 patients from ten dialysis centers. Spent PD dialysate samples were collected, on average, 25±13 days (median, 20; range, 10-45) after the onset of symptoms. The temporal distance of PD effluent collection relative to the closest positive nasal-swab RT-PCR result was 15±11 days (median, 14; range, 1-41). All 26 PD effluent samples tested negative at three SARS-CoV-2 genomic regions.CONCLUSIONSOur findings indicate the absence of SARS-CoV-2 in spent PD dialysate collected at ≥10 days after the onset of COVID-19 symptoms. We cannot rule out the presence of SARS-CoV-2 in spent PD dialysate in the early stage of COVID-19.BACKGROUNDTo date, it is unclear whether SARS-CoV-2 is present in spent dialysate from patients with COVID-19 on peritoneal dialysis (PD). Our aim was to assess the presence or absence of SARS-CoV-2 in spent dialysate from patients on chronic PD who had a confirmed diagnosis of COVID-19.METHODSSpent PD dialysate samples from patients on PD who were positive for COVID-19 were collected between March and August 2020. The multiplexed, real-time RT-PCR assay contained primer/probe sets specific to different SARS-CoV-2 genomic regions and to bacteriophage MS2 as an internal process control for nucleic acid extraction. Demographic and clinical data were obtained from patients' electronic health records.
Murilo Guedes, Roberto Pecoits-Filho, Juliana El Ghoz Leme, Yue Jiao, Jochen G Raimann, Yuedong Wang, Peter Kotanko, Thyago Proença de Moraes, Ravi Thadhani, Franklin W Maddux, Len A Usvyat, John W Larkin
RESULTSAmong 98,616 incident HD patients (age 62.6 ± 14.4 years, 57.8% male) who responded to DRT survey, a higher spKt/V in the incident period was associated with 13.5% (OR = 0.865; 95%CI 0.801-to-0.935) lower risk of a change to a longer DRT in the first-prevalent year. A higher number of HD treatments with IDH episodes per month in the incident period was associated with a 0.8% (OR = 1.008; 95%CI 1.001-to-1.015) and 1.6% (OR = 1.016; 95%CI 1.006-to-1.027) higher probability of a change to a longer DRT in the first- and second-prevalent years, respectively. Consistently, an increased in incidence of IDH episodes/months was associated to a change to a longer DRT over time.CONCLUSIONSIncident patients who had higher spKt/V and less sessions with IDH episodes had a lower likelihood of changing to a longer DRT in first year of HD. Dose optimization strategies with cardiac stability in fluid removal should be tested.BACKGROUNDDialysis recovery time (DRT) surveys capture the perceived time after HD to return to performing regular activities. Prior studies suggest the majority of HD patients report a DRT > 2 h. However, the profiles of and modifiable dialysis practices associated with changes in DRT relative to the start of dialysis are unknown. We hypothesized hemodialysis (HD) dose and rates of intradialytic hypotension (IDH) would associate with changes in DRT in the first years after initiating dialysis.METHODSWe analyzed data from adult HD patients who responded to a DRT survey ≤180 days from first date of dialysis (FDD) during 2014 to 2017. DRT survey was administered with annual KDQOL survey. DRT survey asks: "How long does it take you to be able to return to your normal activities after your dialysis treatment?" Answers are: < 0.5, 0.5-to-1, 1-to-2, 2-to-4, or > 4 h. An adjusted logistic regression model computed odds ratio for a change to a longer DRT (increase above DRT > 2 h) in reference to a change to a shorter DRT (decrease below DRT < 2 h, or from DRT > 4 h). Changes in DRT were calculated from incident (≤180 days FDD) to first prevalent (> 365-to- ≤ 545 days FDD) and second prevalent (> 730-to- ≤ 910 days FDD) years.
Gabriela Ferreira Dias, Nadja Grobe, Sabrina Rogg, David J Jörg, Roberto Pecoits-Filho, Andréa Novais Moreno-Amaral, Peter Kotanko
Red blood cells (RBC) are the most abundant cells in the blood. Despite powerful defense systems against chemical and mechanical stressors, their life span is limited to about 120 days in healthy humans and further shortened in patients with kidney failure. Changes in the cell membrane potential and cation permeability trigger a cascade of events that lead to exposure of phosphatidylserine on the outer leaflet of the RBC membrane. The translocation of phosphatidylserine is an important step in a process that eventually results in eryptosis, the programmed death of an RBC. The regulation of eryptosis is complex and involves several cellular pathways, such as the regulation of non-selective cation channels. Increased cytosolic calcium concentration results in scramblase and floppase activation, exposing phosphatidylserine on the cell surface, leading to early clearance of RBCs from the circulation by phagocytic cells. While eryptosis is physiologically meaningful to recycle iron and other RBC constituents in healthy subjects, it is augmented under pathological conditions, such as kidney failure. In chronic kidney disease (CKD) patients, the number of eryptotic RBC is significantly increased, resulting in a shortened RBC life span that further compounds renal anemia. In CKD patients, uremic toxins, oxidative stress, hypoxemia, and inflammation contribute to the increased eryptosis rate. Eryptosis may have an impact on renal anemia, and depending on the degree of shortened RBC life span, the administration of erythropoiesis-stimulating agents is often insufficient to attain desired hemoglobin target levels. The goal of this review is to indicate the importance of eryptosis as a process closely related to life span reduction, aggravating renal anemia.
Jeroen P Kooman, Peter Stenvinkel, Paul G Shiels, Martin Feelisch, Bernard Canaud, Peter Kotanko
Patients treated with hemodialysis (HD) repeatedly undergo intradialytic low arterial oxygen saturation and low central venous oxygen saturation, reflecting an imbalance between upper body systemic oxygen supply and demand, which are associated with increased mortality. Abnormalities along the entire oxygen cascade, with impaired diffusive and convective oxygen transport, contribute to the reduced tissue oxygen supply. HD treatment impairs pulmonary gas exchange and reduces ventilatory drive, whereas ultrafiltration can reduce tissue perfusion due to a decline in cardiac output. In addition to these factors, capillary rarefaction and reduced mitochondrial efficacy can further affect the balance between cellular oxygen supply and demand. Whereas it has been convincingly demonstrated that a reduced perfusion of heart and brain during HD contributes to organ damage, the significance of systemic hypoxia remains uncertain, although it may contribute to oxidative stress, systemic inflammation, and accelerated senescence. These abnormalities along the oxygen cascade of patients treated with HD appear to be diametrically opposite to the situation in Tibetan highlanders and Sherpa, whose physiology adapted to the inescapable hypobaric hypoxia of their living environment over many generations. Their adaptation includes pulmonary, vascular, and metabolic alterations with enhanced capillary density, nitric oxide production, and mitochondrial efficacy without oxidative stress. Improving the tissue oxygen supply in patients treated with HD depends primarily on preventing hemodynamic instability by increasing dialysis time/frequency or prescribing cool dialysis. Whether dietary or pharmacological interventions, such as the administration of L-arginine, fermented food, nitrate, nuclear factor erythroid 2-related factor 2 agonists, or prolyl hydroxylase 2 inhibitors, improve clinical outcome in patients treated with HD warrants future research.
Caitlin K Monaghan, John W Larkin, Sheetal Chaudhuri, Hao Han, Yue Jiao, Kristine M Bermudez, Eric D Weinhandl, Ines A Dahne-Steuber, Kathleen Belmonte, Luca Neri, Peter Kotanko, Jeroen P Kooman, Jeffrey L Hymes, Robert J Kossmann, Len A Usvyat, Franklin W Maddux
RESULTSWe used a select cohort of 40,490 patients on HD to build the ML model (11,166 patients who were COVID-19 positive and 29,324 patients who were unaffected controls). The prevalence of COVID-19 in the cohort (28% COVID-19 positive) was by design higher than the HD population. The prevalence of COVID-19 was set to 10% in the testing dataset to estimate the prevalence observed in the national HD population. The threshold for classifying observations as positive or negative was set at 0.80 to minimize false positives. Precision for the model was 0.52, the recall was 0.07, and the lift was 5.3 in the testing dataset. Area under the receiver operating characteristic curve (AUROC) and area under the precision-recall curve (AUPRC) for the model was 0.68 and 0.24 in the testing dataset, respectively. Top predictors of a patient on HD having a SARS-CoV-2 infection were the change in interdialytic weight gain from the previous month, mean pre-HD body temperature in the prior week, and the change in post-HD heart rate from the previous month.CONCLUSIONSThe developed ML model appears suitable for predicting patients on HD at risk of having COVID-19 at least 3 days before there would be a clinical suspicion of the disease.BACKGROUNDWe developed a machine learning (ML) model that predicts the risk of a patient on hemodialysis (HD) having an undetected SARS-CoV-2 infection that is identified after the following ≥3 days.METHODSAs part of a healthcare operations effort, we used patient data from a national network of dialysis clinics (February-September 2020) to develop an ML model (XGBoost) that uses 81 variables to predict the likelihood of an adult patient on HD having an undetected SARS-CoV-2 infection that is identified in the subsequent ≥3 days. We used a 60%:20%:20% randomized split of COVID-19-positive samples for the training, validation, and testing datasets.
Maggie Han, Xiaoling Ye, Sharon Rao, Schantel Williams, Stephan Thijssen, Jeffrey Hymes, Franklin W Maddux, Peter Kotanko
RESULTSPatients were 65 years old, 57% male, and had a HD vintage of 10 months. Patients whose dialysis treatments started before 8:30 a.m. were more likely to be younger, male, and have a greater dialysis vintage. Patients receiving Engerix B® and starting dialysis before 8:30 a.m. had a significantly higher seroconversion rate compared to patients who started dialysis after 8:30 a.m. Early dialysis start was a significant predictor of seroconversion in univariate and multivariate regression including male gender, but not in multivariate regression including age, neutrophil-to-lymphocyte ratio, and vintage.CONCLUSIONWhile better sleep following vaccination is associated with seroconversion in the general population, this is not the case in hemodialysis patients after multivariate adjustment. In the context of end-stage kidney disease, early dialysis start is not a significant predictor of HB vaccination response. The association between objectively measured postvaccination sleep duration and seroconversion rate should be investigated.BACKGROUND/AIMSHepatitis B (HB) vaccination in hemodialysis patients is important as they are at a higher risk of contracting HB. However, hemodialysis patients have a lower HB seroconversion rate than their healthy counterparts. As better sleep has been associated with better seroconversion in healthy populations and early hemodialysis start has been linked to significant sleep-wake disturbances in hemodialysis patients, we examined if hemodialysis treatment start time is associated with HB vaccination response.METHODSDemographics, standard-of-care clinical, laboratory, and treatment parameters, dialysis shift data, HB antigen status, HB vaccination status, and HB titers were collected from hemodialysis patients in Fresenius clinics from January 2010 to December 2015. Patients in our analysis received 90% of dialysis treatments either before or after 8:30 a.m., were negative for HB antigen, and received a complete series of HB vaccination (Engerix B® or Recombivax HB™). Univariate and multivariate regression models examined whether dialysis start time is a predictor of HB vaccination response.
Priscila Preciado, Leticia M Tapia Silva, Xiaoling Ye, Hanjie Zhang, Yuedong Wang, Peter Waguespack, Jeroen P Kooman, Peter Kotanko
RESULTSIntradialytic SaO2 was available in 52 patients (29 males; mean ± standard deviation age 66.5 ± 15.7 years) contributing 338 HD treatments. Mean time between onset of symptoms indicative of COVID-19 and diagnosis was 1.1 days (median 0; range 0-9). Prior to COVID-19 diagnosis the rate of HD treatments with hypoxemia, defined as treatment-level average SaO2 <90%, increased from 2.8% (2-4 weeks pre-diagnosis) to 12.2% (1 week) and 20.7% (3 days pre-diagnosis). Intradialytic O2 supplementation increased sharply post-diagnosis. Eleven patients died from COVID-19 within 5 weeks. Compared with patients who recovered from COVID-19, demised patients showed a more pronounced decline in SaO2 prior to COVID-19 diagnosis.CONCLUSIONSIn HD patients, hypoxemia may precede the onset of clinical symptoms and the diagnosis of COVID-19. A steep decline of SaO2 is associated with poor patient outcomes. Measurements of SaO2 may aid the pre-symptomatic identification of patients with COVID-19.BACKGROUNDMaintenance hemodialysis (MHD) patients are particularly vulnerable to coronavirus disease 2019 (COVID-19), a viral disease that may cause interstitial pneumonia, impaired alveolar gas exchange and hypoxemia. We ascertained the time course of intradialytic arterial oxygen saturation (SaO2) in MHD patients between 4 weeks pre-diagnosis and the week post-diagnosis of COVID-19.METHODSWe conducted a quality improvement project in confirmed COVID-19 in-center MHD patients from 11 dialysis facilities. In patients with an arterio-venous access, SaO2 was measured 1×/min during dialysis using the Crit-Line monitor (Fresenius Medical Care, Waltham, MA, USA). We extracted demographic, clinical, treatment and laboratory data, and COVID-19-related symptoms from the patients' electronic health records.
Gemma M Pamplona, Terry Sullivan, Peter Kotanko
No abstract available
David F Keane, Jochen G Raimann, Hanjie Zhang, Joanna Willetts, Stephan Thijssen, Peter Kotanko
Intradialytic hypotension (IDH) is a common complication of hemodialysis, but there is no data about the time of onset during treatment. Here we describe the incidence of IDH throughout hemodialysis and associations of time of hypotension with clinical parameters and survival by analyzing data from 21 dialysis clinics in the United States to include 785682 treatments from 4348 patients. IDH was defined as a systolic blood pressure of 90 mmHg or under while IDH incidence was calculated in 30-minute intervals throughout the hemodialysis session. Associations of time of IDH with clinical and treatment parameters were explored using logistic regression and with survival using Cox-regression. Sensitivity analysis considered further IDH definitions. IDH occurred in 12% of sessions at a median time interval of 120-149 minutes. There was no notable change in IDH incidence across hemodialysis intervals (range: 2.6-3.2 episodes per 100 session-intervals). Relative blood volume and ultrafiltration volume did not notably associate with IDH in the first 90 minutes but did thereafter. Associations between central venous but not arterial oxygen saturation and IDH were present throughout hemodialysis. Patients prone to IDH early as compared to late in a session had worse survival. Sensitivity analyses suggested IDH definition affects time of onset but other analyses were comparable. Thus, our study highlights the incidence of IDH during the early part of hemodialysis which, when compared to later episodes, associates with clinical parameters and mortality.
Ravi Thadhani, Joanna Willetts, Catherine Wang, John Larkin, Hanjie Zhang, Lemuel Rivera Fuentes, Len Usvyat, Kathleen Belmonte, Yuedong Wang, Robert Kossmann, Jeffrey Hymes, Peter Kotanko, Franklin Maddux
MEASUREMENTSConditional logistic regression models tested whether chair exposure after a positive patient conferred a higher risk of SARS-CoV-2 infection to the immediate subsequent patient.RESULTSAmong 170,234 hemodialysis patients, 4,782 (2.8%) tested positive for SARS-CoV-2 (mean age 64 years, 44% female). Most facilities (68.5%) had 0 to 1 positive SARS-CoV-2 patient. We matched 2,379 SARS-CoV-2 positive cases to 2,379 non-SARS-CoV-2 controls; 1.30% (95%CI 0.90%, 1.87%) of cases and 1.39% (95%CI 0.97%, 1.97%) of controls were exposed to a chair previously sat in by a shedding SARS-CoV-2 patient. Transmission risk among cases was not significantly different from controls (OR=0.94; 95%CI 0.57 to 1.54; p=0.80). Results remained consistent in adjusted and sensitivity analyses.PATIENTSAdult (age ≥18 years) hemodialysis patients.DESIGNWe used real-world data from hemodialysis patients treated between February 1 st and June 8 th , 2020 to perform a case-control study matching each SARS-CoV-2 positive patient (case) to a non-SARS-CoV-2 patient (control) in the same dialysis shift and traced back 14 days to capture possible exposure from chairs sat in by SARS-CoV-2 patients. Cases and controls were matched on age, sex, race, facility, shift date, and treatment count.CONCLUSIONSThe risk of indirect patient-to-patient transmission of SARS-CoV-2 infection from dialysis chairs appears to be low.OBJECTIVEWe examined transmission within hemodialysis facilities, with a specific focus on the possibility of indirect patient-to-patient transmission through shared dialysis chairs.PRIMARY FUNDING SOURCEFresenius Medical Care North America; National Institute of Diabetes and Digestive and Kidney Diseases (R01DK130067).SETTING2,600 hemodialysis facilities in the United States.BACKGROUNDSARS-CoV-2 is primarily transmitted through aerosolized droplets; however, the virus can remain transiently viable on surfaces.LIMITATIONAnalysis used real-world data that could contain errors and only considered vertical transmission associated with shared use of dialysis chairs by symptomatic patients.
Rakesh Malhotra, Ujjala Kumar, Patricia Virgen, Bryan Magallon, Pranav S Garimella, Tushar Chopra, Peter Kotanko, T Alp Ikizler, Danuta Trzebinska, Lisa Cadmus-Bertram, Joachim H Ix
DISCUSSIONESKD participants receiving hemodialysis are frequently sedentary, and differences appear more pronounced in older patients. These findings may assist in designing patient-centered interventions to increase physical activity among hemodialysis patients.INTRODUCTIONThe physical decline in patients with end-stage kidney disease (ESKD) is associated with morbidity and mortality. Prior studies have attempted to promote physical activity at the time of dialysis; however, physical activity patterns on the nondialysis days are unknown. This study aimed to quantify physical activity on dialysis and nondialysis days in hemodialysis patients using a wearable actigraph.FINDINGSOf the 52 recruited, 45 participants (urban = 25; rural = 20) completed the study. The mean age was 61 ± 15 years, 42% were women, 64% were Hispanic, and the mean dialysis vintage was 4.4 ± 3.0 years. For those with valid Fitbit data (defined as ≥10 hours of wear per day) for 28 days (n = 45), participants walked an average of 3688 steps per day, and 73% of participants were sedentary (<5000 steps/day). Participants aged >80 years were less active than younger (age < 65 years) participants (1232 vs. 4529 steps, P = 0.01). There were no statistical differences between the groups when stratified by gender (women vs. men [2817 vs. 4324 steps, respectively]), urbanicity (rural vs. urban dialysis unit [3141 vs. 4123 steps, respectively]), and dialysis/nondialysis day (3177 vs. 4133 steps, respectively). Due to the small sample size, we also calculated effect sizes. The effect size was medium for the gender differences (cohen's d = 0.57) and small to medium for urbanicity and dialysis/nondialysis day (d = 0.37 and d = 0.33, respectively). We found no association between physical activity and self-reported depression and fatigue scale. The majority of participants (62%, 28/45) found the Fitbit tracker easy to wear and comfortable.METHODSIn this prospective study, subjects receiving hemodialysis were recruited from two outpatient dialysis units in urban San Diego and rural Imperial County, CA, between March 2018 and April 2019. Key inclusion criteria included: (1) receiving thrice weekly hemodialysis for ≥3 months, (2) age ≥ 18 years, and (3) able to walk with or without assistive devices. All participants wore a Fitbit Charge 2 tracker for a minimum of 4 weeks. The primary outcome was the number of steps per day. Each participant completed the Physical Activity Questionnaire, the Patient Health Questionnaire (PHQ)-9, the PROMIS Short form Fatigue Questionnaire at baseline, and the Participant Technology Experience Questionnaire at day 7 after study enrolment.
Alhaji Cherif, Joanna L Willetts, Len Usvyat, Yuedong Wang, Peter Kotanko
No abstract available
Richard B Weller, Martin Feelisch, Peter Kotanko
No abstract available
Bernard Canaud, Xiaoling Ye, Len Usvyat, Jeroen Kooman, Frank van der Sande, Jochen Raimann, Yuedong Wang, Peter Kotanko
No abstract available
Maggie Han, Priscila Preciado, Ohnmar Thwin, Xia Tao, Leticia M Tapia-Silva, Lemuel Rivera Fuentes, Mohamad Hakim, Amrish Patel, Lela Tisdale, Hanjie Zhang, Peter Kotanko
RESULTS42 patients were included. Their mean age was 55 years, 79% were males, and 69% were African Americans. Between January 1 and February 13, 2020, patients took on average 5,963 (95% CI 4,909-7,017) steps/day. In the week prior to the mandated lockdown, when a national emergency was declared, and in the week of the shutdown, the average number of daily steps had decreased by 868 steps/day (95% CI 213-1,722) and 1,222 steps/day (95% CI 668-2300), respectively. Six patients were diagnosed with COVID-19 during the study period. Five of them exhibited significantly higher PAL in the 2 weeks prior to showing COVID-19 symptoms compared to COVID-19 negative patients.BACKGROUND/OBJECTIVESOn March 22, 2020, a statewide stay-at-home order for nonessential tasks was implemented in New York State. We aimed to determine the impact of the lockdown on physical activity levels (PAL) in hemodialysis patients.CONCLUSIONLockdown measures were associated with a significant decrease in PAL in hemodialysis patients. Patients who contracted COVID-19 had higher PAL during the incubation period. Methods to increase PAL while allowing for social distancing should be explored and implemented.METHODSStarting in May 2018, we are conducting an observational study with a 1-year follow-up on PAL in patients from 4 hemodialysis clinics in New York City. Patients active in the study as of March 22, 2020, were included. PAL was defined by steps taken per day measured by a wrist-based monitoring device (Fitbit Charge 2). Average steps/day were calculated for January 1 to February 13, 2020, and then weekly from February 14 to June 30.
Warren Krackov, Murat Sor, Rishi Razdan, Hanjie Zheng, Peter Kotanko
RESULTSThere was a high degree of correlation between our noninvasive AI instrument and the results of the adjudication by the vascular experts. Our results indicate that CNN can automatically classify aneurysms. We achieved a >90% classification accuracy in the validation images.CONCLUSIONThis is the first quality improvement project to show that an AI instrument can reliably grade vascular access aneurysms in a noninvasive way, allowing rapid assessments to be made on patients who would otherwise be at risk for highly morbid events. Moreover, these AI-assisted assessments can be made without having to schedule separate appointments and potentially even via telehealth.BACKGROUNDInnovations in artificial intelligence (AI) have proven to be effective contributors to high-quality health care. We examined the beneficial role AI can play in noninvasively grading vascular access aneurysms to reduce high-morbidity events, such as rupture, in ESRD patients on hemodialysis.METHODSOur AI instrument noninvasively examines and grades aneurysms in both arteriovenous fistulas and arteriovenous grafts. Aneurysm stages were adjudicated by 3 vascular specialists, based on a grading system that focuses on actions that need to be taken. Our automatic classification of aneurysms builds on 2 components: (a) the use of smartphone technology to capture aneurysm appearance and (b) the analysis of these images using a cloud-based convolutional neural network (CNN).
Nadja Grobe, Alhaji Cherif, Xiaoling Wang, Zijun Dong, Peter Kotanko
AIMSThis narrative review aims to provide a comprehensive overview of the global efforts to implement pool testing, specifically for COVID-19 screening.SOURCESData were retrieved from a detailed search for peer-reviewed articles and preprint reports using Medline/PubMed, medRxiv, Web of Science, and Google up to 21st March 2021, using search terms "pool testing", "viral", "serum", "SARS-CoV-2" and "COVID-19".IMPLICATIONSThe theory of pool testing is well understood and numerous successful examples from the past are available. Operationalization of pool testing requires sophisticated processes that can be adapted to the local medical circumstances. Special attention needs to be paid to sample collection, sample pooling, and strategies to avoid re-sampling.CONTENTThis review summarizes the history and theory of pool testing. We identified numerous peer-reviewed articles that describe specific details and practical implementation of pool testing. Successful examples as well as limitations of pool testing, in general and specifically related to the detection of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) RNA and antibodies, are reviewed. While promising, significant operational, pre-analytical, logistical, and economic challenges need to be overcome to advance pool testing.BACKGROUNDPool-testing strategies combine samples from multiple people and test them as a group. A pool-testing approach may shorten the screening time and increase the test rate during times of limited test availability and inadequate reporting speed. Pool testing has been effectively used for a wide variety of infectious disease screening settings. Historically, it originated from serological testing in syphilis. During the current coronavirus disease 2019 (COVID-19) pandemic, pool testing is considered across the globe to inform opening strategies and to monitor infection rates after the implementation of interventions.
Ohnmar Thwin, Nadja Grobe, Leticia M Tapia Silva, Xiaoling Ye, Hanjie Zhang, Yuedong Wang, Peter Kotanko
No abstract available
Paul A Rootjes, Erik Lars Penne, Georges Ouellet, Yanna Dou, Stephan Thijssen, Peter Kotanko, Jochen G Raimann
RESULTSSeventeen chronic HD patients (11 males, age 54.1 ± 18.7 years) completed the study. The average priming and rinsing volumes were 236.7 ± 77.5 and 245.0 ± 91.8 mL respectively. The mean IDWG did not significantly change (2.52 ± 0.88 kg in Phase 1; 2.28 ± 0.70 kg in Phase 2; and 2.51 ± 1.2 kg in Phase 3). No differences in blood pressures, intradialytic symptoms or thirst were observed.MATERIALS AND METHODSWe enrolled non-diabetic and anuric stable HD patients. First, the extracorporeal circuit was primed and rinsed with approximately 200-250 mL of isotonic saline during 4 weeks (Phase 1), subsequently a similar volume of a 5% dextrose solution replaced the saline for another 4 weeks (Phase 2), followed by another 4 weeks of saline (Phase 3). We collected data on interdialytic weight gain (IDWG), pre- and post-dialysis blood pressure, intradialytic symptoms, and thirst.CONCLUSIONSReplacing saline by 5% dextrose for priming and rinsing is feasible in stable HD patients and may reduce intradialytic sodium loading. A non-significant trend toward a lower IDWG was observed when 5% dextrose was used. Prospective studies with a larger sample size and longer follow-up are needed to gain further insight into the possible effects of using alternate priming and rinsing solutions lowering intradialytic sodium loading.TRIAL REGISTRATIONIdentifier NCT01168947 (ClinicalTrials.gov).INTRODUCTIONExcess sodium intake and consequent volume overload are major clinical problems in hemodialysis (HD) contributing to adverse outcomes. Saline used for priming and rinsing of the extracorporeal circuit is a potentially underappreciated source of intradialytic sodium gain. We aimed to examine the feasibility and clinical effects of replacing saline as the priming and rinsing fluid by a 5% dextrose solution.
Pablo Maggiani-Aguilera, Jonathan S Chávez-Iñiguez, Joana G Navarro-Gallardo, Guillermo Navarro-Blackaller, Alondra M Flores-Llamas, Tania Pelayo-Retano, Erendira A Arellano-Delgado, Violeta E González-Montes, Ekatherina Yanowsky-Ortega, Jochen G Raimann, Guillermo Garcia-Garcia
RESULTSBetween January 2019 and January 2020 a total of 75 patients with tunnelled catheter insertion were analysed. Catheter replacement at 6-month occur in 10 (13.3%) patients. By multivariate analysis, the incorrect catheter tip position (SVC) (OR 1.23, 95% CI 1.07-1.42, p <.004), the presence of extrasystoles during the procedure (OR 0.88, 95% CI 0.78-0.98, p = .03), incorrect catheter tug (OR 1.31, 95% CI 1.10-1.55, p = .003), incorrect catheter top position (kinking; OR 1.40, 95% CI 1.04-1.88, p = .02) and catheter-related bloodstream infection (OR 2.60, 95% CI 2.09-3.25, p <.001) were the only variables associated with catheter replacement at 6-month follow-up.AIMTunnelled haemodialysis (HD) catheters can be used instantly, but there are several anatomical variables that could impact it survival. This study aimed to examine the impact of different novel anatomic variables, with catheter replacement.CONCLUSIONThe risk of catheter replacement at 6-month follow-up could be attenuated by avoiding incorrect catheter tug and top position, and by placing the vascular catheter tip in the CAJ and MDA.METHODSIn a single-centre a prospective cohort in chronic kidney disease G5 patients were conducted. The primary outcome was to determine the factors associated with catheter replacement during the first 6-month of follow-up. All procedures were performed without fluoroscopy. Three anatomic regions for catheter tip position were established: considered as superior vena cava (SVC), cavo-atrial junction (CAJ) and mid-to deep atrium (MDA). Many other anatomical variables were measured. Catheter-related bloodstream infection was also included.
Alhaji Cherif, Vaibhav Maheshwari, Doris Fuertinger, Gudrun Schappacher-Tilp, Priscila Preciado, David Bushinsky, Stephan Thijssen, Peter Kotanko
Precise maintenance of acid-base homeostasis is fundamental for optimal functioning of physiological and cellular processes. The presence of an acid-base disturbance can affect clinical outcomes and is usually caused by an underlying disease. It is, therefore, important to assess the acid-base status of patients, and the extent to which various therapeutic treatments are effective in controlling these acid-base alterations. In this paper, we develop a dynamic model of the physiological regulation of an HCO3-/CO2 buffering system, an abundant and powerful buffering system, using Henderson-Hasselbalch kinetics. We simulate the normal physiological state and four cardinal acidbase disorders: Metabolic acidosis and alkalosis and respiratory acidosis and alkalosis. We show that the model accurately predicts serum pH over a range of clinical conditions. In addition to qualitative validation, we compare the in silico results with clinical data on acid-base homeostasis and alterations, finding clear relationships between primary acid-base disturbances and the secondary adaptive compensatory responses. We also show that the predicted primary disturbances accurately resemble clinically observed compensatory responses. Furthermore, via sensitivity analysis, key parameters were identified which could be the most effective in regulating systemic pH in healthy individuals, and those with chronic kidney disease and distal and proximal renal tubular acidosis. The model presented here may provide pathophysiologic insights and can serve as a tool to assess the safety and efficacy of different therapeutic interventions to control or correct acid-base disorders.
Xiaoling Wang, Nadja Grobe, Amrish Patel, Shuchita Sharma, Jaime Uribarri, Peter Kotanko
No abstract available
Xiaoling Wang, Nadja Grobe, Zahin Haq, Ohnmar Thwin, Lemuel Rivera Fuentes, Dugan Maddux, Peter Kotanko
No abstract available
Sheetal Chaudhuri, Hao Han, Len Usvyat, Yue Jiao, David Sweet, Allison Vinson, Stephanie Johnstone Steinberg, Dugan Maddux, Kathleen Belmonte, Jane Brzozowski, Brad Bucci, Peter Kotanko, Yuedong Wang, Jeroen P Kooman, Franklin W Maddux, John Larkin
RESULTSThe between group difference in annual hospital admission and day rates was similar at baseline (2015) with a mean difference between DHRP versus control clinics of -0.008 ± 0.09 ppy and -0.05 ± 0.96 ppy respectively. The between group difference in hospital admission and day rates became more distinct at the end of follow up (2018) favoring DHRP clinics with the mean difference being -0.155 ± 0.38 ppy and -0.97 ± 2.78 ppy respectively. A paired t-test showed the change in the between group difference in hospital admission and day rates from baseline to the end of the follow up was statistically significant (t-value = 2.73, p-value < 0.01) and (t-value = 2.29, p-value = 0.02) respectively.CONCLUSIONSThese findings suggest ML model-based risk-directed interdisciplinary team interventions associate with lower hospitalization rates and hospital day rate in HD patients, compared to controls.BACKGROUNDAn integrated kidney disease company uses machine learning (ML) models that predict the 12-month risk of an outpatient hemodialysis (HD) patient having multiple hospitalizations to assist with directing personalized interdisciplinary interventions in a Dialysis Hospitalization Reduction Program (DHRP). We investigated the impact of risk directed interventions in the DHRP on clinic-wide hospitalization rates.METHODSWe compared the hospital admission and day rates per-patient-year (ppy) from all hemodialysis patients in 54 DHRP and 54 control clinics identified by propensity score matching at baseline in 2015 and at the end of the pilot in 2018. We also used paired T test to compare the between group difference of annual hospitalization rate and hospitalization days rates at baseline and end of the pilot.
Jonathan S Chávez-Íñiguez, Pablo Maggiani-Aguilera, Christian Pérez-Flores, Rolando Claure-Del Granado, Andrés E De la Torre-Quiroga, Alejandro Martínez-Gallardo González, Guillermo Navarro-Blackaller, Ramón Medina-González, Jochen G Raimann, Francisco G Yanowsky-Escatell, Guillermo García-García
RESULTSFrom 2017 to 2020, we analyzed 288 AKI patients. The mean age was 55.3 years, 60.7% were male, AKI KDIGO stage 3 was present in 50.5% of them, sepsis was the main etiology 50.3%, and 72 (25%) patients started KRT. The overall survival was 84.4%. Fluid adjustment was the only intervention associated with a decreased risk for starting KRT (odds ratio [OR]: 0.58, 95% confidence interval [CI]: 0.48-0.70, and p ≤ 0.001) and AKI progression to stage 3 (OR: 0.59, 95% CI: 0.49-0.71, and p ≤ 0.001). Receiving vasopressors and KRT were associated with mortality. None of the interventions studied was associated with reducing the risk of death.CONCLUSIONSIn this prospective cohort study of AKI patients, we found for the first time that early nephrologist intervention and fluid prescription adjustment were associated with lower risk of starting KRT and progression to AKI stage 3.BACKGROUNDBased on the pathophysiology of acute kidney injury (AKI), it is plausible that certain early interventions by the nephrologist could influence its trajectory. In this study, we investigated the impact of 5 early nephrology interventions on starting kidney replacement therapy (KRT), AKI progression, and death.METHODSIn a prospective cohort at the Hospital Civil of Guadalajara, we followed up for 10 days AKI patients in whom a nephrology consultation was requested. We analyzed 5 early interventions of the nephrology team (fluid adjustment, nephrotoxic withdrawal, antibiotic dose adjustment, nutritional adjustment, and removal of hyperchloremic solutions) after the propensity score and multivariate analysis for the risk of starting KRT (primary objective), AKI progression to stage 3, and death (secondary objectives).
Sheetal Chaudhuri, Hao Han, Caitlin Monaghan, John Larkin, Peter Waguespack, Brian Shulman, Zuwen Kuang, Srikanth Bellamkonda, Jane Brzozowski, Jeffrey Hymes, Mike Black, Peter Kotanko, Jeroen P Kooman, Franklin W Maddux, Len Usvyat
RESULTSTreatment data from 616 in-center dialysis patients in the six clinics was curated into a big data store and fed into a Machine Learning (ML) model developed and deployed within the cloud. The threshold for classifying observations as positive or negative was set at 0.08. Precision for the model at this threshold was 0.33 and recall was 0.94. The area under the receiver operating curve (AUROC) for the ML model was 0.89 using test data.CONCLUSIONSThe findings from our proof-of concept analysis demonstrate the design of a cloud-based framework that can be used for making real-time predictions of events during dialysis treatments. Making real-time predictions has the potential to assist clinicians at the point of care during hemodialysis.METHODWe conducted a proof-of-concept analysis to retrospectively assess near real-time dialysis treatment data from in-center patients in six clinics using Optical Sensing Device (OSD), during December 2018 to August 2019. The goal of this analysis was to use real-time OSD data to predict if a patient's relative blood volume (RBV) decreases at a rate of at least - 6.5 % per hour within the next 15 min during a dialysis treatment, based on 10-second windows of data in the previous 15 min. A dashboard application was constructed to demonstrate how reporting structures may be developed to alert clinicians in real time of at-risk cases. Data was derived from three sources: (1) OSDs, (2) hemodialysis machines, and (3) patient electronic health records.BACKGROUNDInadequate refilling from extravascular compartments during hemodialysis can lead to intradialytic symptoms, such as hypotension, nausea, vomiting, and cramping/myalgia. Relative blood volume (RBV) plays an important role in adapting the ultrafiltration rate which in turn has a positive effect on intradialytic symptoms. It has been clinically challenging to identify changes RBV in real time to proactively intervene and reduce potential negative consequences of volume depletion. Leveraging advanced technologies to process large volumes of dialysis and machine data in real time and developing prediction models using machine learning (ML) is critical in identifying these signals.
Sheetal Chaudhuri, Rachel Lasky, Yue Jiao, John Larkin, Caitlin Monaghan, Anke Winter, Luca Neri, Peter Kotanko, Jeffrey Hymes, Sangho Lee, Yuedong Wang, Jeroen P Kooman, Franklin Maddux, Len Usvyat
RESULTSThere were 12,836 HD patients with a suspicion of COVID-19 who received RT-PCR testing (8895 SARS-CoV-2 positive). We observed significantly different trends (p < 0.05) in pre-HD systolic blood pressure (SBP), pre-HD pulse rate, body temperature, ferritin, neutrophils, lymphocytes, albumin, and interdialytic weight gain (IDWG) between COVID-19 positive and negative patients. For COVID-19 positive group, we observed significantly different clinical trends (p < 0.05) in pre-HD pulse rate, lymphocytes, neutrophils, and albumin between survivors and nonsurvivors. We also observed that, in the group of survivors, most clinical parameters returned to pre-COVID-19 levels within 60-90 days.CONCLUSIONWe observed unique temporal trends in various clinical and laboratory parameters among HD patients who tested positive versus negative for SARS-CoV-2 infection and those who survived the infection versus those who died. These trends can help to define the physiological disturbances that characterize the onset and course of COVID-19 in HD patients.INTRODUCTIONThe clinical impact of COVID-19 has not been established in the dialysis population. We evaluated the trajectories of clinical and laboratory parameters in hemodialysis (HD) patients.METHODSWe used data from adult HD patients treated at an integrated kidney disease company who received a reverse transcription polymerase chain reaction (RT-PCR) test to investigate suspicion of a severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2) infection between May 1 and September 1, 2020. Nonparametric smoothing splines were used to fit data for individual trajectories and estimate the mean change over time in patients testing positive or negative for SARS-CoV-2 and those who survived or died within 30 days of first suspicion or positive test date. For each clinical parameter of interest, the difference in average daily changes between COVID-19 positive versus negative group and COVID-19 survivor versus nonsurvivor group was estimated by fitting a linear mixed effects model based on measurements in the 14 days before (i.e., Day -14 to Day 0) Day 0.
Lin-Chun Wang, Leticia M Tapia, Xia Tao, Joshua E Chao, Ohnmar Thwin, Hanjie Zhang, Stephan Thijssen, Peter Kotanko, Nadja Grobe
RESULTSThe SEV group reported a 3.3-fold higher frequency of BSS stool types 1 and 2 (more likely constipated, p < 0.05), whereas the SFO group reported a 1.5-fold higher frequency of BSS stool types 5-7 (more likely loose stool and diarrhea, not significant). Participants in the SFO group showed a trend toward better adherence to phosphate binder therapy (SFO: 87.6% vs. SEV: 66.6%, not significant). UTOX, serum phosphorus, nutritional and liver function markers, and tryptophan were not different between the two groups.CONCLUSIONThere was no difference in the gut microbiome-derived UTOX levels between phosphate binders (SFO vs. SEV), despite SFO therapy resulting in fewer constipated participants. This pilot study may inform study design of future clinical trials and highlights the importance of including factors beyond bowel habits and their association with UTOX levels.INTRODUCTIONConstipation is prevalent in patients with kidney failure partly due to the use of medication, such as phosphate binders. We hypothesized that serum levels of gut microbiome-derived uremic toxins (UTOX) may be affected by the choice of phosphate binder putatively through its impact on colonic transit time. We investigated two commonly prescribed phosphate binders, sevelamer carbonate (SEV) and sucroferric oxyhydroxide (SFO), and their association with gut microbiome-derived UTOX levels in hemodialysis (HD) patients.METHODSWeekly blood samples were collected from 16 anuric HD participants during the 5-week observational period. All participants were on active phosphate binder monotherapy with either SFO or SEV for at least 4 weeks prior to enrollment. Eight UTOX (7 gut microbiome-derived) and tryptophan were quantified using liquid chromatography-mass spectrometry. Serum phosphorus, nutritional, and liver function markers were also measured. For each substance, weekly individual levels, the median concentration per participant, and differences between SFO and SEV groups were reported. Patient-reported bowel movements, by the Bristol Stool Scale (BSS), and pill usage were assessed weekly.
Vaibhav Maheshwari, Xia Tao, Stephan Thijssen, Peter Kotanko
Removal of protein-bound uremic toxins (PBUTs) during conventional dialysis is insufficient. PBUTs are associated with comorbidities and mortality in dialysis patients. Albumin is the primary carrier for PBUTs and only a small free fraction of PBUTs are dialyzable. In the past, we proposed a novel method where a binding competitor is infused upstream of a dialyzer into an extracorporeal circuit. The competitor competes with PBUTs for their binding sites on albumin and increases the free PBUT fraction. Essentially, binding competitor-augmented hemodialysis is a reactive membrane separation technique and is a paradigm shift from conventional dialysis therapies. The proposed method has been tested in silico, ex vivo, and in vivo, and has proven to be very effective in all scenarios. In an ex vivo study and a proof-of-concept clinical study with 18 patients, ibuprofen was used as a binding competitor; however, chronic ibuprofen infusion may affect residual kidney function. Binding competition with free fatty acids significantly improved PBUT removal in pre-clinical rat models. Based on in silico analysis, tryptophan can also be used as a binding competitor; importantly, fatty acids or tryptophan may have salutary effects in HD patients. More chemoinformatics research, pre-clinical, and clinical studies are required to identify ideal binding competitors before routine clinical use.
Markus Pirklbauer, David A Bushinsky, Peter Kotanko, Gudrun Schappacher-Tilp
Background: Personalized management of secondary hyperparathyroidism is a critical part of hemodialysis patient care. We used a mathematical model of parathyroid gland (PTG) biology to predict (1) short-term peridialytic intact PTH (iPTH) changes in response to diffusive calcium (Ca) fluxes and (2) to predict long-term iPTH levels. Methods: We dialyzed 26 maintenance hemodialysis patients on a single occasion with a dialysate Ca concentration of 1.75 mmol/l to attain a positive dialysate-to-blood ionized Ca (iCa) gradient and thus diffusive Ca loading. Intradialytic iCa kinetics, peridialytic iPTH change, and dialysate-sided iCa mass balance (iCaMB) were assessed. Patient-specific PTG model parameters were estimated using clinical, medication, and laboratory data. We then used the personalized PTG model to predict peridialytic and long-term (6-months) iPTH levels. Results: At dialysis start, the median dialysate-to-blood iCa gradient was 0.3 mmol/l (IQR 0.11). The intradialytic iCa gain was 488 mg (IQR 268). Median iPTH decrease was 75% (IQR 15) from pre-dialysis 277 to post-dialysis 51 pg/ml. Neither iCa gradient nor iCaMB were significantly associated with peridialytic iPTH changes. The personalized PTG model accurately predicted both short-term, treatment-level peridialytic iPTH changes (r = 0.984, p < 0.001, n = 26) and patient-level 6-months iPTH levels (r = 0.848, p < 0.001, n = 13). Conclusions: This is the first report showing that both short-term and long-term iPTH dynamics can be predicted using a personalized mathematical model of PTG biology. Prospective studies are warranted to explore further model applications, such as patient-level prediction of iPTH response to PTH-lowering treatment.
Ravi Thadhani, Joanna Willetts, Catherine Wang, John Larkin, Hanjie Zhang, Lemuel Rivera Fuentes, Len Usvyat, Kathleen Belmonte, Yuedong Wang, Robert Kossmann, Jeffrey Hymes, Peter Kotanko, Franklin Maddux
RESULTSAmong 170,234 hemodialysis patients, 4,782 (2.8 %) tested positive for SARS-CoV-2 (mean age 64 years, 44 % female). Most facilities (68.5 %) had 0 to 1 positive SARS-CoV-2 patient. We matched 2,379 SARS-CoV-2 positive cases to 2,379 non-SARS-CoV-2 controls; 1.30 % (95 %CI 0.90 %, 1.87 %) of cases and 1.39 % (95 %CI 0.97 %, 1.97 %) of controls were exposed to a chair previously sat in by a shedding SARS-CoV-2 patient. Transmission risk among cases was not significantly different from controls (OR = 0.94; 95 %CI 0.57 to 1.54; p = 0.80). Results remained consistent in adjusted and sensitivity analyses.CONCLUSIONSThe risk of indirect patient-to-patient transmission of SARS-CoV-2 infection from dialysis chairs appears to be low.BACKGROUNDSARS-CoV-2 can remain transiently viable on surfaces. We examined if use of shared chairs in outpatient hemodialysis associates with a risk for indirect patient-to-patient transmission of SARS-CoV-2.METHODSWe used data from adults treated at 2,600 hemodialysis facilities in United States between February 1st and June 8th, 2020. We performed a retrospective case-control study matching each SARS-CoV-2 positive patient (case) to a non-SARS-CoV-2 patient (control) treated in the same dialysis shift. Cases and controls were matched on age, sex, race, facility, shift date, and treatment count. For each case-control pair, we traced backward 14 days to assess possible prior exposure from a 'shedding' SARS-CoV-2 positive patient who sat in the same chair immediately before the case or control. Conditional logistic regression models tested whether chair exposure after a shedding SARS-CoV-2 positive patient conferred a higher risk of SARS-CoV-2 infection to the immediate subsequent patient.
Richard V Remigio, Rodman Turpin, Jochen G Raimann, Peter Kotanko, Frank W Maddux, Amy Rebecca Sapkota, Xin-Zhong Liang, Robin Puett, Xin He, Amir Sapkota
RESULTSBased on Lag 2- Lag 1 temporal ordering, 1 °C increase in daily TMAX was associated with increased hazard of ACHA by 1.4% (adjusted hazard ratio (HR), 1.014; 95% confidence interval, 1.007-1.021) and ACM 7.5% (adjusted HR, 1.075, 1.050-1.100). Short-term lag exposures to 1 °C increase in temperature predicted mean reductions in IDWG and preSBP by 0.013-0.015% and 0.168-0.229 mmHg, respectively. Mediation analysis for ACHA identified significant indirect effects for all three studied pathways (preSBP, IDWG, and preSBP + IDWG) and significant indirect effects for IDWG and conjoined preSBP + IDWG pathways for ACM. Of note, only 1.03% of the association between temperature and ACM was mediated through preSBP. The mechanistic path for IDWG, independent of preSBP, demonstrated inconsistent mediation and, consequently, potential suppression effects in ACHA (-15.5%) and ACM (-6.3%) based on combined pathway models. Proportion mediated estimates from preSBP + IDWG pathways achieved 2.2% and 0.3% in combined pathway analysis for ACHA and ACM outcomes, respectively. Lag 2 discrete-time ACM mediation models exhibited consistent mediation for all three pathways suggesting that 2-day lag in IDWG and preSBP responses can explain 2.11% and 4.41% of total effect association between temperature and mortality, respectively.CONCLUSIONWe corroborated the previously reported association between ambient temperature, ACHA and ACM. Our results foster the understanding of potential physiological linkages that may explain or suppress temperature-driven hospital admissions and mortality risks. Of note, concomitant changes in preSBP and IDWG may have little intermediary effect when analyzed in combined pathway models. These findings advance our assessment of candidate interventions to reduce the impact of outdoor temperature change on ESKD patients.BACKGROUNDTypical thermoregulatory responses to elevated temperatures among healthy individuals include reduced blood pressure and perspiration. Individuals with end-stage kidney disease (ESKD) are susceptible to systemic fluctuations caused by ambient temperature changes that may increase morbidity and mortality. We investigated whether pre-dialysis systolic blood pressure (preSBP) and interdialytic weight gain (IDWG) can independently mediate the association between ambient temperature, all-cause hospital admissions (ACHA), and all-cause mortality (ACM).METHODSThe study population consisted of ESKD patients receiving hemodialysis treatments at Fresenius Medical Care facilities in Philadelphia County, PA, from 2011 to 2019 (n = 1981). Within a time-to-event framework, we estimated the association between daily maximum dry-bulb temperature (TMAX) and, as separate models, ACHA and ACM during warmer calendar months. Clinically measured preSBP and IDWG responses to temperature increases were estimated using linear mixed effect models. We employed the difference (c-c') method to decompose total effect models for ACHA and ACM using preSBP and IDWG as time-dependent mediators. Covariate adjustments for exposure-mediator and total and direct effect models include age, race, ethnicity, blood pressure medication use, treatment location, preSBP, and IDWG. We considered lags up to two days for exposure and 1-day lag for mediator variables (Lag 2-Lag 1) to assure temporality between exposure-outcome models. Sensitivity analyses for 2-day (Lag 2-only) and 1-day (Lag 1-only) lag structures were also conducted.
Gabriela Ferreira Dias, Sara Soares Tozoni, Gabriela Bohnen, Nadja Grobe, Silvia D Rodrigues, Tassiana Meireles, Lia S Nakao, Roberto Pecoits-Filho, Peter Kotanko, Andréa Novais Moreno-Amaral
RESULTSHere, we show that HD-RBC have less intracellular oxygen and that it is further decreased post-HD. Also, incubation in 5% O2 and uremia triggered eryptosis in vitro by exposing PS. Hypoxia itself increased the PS exposure in HD-RBC and CON-RBC, and the addition of uremic serum aggravated it. Furthermore, inhibition of the organic anion transporter 2 with ketoprofen reverted eryptosis and restored the levels of intracellular oxygen. Cytosolic levels of the uremic toxins pCS and IAA were decreased after dialysis.CONCLUSIONThese findings suggest the participation of uremic toxins and hypoxia in the process of eryptosis and intracellular oxygenation.BACKGROUND/AIMSChronic kidney disease is frequently accompanied by anemia, hypoxemia, and hypoxia. It has become clear that the impaired erythropoietin production and altered iron homeostasis are not the sole causes of renal anemia. Eryptosis is a process of red blood cells (RBC) death, like apoptosis of nucleated cells, characterized by Ca2+ influx and phosphatidylserine (PS) exposure to the outer RBC membrane leaflet. Eryptosis can be induced by uremic toxins and occurs before senescence, thus shortening RBC lifespan and aggravating renal anemia. We aimed to assess eryptosis and intracellular oxygen levels of RBC from hemodialysis patients (HD-RBC) and their response to hypoxia, uremia, and uremic toxins uptake inhibition.METHODSUsing flow cytometry, RBC from healthy individuals (CON-RBC) and HD-RBC were subjected to PS (Annexin-V), intracellular Ca2+ (Fluo-3/AM) and intracellular oxygen (Hypoxia Green) measurements, at baseline and after incubation with uremic serum and/or hypoxia (5% O2), with or without ketoprofen. Baseline levels of uremic toxins were quantified in serum and cytosol by high performance liquid chromatography.
Jochen G Raimann, Christopher T Chan, John T Daugirdas, Thomas Depner, Tom Greene, George A Kaysen, Alan S Kliger, Peter Kotanko, Brett Larive, Gerald Beck, Robert McGregor Lindsay, Michael V Rocco, Glenn M Chertow, Nathan W Levin
RESULTSIn 197 enrolled subjects in the FHN Daily Trial, the treatment effect of frequent HD on ∆LVM was modified by SNa. When the FHN Daily Trial participants are divided into lower and higher predialysis SNa groups (less and greater than 138 mEq/L), the LVM reduction in the lower group was substantially higher (-28.0 [95% CI -40.5 to -15.4] g) than in the higher predialysis SNa group (-2.0 [95% CI -15.5 to 11.5] g). Accounting for GNa, TIFL also showed more pronounced effects among patients with higher GNa or higher TIFL. Results in the Nocturnal Trial were similar in direction and magnitude but did not reach statistical significance.INTRODUCTIONThe Frequent Hemodialysis Network (FHN) Daily and Nocturnal trials aimed to compare the effects of hemodialysis (HD) given 6 versus 3 times per week. More frequent in-center HD significantly reduced left-ventricular mass (LVM), with more pronounced effects in patients with low urine volumes. In this study, we aimed to explore another potential effect modifier: the predialysis serum sodium (SNa) and related proxies of plasma tonicity.DISCUSSION/CONCLUSIONIn the FHN Daily Trial, the favorable effects of frequent HD on left-ventricular hypertrophy were more pronounced among patients with lower predialysis SNa and higher GNa and TIFL. Whether these metrics can be used to identify patients most likely to benefit from frequent HD or other dialytic or nondialytic interventions remains to be determined. Prospective, adequately powered studies studying the effect of GNa reduction on mortality and hospitalization are needed.METHODSUsing data from the FHN Daily and Nocturnal Trials, we compared the effects of frequent HD on LVM among patients stratified by SNa, dialysate-to-predialysis serum-sodium gradient (GNa), systolic and diastolic blood pressure, time-integrated sodium-adjusted fluid load (TIFL), and extracellular fluid volume estimated by bioelectrical impedance analysis.
Roberto Pecoits-Filho, John Larkin, Carlos Eduardo Poli-de-Figueiredo, Américo Lourenço Cuvello-Neto, Ana Beatriz Lesqueves Barra, Priscila Bezerra Gonçalves, Shimul Sheth, Murilo Guedes, Maggie Han, Viviane Calice-Silva, Manuel Carlos Martins de Castro, Peter Kotanko, Thyago Proenca de Moraes, Jochen G Raimann, Maria Eugenia F Canziani
RESULTSWe randomized 195 patients (HDF 97; HD 98) between August 2016 and October 2017. Despite the achievement of a high convective volume in the majority of sessions and a positive impact on solute removal, the treatment effect HDF on the primary outcome was +538 [95% confidence interval (CI) -330 to 1407] steps/24 h after dialysis compared with HD, and was not statistically significant. Despite a lack of statistical significance, the observed size of the treatment effect was modest and driven by steps taken between 1.5 and 24.0 h after dialysis, in particular between 20 and 24 h (+197 steps; 95% CI -95 to 488).CONCLUSIONSHDF did not have a statistically significant treatment effect on PA 24 h following dialysis, albeit effect sizes may be clinically meaningful and deserve further investigation.BACKGROUNDDialysis patients are typically inactive and their physical activity (PA) decreases over time. Uremic toxicity has been suggested as a potential causal factor of low PA in dialysis patients. Post-dilution high-volume online hemodiafiltration (HDF) provides greater higher molecular weight removal and studies suggest better clinical/patient-reported outcomes compared with hemodialysis (HD).METHODSHDFIT was a randomized controlled trial at 13 clinics in Brazil that aimed to investigate the effects of HDF on measured PA (step counts) as a primary outcome. Stable HD patients (vintage 3-24 months) were randomized to receive HDF or high-flux HD. Treatment effect of HDF on the primary outcome from baseline to 3 and 6 months was estimated using a linear mixed-effects model.
Pablo Maggiani-Aguilera, Jochen G Raimann, Jonathan S Chávez-Iñiguez, Guillermo Navarro-Blackaller, Peter Kotanko, Guillermo Garcia-Garcia
RESULTSIn 1,632 patients from RRI, the CVC prevalence at month 1 was 64% and 97% among 174 HC patients. The conversion rate was 31.7% in RRI and 10.6% in HC. CVC to non-central venous catheter (NON-CVC) conversion reduced the risk of hospitalization in both HC (aHR 0.38 [95% CI: 0.21-0.68], p = 0.001) and RRI (aHR 0.84 [95% CI: 0.73-0.93], p = 0.001). NON-CVC patients had a lower mortality risk in both populations.INTRODUCTIONCentral venous catheter (CVC) as vascular access in hemodialysis (HD) associates with adverse outcomes. Early CVC to fistula or graft conversion improves these outcomes. While socioeconomic disparities between the USA and Mexico exist, little is known about CVC prevalence and conversion rates in uninsured Mexican HD patients. We examined vascular access practice patterns and their effects on survival and hospitalization rates among uninsured Mexican HD patients, in comparison with HD patients who initiated treatment in the USA.DISCUSSION/CONCLUSIONCVC prevalence and conversion rates of CVC to NON-CVC differed between the US and Mexican patients. An association exists between vascular access type and hospitalization and mortality risk. Prospective studies are needed to evaluate if accelerated and systematic catheter use reduction would improve outcomes in these populations.METHODSIn this retrospective study of incident HD patients at Hospital Civil (HC; Guadalajara, MX) and the Renal Research Institute (RRI; USA), we categorized patients by the vascular access at the first month of HD and after the following 6 months. Factors associated with continued CVC use were identified by a logistic regression model. We developed multivariate Cox proportional hazards models to investigate the effects of access and conversion on mortality and hospitalization over an 18-month follow-up period.
Ulrich Moissl, Lemuel Rivera Fuentes, Mohamad I Hakim, Manuel Hassler, Dewangi A Kothari, Laura Rosales, Fansan Zhu, Jochen G Raimann, Stephan Thijssen, Peter Kotanko
DISCUSSIONWhile about half of the patients had normal fluid status pre-HD, a considerable proportion of patients was either fluid overloaded or depleted, indicating the need for tools to objectively quantify fluid status.INTRODUCTIONInadequate fluid status remains a key driver of cardiovascular morbidity and mortality in chronic hemodialysis (HD) patients. Quantification of fluid overload (FO) using bioimpedance spectroscopy (BIS) has become standard in many countries. To date, no BIS device has been approved in the United States for fluid status assessment in kidney patients. Therefore, no previous quantification of fluid status in US kidney patients using BIS has been reported. Our aim was to conduct a cross-sectional BIS-based assessment of fluid status in an urban US HD population.FINDINGSWe studied 170 urban HD patients (age 61 ± 14 years, 60% male). Pre- and post-HD FO (mean ± SD), were 2.2 ± 2.4 and -0.2 ± 2.7 L, respectively. Pre-HD, 43% of patients were fluid overloaded, 53% normally hydrated, and 4% fluid depleted. Post-HD, 12% were fluid overloaded, 55% normohydrated and 32% fluid depleted. Only 48% of fluid overloaded patients were hypertensive, while 38% were normotensive and 14% hypotensive. Fluid status did not differ significantly between African Americans (N = 90) and Caucasians (N = 61).METHODSWe determined fluid status in chronic HD patients using whole body BIS (Body Composition Monitor, BCM). The BCM reports FO in liters; negative FO denotes fluid depletion. Measurements were performed before dialysis. Post-HD FO was estimated by subtracting the intradialytic weight loss from the pre-HD FO.
Hanjie Zhang, Dean Preddie, Warren Krackov, Murat Sor, Peter Waguespack, Zuwen Kuang, Xiaoling Ye, Peter Kotanko
No abstract available
Richard V Remigio, Hao He, Jochen G Raimann, Peter Kotanko, Frank W Maddux, Amy Rebecca Sapkota, Xin-Zhong Liang, Robin Puett, Xin He, Amir Sapkota
RESULTSFrom 2001 to 2016, the sample population consisted of 43,338 ESKD patients. We recorded 5217 deaths and 78,433 hospital admissions. A 10-unit increase in PM2.5 concentration was associated with a 5% increase in ACM (rate ratio [RRLag0-3]: 1.05, 95% CI: 1.00-1.10) and same-day O3 (RRLag0: 1.02, 95% CI: 1.01-1.03) after adjusting for extreme heat exposures. Mortality models suggest evidence of interaction and effect measure modification, though not always simultaneously. ACM risk increased up to 8% when daily ozone concentrations exceeded National Ambient Air Quality Standards established by the United States, but the increases in risk were considerably higher during EHE days across lag periods.CONCLUSIONOur findings suggest interdependent effects of EHE and air pollution among ESKD patients for all-cause mortality risks. National level assessments are needed to consider the ESKD population as a sensitive population and inform treatment protocols during extreme heat and degraded pollution episodes.BACKGROUNDIncreasing number of studies have linked air pollution exposure with renal function decline and disease. However, there is a lack of data on its impact among end-stage kidney disease (ESKD) patients and its potential modifying effect from extreme heat events (EHE).METHODSFresenius Kidney Care records from 28 selected northeastern US counties were used to pool daily all-cause mortality (ACM) and all-cause hospital admissions (ACHA) counts. County-level daily ambient PM2.5 and ozone (O3) were estimated using a high-resolution spatiotemporal coupled climate-air quality model and matched to ESKD patients based on ZIP codes of treatment sites. We used time-stratified case-crossover analyses to characterize acute exposures using individual and cumulative lag exposures for up to 3 days (Lag 0-3) by using a distributed lag nonlinear model framework. We used a nested model comparison hypothesis test to evaluate for interaction effects between air pollutants and EHE and stratification analyses to estimate effect measures modified by EHE days.
Sudhir K Bowry, Peter Kotanko, Rainer Himmele, Xia Tao, Michael Anger
Informed decision-making is paramount to the improvement of dialysis therapies and patient outcomes. A cornerstone of delivery of optimal dialysis therapy is to delineate which substances (uraemic retention solutes or 'uraemic toxins') contribute to the condition of uraemia in terms of deleterious biochemical effects they may exert. Thereafter, decisions can be made as to which of the accumulated compounds need to be targeted for removal and by which strategies. For haemodialysis (HD), the non-selectivity of membranes is sometimes considered a limitation. Yet, considering that dozens of substances with potential toxicity need to be eliminated, and targeting removal of individual toxins explicitly is not recommended, current dialysis membranes enable elimination of several molecules of a broad size range within a single therapy session. However, because HD solute removal is based on size-exclusion principles, i.e. the size of the substances to be removed relative to the mean size of the 'pores' of the membrane, only a limited degree of selectivity of removal is possible. Removal of unwanted substances during HD needs to be weighed against the unavoidable loss of substances that are recognized to be necessary for bodily functions and physiology. In striving to improve the efficiency of HD by increasing the porosity of membranes, there is a greater potential for the loss of substances that are of benefit. Based on this elementary trade-off and availability of recent guidance on the relative toxicity of substances retained in uraemia, we propose a new evidence-linked uraemic toxin elimination (ELUTE) approach whereby only those clusters of substances for which there is a sufficient body of evidence linking them to deleterious biological effects need to be targeted for removal. Our approach involves correlating the physical properties of retention solutes (deemed to express toxicity) with key determinants of membranes and separation processes. Our analysis revealed that in attempting to remove the relatively small number of 'larger' substances graded as having only moderate toxicity, uncontrolled (and efficient) removal of several useful compounds would take place simultaneously and may compromise the well-being or outcomes of patients. The bulk of the uraemic toxin load comprises uraemic toxins below <30 000 Da and are adequately removed by standard membranes. Further, removal of a few difficult-to-remove-by-dialysis (protein-bound) compounds that express toxicity cannot be achieved by manipulation of pore size alone. The trade-off between the benefits of effective removal of the bulk of the uraemic toxin load and risks (increased loss of useful substances) associated with targeting the removal of a few larger substances in 'high-efficiency' HD treatment strategies needs to be recognized and better understood. The removability during HD of substances, be they toxic, inert or beneficial, needs be revised to establish the pros and cons of current dialytic elimination strategies. .
Sabrina Casper, Doris H Fuertinger, Leticia M Tapia Silva, Lemuel Rivera Fuentes, Stephan Thijssen, Peter Kotanko
RESULTSIn all tests, the ultrafiltration controller performed as expected. In the in silico and ex vivo bench experiments, the controller showed robust reaction toward deliberate disruptive interventions (e.g. signal noise; extreme plasma refill rates). No adverse events were observed in the clinical study.CONCLUSIONSThe ultrafiltration controller can steer RBV trajectories toward desired RBV ranges while obeying to a set of constraints. Prospective studies in hemodialysis patients with diverse clinical characteristics are warranted to further explore the controllers impact on intradialytic hemodynamic stability, quality of life, and long-term outcomes.BACKGROUNDMost hemodialysis patients without residual kidney function accumulate fluid between dialysis session that needs to be removed by ultrafiltration. Ultrafiltration usually results in a decline in relative blood volume (RBV). Recent epidemiological research has identified RBV ranges that were associated with significantly better survival. The objective of this work was to develop an ultrafiltration controller to steer a patient's RBV trajectory into these favorable RBV ranges.METHODSWe designed a proportional-integral feedback ultrafiltration controller that utilizes signals from a device that reports RBV. The control goal is to attain the RBV trajectory associated with improved patient survival. Additional constraints such as upper and lower bounds of ultrafiltration volume and rate were realized. The controller was evaluated in in silico and ex vivo bench experiments, and in a clinical proof-of-concept study in two maintenance dialysis patients.
Bernard Canaud, Andrew Davenport, Thomas A Golper, Jochen G Raimann
No abstract available
Rasha Hussein, Murilo Guedes, Nada Ibraheim, Mazin M Ali, Amal El-Tahir, Nahla Allam, Hussain Abuakar, Roberto Pecoits-Filho, Peter Kotanko
OBJECTIVESDespite the possibility of concurrent infection with COVID-19 and malaria, little is known about the clinical course of coinfected patients. We analysed the clinical outcomes of patients with concurrent COVID-19 and malaria infection.RESULTSWe included 591 patients with confirmed COVID-19 diagnosis who were also tested for malaria. Mean (SD) age was 58 (16.2) years, 446/591 (75.5%) were males. Malaria was diagnosed in 270/591 (45.7%) patients. Most malaria patients were infected by Plasmodium falciparum (140/270; 51.9%), while 121/270 (44.8%) were coinfected with Plasmodium falciparum and Plasmodium vivax. Median follow-up was 29 days. Crude mortality rates were 10.71 and 5.87 per 1000 person-days for patients with and without concurrent malaria, respectively. In the fully adjusted Cox model, patients with concurrent malaria and COVID-19 had a greater mortality risk (hazard ratio 1.43, 95% confidence interval 1.21-1.69).DISCUSSIONCoinfection with COVID-19 and malaria is associated with increased all-cause in-hospital mortality compared to monoinfection with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2).METHODSWe conducted a retrospective cohort study that assessed prospectively collected data of all patients who were admitted between May and December 2020 to the Universal COVID-19 treatment center (UCTC), Khartoum, Sudan. UCTC compiled demographic, clinical, laboratory (including testing for malaria), and outcome data in all patients with confirmed COVID-19 hospitalized at that clinic. The primary outcome was all-cause mortality during the hospital stay. We built proportional hazard Cox models with malaria status as the main exposure and stepwise adjustment for age, sex, cardiovascular comorbidities, diabetes, and hypertension.
Orly F Kohn, Susie Q Lew, Steve Siu-Man Wong, Ramin Sam, Hung-Chun Chen, Jochen G Raimann, David J Leehey, Antonios H Tzamaloukas, Todd S Ing
Herbal medicine, a form of complementary and alternative medicine (CAM), is used throughout the world, in both developing and developed countries. The ingredients in herbal medicines are not standardized by any regulatory agency. Variability exists in the ingredients as well as in their concentrations. Plant products may become contaminated with bacteria and fungi during storage. Therefore, harm can occur to the kidney, liver, and blood components after ingestion. We encourage scientific studies to identify the active ingredients in herbs and to standardize their concentrations in all herbal preparations. Rigorous studies need to be performed in order to understand the effect of herbal ingredients on different organ systems as well as these substances' interaction with other medications.
Jochen G Raimann, Yuedong Wang, Ariella Mermelstein, Peter Kotanko, John T Daugirdas
RESULTSIn the studied 2542 patients, UFR not scaled to body weight was strongly associated with MHR, whereas postdialysis weight was inversely associated with MHR. MHR crossed 1.5 when unscaled UFR exceeded 1000 ml/h, and this relationship was largely independent of postdialysis weight in the range of 80 to 140 kg. A UFR warning level associated with a lower MHR of 1.3 would be 900 ml/h, whereas the UFR associated with an MHR of 1.0 was patient-size dependent. The MHR when exceeding a UFR threshold of 13 ml/h per kg was dependent on patient weight (MHR = 1.20, 1.45, and >2.0 for a 60, 80, and 100 kg patient, respectively).CONCLUSIONUFR thresholds based on unscaled UFR give more uniform risk levels for patients of different sizes than thresholds based on UFR/kg.INTRODUCTIONOne proposed threshold ultrafiltration rate (UFR) of concern in hemodialysis patients is 13 ml/h per kg. We evaluated associations among UFR, postdialysis weight, and mortality to determine whether exceeding such a threshold would result in similar levels of risk for patients of different body weights.METHODSData were analyzed in this retrospective cohort study for 1 year following dialysis initiation (baseline) and over 2 years of follow-up in incident patients receiving thrice-weekly in-center hemodialysis. Patient-level UFR was averaged over the baseline period. To investigate the joint effect of UFR and postdialysis weight on survival, we fit Cox proportional hazards models using bivariate tensor product spline functions, adjusting for sex, race, age, diabetes, and predialysis serum albumin, phosphorus, and systolic blood pressure (BP). We constructed contour plots of mortality hazard ratios (MHRs) over the entire range of UFR values and postdialysis weights.
Gudrun Schappacher-Tilp, Peter Kotanko, Markus Pirklbauer
Altered parathyroid gland biology is a major driver of chronic kidney disease-mineral bone disorder (CKD-MBD) in patients with chronic kidney disease. CKD-MBD is associated with a high risk of vascular calcification and cardiovascular events. A hallmark of CKD-MBD is secondary hyperparathyroidism with increased parathyroid hormone (PTH) synthesis and release and reduced expression of calcium-sensing receptors on the surface of parathyroid cells and eventually hyperplasia of parathyroid gland cells. The KDIGO guidelines strongly recommend the control of PTH in hemodialysis patients. Due to the complexity of parathyroid gland biology, mathematical models have been employed to study the interaction of PTH regulators and PTH plasma concentrations. Here, we present an overview of various model approaches and discuss the impact of different model structures and complexities on the clinical use of these models.
Peter Kotanko, David J Jörg, Nadja Grobe, Christoph Zaba
Erythropoietin deficiency is an extensively researched cause of renal anemia. The etiology and consequences of shortened red blood cell (RBC) life span in chronic kidney disease (CKD) are less well understood. Traversing capillaries requires RBC geometry changes, a process enabled by adaptions of the cytoskeleton. These changes are mediated by transient activation of the mechanosensory Piezo1 channel, resulting in calcium influx. Importantly, prolonged Piezo1 activation shortens RBC life span, presumably through activation of calcium-dependent intracellular pathways triggering RBC death. Two Piezo1-activating small molecules, Jedi1 and Jedi2, share remarkable structural similarities with 3-carboxy-4-methyl-5-propyl-2-furanpropanoic acid (CMPF), a uremic retention solute cleared by the healthy kidney. We hypothesize that in CKD the accumulation of CMPF leads to prolonged activation of Piezo1 (similar in effect to Jedi1 and Jedi2), thus reducing RBC life span. This hypothesis can be tested through bench experiments and, ultimately, by studying the effect of CMPF removal on renal anemia.
Usama Hussein, Monica Cimini, Garry J Handelman, Jochen G Raimann, Li Liu, Samer R Abbas, Peter Kotanko, Nathan W Levin, Fredric O Finkelstein, Fansan Zhu
Diagnosis of fluid overload (FO) in early stage is essential to manage fluid balance of patients with chronic kidney disease (CKD) and to prevent cardiovascular disease (CVD). However, the identification of fluid status in patients with CKD is largely dependent on the physician's clinical acumen. The ratio of fluid overload to extracellular volume (FO/ECV) has been used as a reference to assess fluid status. The primary aim of this study was to compare FO/ECV with other bioimpedance methods and clinical assessments in patients with CKD. Whole body ECV, intracellular volume (ICV), total body water (TBW), and calf normalized resistivity (CNR) were measured (Hydra 4200). Thresholds of FO utilizing CNR and ECV/TBW were derived by receiver operator characteristic (ROC) analysis based on data from pooled patients with CKD and healthy subjects (HSs). Clinical assessments of FO in patients with CKD were performed by nephrologists. Patients with CKD (stage 3 and stage 4) (n = 50) and HSs (n = 189) were studied. The thresholds of FO were ≤14.3 (10-2 Ωm3/kg) for females and ≤13.1 (10-2 Ωm3/kg) for males using CNR and ≥0.445 in females and ≥0.434 in males using ECV/TBW. FO was diagnosed in 78%, 62%, and 52% of patients with CKD by CNR, FO/ECV, and ECV/TBW, respectively, whereas only 24% of patients with CKD were diagnosed to be FO by clinical assessment. The proportion of FO in patients with nondialysis CKD was largely underestimated by clinical assessment compared with FO/ECV, CNR, and ECV/TBW. CNR and FO/ECV methods were more sensitive than ECV/TBW in identifying fluid overload in these patients with CKD.NEW & NOTEWORTHY We found that fluid overload (FO) in patients with nondialysis CKD was largely underestimated by clinical assessment compared with bioimpedance methods, which was majorly due to lack of appropriate techniques to assess FO. In addition, although degree of FO by bioimpedance markers positively correlated with the age in healthy subjects (HSs), no difference was observed in the three hydration markers between groups of 50 ≤ age <70 yr and age ≥70 yr in the patients with CKD.
Lin-Chun Wang, Jochen G Raimann, Xia Tao, Priscila Preciado, Ohnmar Thwin, Laura Rosales, Stephan Thijssen, Peter Kotanko, Fansan Zhu
DISCUSSIONAlthough segmental eight-point bioimpedance techniques provided comparable TBW measurements not affected by standing over a period of 10-15 min, the ECW/TBW ratio appeared to be significantly lower in InBody compared with Seca and Hydra. Results from our study showed lack of agreement between different bioimpedance devices; direct comparison of ECW, ICW, and ECW/TBW between different devices should be avoided and clinicians should use the same device to track the fluid status in their HD population in a longitudinal direction.INTRODUCTIONSegmental eight-point bioimpedance has been increasingly used in practice. However, whether changes in bioimpedance analysis components before and after hemodialysis (HD) using this technique in a standing position is comparable to traditional whole-body wrist-to-ankle method is still unclear. We aimed to investigate the differences between two eight-point devices (InBody 770 and Seca mBCA 514) and one wrist-to-ankle (Hydra 4200) in HD patients and healthy subjects in a standing position.FINDINGSOverall, total body water (TBW) was not different between the three devices, but InBody showed lower extracellular water (ECW) and higher intracellular water (ICW) compared to the other two devices. When intradialytic weight loss was used as a surrogate for changes in ECW (∆ECW) and changes in TBW (∆TBW), ∆ECW was underestimated by Hydra (-0.79 ± 0.89 L, p < 0.01), InBody (-1.44 ± 0.65 L, p < 0.0001), and Seca (-0.32 ± 1.34, n.s.). ∆TBW was underestimated by Hydra (-1.14 ± 2.81 L, n.s.) and InBody (-0.52 ± 0.85 L, p < 0.05) but overestimated by Seca (+0.93 ± 3.55 L, n.s.).METHODSThirteen HD patients were studied pre- and post-HD, and 12 healthy subjects once. Four measurements were performed in the following order: InBody; Seca; Hydra; and InBody again. Electrical equivalent models by each bioimpedance method and the fluid volume estimates by each device were also compared.
Murilo Guedes, Liz Wallim, Camila R Guetter, Yue Jiao, Vladimir Rigodon, Chance Mysayphonh, Len A Usvyat, Pasqual Barretti, Peter Kotanko, John W Larkin, Franklin W Maddux, Roberto Pecoits-Filho, Thyago Proenca de Moraes
RESULTSWe used data from 4,285 PD patients (Brazil n = 1,388 and United States n = 2,897). Model estimates showed lower vitality levels within 90 days of starting PD were associated with a higher risk of mortality, which was consistent in Brazil and the United States cohorts. In the multivariate survival model, each 10-unit increase in vitality score was associated with lower risk of all-cause mortality in both cohorts (Brazil HR = 0.79 [95%CI 0.70 to 0.90] and United States HR = 0.90 [95%CI 0.88 to 0.93], pooled HR = 0.86 [95%CI 0.75 to 0.98]). Results for all models provided consistent effect estimates.CONCLUSIONSAmong patients in Brazil and the United States, lower vitality score in the initial months of PD was independently associated with all-cause mortality.BACKGROUNDWe tested if fatigue in incident Peritoneal Dialysis associated with an increased risk for mortality, independently from main confounders.METHODSWe conducted a side-by-side study from two of incident PD patients in Brazil and the United States. We used the same code to independently analyze data in both countries during 2004 to 2011. We included data from adults who completed KDQOL-SF vitality subscale within 90 days after starting PD. Vitality score was categorized in four groups: >50 (high vitality), ≥40 to ≤50 (moderate vitality), >35 to <40 (moderate fatigue), ≤35 (high fatigue; reference group). In each country's cohort, we built four distinct models to estimate the associations between vitality (exposure) and all-cause mortality (outcome): (i) Cox regression model; (ii) competing risk model accounting for technique failure events; (iii) multilevel survival model of clinic-level clusters; (iv) multivariate regression model with smoothing splines treating vitality as a continuous measure. Analyses were adjusted for age, comorbidities, PD modality, hemoglobin, and albumin. A mixed-effects meta-analysis was used to pool hazard ratios (HRs) from both cohorts to model mortality risk for each 10-unit increase in vitality.
Bernard Canaud, Jeroen Kooman, Andreas Maierhofer, Jochen Raimann, Jens Titze, Peter Kotanko
New physiologic findings related to sodium homeostasis and pathophysiologic associations require a new vision for sodium, fluid and blood pressure management in dialysis-dependent chronic kidney disease patients. The traditional dry weight probing approach that has prevailed for many years must be reviewed in light of these findings and enriched by availability of new tools for monitoring and handling sodium and water imbalances. A comprehensive and integrated approach is needed to improve further cardiac health in hemodialysis (HD) patients. Adequate management of sodium, water, volume and hemodynamic control of HD patients relies on a stepwise approach: the first entails assessment and monitoring of fluid status and relies on clinical judgement supported by specific tools that are online embedded in the HD machine or devices used offline; the second consists of acting on correcting fluid imbalance mainly through dialysis prescription (treatment time, active tools embedded on HD machine) but also on guidance related to diet and thirst management; the third consist of fine tuning treatment prescription to patient responses and tolerance with the support of innovative tools such as artificial intelligence and remote pervasive health trackers. It is time to come back to sodium and water imbalance as the root cause of the problem and not to act primarily on their consequences (fluid overload, hypertension) or organ damage (heart; atherosclerosis, brain). We know the problem and have the tools to assess and manage in a more precise way sodium and fluid in HD patients. We strongly call for a sodium first approach to reduce disease burden and improve cardiac health in dialysis-dependent chronic kidney disease patients.
Jeroen Peter Kooman, Paola Carioni, Vratislava Kovarova, Otto Arkossy, Anke Winter, Yan Zhang, Francesco Bellocchio, Peter Kotanko, Hanjie Zhang, Len Usvyat, John Larkin, Stefano Stuard, Luca Neri
RESULTSWe included 9,211 patients (age 65.4 ± 13.7 years, dialysis vintage 4.2 ± 3.7 years) eligible for the study. The 30-day mortality rate was 20.8%. In LR models, several potentially modifiable factors were associated with higher mortality: body mass index (BMI) 30-40 kg/m2 (OR: 1.28, CI: 1.10-1.50), single-pool Kt/V (OR off-target vs on-target: 1.19, CI: 1.02-1.38), overhydration (OR: 1.15, CI: 1.01-1.32), and both low (<2.5 mg/dl) and high (≥5.5 mg/dl) serum phosphate levels (OR: 1.52, CI: 1.07-2.16 and OR: 1.17, CI: 1.01-1.35). On-line hemodiafiltration was protective in the model using KPIs (OR: 0.86, CI: 0.76-0.97). SHapley Additive exPlanations analysis in XGBoost models shows a high influence on prediction for several modifiable factors as well, including inflammatory parameters, high BMI, and fluid overload. In both LR and XGBoost models, age, gender, and comorbidities were strongly associated with mortality.CONCLUSIONBoth conventional and machine learning techniques showed that KPIs and modifiable risk factors in different dimensions ascertained 6 months before the COVID-19 suspicion date were associated with 30-day COVID-19-related mortality. Our results suggest that adequate dialysis and achieving KPI targets remain of major importance during the COVID-19 pandemic as well.INTRODUCTIONPatients with end-stage kidney disease face a higher risk of severe outcomes from SARS-CoV-2 infection. Moreover, it is not well known to what extent potentially modifiable risk factors contribute to mortality risk. In this historical cohort study, we investigated the incidence and risk factors for 30-day mortality among hemodialysis patients with SARS-CoV-2 infection treated in the European Fresenius Medical Care NephroCare network using conventional and machine learning techniques.METHODSWe included adult hemodialysis patients with the first documented SARS-CoV-2 infection between February 1, 2020, and March 31, 2021, registered in the clinical database. The index date for the analysis was the first SARS-CoV-2 suspicion date. Patients were followed for up to 30 days until April 30, 2021. Demographics, comorbidities, and various modifiable risk factors, expressed as continuous parameters and as key performance indicators (KPIs), were considered to tap multiple dimensions including hemodynamic control, nutritional state, and mineral metabolism in the 6 months before the index date. We used logistic regression (LR) and XGBoost models to assess risk factors for 30-day mortality.
Xiaoling Wang, Maggie Han, Lemuel Rivera Fuentes, Ohnmar Thwin, Nadja Grobe, Kevin Wang, Yuedong Wang, Peter Kotanko
RESULTSForty-two patients had three doses of mRNA1273. Compared to levels prior to the third dose, nAb-WT increased 18-fold (peak at day 23) and nAb-Omicron increased 23-fold (peak at day 24) after the third dose. Peak nAb-WT exceeded peak nAb-Omicron 27-fold. Twenty-one patients had COVID-19 between December 24, 2021, and February 2, 2022. Following COVID-19, nAb-WT and nAb-Omicron increased 12- and 40-fold, respectively. While levels of vaccinal and post-COVID nAb-WT were comparable, post-COVID nAb-Omicron levels were 3.2 higher than the respective peak vaccinal nAb-Omicron. Four immunocompromised patients having reasons other than end-stage kidney disease have very low to no nAb after the third dose or COVID-19.CONCLUSIONSOur results suggest that most hemodialysis patients have a strong humoral response to the third dose of vaccination and an even stronger post-COVID-19 humoral response. Nevertheless, nAb levels clearly decay over time. These findings may inform ongoing discussions regarding a fourth vaccination in hemodialysis patients.BACKGROUNDIn hemodialysis patients, a third vaccination is frequently administered to augment protection against coronavirus disease 2019 (COVID-19). However, the newly emerged B.1.1.159 (Omicron) variant may evade vaccinal protection more easily than previous strains. It is of clinical interest to better understand the neutralizing activity against severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) variants after booster vaccine or COVID-19 infection in these mostly immunocompromised patients.METHODSHemodialysis patients from four dialysis centers were recruited between June 2021 and February 2022. Each patient provided a median of six serum samples. SARS-CoV-2 neutralizing antibodies (nAbs) against wild type (WT) or Omicron were measured using the GenScript SARS-CoV-2 Surrogate Virus Neutralization Test Kit.
Xin Wang, Leticia M Tapia Silva, Milind Nikam, Sandip Mitra, Syed Shaukat Abbas Zaidi, Nadja Grobe
The aim of the paper is to summarize the current understanding of the molecular biology of arteriovenous fistula (AVF). It intends to encourage vascular access teams, care providers, and scientists, to explore new molecular tools for assessing the suitability of patients for AVF as vascular access for maintenance hemodialysis (HD). This review also highlights most recent discoveries and may serve as a guide to explore biomarkers and technologies for the assessment of kidney disease patients choosing to start kidney replacement therapy. Objective criteria for AVF eligibility are lacking partly because the underlying physiology of AVF maturation is poorly understood. Several molecular processes during a life cycle of an AVF, even before creation, can be characterized by measuring molecular fingerprints using newest "omics" technologies. In addition to hypothesis-driven strategies, untargeted approaches have the potential to reveal the interplay of hundreds of metabolites, transcripts, proteins, and genes underlying cardiovascular adaptation and vascular access-related adjustments at any given timepoint of a patient with kidney disease. As a result, regular monitoring of modifiable, molecular risk factors together with clinical assessment could help to reduce AVF failure rates, increase patency, and improve long-term outcomes. For the future, identification of vulnerable patients based on the assessment of biological markers of AVF maturation at different stages of the life cycle may aid in individualizing vascular access recommendations.
David J Jörg, Doris H Fuertinger, Alhaji Cherif, David A Bushinsky, Ariella Mermelstein, Jochen G Raimann, Peter Kotanko
Our bones are constantly being renewed in a fine-tuned cycle of destruction and formation that helps keep them healthy and strong. However, this process can become imbalanced and lead to osteoporosis, where the bones are weakened and have a high risk of fracturing. This is particularly common post-menopause, with one in three women over the age of 50 experiencing a broken bone due to osteoporosis. There are several drug types available for treating osteoporosis, which work in different ways to strengthen bones. These drugs can be taken individually or combined, meaning that a huge number of drug combinations and treatment strategies are theoretically possible. However, it is not practical to test the effectiveness of all of these options in human trials. This could mean that patients are not getting the maximum potential benefit from the drugs available. Jörg et al. developed a mathematical model to predict how different osteoporosis drugs affect the process of bone renewal in the human body. The model could then simulate the effect of changing the order in which the therapies were taken, which showed that the sequence had a considerable impact on the efficacy of the treatment. This occurs because different drugs can interact with each other, leading to an improved outcome when they work in the right order. These results suggest that people with osteoporosis may benefit from altered treatment schemes without changing the type or amount of medication taken. The model could suggest new treatment combinations that reduce the risk of bone fracture, potentially even developing personalised plans for individual patients based on routine clinical measurements in response to different drugs.
Gabriela F Dias, Sara S Tozoni, Gabriela Bohnen, Beatriz A K van Spitzenbergen, Nadja Grobe, Lia S Nakao, Roberto Pecoits-Filho, Peter Kotanko, Andréa N Moreno-Amaral
Oxidative stress (OS) is essential in uremia-associated comorbidities, including renal anemia. Complications experienced by hemodialysis (HD) patients, such as hypoxemia and uremic toxins accumulation, induce OS and premature death of red blood cells (RBC). We aimed to characterize reactive oxygen species (ROS) production and antioxidant pathways in HD-RBC and RBC from healthy controls (CON-RBC) and evaluate the role of uremia and hypoxia in these pathways. ROS production, xanthine oxidase (XO) and superoxide dismutase (SOD) activities, glutathione (GSH), and heme oxygenase-1 (HO-1) levels were measured using flow cytometry or spectrophotometry in CON-RBC and HD-RBC (pre- and post-HD), at baseline and after 24 h incubation with uremic serum (S-HD) and/or under hypoxic conditions (5% O2 ). Ketoprofen was used to inhibit RBC uremic toxins uptake. HD-RBC showed higher ROS levels and lower XO activity than CON-RBC, particularly post-HD. GSH levels were lower, while SOD activity and HO-1 levels of HD-RBC were higher than control. Hypoxia per se triggered ROS production in CON-RBC and HD-RBC. S-HD, on top of hypoxia, increased ROS levels. Inhibition of uremic toxins uptake attenuated ROS of CON and HD-RBC under hypoxia and uremia. CON-RBC in uremia and hypoxia showed lower GSH levels than cells in normoxia and non-uremic conditions. Redox mechanisms of HD-RBC are altered and prone to oxidation. Uremic toxins and hypoxia play a role in unbalancing these systems. Hypoxia and uremia participate in the pathogenesis of OS in HD-RBC and might induce RBC death and thus compound anemia.
Paulo P Galuzio, Alhaji Cherif, Xia Tao, Ohnmar Thwin, Hanjie Zhang, Stephan Thijssen, Peter Kotanko
In patients with kidney failure treated by hemodialysis, intradialytic arterial oxygen saturation (SaO2) time series present intermittent high-frequency high-amplitude oximetry patterns (IHHOP), which correlate with observed sleep-associated breathing disturbances. A new method for identifying such intermittent patterns is proposed. The method is based on the analysis of recurrence in the time series through the quantification of an optimal recurrence threshold ([Formula: see text]). New time series for the value of [Formula: see text] were constructed using a rolling window scheme, which allowed for real-time identification of the occurrence of IHHOPs. The results for the optimal recurrence threshold were confronted with standard metrics used in studies of obstructive sleep apnea, namely the oxygen desaturation index (ODI) and oxygen desaturation density (ODD). A high correlation between [Formula: see text] and the ODD was observed. Using the value of the ODI as a surrogate to the apnea-hypopnea index (AHI), it was shown that the value of [Formula: see text] distinguishes occurrences of sleep apnea with great accuracy. When subjected to binary classifiers, this newly proposed metric has great power for predicting the occurrences of sleep apnea-related events, as can be seen by the larger than 0.90 AUC observed in the ROC curve. Therefore, the optimal threshold [Formula: see text] from recurrence analysis can be used as a metric to quantify the occurrence of abnormal behaviors in the arterial oxygen saturation time series.
Adrián M Guinsburg, Yue Jiao, María Inés Díaz Bessone, Caitlin K Monaghan, Beatriz Magalhães, Michael A Kraus, Peter Kotanko, Jeffrey L Hymes, Robert J Kossmann, Juan Carlos Berbessi, Franklin W Maddux, Len A Usvyat, John W Larkin
RESULTSAmong HD patients with COVID-19, 28.8% (1,001/3,473) died in LatAm and 20.5% (4,426/21,624) died in North America. Mortality occurred earlier in LatAm versus North America; 15.0% and 7.3% of patients died within 0-14 days, 7.9% and 4.6% of patients died within 15-30 days, and 5.9% and 8.6% of patients died > 30 days after COVID-19 presentation, respectively. Area under curve ranged from 0.73 to 0.83 across prediction models in both regions. Top predictors of death after COVID-19 consistently included older age, longer vintage, markers of poor nutrition and more inflammation in both regions at all timepoints. Unique patient attributes (higher BMI, male sex) were top predictors of mortality during 0-14 and 15-30 days after COVID-19, yet not mortality > 30 days after presentation.CONCLUSIONSFindings showed distinct profiles of mortality in COVID-19 in LatAm and North America throughout 2020. Mortality rate was higher within 0-14 and 15-30 days after COVID-19 in LatAm, while mortality rate was higher in North America > 30 days after presentation. Nonetheless, a remarkable proportion of HD patients died > 30 days after COVID-19 presentation in both regions. We were able to develop a series of suitable prognostic prediction models and establish the top predictors of death in COVID-19 during shorter-, intermediate-, and longer-term follow up periods.BACKGROUNDWe developed machine learning models to understand the predictors of shorter-, intermediate-, and longer-term mortality among hemodialysis (HD) patients affected by COVID-19 in four countries in the Americas.METHODSWe used data from adult HD patients treated at regional institutions of a global provider in Latin America (LatAm) and North America who contracted COVID-19 in 2020 before SARS-CoV-2 vaccines were available. Using 93 commonly captured variables, we developed machine learning models that predicted the likelihood of death overall, as well as during 0-14, 15-30, > 30 days after COVID-19 presentation and identified the importance of predictors. XGBoost models were built in parallel using the same programming with a 60%:20%:20% random split for training, validation, & testing data for the datasets from LatAm (Argentina, Columbia, Ecuador) and North America (United States) countries.
Ana Paula Bernardo, Paola Carioni, Stefano Stuard, Peter Kotanko, Len A Usvyat, Vratislava Kovarova, Otto Arkossy, Francesco Bellocchio, Antonio Tupputi, Federica Gervasoni, Anke Winter, Yan Zhang, Hanjie Zhang, Pedro Ponce, Luca Neri
RESULTSIn the effectiveness analysis concerning mRNA vaccines, we observed 850 SARS-CoV-2 infections and 201 COVID-19 related deaths among the 28110 patients during a mean follow up of 44 ± 40 days. In the effectiveness analysis concerning viral-carrier vaccines, we observed 297 SARS-CoV-2 infections and 64 COVID-19 related deaths among 12888 patients during a mean follow up of 48 ± 32 days. We observed 18.5/100-patient-year and 8.5/100-patient-year fewer infections and 5.4/100-patient-year and 5.2/100-patient-year fewer COVID-19 related deaths among patients vaccinated with mRNA and viral-carrier vaccines respectively, compared to matched unvaccinated controls. Estimated vaccine effectiveness at days 15, 30, 60 and 90 after the first dose of a mRNA vaccine was: for infection, 41.3%, 54.5%, 72.6% and 83.5% and, for death, 33.1%, 55.4%, 80.1% and 91.2%. Estimated vaccine effectiveness after the first dose of a viral-carrier vaccine was: for infection, 38.3% without increasing over time and, for death, 56.6%, 75.3%, 92.0% and 97.4%.CONCLUSIONIn this large, real-world cohort of hemodialyzed patients, mRNA and viral-carrier COVID-19 vaccines were associated with reduced COVID-19 related mortality. Additionally, we observed a strong reduction of SARS-CoV-2 infection in hemodialysis patients receiving mRNA vaccines.BACKGROUNDHemodialysis patients have high-risk of severe SARS-CoV-2 infection but were unrepresented in randomized controlled trials evaluating the safety and efficacy of COVID-19 vaccines. We estimated the real-world effectiveness of COVID-19 vaccines in a large international cohort of hemodialysis patients.METHODSIn this historical, 1:1 matched cohort study, we included adult hemodialysis patients receiving treatment from December 1, 2020, to May 31, 2021. For each vaccinated patient, an unvaccinated control was selected among patients registered in the same country and attending a dialysis session around the first vaccination date. Matching was based on demographics, clinical characteristics, past COVID-19 infections and a risk score representing the local background risk of infection at vaccination dates. We estimated the effectiveness of mRNA and viral-carrier COVID-19 vaccines in preventing infection and mortality rates from a time-dependent Cox regression stratified by country.
Dalia E Yousif, Xiaoling Ye, Stefano Stuard, Juan Berbessi, Adrian M Guinsburg, Len A Usvyat, Jochen G Raimann, Jeroen P Kooman, Frank M van der Sande, Neill Duncan, Kevin J Woollard, Rupert Bright, Charles Pusey, Vineet Gupta, Joachim H Ix, Peter Kotanko, Rakesh Malhotra
RESULTSWe studied 18,726 incident hemodialysis patients. Their age at dialysis initiation was 71.3 ± 11.9 years; 10,802 (58%) were males. Within the first 6 months, 2068 (11%) patients died, and 12,295 patients (67%) survived >36 months (survivor cohort). Hemodialysis patients who died showed a distinct biphasic pattern of change in inflammatory markers where an initial decline of inflammation was followed by a rapid rise that was consistently evident approximately 6 months before death. This pattern was similar in all patients who died and was consistent across the survival time intervals. In contrast, in the survivor cohort, we observed initial decline of inflammation followed by sustained low levels of inflammatory biomarkers.CONCLUSIONOur international study of incident hemodialysis patients highlights a temporal relationship between serial measurements of inflammatory markers and patient survival. This finding may inform the development of prognostic models, such as the integration of dynamic changes in inflammatory markers for individual risk profiling and guiding preventive and therapeutic interventions.INTRODUCTIONInflammation is highly prevalent among patients with end-stage kidney disease and is associated with adverse outcomes. We aimed to investigate longitudinal changes in inflammatory markers in a diverse international incident hemodialysis patient population.METHODSThe MONitoring Dialysis Outcomes (MONDO) Consortium encompasses hemodialysis databases from 31 countries in Europe, North America, South America, and Asia. The MONDO database was queried for inflammatory markers (total white blood cell count [WBC], neutrophil count, lymphocyte count, serum albumin, and C-reactive protein [CRP]) and hemoglobin levels in incident hemodialysis patients. Laboratory parameters were measured every month. Patients were stratified by survival time (≤6 months, >6 to 12 months, >12 to 18 months, >18 to 24 months, >24 to 30 months, >30 to 36 months, and >36 months) following dialysis initiation. We used cubic B-spline basis function to evaluate temporal changes in inflammatory parameters in relationship with patient survival.
Sheetal Chaudhuri, John Larkin, Murilo Guedes, Yue Jiao, Peter Kotanko, Yuedong Wang, Len Usvyat, Jeroen P Kooman
MATERIALS AND METHODSWe included data HD patients who had data across a baseline period of at least 1 year and 1 day in the internationally representative Monitoring Dialysis Outcomes (MONDO) Initiative dataset. Twenty-three input parameters considered in the model were chosen in an a priori manner. The prediction model used 1 year baseline data to predict death in the following 3 years. The dataset was randomly split into 80% training data and 20% testing data for model development. Two different modeling techniques were used to build the mortality prediction model.DISCUSSIONIn the internationally representative MONDO data for HD patients, we describe the development of a ML model and a traditional statistical model that was suitable for classification of a prevalent HD patient's 3-year risk of death. While both models had a reasonably high AUROC, the ML model was able to identify levels of hematocrit (HCT) as an important risk factor in mortality. If implemented in clinical practice, such proof-of-concept models could be used to provide pre-emptive care for HD patients.INTRODUCTIONSeveral factors affect the survival of End Stage Kidney Disease (ESKD) patients on dialysis. Machine learning (ML) models may help tackle multivariable and complex, often non-linear predictors of adverse clinical events in ESKD patients. In this study, we used advanced ML method as well as a traditional statistical method to develop and compare the risk factors for mortality prediction model in hemodialysis (HD) patients.FINDINGSA total of 95,142 patients were included in the analysis sample. The area under the receiver operating curve (AUROC) of the model on the test data with XGBoost ML model was 0.84 on the training data and 0.80 on the test data. AUROC of the logistic regression model was 0.73 on training data and 0.75 on test data. Four out of the top five predictors were common to both modeling strategies.
Hanjie Zhang, Max Botler, Jeroen P Kooman
Analysis of medical images, such as radiological or tissue specimens, is an indispensable part of medical diagnostics. Conventionally done manually, the process may sometimes be time-consuming and prone to interobserver variability. Image classification and segmentation by deep learning strategies, predominantly convolutional neural networks, may provide a significant advance in the diagnostic process. In renal medicine, most evidence has been generated around the radiological assessment of renal abnormalities and histological analysis of renal biopsy specimens' segmentation. In this article, the basic principles of image analysis by convolutional neural networks, brief descriptions of convolutional neural networks, and their system architecture for image analysis are discussed, in combination with examples regarding their use in image analysis in nephrology.
Murilo Guedes, Brian Bieber, Indranil Dasgupta, Almudena Vega, Kosaku Nitta, Steven Brunelli, John Hartman, Jochen G Raimann, Bruce M Robinson, Ronald L Pisoni
Mineral bone disorder (MBD) is a frequent consequence of chronic kidney disease, more so in patients with kidney failure treated by kidney replacement therapy. Despite the wide availability of interventions to control serum phosphate and parathyroid hormone levels, unmet gaps remain on optimal targets and best practices, leading to international practice pattern variations over time. In this Special Report, we describe international trends from the Dialysis Outcomes and Practice Patterns Study (DOPPS) for MBD biomarkers and treatments from 2002-2021, including data from a group of 7 European countries (Belgium, France, Germany, Italy, Spain, Sweden, United Kingdom), Japan, and the United States. From 2002-2012, mean phosphate levels declined in Japan (5.6 to 5.2 mg/dL), Europe (5.5 to 4.9 mg/dL), and the United States (5.7 to 5.0 mg/dL). Since then, levels rose in the United States (to mean 5.6 mg/dL, 2021), were stable in Japan (5.3 mg/dL), and declined in Europe (4.8 mg/dL). In 2021, 52% (United States), 27% (Europe), and 39% (Japan) had phosphate >5.5 mg/dL. In the United States, overall phosphate binder use was stable (80%-84% over 2015-2021), and parathyroid hormone levels rose only modestly. Although these results potentially stem from pervasive knowledge gaps in clinical practice, the noteworthy steady increase in serum phosphate in the United States over the past decades may be consequential to patient outcomes, an uncertainty that hopefully will soon be addressed by ongoing clinical trials. The DOPPS will continue to monitor international trends as new interventions and strategies ensue for MBD management in chronic kidney disease.
Thomas Lang, Adam M Zawada, Lukas Theis, Jennifer Braun, Bertram Ottillinger, Pascal Kopperschmidt, Alfred Gagel, Peter Kotanko, Manuela Stauss-Grabo, James P Kennedy, Bernard Canaud
Despite the significant medical and technical improvements in the field of dialytic renal replacement modalities, morbidity and mortality are excessively high among patients with end-stage kidney disease, and most interventional studies yielded disappointing results. Hemodiafiltration, a dialysis method that was implemented in clinics many years ago and that combines the two main principles of hemodialysis and hemofiltration-diffusion and convection-has had a positive impact on mortality rates, especially when delivered in a high-volume mode as a surrogate for a high convective dose. The achievement of high substitution volumes during dialysis treatments does not only depend on patient characteristics but also on the dialyzer (membrane) and the adequately equipped hemodiafiltration machine. The present review article summarizes the technical aspects of online hemodiafiltration and discusses present and ongoing clinical studies with regards to hard clinical and patient-reported outcomes.
Paulo Paneque Galuzio, Alhaji Cherif
We reviewed some of the latest advancements in the use of mathematical models in nephrology. We looked over 2 distinct categories of mathematical models that are widely used in biological research and pointed out some of their strengths and weaknesses when applied to health care, especially in the context of nephrology. A mechanistic dynamical system allows the representation of causal relations among the system variables but with a more complex and longer development/implementation phase. Artificial intelligence/machine learning provides predictive tools that allow identifying correlative patterns in large data sets, but they are usually harder-to-interpret black boxes. Chronic kidney disease (CKD), a major worldwide health problem, generates copious quantities of data that can be leveraged by choice of the appropriate model; also, there is a large number of dialysis parameters that need to be determined at every treatment session that can benefit from predictive mechanistic models. Following important steps in the use of mathematical methods in medical science might be in the intersection of seemingly antagonistic frameworks, by leveraging the strength of each to provide better care.
Girish N Nadkarni, Peter Kotanko
No abstract available
David J Jörg, Doris H Fuertinger, Peter Kotanko
Patients with renal anemia are frequently treated with erythropoiesis-stimulating agents (ESAs), which are dynamically dosed in order to stabilize blood hemoglobin levels within a specified target range. During typical ESA treatments, a fraction of patients experience hemoglobin 'cycling' periods during which hemoglobin levels periodically over- and undershoot the target range. Here we report a specific mechanism of hemoglobin cycling, whereby cycles emerge from the patient's delayed physiological response to ESAs and concurrent ESA dose adjustments. We introduce a minimal theoretical model that can explain dynamic hallmarks of observed hemoglobin cycling events in clinical time series and elucidates how physiological factors (such as red blood cell lifespan and ESA responsiveness) and treatment-related factors (such as dosing schemes) affect cycling. These results show that in general, hemoglobin cycling cannot be attributed to patient physiology or ESA treatment alone but emerges through an interplay of both, with consequences for the design of ESA treatment strategies.
Peter Kotanko, Hanjie Zhang, Yuedong Wang
No abstract available
Christina H Wang, Dan Negoianu, Hanjie Zhang, Sabrina Casper, Jesse Y Hsu, Peter Kotanko, Jochen Raimann, Laura M Dember
RESULTSDuring 180,319 HD sessions among 2554 patients, PRR had high within-patient and between-patient variability. Female sex and hypoalbuminemia were associated with low PRR at multiple time points during the first hour of HD. Low starting PRR has a higher hazard of IDH, whereas high starting PRR was protective (hazard ratio [HR], 1.26, 95% confidence interval [CI], 1.18 to 1.35 versus HR, 0.79, 95% CI, 0.73 to 0.85, respectively). However, when accounting for time-varying PRR and time-varying confounders, compared with a moderate PRR, while a consistently low PRR was associated with increased risk of hypotension (odds ratio [OR], 1.09, 95% CI, 1.02 to 1.16), a consistently high PRR had a stronger association with hypotension within the next 15 minutes (OR, 1.38, 95% CI, 1.30 to 1.45).KEY POINTSDirectly studying plasma refill rate (PRR) during hemodialysis (HD) can offer insight into physiologic mechanisms that change throughout HD. PRR at the start and during HD is associated with intradialytic hypotension, independent of ultrafiltration rate. A rising PRR during HD may be an early indicator of compensatory mechanisms for impending circulatory instability.CONCLUSIONSWe present a straightforward technique to quantify plasma refill that could easily integrate with devices that monitor hematocrit during HD. Our study highlights how examining patterns of plasma refill may enhance our understanding of circulatory changes during HD, an important step to understand how current technology might be used to improve hemodynamic instability.BACKGROUNDAttaining the optimal balance between achieving adequate volume removal while preserving organ perfusion is a challenge for patients receiving maintenance hemodialysis (HD). Current strategies to guide ultrafiltration are inadequate.METHODSWe developed an approach to calculate the plasma refill rate (PRR) throughout HD using hematocrit and ultrafiltration data in a retrospective cohort of patients receiving maintenance HD at 17 dialysis units from January 2017 to October 2019. We studied whether (1) PRR is associated with traditional risk factors for hemodynamic instability using logistic regression, (2) low starting PRR is associated with intradialytic hypotension (IDH) using Cox proportional hazard regression, and (3) time-varying PRR throughout HD is associated with hypotension using marginal structural modeling.
Rakesh Malhotra, Sina Rahimi, Ushma Agarwal, Ronit Katz, Ujjala Kumar, Pranav S Garimella, Vineet Gupta, Tushar Chopra, Peter Kotanko, T Alp Ikizler, Britta Larsen, Lisa Cadmus-Bertram, Joachim H Ix
RESULTSOut of 55 participants, 46 participants completed the 12-week intervention (23 per arm). The mean age was 62 (± 14 SD) years; 44% were Black, and 36% were Hispanic. At baseline, step count (structured feedback intervention: 3,704 [1,594] vs wearable activity tracker alone: 3,808 [1,890]) and other participant characteristics were balanced between the arms. We observed a larger change in daily step count in the structured feedback arm at 12 weeks relative to use of the wearable activity tracker alone arm (Δ 920 [±580 SD] versus Δ 281 [±186 SD] steps; between-group difference Δ 639 [±538 SD] steps; P<0.05).RATIONALE & OBJECTIVEPeople with end-stage kidney disease (ESKD) have very low physical activity, and the degree of inactivity is strongly associated with morbidity and mortality. We assessed the feasibility and effectiveness of a 12-week intervention coupling a wearable activity tracker (FitBit) and structured feedback coaching versus wearable activity tracker alone on changes in physical activity in hemodialysis patients.INTERVENTIONSAll participants wore a Fitbit Charge 2 tracker for a minimum of 12 weeks. Participants were randomly assigned 1:1 to a wearable activity tracker plus a structured feedback intervention versus the wearable activity tracker alone. The structured feedback group was counseled weekly on steps achieved after randomization.TRIAL REGISTRATIONRegistered at ClinicalTrials.gov with study number NCT05241171.LIMITATIONSSingle-center study and small sample size.STUDY DESIGNRandomized controlled trial.OUTCOMEThe outcome was step count, and the main parameter of interest was the absolute change in daily step count, averaged per week, from baseline to completion of 12 weeks intervention. In the intention-to-treat analysis, mixed-effect linear regression analysis was used to evaluate change in daily step count from baseline to 12-weeks in both arms.FUNDINGGrants from industry (Satellite Healthcare) and government (National Institute for Diabetes and Digestive and Kidney Diseases (NIDDK).CONCLUSIONThis pilot randomized controlled trial demonstrated that structured feedback coupled with a wearable activity tracker led to a greater daily step count that was sustained over 12 weeks relative to a wearable activity tracker alone. Future studies are required to determine longer-term sustainability of the intervention and potential health benefits in hemodialysis patients.SETTING & PARTICIPANTS55 participants with ESKD receiving hemodialysis who were able to walk with or without assistive devices recruited from a single academic hemodialysis unit between January 2019 and April 2020.
Amir Sapkota, Peter Kotanko
No abstract available
Karlien J Ter Meulen, Xiaoling Ye, Yuedong Wang, Len A Usvyat, Frank M van der Sande, Constantijn J Konings, Peter Kotanko, Jeroen P Kooman, Franklin W Maddux
RESULTSWe included 302,613 patients. Baseline phosphate was 5.1±1.2 mg/dl, and mean DR was +0.6±3.3 mg/dl. Across different levels of phosphate, higher levels of DR of phosphate were associated with higher risk of all-cause mortality. In patients with lower levels of phosphate and serum albumin, the effect of a negative DR was most pronounced, whereas in patients with higher phosphate levels, a positive DR was related to increased mortality.KEY POINTSAn increase in serum phosphate variability is an independent risk factor of mortality. The effects of a positive directional range (DR) is most pronounced in patients with high serum phosphate levels whereas the effects of a negative DR is most pronounced in patients with low serum phosphate and/or serum albumin.CONCLUSIONSHigher variability of serum phosphate is related to mortality at all levels of phosphate, especially in lower levels with a negative DR and in low serum albumin levels. This could possibly reflect dietary intake in patients who are already inflamed or malnourished, where a further reduction in serum phosphate should prompt for nutritional evaluation.BACKGROUNDIn maintenance hemodialysis (HD) patients, previous studies have shown that serum phosphate levels have a bidirectional relation to outcome. Less is known about the relation between temporal dynamics of serum phosphate in relation to outcome. We aimed to further explore the relation between serum phosphate variability and all-cause mortality.METHODSAll adult incident HD patients treated in US Fresenius Kidney Care clinics between January 2010 and October 2018 were included. Baseline period was defined as 6 months after initiation of HD and months 7–18 as follow-up period. All-cause mortality was recorded during the follow-up period. The primary metric of variability used was directional range (DR) that is the difference between the largest and smallest values within a time period; DR was positive when the smallest value preceded the largest and negative otherwise. Cox proportional hazards models with spline terms were applied to explore the association between phosphate, DR, and all-cause mortality. In addition, tensor product smoothing splines were computed to further elucidate the interactions of phosphate, DR, and all-cause mortality.
Jonathan S Chávez-Iñiguez, Jochen G Raimann
No abstract available
Nadja Grobe, Josef Scheiber, Hanjie Zhang, Christian Garbe, Xiaoling Wang
Omics applications in nephrology may have relevance in the future to improve clinical care of kidney disease patients. In a short term, patients will benefit from specific measurement and computational analyses around biomarkers identified at various omics-levels. In mid term and long term, these approaches will need to be integrated into a holistic representation of the kidney and all its influencing factors for individualized patient care. Research demonstrates robust data to justify the application of omics for better understanding, risk stratification, and individualized treatment of kidney disease patients. Despite these advances in the research setting, there is still a lack of evidence showing the combination of omics technologies with artificial intelligence and its application in clinical diagnostics and care of patients with kidney disease.
Richard V Remigio, Hyeonjin Song, Jochen G Raimann, Peter Kotanko, Frank W Maddux, Rachel A Lasky, Xin He, Amir Sapkota
RESULTSWe observed positive associations between inclement weather and missed appointment (rainfall, hurricane and tropical storm, snowfall, snow depth, and wind advisory) when compared with noninclement weather days. The risk of missed appointments was most pronounced during the day of inclement weather (lag 0) for rainfall (incidence rate ratio [RR], 1.03 per 10-mm rainfall; 95% confidence interval [CI], 1.02 to 1.03) and snowfall (RR, 1.02; 95% CI, 1.01 to 1.02). Over 7 days (lag 0-6), hurricane and tropical storm exposures were associated with a 55% higher risk of missed appointments (RR, 1.55; 95% CI, 1.22 to 1.98). Similarly, 7-day cumulative exposure to sustained wind advisories was associated with 29% higher risk (RR, 1.29; 95% CI, 1.25 to 1.31), while wind gusts advisories showed a 34% higher risk (RR, 1.34; 95% CI, 1.29 to 1.39) of missed appointment.CONCLUSIONSInclement weather was associated with higher risk of missed hemodialysis appointments within the Northeastern United States. Furthermore, the association between inclement weather and missed hemodialysis appointments persisted for several days, depending on the inclement weather type.BACKGROUNDNonadherence to hemodialysis appointments could potentially result in health complications that can influence morbidity and mortality. We examined the association between different types of inclement weather and hemodialysis appointment adherence.METHODSWe analyzed health records of 60,135 patients with kidney failure who received in-center hemodialysis treatment at Fresenius Kidney Care clinics across the Northeastern US counties during 2001-2019. County-level daily meteorological data on rainfall, hurricane and tropical storm events, snowfall, snow depth, and wind speed were extracted using National Oceanic and Atmosphere Agency data sources. A time-stratified case-crossover study design with conditional Poisson regression was used to estimate the effect of inclement weather exposures within the Northeastern US region. We applied a distributed lag nonlinear model framework to evaluate the delayed effect of inclement weather for up to 1 week.
Ariella Mermelstein, Jochen G Raimann, Yuedong Wang, Peter Kotanko, John T Daugirdas
RESULTSIn the studied 396,358 patients, the average ultrafiltration rate in ml/h was related to postdialysis weight (W) in kg: 3W+330. Ultrafiltration rates associated with 20% or 40% higher weight-specific mortality risk were 3W+500 and 3W+630 ml/h, respectively, and were 70 ml/h higher in men than in women. Nineteen percent or 7.5% of patients exceeded ultrafiltration rates associated with a 20% or 40% higher mortality risk, respectively. Low ultrafiltration rates were associated with subsequent weight loss. Ultrafiltration rates associated with a given mortality risk were lower in high-body weight older patients and higher in patients on dialysis for more than 3 years.CONCLUSIONSUltrafiltration rates associated with various levels of higher mortality risk depend on body weight, but not in a 1:1 ratio, and are different in men versus women, in high-body weight older patients, and in high-vintage patients.BACKGROUNDWe hypothesized that the association of ultrafiltration rate with mortality in hemodialysis patients was differentially affected by weight and sex and sought to derive a sex- and weight-indexed ultrafiltration rate measure that captures the differential effects of these parameters on the association of ultrafiltration rate with mortality.METHODSData were analyzed from the US Fresenius Kidney Care (FKC) database for 1 year after patient entry into a FKC dialysis unit (baseline) and over 2 years of follow-up for patients receiving thrice-weekly in-center hemodialysis. To investigate the joint effect of baseline-year ultrafiltration rate and postdialysis weight on survival, we fit Cox proportional hazards models using bivariate tensor product spline functions and constructed contour plots of weight-specific mortality hazard ratios over the entire range of ultrafiltration rate values and postdialysis weights (W).
Juntao Duan, Hanmo Li, Xiaoran Ma, Hanjie Zhang, Rachel Lasky, Caitlin K Monaghan, Sheetal Chaudhuri, Len A Usvyat, Mengyang Gu, Wensheng Guo, Peter Kotanko, Yuedong Wang
CONCLUSIONAs found in our study, the dynamics of the prediction model are frequently changing as the pandemic evolves. County-level infection information and vaccination information are crucial for the success of early COVID-19 prediction models. Our results show that the proposed model can effectively identify SARS-CoV-2 infections during the incubation period. Prospective studies are warranted to explore the application of such prediction models in daily clinical practice.BACKGROUNDThe coronavirus disease 2019 (COVID-19) pandemic has created more devastation among dialysis patients than among the general population. Patient-level prediction models for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection are crucial for the early identification of patients to prevent and mitigate outbreaks within dialysis clinics. As the COVID-19 pandemic evolves, it is unclear whether or not previously built prediction models are still sufficiently effective.METHODSWe developed a machine learning (XGBoost) model to predict during the incubation period a SARS-CoV-2 infection that is subsequently diagnosed after 3 or more days. We used data from multiple sources, including demographic, clinical, treatment, laboratory, and vaccination information from a national network of hemodialysis clinics, socioeconomic information from the Census Bureau, and county-level COVID-19 infection and mortality information from state and local health agencies. We created prediction models and evaluated their performances on a rolling basis to investigate the evolution of prediction power and risk factors.RESULTFrom April 2020 to August 2020, our machine learning model achieved an area under the receiver operating characteristic curve (AUROC) of 0.75, an improvement of over 0.07 from a previously developed machine learning model published by Kidney360 in 2021. As the pandemic evolved, the prediction performance deteriorated and fluctuated more, with the lowest AUROC of 0.6 in December 2021 and January 2022. Over the whole study period, that is, from April 2020 to February 2022, fixing the false-positive rate at 20%, our model was able to detect 40% of the positive patients. We found that features derived from local infection information reported by the Centers for Disease Control and Prevention (CDC) were the most important predictors, and vaccination status was a useful predictor as well. Whether or not a patient lives in a nursing home was an effective predictor before vaccination, but became less predictive after vaccination.
Armando Armenta-Alvarez, Salvador Lopez-Gil, Iván Osuna, Nadja Grobe, Xia Tao, Gabriela Ferreira Dias, Xiaoling Wang, Joshua Chao, Jochen G Raimann, Stephan Thijssen, Hector Perez-Grovas, Bernard Canaud, Peter Kotanko, Magdalena Madero
RESULTSTwelve anuric patients were studied (six female patients; 44±19 years; dialysis vintage 35.2±28 months). The blood flow was 369±23 ml/min, dialysate flow was 495±61 ml/min, and ultrafiltration volume was 2.8±0.74 L. No significant differences were found regarding the removal of B2M, vitamin B12, and water-soluble solutes between dialytic modalities and dialyzers. Albumin and total protein loss were significantly higher in MCO groups than HFX groups when compared with the same modality. HDF groups had significantly higher albumin and total protein loss than HD groups when compared with the same dialyzer. MCO-HDF showed the highest protein loss among all groups.KEY POINTSHDF and MCO have shown greater clearance of middle-size uremic solutes in comparison with HF dialyzers; MCO has never been studied in HDF. MCO in HDF does not increase the clearance of B2M and results in a higher loss of albumin.CONCLUSIONSMCO-HD is not superior to HFX-HD and HFX-HDF for both middle molecule and water-soluble solute removal. Protein loss was more pronounced with MCO when compared with HFX on both HD and HDF modalities. MCO-HDF has no additional benefits regarding better removal of B2M but resulted in greater protein loss than MCO-HD.BACKGROUNDMiddle molecule removal and albumin loss have been studied in medium cutoff (MCO) membranes on hemodialysis (HD). It is unknown whether hemodiafiltration (HDF) with MCO membranes provides additional benefit. We aimed to compare the removal of small solutes and β2-microglobulin (B2M), albumin, and total proteins between MCO and high-flux (HFX) membranes with both HD and HDF, respectively.METHODSThe cross-over study comprised 4 weeks, one each with postdilutional HDF using HFX (HFX-HDF), MCO (MCO-HDF), HD with HFX (HFX-HD), and MCO (MCO-HD). MCO and HFX differ with respect to several characteristics, including membrane composition, pore size distribution, and surface area (HFX, 2.5 m2; MCO, 1.7 m2). There were two study treatments per week, one after the long interdialytic interval and another midweek. Reduction ratios of vitamin B12, B2M, phosphate, uric acid, and urea corrected for hemoconcentration were computed. Dialysis albumin and total protein loss during the treatment were quantified from dialysate samples.
Priscila Preciado, Laura Rosales Merlo, Hanjie Zhang, Jeroen P Kooman, Frank M van der Sande, Peter Kotanko
DISCUSSIONConcurrent combined monitoring of intradialytic ScvO2 and RBV change may provide additional insights into a patient's circulatory status. Patients with low ScvO2 and small changes in RBV may represent a specifically vulnerable group of patients at particularly high risk for adverse outcomes, possibly related to poor cardiac reserve and fluid overload.INTRODUCTIONIn maintenance hemodialysis (HD) patients, low central venous oxygen saturation (ScvO2 ) and small decline in relative blood volume (RBV) have been associated with adverse outcomes. Here we explore the joint association between ScvO2 and RBV change in relation to all-cause mortality.FINDINGSBaseline comprised 5231 dialysis sessions in 216 patients. The median RBV change was -5.5% and median ScvO2 was 58.8%. During follow-up, 44 patients (20.4%) died. In the adjusted model, all-cause mortality was highest in patients with ScvO2 below median and RBV change above median (HR 6.32; 95% confidence interval [CI] 1.37-29.06), followed by patients with ScvO2 below median and RBV change below median (HR 5.04; 95% CI 1.14-22.35), and ScvO2 above median and RBV change above median (HR 4.52; 95% CI 0.95-21.36).METHODSWe conducted a retrospective study in maintenance HD patients with central venous catheters as vascular access. During a 6-month baseline period, Crit-Line (Fresenius Medical Care, Waltham, MA) was used to measure continuously intradialytic ScvO2 and hematocrit-based RBV. We defined four groups per median change of RBV and median ScvO2 . Patients with ScvO2 above median and RBV change below median were defined as reference. Follow-up period was 3 years. We constructed Cox proportional hazards model with adjustment for age, diabetes, and dialysis vintage to assess the association between ScvO2 and RBV and all-cause mortality during follow-up.
Peter Kotanko, Girish N Nadkarni
No abstract available
Zahin Haq, Xin Wang, Qiuqiong Cheng, Gabriela F Dias, Christoph Moore, Dorothea Piecha, Peter Kotanko, Chih-Hu Ho, Nadja Grobe
Bisphenol A (BPA)-based materials are used in the manufacturing of hemodialyzers, including their polycarbonate (PC) housings and polysulfone (PS) membranes. As concerns for BPA's adverse health effects rise, the regulation on BPA exposure is becoming more rigorous. Therefore, BPA alternatives, such as Bisphenol S (BPS), are increasingly used. It is important to understand the patient risk of BPA and BPS exposure through dialyzer use during hemodialysis. Here, we report the bisphenol levels in extractables and leachables obtained from eight dialyzers currently on the market, including high-flux and medium cut-off membranes. A targeted liquid chromatography-mass spectrometry strategy utilizing stable isotope-labeled internal standards provided reliable data for quantitation with the standard addition method. BPA ranging from 0.43 to 32.82 µg/device and BPS ranging from 0.02 to 2.51 µg/device were detected in dialyzers made with BPA- and BPS-containing materials, except for the novel FX CorAL 120 dialyzer. BPA and BPS were also not detected in bloodline controls and cellulose-based membranes. Based on the currently established tolerable intake (6 µg/kg/day), the resulting margin of safety indicates that adverse effects are unlikely to occur in hemodialysis patients exposed to BPA and BPS quantified herein. With increasing availability of new data and information about the toxicity of BPA and BPS, the patient safety limits of BPA and BPS in those dialyzers may need a re-evaluation in the future.
Ercan Ok, Cenk Demirci, Gulay Asci, Kivanc Yuksel, Fatih Kircelli, Serkan Kubilay Koc, Sinan Erten, Erkan Mahsereci, Ali Rıza Odabas, Stefano Stuard, Franklin W Maddux, Jochen G Raimann, Peter Kotanko, Peter G Kerr, Christopher T Chan
RESULTSThe mean duration of dialysis session was 418 ± 54 minutes in HHD and 242 ± 10 minutes in patients on ICHD. All-cause mortality rate was 3.76 and 6.27 per 100 patient-years in the HHD and the ICHD groups, respectively. In the intention-to-treat analysis, HHD was associated with a 40% lower risk for all-cause mortality than ICHD (hazard ratio [HR] = 0.60; 95% confidence interval [CI] 0.45 to 0.80; P < 0.001). In HHD, the 5-year technical survival was 86.5%. HHD treatment provided better phosphate and blood pressure (BP) control, improvements in nutrition and inflammation, and reduction in hospitalization days and medication requirement.CONCLUSIONThese results indicate that extended HHD is associated with higher survival and better outcomes compared to ICHD.INTRODUCTIONMore frequent and/or longer hemodialysis (HD) has been associated with improvements in numerous clinical outcomes in patients on dialysis. Home HD (HHD), which allows more frequent and/or longer dialysis with lower cost and flexibility in treatment planning, is not widely used worldwide. Although, retrospective studies have indicated better survival with HHD, this issue remains controversial. In this multicenter study, we compared thrice-weekly extended HHD with in-center conventional HD (ICHD) in a large patient population with a long-term follow-up.METHODSWe matched 349 patients starting HHD between 2010 and 2014 with 1047 concurrent patients on ICHD by using propensity scores. Patients were followed-up with from their respective baseline until September 30, 2018. The primary outcome was overall survival. Secondary outcomes were technique survival; hospitalization; and changes in clinical, laboratory, and medication parameters.
Xiaoling Wang, Ohnmar Thwin, Zahin Haq, Zijun Dong, Lela Tisdale, Lemuel Rivera Fuentes, Nadja Grobe, Peter Kotanko
RESULTSMask and saliva testing specificities were 99% and 100%, respectively. Test sensitivity was 62% for masks, and 81% for saliva (p = 0.16). Median viral RNA shedding duration was 11 days and longer in immunocompromised versus non-immunocompromised patients (22 vs. 11 days, p = 0.06, log-rank test).CONCLUSIONWhile SARS-CoV-2 testing on worn masks appears to be less sensitive compared to saliva, it may be a preferred screening method for individuals who are mandated to wear masks yet averse to more invasive sampling. However, optimized RNA extraction methods and automated procedures are warranted to increase test sensitivity and scalability. We corroborated longer viral RNA shedding in immunocompromised patients.BACKGROUNDExhaled SARS-CoV-2 can be detected on face masks. We compared tests for SARS-CoV-2 RNA on worn face masks and matched saliva samples.METHODSWe conducted this prospective, observational, case-control study between December 2021 and March 2022. Cases comprised 30 in-center hemodialysis patients with recent COVID-19 diagnosis. Controls comprised 13 hemodialysis patients and 25 clinic staff without COVID-19 during the study period and the past 2 months. Disposable 3-layer masks were collected after being worn for 4 hours together with concurrent saliva samples. ThermoFisher COVID-19 Combo Kit (A47814) was used for RT-PCR testing.
Ana Catalina Alvarez-Elias, Barry M Brenner, Valerie A Luyckx
PURPOSE OF REVIEWThe consequences of climate change, including heat and extreme weather events impact kidney function in adults and children. The impacts of climate change on kidney development during gestation and thereby on kidney function later in life have been poorly described. Clinical evidence is summarized to highlight possible associations between climate change and nephron mass.SUMMARYClimate change has important impacts on pregnant women and their unborn children. Being born too small or too soon is associated with life-time risk of kidney disease. Climate change may therefore have a dual effect of impacting fetal kidney development and contributing to cumulative postnatal kidney injury. The impact on population kidney health of future generations may be significant.RECENT FINDINGSPregnant women are vulnerable to the effects of climate change, being less able to thermoregulate, more sensitive to the effects of dehydration, and more susceptible to infections. Exposure to heat, wildfire smoke, drought, floods and climate-related infections are associated with low birth weight, preterm birth and preeclampsia. These factors are associated with reduced nephron numbers, kidney dysfunction and higher blood pressures in offspring in later life. Exposure to air pollution is associated with higher blood pressures in children and has variable effects on estimated glomerular filtration rate.
Hanjie Zhang, Lin-Chun Wang, Sheetal Chaudhuri, Aaron Pickering, Len Usvyat, John Larkin, Pete Waguespack, Zuwen Kuang, Jeroen P Kooman, Franklin W Maddux, Peter Kotanko
RESULTSWe utilized data from 693 patients who contributed 42 656 hemodialysis sessions and 355 693 intradialytic SBP measurements. IDH occurred in 16.2% of hemodialysis treatments. Our model predicted IDH 15-75 min in advance with an AUROC of 0.89. Top IDH predictors were the most recent intradialytic SBP and IDH rate, as well as mean nadir SBP of the previous 10 dialysis sessions.CONCLUSIONSReal-time prediction of IDH during an ongoing hemodialysis session is feasible and has a clinically actionable predictive performance. If and to what degree this predictive information facilitates the timely deployment of preventive interventions and translates into lower IDH rates and improved patient outcomes warrants prospective studies.BACKGROUNDIn maintenance hemodialysis patients, intradialytic hypotension (IDH) is a frequent complication that has been associated with poor clinical outcomes. Prediction of IDH may facilitate timely interventions and eventually reduce IDH rates.METHODSWe developed a machine learning model to predict IDH in in-center hemodialysis patients 15-75 min in advance. IDH was defined as systolic blood pressure (SBP) <90 mmHg. Demographic, clinical, treatment-related and laboratory data were retrieved from electronic health records and merged with intradialytic machine data that were sent in real-time to the cloud. For model development, dialysis sessions were randomly split into training (80%) and testing (20%) sets. The area under the receiver operating characteristic curve (AUROC) was used as a measure of the model's predictive performance.
Susie Q Lew, Gulay Asci, Paul A Rootjes, Ercan Ok, Erik L Penne, Ramin Sam, Antonios H Tzamaloukas, Todd S Ing, Jochen G Raimann
The relationship between sodium, blood pressure and extracellular volume could not be more pronounced or complex than in a dialysis patient. We review the patients' sources of sodium exposure in the form of dietary salt intake, medication administration, and the dialysis treatment itself. In addition, the roles dialysis modalities, hemodialysis types, and dialysis fluid sodium concentration have on blood pressure, intradialytic symptoms, and interdialytic weight gain affect patient outcomes are discussed. We review whether sodium restriction (reduced salt intake), alteration in dialysis fluid sodium concentration and the different dialysis types have any impact on blood pressure, intradialytic symptoms, and interdialytic weight gain.
Sunpeng Duan, Yuedong Wang, Peter Kotanko, Hanjie Zhang
RESULTSOut of 978 patients, 193 (19.7%) tested positive for COVID-19 and had contact with other patients during the COV-Pos infectious period. Network diagrams showed no evidence that more exposed patients would have had a higher chance of infection. This finding was corroborated by logistic mixed effect regression (donor-to-potential recipient exposure OR: 0.63; 95% CI 0.32 to 1.17, p = 0.163). Separate analyses according to vaccination led to materially identical results.CONCLUSIONSTransmission of SARS-CoV-2 between in-center hemodialysis patients is unlikely. This finding supports the effectiveness of non-pharmaceutical interventions, such as universal masking and other procedures to control spread of COVID-19.BACKGROUNDIn-center hemodialysis entails repeated interactions between patients and clinic staff, potentially facilitating the spread of COVID-19. We examined if in-center hemodialysis is associated with the spread of SARS-CoV-2 between patients.METHODSOur retrospective analysis comprised all patients receiving hemodialysis in four New York City clinics between March 12th, 2020, and August 31st, 2022. Treatment-level clinic ID, dialysis shift, dialysis machine station, and date of COVID-19 diagnosis by RT-PCR were documented. To estimate the donor-to-potential recipient exposure ("donor" being the COVID-19 positive patient denoted as "COV-Pos"; "potential recipient" being other susceptible patients in the same shift), we obtained the spatial coordinates of each dialysis station, calculated the Euclidean distances between stations and weighted the exposure by proximity between them. For each donor, we estimated the donor-to-potential recipient exposure of all potential recipients dialyzed in the same shift and accumulated the exposure over time within the 'COV-Pos infectious period' as cumulative exposures. The 'COV-Pos infectious period' started 5 days before COVID-19 diagnosis date. We deployed network analysis to assess these interactions and summarized the donor-to-potential recipient exposure in 193 network diagrams. We fitted mixed effects logistic regression models to test whether more donor-to-potential recipient exposure conferred a higher risk of SARS-CoV-2 infection.
Marcus Dariva, Murilo Guedes, Vladimir Rigodon, Peter Kotanko, John W Larkin, Bruno Ferlin, Roberto Pecoits-Filho, Pasqual Barretti, Thyago Proença de Moraes
RESULTSWe analysed data of 848 patients (814 starting on CAPD and 34 starting on APD). The SBP decreased by 4 (SD 22) mmHg when transitioning from CAPD to APD (p < 0.001) and increased by 4 (SD 21) mmHg when transitioning from APD to CAPD (p = 0.38); consistent findings were seen for DBP. There was no significant change in the number of antihypertensive drugs prescribed before and after transition.CONCLUSIONSTransition between PD modalities seems to directly impact on BP levels. Further studies are needed to confirm if switching to APD could be an effective treatment for uncontrolled hypertension among CAPD patients.BACKGROUNDHypertension is a leading cause of kidney failure, affects most dialysis patients and associates with adverse outcomes. Hypertension can be difficult to control with dialysis modalities having differential effects on sodium and water removal. There are two main types of peritoneal dialysis (PD), automated peritoneal dialysis (APD) and continuous ambulatory peritoneal dialysis (CAPD). It is unknown whether one is superior to the other in controlling blood pressure (BP). Therefore, the aim of our study was to analyse the impact of switching between these two PD modalities on BP levels in a nationally representative cohort.METHODSThis was a cohort study of patients on PD from 122 dialysis centres in Brazil (BRAZPD II study). Clinical and laboratory data were collected monthly throughout the study duration. We selected all patients who remained on PD at least 6 months and 3 months on each modality at minimum. We compared the changes in mean systolic/diastolic blood pressures (SBP/DBP) before and after modality transition using a multilevel mixed-model where patients were at first level and their clinics at the second level.
Bernard Canaud, Andrew Davenport, Hélène Leray-Moragues, Marion Morena-Carrere, Jean Paul Cristol, Jeroen Kooman, Peter Kotanko
Chronic kidney disease poses a growing global health concern, as an increasing number of patients progress to end-stage kidney disease requiring kidney replacement therapy, presenting various challenges including shortage of care givers and cost-related issues. In this narrative essay, we explore innovative strategies based on in-depth literature analysis that may help healthcare systems face these challenges, with a focus on digital health technologies (DHTs), to enhance removal and ensure better control of broader spectrum of uremic toxins, to optimize resources, improve care and outcomes, and empower patients. Therefore, alternative strategies, such as self-care dialysis, home-based dialysis with the support of teledialysis, need to be developed. Managing ESKD requires an improvement in patient management, emphasizing patient education, caregiver knowledge, and robust digital support systems. The solution involves leveraging DHTs to automate HD, implement automated algorithm-driven controlled HD, remotely monitor patients, provide health education, and enable caregivers with data-driven decision-making. These technologies, including artificial intelligence, aim to enhance care quality, reduce practice variations, and improve treatment outcomes whilst supporting personalized kidney replacement therapy. This narrative essay offers an update on currently available digital health technologies used in the management of HD patients and envisions future technologies that, through digital solutions, potentially empower patients and will more effectively support their HD treatments.
Amun G Hofmann, Suman Lama, Hanjie Zhang, Afshin Assadian, Murat Sor, Jeffrey Hymes, Peter Kotanko, Jochen Raimann
RESULTSIn total, 38 151 patients (52.2%) had complete data and made up the main cohort. Sensitivity analyses were conducted in 67 421 patients (92.3%) after eliminating variables with a high proportion of missing data points. Selected features diverged between datasets and workflows. A previously failed arteriovenous access appeared to be the most stable predictor for subsequent failure. Prediction of re-conversion based on the demographic and clinical information resulted in an area under the receiver operating characteristic curve (ROCAUC) between 0.541 and 0.571, whereas models predicting all cause mortality performed considerably better (ROCAUC 0.662 - 0.683).OBJECTIVEThe decision to convert from catheter to arteriovenous access is difficult yet very important. The ability to accurately predict fistula survival prior to surgery would significantly improve the decision making process. Many previously investigated demographic and clinical features have been associated with fistula failure. However, it is not conclusively understood how reliable predictions based on these parameters are at an individual level. The aim of this study was to investigate the probability of arteriovenous fistula maturation and survival after conversion using machine learning workflows.CONCLUSIONWhile group level depiction of major adverse outcomes after catheter to arteriovenous fistula or graft conversion is possible using the included variables, patient level predictions are associated with limited performance. Factors during and after fistula creation as well as biomolecular and genetic biomarkers might be more relevant predictors of fistula survival than baseline clinical conditions.METHODSA retrospective cohort study on multicentre data from a large North American dialysis organisation was conducted. The study population comprised 73 031 chronic in centre haemodialysis patients. The dataset included 49 variables including demographic and clinical features. Two distinct feature selection and prediction pipelines were used: LASSO regression and Boruta followed by a random forest classifier. Predictions were facilitated for re-conversion to catheter within one year. Additionally, all cause mortality predictions were conducted to serve as a comparator.
Doris H Fuertinger, Lin-Chun Wang, David J Jörg, Lemuel Rivera Fuentes, Xiaoling Ye, Sabrina Casper, Hanjie Zhang, Ariella Mermelstein, Alhaji Cherif, Kevin Ho, Jochen G Raimann, Lela Tisdale, Peter Kotanko, Stephan Thijssen
RESULTSThe intervention group showed an improved median percentage of hemoglobin measurements within target at 47% (interquartile range, 39–58), with a 10% point median difference between the two groups (95% confidence interval, 3 to 16; P = 0.008). The odds ratio of being within the hemoglobin target in the standard-of-care group compared with the group receiving the personalized ESA recommendations was 0.68 (95% confidence interval, 0.51 to 0.92). The variability of hemoglobin levels decreased in the intervention group, with the percentage of patients experiencing fluctuating hemoglobin levels being 45% versus 82% in the standard-of-care group. ESA usage was reduced by approximately 25% in the intervention group.KEY POINTSWe conducted a randomized controlled pilot trial in patients on hemodialysis using a physiology-based individualized anemia therapy assistance software. Patients in the group receiving erythropoiesis-stimulating agent dose recommendations from the novel software showed improvement in hemoglobin stability and erythropoiesis-stimulating agent utilization.CONCLUSIONSOur results demonstrated an improved hemoglobin target attainment and variability by using personalized ESA recommendations using the physiology-based anemia therapy assistance software.CLINICAL TRIAL REGISTRATION NUMBER:NCT04360902.BACKGROUNDAnemia is common among patients on hemodialysis. Maintaining stable hemoglobin levels within predefined target levels can be challenging, particularly in patients with frequent hemoglobin fluctuations both above and below the desired targets. We conducted a multicenter, randomized controlled trial comparing our anemia therapy assistance software against a standard population-based anemia treatment protocol. We hypothesized that personalized dosing of erythropoiesis-stimulating agents (ESAs) improves hemoglobin target attainment.METHODSNinety-six patients undergoing hemodialysis and receiving methoxy polyethylene glycol-epoetin beta were randomized 1:1 to the intervention group (personalized ESA dose recommendations computed by the software) or the standard-of-care group for 26 weeks. The therapy assistance software combined a physiology-based mathematical model and a model predictive controller designed to stabilize hemoglobin levels within a tight target range (10–11 g/dl). The primary outcome measure was the percentage of hemoglobin measurements within the target. Secondary outcome measures included measures of hemoglobin variability and ESA utilization.
Vaibhav Maheshwari, Nadja Grobe, Xin Wang, Amrish Patel, Alhaji Cherif, Xia Tao, Joshua Chao, Alexander Heide, Dejan Nikolic, Jiaming Dong, Peter Kotanko
It has been estimated that in 2010, over two million patients with end-stage kidney disease may have faced premature death due to a lack of access to affordable renal replacement therapy, mostly dialysis. To address this shortfall in dialytic kidney replacement therapy, we propose a novel, cost-effective, and low-complexity hemodialysis method called allo-hemodialysis (alloHD). With alloHD, instead of conventional hemodialysis, the blood of a patient with kidney failure flows through the dialyzer's dialysate compartment counter-currently to the blood of a healthy subject (referred to as a "buddy") flowing through the blood compartment. Along the concentration and hydrostatic pressure gradients, uremic solutes and excess fluid are transferred from the patient to the buddy and subsequently excreted by the healthy kidneys of the buddy. We developed a mathematical model of alloHD to systematically explore dialysis adequacy in terms of weekly standard urea Kt/V. We showed that in the case of an anuric child (20 kg), four 4 h alloHD sessions are sufficient to attain a weekly standard Kt/V of >2.0. In the case of an anuric adult patient (70 kg), six 4 h alloHD sessions are necessary. As a next step, we designed and built an alloHD machine prototype that comprises off-the-shelf components. We then used this prototype to perform ex vivo experiments to investigate the transport of solutes, including urea, creatinine, and protein-bound uremic retention products, and to quantitate the accuracy and precision of the machine's ultrafiltration control. These experiments showed that alloHD performed as expected, encouraging future in vivo studies in animals with and without kidney failure.
Mariana Murea, Jochen G Raimann, Jasmin Divers, Harvey Maute, Cassandra Kovach, Emaad M Abdel-Rahman, Alaa S Awad, Jennifer E Flythe, Samir C Gautam, Vandana D Niyyar, Glenda V Roberts, Nichole M Jefferson, Islam Shahidul, Ucheoma Nwaozuru, Kristie L Foley, Erica J Trembath, Merlo L Rosales, Alison J Fletcher, Sheikh I Hiba, Anne Huml, Daphne H Knicely, Irtiza Hasan, Bhaktidevi Makadia, Raman Gaurav, Janice Lea, Paul T Conway, John T Daugirdas, Peter Kotanko
TRIAL REGISTRATIONClinicaltrials.gov NCT05828823. Registered on 25 April 2023.DISCUSSIONOur proposal challenges the status quo of HD care delivery. Our overarching hypothesis posits that CMIHD is non-inferior to CHD. If successful, the results will positively impact one of the highest-burdened patient populations and their caregivers.BACKGROUNDMost patients starting chronic in-center hemodialysis (HD) receive conventional hemodialysis (CHD) with three sessions per week targeting specific biochemical clearance. Observational studies suggest that patients with residual kidney function can safely be treated with incremental prescriptions of HD, starting with less frequent sessions and later adjusting to thrice-weekly HD. This trial aims to show objectively that clinically matched incremental HD (CMIHD) is non-inferior to CHD in eligible patients.METHODSAn unblinded, parallel-group, randomized controlled trial will be conducted across diverse healthcare systems and dialysis organizations in the USA. Adult patients initiating chronic hemodialysis (HD) at participating centers will be screened. Eligibility criteria include receipt of fewer than 18 treatments of HD and residual kidney function defined as kidney urea clearance ≥3.5 mL/min/1.73 m2 and urine output ≥500 mL/24 h. The 1:1 randomization, stratified by site and dialysis vascular access type, assigns patients to either CMIHD (intervention group) or CHD (control group). The CMIHD group will be treated with twice-weekly HD and adjuvant pharmacologic therapy (i.e., oral loop diuretics, sodium bicarbonate, and potassium binders). The CHD group will receive thrice-weekly HD according to usual care. Throughout the study, patients undergo timed urine collection and fill out questionnaires. CMIHD will progress to thrice-weekly HD based on clinical manifestations or changes in residual kidney function. Caregivers of enrolled patients are invited to complete semi-annual questionnaires. The primary outcome is a composite of patients' all-cause death, hospitalizations, or emergency department visits at 2 years. Secondary outcomes include patient- and caregiver-reported outcomes. We aim to enroll 350 patients, which provides ≥85% power to detect an incidence rate ratio (IRR) of 0.9 between CMIHD and CHD with an IRR non-inferiority of 1.20 (α = 0.025, one-tailed test, 20% dropout rate, average of 2.06 years of HD per patient participant), and 150 caregiver participants (of enrolled patients).
Yoshitsugu Obi, Jochen G Raimann, Kamyar Kalantar-Zadeh, Mariana Murea
Individuals afflicted with advanced kidney dysfunction who require dialysis for medical management exhibit different degrees of native kidney function, called residual kidney function (RKF), ranging from nil to appreciable levels. The primary focus of this manuscript is to delve into the concept of RKF, a pivotal yet under-represented topic in nephrology. To begin, we unpack the definition and intrinsic nature of RKF. We then juxtapose the efficiency of RKF against that of hemodialysis in preserving homeostatic equilibrium and facilitating physiological functions. Given the complex interplay of RKF and overall patient health, we shed light on the extent of its influence on patient outcomes, particularly in those living with advanced kidney dysfunction and on dialysis. This manuscript subsequently presents methodologies and measures to assess RKF, concluding with the potential benefits of targeted interventions aimed at preserving RKF.
Xiaoran Ma, Wensheng Guo, Mengyang Gu, Len Usvyat, Peter Kotanko, Yuedong Wang
Some patients with COVID-19 show changes in signs and symptoms such as temperature and oxygen saturation days before being positively tested for SARS-CoV-2, while others remain asymptomatic. It is important to identify these subgroups and to understand what biological and clinical predictors are related to these subgroups. This information will provide insights into how the immune system may respond differently to infection and can further be used to identify infected individuals. We propose a flexible nonparametric mixed-effects mixture model that identifies risk factors and classifies patients with biological changes. We model the latent probability of biological changes using a logistic regression model and trajectories in the latent groups using smoothing splines. We developed an EM algorithm to maximize the penalized likelihood for estimating all parameters and mean functions. We evaluate our methods by simulations and apply the proposed model to investigate changes in temperature in a cohort of COVID-19-infected hemodialysis patients.
Zijun Dong, Lemuel Rivera Fuentes, Sharon Rao, Peter Kotanko
While life-sustaining, hemodialysis is a non-physiological treatment modality that exerts stress on the patient, primarily due to fluid shifts during ultrafiltration. Automated feedback control systems, integrated with sensors that continuously monitor bio-signals such as blood volume, can adjust hemodialysis treatment parameters, e.g., ultrafiltration rate, in real-time. These systems hold promise to mitigate hemodynamic stress, prevent intradialytic hypotension, and improve the removal of water and electrolytes in chronic hemodialysis patients. However, robust evidence supporting their clinical application remains limited. Based on an extensive literature research, we assess feedback-controlled ultrafiltration systems that have emerged over the past three decades in comparison to conventional hemodialysis treatment. We identified 28 clinical studies. Closed loop ultrafiltration control demonstrated effectiveness in 23 of them. No adverse effects of closed loop ultrafiltration control were reported across all trials. Closed loop ultrafiltration control represents an important advancement towards more physiological hemodialysis. Its development is driven by innovations in real-time bio-signals monitoring, advancement in control theory, and artificial intelligence. We expect these innovations will lead to the prevalent adoption of ultrafiltration control in the future, provided its clinical value is substantiated in adequately randomized controlled trials.
Bernard Canaud, Jeroen P Kooman, Nicholas M Selby, Maarten Taal, Andreas Maierhofer, Pascal Kopperschmidt, Susan Francis, Allan Collins, Peter Kotanko
The development of maintenance hemodialysis (HD) for end stage kidney disease patients is a success story that continues to save many lives. Nevertheless, intermittent renal replacement therapy is also a source of recurrent stress for patients. Conventional thrice weekly short HD is an imperfect treatment that only partially corrects uremic abnormalities, increases cardiovascular risk, and exacerbates disease burden. Altering cycles of fluid loading associated with cardiac stretching (interdialytic phase) and then fluid unloading (intradialytic phase) likely contribute to cardiac and vascular damage. This unphysiologic treatment profile combined with cyclic disturbances including osmotic and electrolytic shifts may contribute to morbidity in dialysis patients and augment the health burden of treatment. As such, HD patients are exposed to multiple stressors including cardiocirculatory, inflammatory, biologic, hypoxemic, and nutritional. This cascade of events can be termed the dialysis stress storm and sickness syndrome. Mitigating cardiovascular risk and morbidity associated with conventional intermittent HD appears to be a priority for improving patient experience and reducing disease burden. In this in-depth review, we summarize the hidden effects of intermittent HD therapy, and call for action to improve delivered HD and develop treatment schedules that are better tolerated and associated with fewer adverse effects.
Lin-Chun Wang, Hanjie Zhang, Nancy Ginsberg, Andrea Nandorine Ban, Jeroen P Kooman, Peter Kotanko
OBJECTIVESThe rising diversity of food preferences and the desire to provide better personalized care provide challenges to renal dietitians working in dialysis clinics. To address this situation, we explored the use of a large language model, specifically, ChatGPT using the GPT-4 model (openai.com), to support nutritional advice given to dialysis patients.RESULTSChatGPT generated a daily menu with five recipes. The renal dietitian rated the recipes at 3 (3, 3) [median (Q1, Q3)], the cooking instructions at 5 (5,5), and the nutritional analysis at 2 (2, 2) on the five-point Likert scale. ChatGPT's nutritional analysis underestimated calories by 36% (95% CI: 44-88%), protein by 28% (25-167%), fat 48% (29-81%), phosphorus 54% (15-102%), potassium 49% (40-68%), and sodium 53% (14-139%). The nutritional analysis of online available recipes differed only by 0 to 35%. The translations were rated as reliable by native speakers (4 on the five-point Likert scale).CONCLUSIONWhile ChatGPT-4 shows promise in providing personalized nutritional guidance for diverse dialysis patients, improvements are necessary. This study highlights the importance of thorough qualitative and quantitative evaluation of artificial intelligence-generated content, especially regarding medical use cases.METHODSWe tasked ChatGPT-4 with generating a personalized daily meal plan, including nutritional information. Virtual "patients" were generated through Monte Carlo simulation; data from a randomly selected virtual patient were presented to ChatGPT. We provided to ChatGPT patient demographics, food preferences, laboratory data, clinical characteristics, and available budget, to generate a one-day sample menu with recipes and nutritional analyses. The resulting daily recipe recommendations, cooking instructions, and nutritional analyses were reviewed and rated on a five-point Likert scale by an experienced renal dietitian. In addition, the generated content was rated by a renal dietitian and compared with a U. S. Department of Agriculture-approved nutrient analysis software. ChatGPT also analyzed nutrition information of two recipes published online. We also requested a translation of the output into Spanish, Mandarin, Hungarian, German, and Dutch.
Vaibhav Maheshwari, Maria Esther Díaz-González de Ferris, Guido Filler, Peter Kotanko
Severe Neonatal Jaundice (SNJ) causes long-term neurocognitive impairment, cerebral palsy, auditory neuropathy, deafness, or death. We developed a mathematical model for allo-hemodialysis as a potential blood purification method for the treatment of SNJ in term or near-term infants. With allo-hemodialysis (allo-HD), the neonate's blood flows through hollow fibers of a miniature 0.075 m2 hemodialyzer, while the blood of a healthy adult ("buddy") flows counter-currently through the dialysate compartment. We simulated the kinetics of unconjugated bilirubin in allo-hemodialysis with neonate blood flow rates of 12.5 and 15 mL/min (for a 2.5 kg and 3.5 kg neonate, respectively), and 30 mL/min for the buddy. Bilirubin production rates in neonate and buddy were set to 6 and 3 mg/kg/day, respectively. Buddy bilirubin conjugation rate was calculated to obtain normal steady-state bilirubin levels. Albumin levels were set to 1.1, 2.1, 3.1 g/dL for the neonate and 3.3 g/dL for the buddy. Model simulations suggest that a 6-h allo-hemodialysis session could reduce neonatal bilirubin levels by > 35% and that this modality would be particularly effective with low neonatal serum albumin levels. Due to the high bilirubin conjugation capacity of an adult's healthy liver and the larger distribution volume, the buddy's bilirubin level increases only transiently during allo-hemodialysis. Our modelling suggests that a single allo-hemodialysis session may lower neonatal unconjugated bilirubin levels effectively. If corroborated in ex-vivo, animal, and clinical studies, this bilirubin reduction could lower the risks associated with SNJ, especially kernicterus, and possibly avoiding the morbidity associated with exchange transfusions.
Laura Rosales Merlo, Xiaoling Ye, Hanjie Zhang, Brenda Chan, Marilou Mateo, Seth Johnson, Frank M van der Sande, Jeroen P Kooman, Peter Kotanko
RESULTSThe QIP group comprised 44 patients (59 ± 17 years), the concurrent control group 48 patients (59 ± 16 years), the historic control group 57 patients (58 ± 15 years). Six-month post-AVF creation, the fraction of non-censored patients with catheter in place was 21% in the QIP cohort, 67% in the concurrent control group, and 68% in the historic control group. In unadjusted and adjusted analysis, catheter residence time post-fistula creation was shorter in QIP patients compared to either control groups (p < 0.001).CONCLUSIONScvO2-based assessment of fistula maturation is associated with shorter catheter residence post-AVF creation.INTRODUCTIONArteriovenous fistula (AVF) maturation assessment is essential to reduce venous catheter residence. We introduced central venous oxygen saturation (ScvO2) and estimated upper body blood flow (eUBBF) to monitor newly created fistula maturation and recorded catheter time in patients with and without ScvO2-based fistula maturation.METHODSFrom 2017 to 2019, we conducted a multicenter quality improvement project (QIP) in hemodialysis patients with the explicit goal to shorten catheter residence time post-AVF creation through ScvO2-based maturation monitoring. In patients with a catheter as vascular access, we tracked ScvO2 and eUBBF pre- and post-AVF creation. The primary outcome was catheter residence time post-AVF creation. We compared catheter residence time post-AVF creation between QIP patients and controls. One control group comprised concurrent patients; a second control group comprised historic controls (2014-2016). We conducted Kaplan-Meier analysis and constructed a Cox proportional hazards model with variables adjustment to assess time-to-catheter removal.
Lihao Xiao, Hanjie Zhang, Juntao Duan, Xiaoran Ma, Len A Usvyat, Peter Kotanko, Yuedong Wang
COVID-19 has a higher rate of morbidity and mortality among dialysis patients than the general population. Identifying infected patients early with the support of predictive models helps dialysis centers implement concerted procedures (e.g., temperature screenings, universal masking, isolation treatments) to control the spread of SARS-CoV-2 and mitigate outbreaks. We collect data from multiple sources, including demographics, clinical, treatment, laboratory, vaccination, socioeconomic status, and COVID-19 surveillance. Previous early prediction models, such as logistic regression, SVM, and XGBoost, require sophisticated feature engineering and need improved prediction performance. We create deep learning models, including Recurrent Neural Networks (RNN) and Convolutional Neural Networks (CNN), to predict SARS-CoV-2 infections during incubation. Our study shows deep learning models with minimal feature engineering can identify those infected patients more accurately than previously built models. Our Long Short-Term Memory (LSTM) model consistently performed well, with an AUC exceeding 0.80, peaking at 0.91 in August 2021. The CNN model also demonstrated strong results with an AUC above 0.75. Both models outperformed previous best XGBoost models by over 0.10 in AUC. Prediction accuracy declined as the pandemic evolved, dropping to approximately 0.75 between September 2021 and January 2022. Maintaining a 20% false positive rate, our LSTM and CNN models identified 66% and 64% of positive cases among patients, significantly outperforming XGBoost models at 42%. We also identify key features for dialysis patients by calculating the gradient of the output with respect to the input features. By closely monitoring these factors, dialysis patients can receive earlier diagnoses and care, leading to less severe outcomes. Our research highlights the effectiveness of deep neural networks in analyzing longitudinal data, especially in predicting COVID-19 infections during the crucial incubation period. These deep network approaches surpass traditional methods relying on aggregated variable means, significantly improving the accurate identification of SARS-CoV-2 infections.
Jaime Uribarri, Murilo Guedes, Maria Ines Diaz Bessone, Lili Chan, Andres De La Torre, Ariella Mermelstein, Guillermo Garcia-Garcia, Jochen Raimann, Thyago Moraes, Vincent Peters, Stijn Konings, Doug Farrell, Shuchita Sharma, Adrian Guinsburg, Peter Kotanko
RESULTSThere were 16,796 incident PD patients analyzed. Age, BMI, gender, PD modality, Kt/V and CrCl as well as serum phosphate varied significantly across the different cohorts, but >70% had residual renal function. For most cohorts, both CrCltotal and urea Kt/V associated negatively with serum phosphorus levels, and log-likelihood ratio tests demonstrate that models including CrCltotal have more predictive information than those including only urea Kt/V for the largest cohorts. Models including CrCltotal increase information predicting longitudinal serum phosphate levels irrespective of baseline urea Kt/V, age, use of phosphorus binder, and gender.CONCLUSIONSCrCl was not more accurate in predicting serum phosphate than urea Kt/V, but its inclusion in multivariable models predicting serum phosphate added accuracy. In conclusion, both creatinine clearance and Kt/V are associated with phosphate levels, and using both biomarkers, instead of just one, may better assist in the optimization of serum phosphate levels.BACKGROUNDHyperphosphatemia is associated with poor outcome and is still very common in peritoneal dialysis (PD) patients. Since peritoneal phosphate clearance is closer to peritoneal creatinine clearance than urea clearance, we hypothesized that weekly creatinine clearance (CrCl) could be a better marker of serum phosphate in PD.METHODSIn a retrospective observational study, data from adult PD patients were collected across five institutions in North and South America: LATAM, RRI, Mount Sinai Hospital, Hospital Civil de Guadalajara, and the BRAZPD cohort. All centers analyzed routinely available laboratory data, with exclusions for missing data on serum phosphate, CrCl, or urea Kt/V. A unified statistical protocol was employed across centers. Linear mixed-effect models examined associations between longitudinal serum phosphate levels, CrCl, and Kt/V. Adjustments were made for age, gender, and baseline phosphate binder usage. Mixed-effects meta-analysis determined the pooled effect size of CrCl and Kt/V on serum phosphate trajectories, adjusted for confounders.
Beatriz Akemi Kondo Van Spitzenbergen, Gabriela Bohnen Andrade, Erika Sousa Dias, Júlia Bacarin Monte Alegre, Gabriela Ferreira Dias, Nadja Grobe, Andrea Novais Moreno-Amaral, Peter Kotanko
RESULTSIncubation of RBC with CMPF and Jedi1 significantly increased RBC osmotic fragility, an effect prevented by GsMTx-4. At 6.0 g/L NaCl, incubation with CMPF and Jedi1 increased exposure of phosphatidylserine and elevated icCa2+ levels of RBC, indicating increased eryptosis. Notably, at an isotonic NaCl concentration of 9.0 g/L, CMPF - but not Jedi1 - significantly increased RBC phosphatidylserine exposure and icCa2+ levels; both effects were diminished by GsMTx-4.BACKGROUND AND HYPOTHESISIn patients with advanced chronic kidney disease (CKD), the lifespan of red blood cells (RBC) is often shortened, a condition attributed to the "uremic milieu." We reported recently that the uremic solute 3-carboxy-4-methyl-5-propyl-2-furanpropionate (CMPF) shares structural similarities with Jedi1, a chemical activator of the mechanosensitive cation channel Piezo1, whose activation increases calcium influx into cells. Against this backdrop, we hypothesized that CMPF may induce premature RBC death (eryptosis) through prolonged CMPF-induced activation of Piezo1 located on RBC. To test this hypothesis, we explored if CMPF, at concentrations found in uremia, interacts with Piezo1 located on RBC, increases intracellular calcium (icCa2+), and induces eryptosis.CONCLUSIONOur findings support the hypothesis that CMPF may function as an endogenous activator of Piezo1, increase icCa2+ levels, trigger eryptosis, and, through this pathway, possibly shorten the RBC life span. To what extent these in vitro findings are operative in advanced CKD warrants clinical studies.METHODSRBC from healthy individuals were incubated with CMPF or Jedi1 (both at a concentration of 87 µM), in the presence or absence of the Piezo1 inhibitor GsMTx-4 (2 µM). We challenged RBC osmotically through incubation in solutions of NaCl at concentrations between 3.0 and 9.0 g/L and determined their osmotic fragility. Using flow cytometry, we quantified in incubated RBC icCa2+ levels and phosphatidylserine exposure, a cellular marker of eryptosis.
JOIN OUR MAILING LIST
© 2024 Renal Research Institute. All Rights Reserved. The Renal Research Institute and RRI Logos are trademarks of Fresenius Medical Care Holdings, Inc. or its affiliated companies. All other trademarks are the property of their respective owners.