Filter by:

  • Rakesh Malhotra, Sina Rahimi, Ushma Agarwal, Ronit Katz, Ujjala Kumar, Pranav S Garimella, Vineet Gupta, Tushar Chopra, Peter Kotanko, T Alp Ikizler, Britta Larsen, Lisa Cadmus-Bertram, Joachim H Ix

    RESULTSOut of 55 participants, 46 participants completed the 12-week intervention (23 per arm). The mean age was 62 (± 14 SD) years; 44% were Black, and 36% were Hispanic. At baseline, step count (structured feedback intervention: 3,704 [1,594] vs wearable activity tracker alone: 3,808 [1,890]) and other participant characteristics were balanced between the arms. We observed a larger change in daily step count in the structured feedback arm at 12 weeks relative to use of the wearable activity tracker alone arm (Δ 920 [±580 SD] versus Δ 281 [±186 SD] steps; between-group difference Δ 639 [±538 SD] steps; P<0.05).RATIONALE & OBJECTIVEPeople with end-stage kidney disease (ESKD) have very low physical activity, and the degree of inactivity is strongly associated with morbidity and mortality. We assessed the feasibility and effectiveness of a 12-week intervention coupling a wearable activity tracker (FitBit) and structured feedback coaching versus wearable activity tracker alone on changes in physical activity in hemodialysis patients.INTERVENTIONSAll participants wore a Fitbit Charge 2 tracker for a minimum of 12 weeks. Participants were randomly assigned 1:1 to a wearable activity tracker plus a structured feedback intervention versus the wearable activity tracker alone. The structured feedback group was counseled weekly on steps achieved after randomization.TRIAL REGISTRATIONRegistered at ClinicalTrials.gov with study number NCT05241171.LIMITATIONSSingle-center study and small sample size.STUDY DESIGNRandomized controlled trial.OUTCOMEThe outcome was step count, and the main parameter of interest was the absolute change in daily step count, averaged per week, from baseline to completion of 12 weeks intervention. In the intention-to-treat analysis, mixed-effect linear regression analysis was used to evaluate change in daily step count from baseline to 12-weeks in both arms.FUNDINGGrants from industry (Satellite Healthcare) and government (National Institute for Diabetes and Digestive and Kidney Diseases (NIDDK).CONCLUSIONThis pilot randomized controlled trial demonstrated that structured feedback coupled with a wearable activity tracker led to a greater daily step count that was sustained over 12 weeks relative to a wearable activity tracker alone. Future studies are required to determine longer-term sustainability of the intervention and potential health benefits in hemodialysis patients.SETTING & PARTICIPANTS55 participants with ESKD receiving hemodialysis who were able to walk with or without assistive devices recruited from a single academic hemodialysis unit between January 2019 and April 2020.

  • Sunpeng Duan, Yuedong Wang, Peter Kotanko, Hanjie Zhang

    RESULTSOut of 978 patients, 193 (19.7%) tested positive for COVID-19 and had contact with other patients during the COV-Pos infectious period. Network diagrams showed no evidence that more exposed patients would have had a higher chance of infection. This finding was corroborated by logistic mixed effect regression (donor-to-potential recipient exposure OR: 0.63; 95% CI 0.32 to 1.17, p = 0.163). Separate analyses according to vaccination led to materially identical results.CONCLUSIONSTransmission of SARS-CoV-2 between in-center hemodialysis patients is unlikely. This finding supports the effectiveness of non-pharmaceutical interventions, such as universal masking and other procedures to control spread of COVID-19.BACKGROUNDIn-center hemodialysis entails repeated interactions between patients and clinic staff, potentially facilitating the spread of COVID-19. We examined if in-center hemodialysis is associated with the spread of SARS-CoV-2 between patients.METHODSOur retrospective analysis comprised all patients receiving hemodialysis in four New York City clinics between March 12th, 2020, and August 31st, 2022. Treatment-level clinic ID, dialysis shift, dialysis machine station, and date of COVID-19 diagnosis by RT-PCR were documented. To estimate the donor-to-potential recipient exposure ("donor" being the COVID-19 positive patient denoted as "COV-Pos"; "potential recipient" being other susceptible patients in the same shift), we obtained the spatial coordinates of each dialysis station, calculated the Euclidean distances between stations and weighted the exposure by proximity between them. For each donor, we estimated the donor-to-potential recipient exposure of all potential recipients dialyzed in the same shift and accumulated the exposure over time within the 'COV-Pos infectious period' as cumulative exposures. The 'COV-Pos infectious period' started 5 days before COVID-19 diagnosis date. We deployed network analysis to assess these interactions and summarized the donor-to-potential recipient exposure in 193 network diagrams. We fitted mixed effects logistic regression models to test whether more donor-to-potential recipient exposure conferred a higher risk of SARS-CoV-2 infection.

  • Christina H Wang, Dan Negoianu, Hanjie Zhang, Sabrina Casper, Jesse Y Hsu, Peter Kotanko, Jochen Raimann, Laura M Dember

    RESULTSDuring 180,319 HD sessions among 2554 patients, PRR had high within-patient and between-patient variability. Female sex and hypoalbuminemia were associated with low PRR at multiple time points during the first hour of HD. Low starting PRR has a higher hazard of IDH, whereas high starting PRR was protective (hazard ratio [HR], 1.26, 95% confidence interval [CI], 1.18 to 1.35 versus HR, 0.79, 95% CI, 0.73 to 0.85, respectively). However, when accounting for time-varying PRR and time-varying confounders, compared with a moderate PRR, while a consistently low PRR was associated with increased risk of hypotension (odds ratio [OR], 1.09, 95% CI, 1.02 to 1.16), a consistently high PRR had a stronger association with hypotension within the next 15 minutes (OR, 1.38, 95% CI, 1.30 to 1.45).KEY POINTSDirectly studying plasma refill rate (PRR) during hemodialysis (HD) can offer insight into physiologic mechanisms that change throughout HD. PRR at the start and during HD is associated with intradialytic hypotension, independent of ultrafiltration rate. A rising PRR during HD may be an early indicator of compensatory mechanisms for impending circulatory instability.CONCLUSIONSWe present a straightforward technique to quantify plasma refill that could easily integrate with devices that monitor hematocrit during HD. Our study highlights how examining patterns of plasma refill may enhance our understanding of circulatory changes during HD, an important step to understand how current technology might be used to improve hemodynamic instability.BACKGROUNDAttaining the optimal balance between achieving adequate volume removal while preserving organ perfusion is a challenge for patients receiving maintenance hemodialysis (HD). Current strategies to guide ultrafiltration are inadequate.METHODSWe developed an approach to calculate the plasma refill rate (PRR) throughout HD using hematocrit and ultrafiltration data in a retrospective cohort of patients receiving maintenance HD at 17 dialysis units from January 2017 to October 2019. We studied whether (1) PRR is associated with traditional risk factors for hemodynamic instability using logistic regression, (2) low starting PRR is associated with intradialytic hypotension (IDH) using Cox proportional hazard regression, and (3) time-varying PRR throughout HD is associated with hypotension using marginal structural modeling.

  • Xin Wang, Leticia M Tapia Silva, Milind Nikam, Sandip Mitra, Syed Shaukat Abbas Zaidi, Nadja Grobe

    The aim of the paper is to summarize the current understanding of the molecular biology of arteriovenous fistula (AVF). It intends to encourage vascular access teams, care providers, and scientists, to explore new molecular tools for assessing the suitability of patients for AVF as vascular access for maintenance hemodialysis (HD). This review also highlights most recent discoveries and may serve as a guide to explore biomarkers and technologies for the assessment of kidney disease patients choosing to start kidney replacement therapy. Objective criteria for AVF eligibility are lacking partly because the underlying physiology of AVF maturation is poorly understood. Several molecular processes during a life cycle of an AVF, even before creation, can be characterized by measuring molecular fingerprints using newest "omics" technologies. In addition to hypothesis-driven strategies, untargeted approaches have the potential to reveal the interplay of hundreds of metabolites, transcripts, proteins, and genes underlying cardiovascular adaptation and vascular access-related adjustments at any given timepoint of a patient with kidney disease. As a result, regular monitoring of modifiable, molecular risk factors together with clinical assessment could help to reduce AVF failure rates, increase patency, and improve long-term outcomes. For the future, identification of vulnerable patients based on the assessment of biological markers of AVF maturation at different stages of the life cycle may aid in individualizing vascular access recommendations.

  • Frontiers in public health

    10 Feb 2024 Testing of worn face mask and saliva for SARS-CoV-2

    Xiaoling Wang, Ohnmar Thwin, Zahin Haq, Zijun Dong, Lela Tisdale, Lemuel Rivera Fuentes, Nadja Grobe, Peter Kotanko

    RESULTSMask and saliva testing specificities were 99% and 100%, respectively. Test sensitivity was 62% for masks, and 81% for saliva (p = 0.16). Median viral RNA shedding duration was 11 days and longer in immunocompromised versus non-immunocompromised patients (22 vs. 11 days, p = 0.06, log-rank test).CONCLUSIONWhile SARS-CoV-2 testing on worn masks appears to be less sensitive compared to saliva, it may be a preferred screening method for individuals who are mandated to wear masks yet averse to more invasive sampling. However, optimized RNA extraction methods and automated procedures are warranted to increase test sensitivity and scalability. We corroborated longer viral RNA shedding in immunocompromised patients.BACKGROUNDExhaled SARS-CoV-2 can be detected on face masks. We compared tests for SARS-CoV-2 RNA on worn face masks and matched saliva samples.METHODSWe conducted this prospective, observational, case-control study between December 2021 and March 2022. Cases comprised 30 in-center hemodialysis patients with recent COVID-19 diagnosis. Controls comprised 13 hemodialysis patients and 25 clinic staff without COVID-19 during the study period and the past 2 months. Disposable 3-layer masks were collected after being worn for 4 hours together with concurrent saliva samples. ThermoFisher COVID-19 Combo Kit (A47814) was used for RT-PCR testing.

  • Susie Q Lew, Gulay Asci, Paul A Rootjes, Ercan Ok, Erik L Penne, Ramin Sam, Antonios H Tzamaloukas, Todd S Ing, Jochen G Raimann

    The relationship between sodium, blood pressure and extracellular volume could not be more pronounced or complex than in a dialysis patient. We review the patients' sources of sodium exposure in the form of dietary salt intake, medication administration, and the dialysis treatment itself. In addition, the roles dialysis modalities, hemodialysis types, and dialysis fluid sodium concentration have on blood pressure, intradialytic symptoms, and interdialytic weight gain affect patient outcomes are discussed. We review whether sodium restriction (reduced salt intake), alteration in dialysis fluid sodium concentration and the different dialysis types have any impact on blood pressure, intradialytic symptoms, and interdialytic weight gain.

  • Ercan Ok, Cenk Demirci, Gulay Asci, Kivanc Yuksel, Fatih Kircelli, Serkan Kubilay Koc, Sinan Erten, Erkan Mahsereci, Ali Rıza Odabas, Stefano Stuard, Franklin W Maddux, Jochen G Raimann, Peter Kotanko, Peter G Kerr, Christopher T Chan

    RESULTSThe mean duration of dialysis session was 418 ± 54 minutes in HHD and 242 ± 10 minutes in patients on ICHD. All-cause mortality rate was 3.76 and 6.27 per 100 patient-years in the HHD and the ICHD groups, respectively. In the intention-to-treat analysis, HHD was associated with a 40% lower risk for all-cause mortality than ICHD (hazard ratio [HR] = 0.60; 95% confidence interval [CI] 0.45 to 0.80; P < 0.001). In HHD, the 5-year technical survival was 86.5%. HHD treatment provided better phosphate and blood pressure (BP) control, improvements in nutrition and inflammation, and reduction in hospitalization days and medication requirement.CONCLUSIONThese results indicate that extended HHD is associated with higher survival and better outcomes compared to ICHD.INTRODUCTIONMore frequent and/or longer hemodialysis (HD) has been associated with improvements in numerous clinical outcomes in patients on dialysis. Home HD (HHD), which allows more frequent and/or longer dialysis with lower cost and flexibility in treatment planning, is not widely used worldwide. Although, retrospective studies have indicated better survival with HHD, this issue remains controversial. In this multicenter study, we compared thrice-weekly extended HHD with in-center conventional HD (ICHD) in a large patient population with a long-term follow-up.METHODSWe matched 349 patients starting HHD between 2010 and 2014 with 1047 concurrent patients on ICHD by using propensity scores. Patients were followed-up with from their respective baseline until September 30, 2018. The primary outcome was overall survival. Secondary outcomes were technique survival; hospitalization; and changes in clinical, laboratory, and medication parameters.

  • Current opinion in nephrology and hypertension

    14 Dec 2023 Climate change and its influence in nephron mass

    Ana Catalina Alvarez-Elias, Barry M Brenner, Valerie A Luyckx

    PURPOSE OF REVIEWThe consequences of climate change, including heat and extreme weather events impact kidney function in adults and children. The impacts of climate change on kidney development during gestation and thereby on kidney function later in life have been poorly described. Clinical evidence is summarized to highlight possible associations between climate change and nephron mass.SUMMARYClimate change has important impacts on pregnant women and their unborn children. Being born too small or too soon is associated with life-time risk of kidney disease. Climate change may therefore have a dual effect of impacting fetal kidney development and contributing to cumulative postnatal kidney injury. The impact on population kidney health of future generations may be significant.RECENT FINDINGSPregnant women are vulnerable to the effects of climate change, being less able to thermoregulate, more sensitive to the effects of dehydration, and more susceptible to infections. Exposure to heat, wildfire smoke, drought, floods and climate-related infections are associated with low birth weight, preterm birth and preeclampsia. These factors are associated with reduced nephron numbers, kidney dysfunction and higher blood pressures in offspring in later life. Exposure to air pollution is associated with higher blood pressures in children and has variable effects on estimated glomerular filtration rate.

  • Armando Armenta-Alvarez, Salvador Lopez-Gil, Iván Osuna, Nadja Grobe, Xia Tao, Gabriela Ferreira Dias, Xiaoling Wang, Joshua Chao, Jochen G Raimann, Stephan Thijssen, Hector Perez-Grovas, Bernard Canaud, Peter Kotanko, Magdalena Madero

    RESULTSTwelve anuric patients were studied (six female patients; 44±19 years; dialysis vintage 35.2±28 months). The blood flow was 369±23 ml/min, dialysate flow was 495±61 ml/min, and ultrafiltration volume was 2.8±0.74 L. No significant differences were found regarding the removal of B2M, vitamin B12, and water-soluble solutes between dialytic modalities and dialyzers. Albumin and total protein loss were significantly higher in MCO groups than HFX groups when compared with the same modality. HDF groups had significantly higher albumin and total protein loss than HD groups when compared with the same dialyzer. MCO-HDF showed the highest protein loss among all groups.KEY POINTSHDF and MCO have shown greater clearance of middle-size uremic solutes in comparison with HF dialyzers; MCO has never been studied in HDF. MCO in HDF does not increase the clearance of B2M and results in a higher loss of albumin.CONCLUSIONSMCO-HD is not superior to HFX-HD and HFX-HDF for both middle molecule and water-soluble solute removal. Protein loss was more pronounced with MCO when compared with HFX on both HD and HDF modalities. MCO-HDF has no additional benefits regarding better removal of B2M but resulted in greater protein loss than MCO-HD.BACKGROUNDMiddle molecule removal and albumin loss have been studied in medium cutoff (MCO) membranes on hemodialysis (HD). It is unknown whether hemodiafiltration (HDF) with MCO membranes provides additional benefit. We aimed to compare the removal of small solutes and β2-microglobulin (B2M), albumin, and total proteins between MCO and high-flux (HFX) membranes with both HD and HDF, respectively.METHODSThe cross-over study comprised 4 weeks, one each with postdilutional HDF using HFX (HFX-HDF), MCO (MCO-HDF), HD with HFX (HFX-HD), and MCO (MCO-HD). MCO and HFX differ with respect to several characteristics, including membrane composition, pore size distribution, and surface area (HFX, 2.5 m2; MCO, 1.7 m2). There were two study treatments per week, one after the long interdialytic interval and another midweek. Reduction ratios of vitamin B12, B2M, phosphate, uric acid, and urea corrected for hemoconcentration were computed. Dialysis albumin and total protein loss during the treatment were quantified from dialysate samples.

  • Hanjie Zhang, Lin-Chun Wang, Sheetal Chaudhuri, Aaron Pickering, Len Usvyat, John Larkin, Pete Waguespack, Zuwen Kuang, Jeroen P Kooman, Franklin W Maddux, Peter Kotanko

    RESULTSWe utilized data from 693 patients who contributed 42 656 hemodialysis sessions and 355 693 intradialytic SBP measurements. IDH occurred in 16.2% of hemodialysis treatments. Our model predicted IDH 15-75 min in advance with an AUROC of 0.89. Top IDH predictors were the most recent intradialytic SBP and IDH rate, as well as mean nadir SBP of the previous 10 dialysis sessions.CONCLUSIONSReal-time prediction of IDH during an ongoing hemodialysis session is feasible and has a clinically actionable predictive performance. If and to what degree this predictive information facilitates the timely deployment of preventive interventions and translates into lower IDH rates and improved patient outcomes warrants prospective studies.BACKGROUNDIn maintenance hemodialysis patients, intradialytic hypotension (IDH) is a frequent complication that has been associated with poor clinical outcomes. Prediction of IDH may facilitate timely interventions and eventually reduce IDH rates.METHODSWe developed a machine learning model to predict IDH in in-center hemodialysis patients 15-75 min in advance. IDH was defined as systolic blood pressure (SBP) <90 mmHg. Demographic, clinical, treatment-related and laboratory data were retrieved from electronic health records and merged with intradialytic machine data that were sent in real-time to the cloud. For model development, dialysis sessions were randomly split into training (80%) and testing (20%) sets. The area under the receiver operating characteristic curve (AUROC) was used as a measure of the model's predictive performance.

  • Ariella Mermelstein, Jochen G Raimann, Yuedong Wang, Peter Kotanko, John T Daugirdas

    RESULTSIn the studied 396,358 patients, the average ultrafiltration rate in ml/h was related to postdialysis weight (W) in kg: 3W+330. Ultrafiltration rates associated with 20% or 40% higher weight-specific mortality risk were 3W+500 and 3W+630 ml/h, respectively, and were 70 ml/h higher in men than in women. Nineteen percent or 7.5% of patients exceeded ultrafiltration rates associated with a 20% or 40% higher mortality risk, respectively. Low ultrafiltration rates were associated with subsequent weight loss. Ultrafiltration rates associated with a given mortality risk were lower in high-body weight older patients and higher in patients on dialysis for more than 3 years.CONCLUSIONSUltrafiltration rates associated with various levels of higher mortality risk depend on body weight, but not in a 1:1 ratio, and are different in men versus women, in high-body weight older patients, and in high-vintage patients.BACKGROUNDWe hypothesized that the association of ultrafiltration rate with mortality in hemodialysis patients was differentially affected by weight and sex and sought to derive a sex- and weight-indexed ultrafiltration rate measure that captures the differential effects of these parameters on the association of ultrafiltration rate with mortality.METHODSData were analyzed from the US Fresenius Kidney Care (FKC) database for 1 year after patient entry into a FKC dialysis unit (baseline) and over 2 years of follow-up for patients receiving thrice-weekly in-center hemodialysis. To investigate the joint effect of baseline-year ultrafiltration rate and postdialysis weight on survival, we fit Cox proportional hazards models using bivariate tensor product spline functions and constructed contour plots of weight-specific mortality hazard ratios over the entire range of ultrafiltration rate values and postdialysis weights (W).

  • Sheetal Chaudhuri, Andrew Long, Hanjie Zhang, Caitlin Monaghan, John W Larkin, Peter Kotanko, Shashi Kalaskar, Jeroen P Kooman, Frank M van der Sande, Franklin W Maddux, Len A Usvyat

    Artificial intelligence (AI) is considered as the next natural progression of traditional statistical techniques. Advances in analytical methods and infrastructure enable AI to be applied in health care. While AI applications are relatively common in fields like ophthalmology and cardiology, its use is scarcely reported in nephrology. We present the current status of AI in research toward kidney disease and discuss future pathways for AI. The clinical applications of AI in progression to end-stage kidney disease and dialysis can be broadly subdivided into three main topics: (a) predicting events in the future such as mortality and hospitalization; (b) providing treatment and decision aids such as automating drug prescription; and (c) identifying patterns such as phenotypical clusters and arteriovenous fistula aneurysm. At present, the use of prediction models in treating patients with kidney disease is still in its infancy and further evidence is needed to identify its relative value. Policies and regulations need to be addressed before implementing AI solutions at the point of care in clinics. AI is not anticipated to replace the nephrologists' medical decision-making, but instead assist them in providing optimal personalized care for their patients.

  • Sudhir K Bowry, Peter Kotanko, Rainer Himmele, Xia Tao, Michael Anger

    Informed decision-making is paramount to the improvement of dialysis therapies and patient outcomes. A cornerstone of delivery of optimal dialysis therapy is to delineate which substances (uraemic retention solutes or 'uraemic toxins') contribute to the condition of uraemia in terms of deleterious biochemical effects they may exert. Thereafter, decisions can be made as to which of the accumulated compounds need to be targeted for removal and by which strategies. For haemodialysis (HD), the non-selectivity of membranes is sometimes considered a limitation. Yet, considering that dozens of substances with potential toxicity need to be eliminated, and targeting removal of individual toxins explicitly is not recommended, current dialysis membranes enable elimination of several molecules of a broad size range within a single therapy session. However, because HD solute removal is based on size-exclusion principles, i.e. the size of the substances to be removed relative to the mean size of the 'pores' of the membrane, only a limited degree of selectivity of removal is possible. Removal of unwanted substances during HD needs to be weighed against the unavoidable loss of substances that are recognized to be necessary for bodily functions and physiology. In striving to improve the efficiency of HD by increasing the porosity of membranes, there is a greater potential for the loss of substances that are of benefit. Based on this elementary trade-off and availability of recent guidance on the relative toxicity of substances retained in uraemia, we propose a new evidence-linked uraemic toxin elimination (ELUTE) approach whereby only those clusters of substances for which there is a sufficient body of evidence linking them to deleterious biological effects need to be targeted for removal. Our approach involves correlating the physical properties of retention solutes (deemed to express toxicity) with key determinants of membranes and separation processes. Our analysis revealed that in attempting to remove the relatively small number of 'larger' substances graded as having only moderate toxicity, uncontrolled (and efficient) removal of several useful compounds would take place simultaneously and may compromise the well-being or outcomes of patients. The bulk of the uraemic toxin load comprises uraemic toxins below <30 000 Da and are adequately removed by standard membranes. Further, removal of a few difficult-to-remove-by-dialysis (protein-bound) compounds that express toxicity cannot be achieved by manipulation of pore size alone. The trade-off between the benefits of effective removal of the bulk of the uraemic toxin load and risks (increased loss of useful substances) associated with targeting the removal of a few larger substances in 'high-efficiency' HD treatment strategies needs to be recognized and better understood. The removability during HD of substances, be they toxic, inert or beneficial, needs be revised to establish the pros and cons of current dialytic elimination strategies.  .

  • Maggie Han, Priscila Preciado, Ohnmar Thwin, Xia Tao, Leticia M Tapia-Silva, Lemuel Rivera Fuentes, Mohamad Hakim, Amrish Patel, Lela Tisdale, Hanjie Zhang, Peter Kotanko

    RESULTS42 patients were included. Their mean age was 55 years, 79% were males, and 69% were African Americans. Between January 1 and February 13, 2020, patients took on average 5,963 (95% CI 4,909-7,017) steps/day. In the week prior to the mandated lockdown, when a national emergency was declared, and in the week of the shutdown, the average number of daily steps had decreased by 868 steps/day (95% CI 213-1,722) and 1,222 steps/day (95% CI 668-2300), respectively. Six patients were diagnosed with COVID-19 during the study period. Five of them exhibited significantly higher PAL in the 2 weeks prior to showing COVID-19 symptoms compared to COVID-19 negative patients.BACKGROUND/OBJECTIVESOn March 22, 2020, a statewide stay-at-home order for nonessential tasks was implemented in New York State. We aimed to determine the impact of the lockdown on physical activity levels (PAL) in hemodialysis patients.CONCLUSIONLockdown measures were associated with a significant decrease in PAL in hemodialysis patients. Patients who contracted COVID-19 had higher PAL during the incubation period. Methods to increase PAL while allowing for social distancing should be explored and implemented.METHODSStarting in May 2018, we are conducting an observational study with a 1-year follow-up on PAL in patients from 4 hemodialysis clinics in New York City. Patients active in the study as of March 22, 2020, were included. PAL was defined by steps taken per day measured by a wrist-based monitoring device (Fitbit Charge 2). Average steps/day were calculated for January 1 to February 13, 2020, and then weekly from February 14 to June 30.

  • Juntao Duan, Hanmo Li, Xiaoran Ma, Hanjie Zhang, Rachel Lasky, Caitlin K Monaghan, Sheetal Chaudhuri, Len A Usvyat, Mengyang Gu, Wensheng Guo, Peter Kotanko, Yuedong Wang

    CONCLUSIONAs found in our study, the dynamics of the prediction model are frequently changing as the pandemic evolves. County-level infection information and vaccination information are crucial for the success of early COVID-19 prediction models. Our results show that the proposed model can effectively identify SARS-CoV-2 infections during the incubation period. Prospective studies are warranted to explore the application of such prediction models in daily clinical practice.BACKGROUNDThe coronavirus disease 2019 (COVID-19) pandemic has created more devastation among dialysis patients than among the general population. Patient-level prediction models for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection are crucial for the early identification of patients to prevent and mitigate outbreaks within dialysis clinics. As the COVID-19 pandemic evolves, it is unclear whether or not previously built prediction models are still sufficiently effective.METHODSWe developed a machine learning (XGBoost) model to predict during the incubation period a SARS-CoV-2 infection that is subsequently diagnosed after 3 or more days. We used data from multiple sources, including demographic, clinical, treatment, laboratory, and vaccination information from a national network of hemodialysis clinics, socioeconomic information from the Census Bureau, and county-level COVID-19 infection and mortality information from state and local health agencies. We created prediction models and evaluated their performances on a rolling basis to investigate the evolution of prediction power and risk factors.RESULTFrom April 2020 to August 2020, our machine learning model achieved an area under the receiver operating characteristic curve (AUROC) of 0.75, an improvement of over 0.07 from a previously developed machine learning model published by Kidney360 in 2021. As the pandemic evolved, the prediction performance deteriorated and fluctuated more, with the lowest AUROC of 0.6 in December 2021 and January 2022. Over the whole study period, that is, from April 2020 to February 2022, fixing the false-positive rate at 20%, our model was able to detect 40% of the positive patients. We found that features derived from local infection information reported by the Centers for Disease Control and Prevention (CDC) were the most important predictors, and vaccination status was a useful predictor as well. Whether or not a patient lives in a nursing home was an effective predictor before vaccination, but became less predictive after vaccination.

  • Ana Paula Bernardo, Paola Carioni, Stefano Stuard, Peter Kotanko, Len A Usvyat, Vratislava Kovarova, Otto Arkossy, Francesco Bellocchio, Antonio Tupputi, Federica Gervasoni, Anke Winter, Yan Zhang, Hanjie Zhang, Pedro Ponce, Luca Neri

    RESULTSIn the effectiveness analysis concerning mRNA vaccines, we observed 850 SARS-CoV-2 infections and 201 COVID-19 related deaths among the 28110 patients during a mean follow up of 44 ± 40 days. In the effectiveness analysis concerning viral-carrier vaccines, we observed 297 SARS-CoV-2 infections and 64 COVID-19 related deaths among 12888 patients during a mean follow up of 48 ± 32 days. We observed 18.5/100-patient-year and 8.5/100-patient-year fewer infections and 5.4/100-patient-year and 5.2/100-patient-year fewer COVID-19 related deaths among patients vaccinated with mRNA and viral-carrier vaccines respectively, compared to matched unvaccinated controls. Estimated vaccine effectiveness at days 15, 30, 60 and 90 after the first dose of a mRNA vaccine was: for infection, 41.3%, 54.5%, 72.6% and 83.5% and, for death, 33.1%, 55.4%, 80.1% and 91.2%. Estimated vaccine effectiveness after the first dose of a viral-carrier vaccine was: for infection, 38.3% without increasing over time and, for death, 56.6%, 75.3%, 92.0% and 97.4%.CONCLUSIONIn this large, real-world cohort of hemodialyzed patients, mRNA and viral-carrier COVID-19 vaccines were associated with reduced COVID-19 related mortality. Additionally, we observed a strong reduction of SARS-CoV-2 infection in hemodialysis patients receiving mRNA vaccines.BACKGROUNDHemodialysis patients have high-risk of severe SARS-CoV-2 infection but were unrepresented in randomized controlled trials evaluating the safety and efficacy of COVID-19 vaccines. We estimated the real-world effectiveness of COVID-19 vaccines in a large international cohort of hemodialysis patients.METHODSIn this historical, 1:1 matched cohort study, we included adult hemodialysis patients receiving treatment from December 1, 2020, to May 31, 2021. For each vaccinated patient, an unvaccinated control was selected among patients registered in the same country and attending a dialysis session around the first vaccination date. Matching was based on demographics, clinical characteristics, past COVID-19 infections and a risk score representing the local background risk of infection at vaccination dates. We estimated the effectiveness of mRNA and viral-carrier COVID-19 vaccines in preventing infection and mortality rates from a time-dependent Cox regression stratified by country.

  • Jeroen Peter Kooman, Paola Carioni, Vratislava Kovarova, Otto Arkossy, Anke Winter, Yan Zhang, Francesco Bellocchio, Peter Kotanko, Hanjie Zhang, Len Usvyat, John Larkin, Stefano Stuard, Luca Neri

    RESULTSWe included 9,211 patients (age 65.4 ± 13.7 years, dialysis vintage 4.2 ± 3.7 years) eligible for the study. The 30-day mortality rate was 20.8%. In LR models, several potentially modifiable factors were associated with higher mortality: body mass index (BMI) 30-40 kg/m2 (OR: 1.28, CI: 1.10-1.50), single-pool Kt/V (OR off-target vs on-target: 1.19, CI: 1.02-1.38), overhydration (OR: 1.15, CI: 1.01-1.32), and both low (<2.5 mg/dl) and high (≥5.5 mg/dl) serum phosphate levels (OR: 1.52, CI: 1.07-2.16 and OR: 1.17, CI: 1.01-1.35). On-line hemodiafiltration was protective in the model using KPIs (OR: 0.86, CI: 0.76-0.97). SHapley Additive exPlanations analysis in XGBoost models shows a high influence on prediction for several modifiable factors as well, including inflammatory parameters, high BMI, and fluid overload. In both LR and XGBoost models, age, gender, and comorbidities were strongly associated with mortality.CONCLUSIONBoth conventional and machine learning techniques showed that KPIs and modifiable risk factors in different dimensions ascertained 6 months before the COVID-19 suspicion date were associated with 30-day COVID-19-related mortality. Our results suggest that adequate dialysis and achieving KPI targets remain of major importance during the COVID-19 pandemic as well.INTRODUCTIONPatients with end-stage kidney disease face a higher risk of severe outcomes from SARS-CoV-2 infection. Moreover, it is not well known to what extent potentially modifiable risk factors contribute to mortality risk. In this historical cohort study, we investigated the incidence and risk factors for 30-day mortality among hemodialysis patients with SARS-CoV-2 infection treated in the European Fresenius Medical Care NephroCare network using conventional and machine learning techniques.METHODSWe included adult hemodialysis patients with the first documented SARS-CoV-2 infection between February 1, 2020, and March 31, 2021, registered in the clinical database. The index date for the analysis was the first SARS-CoV-2 suspicion date. Patients were followed for up to 30 days until April 30, 2021. Demographics, comorbidities, and various modifiable risk factors, expressed as continuous parameters and as key performance indicators (KPIs), were considered to tap multiple dimensions including hemodynamic control, nutritional state, and mineral metabolism in the 6 months before the index date. We used logistic regression (LR) and XGBoost models to assess risk factors for 30-day mortality.

  • Bernard Canaud, Jeroen Kooman, Andreas Maierhofer, Jochen Raimann, Jens Titze, Peter Kotanko

    New physiologic findings related to sodium homeostasis and pathophysiologic associations require a new vision for sodium, fluid and blood pressure management in dialysis-dependent chronic kidney disease patients. The traditional dry weight probing approach that has prevailed for many years must be reviewed in light of these findings and enriched by availability of new tools for monitoring and handling sodium and water imbalances. A comprehensive and integrated approach is needed to improve further cardiac health in hemodialysis (HD) patients. Adequate management of sodium, water, volume and hemodynamic control of HD patients relies on a stepwise approach: the first entails assessment and monitoring of fluid status and relies on clinical judgement supported by specific tools that are online embedded in the HD machine or devices used offline; the second consists of acting on correcting fluid imbalance mainly through dialysis prescription (treatment time, active tools embedded on HD machine) but also on guidance related to diet and thirst management; the third consist of fine tuning treatment prescription to patient responses and tolerance with the support of innovative tools such as artificial intelligence and remote pervasive health trackers. It is time to come back to sodium and water imbalance as the root cause of the problem and not to act primarily on their consequences (fluid overload, hypertension) or organ damage (heart; atherosclerosis, brain). We know the problem and have the tools to assess and manage in a more precise way sodium and fluid in HD patients. We strongly call for a sodium first approach to reduce disease burden and improve cardiac health in dialysis-dependent chronic kidney disease patients.

  • Gudrun Schappacher-Tilp, Peter Kotanko, Markus Pirklbauer

    Altered parathyroid gland biology is a major driver of chronic kidney disease-mineral bone disorder (CKD-MBD) in patients with chronic kidney disease. CKD-MBD is associated with a high risk of vascular calcification and cardiovascular events. A hallmark of CKD-MBD is secondary hyperparathyroidism with increased parathyroid hormone (PTH) synthesis and release and reduced expression of calcium-sensing receptors on the surface of parathyroid cells and eventually hyperplasia of parathyroid gland cells. The KDIGO guidelines strongly recommend the control of PTH in hemodialysis patients. Due to the complexity of parathyroid gland biology, mathematical models have been employed to study the interaction of PTH regulators and PTH plasma concentrations. Here, we present an overview of various model approaches and discuss the impact of different model structures and complexities on the clinical use of these models.

  • Zahin Haq, Xin Wang, Qiuqiong Cheng, Gabriela F Dias, Christoph Moore, Dorothea Piecha, Peter Kotanko, Chih-Hu Ho, Nadja Grobe

    Bisphenol A (BPA)-based materials are used in the manufacturing of hemodialyzers, including their polycarbonate (PC) housings and polysulfone (PS) membranes. As concerns for BPA's adverse health effects rise, the regulation on BPA exposure is becoming more rigorous. Therefore, BPA alternatives, such as Bisphenol S (BPS), are increasingly used. It is important to understand the patient risk of BPA and BPS exposure through dialyzer use during hemodialysis. Here, we report the bisphenol levels in extractables and leachables obtained from eight dialyzers currently on the market, including high-flux and medium cut-off membranes. A targeted liquid chromatography-mass spectrometry strategy utilizing stable isotope-labeled internal standards provided reliable data for quantitation with the standard addition method. BPA ranging from 0.43 to 32.82 µg/device and BPS ranging from 0.02 to 2.51 µg/device were detected in dialyzers made with BPA- and BPS-containing materials, except for the novel FX CorAL 120 dialyzer. BPA and BPS were also not detected in bloodline controls and cellulose-based membranes. Based on the currently established tolerable intake (6 µg/kg/day), the resulting margin of safety indicates that adverse effects are unlikely to occur in hemodialysis patients exposed to BPA and BPS quantified herein. With increasing availability of new data and information about the toxicity of BPA and BPS, the patient safety limits of BPA and BPS in those dialyzers may need a re-evaluation in the future.

  • Karlien J Ter Meulen, Xiaoling Ye, Yuedong Wang, Len A Usvyat, Frank M van der Sande, Constantijn J Konings, Peter Kotanko, Jeroen P Kooman, Franklin W Maddux

    RESULTSWe included 302,613 patients. Baseline phosphate was 5.1±1.2 mg/dl, and mean DR was +0.6±3.3 mg/dl. Across different levels of phosphate, higher levels of DR of phosphate were associated with higher risk of all-cause mortality. In patients with lower levels of phosphate and serum albumin, the effect of a negative DR was most pronounced, whereas in patients with higher phosphate levels, a positive DR was related to increased mortality.KEY POINTSAn increase in serum phosphate variability is an independent risk factor of mortality. The effects of a positive directional range (DR) is most pronounced in patients with high serum phosphate levels whereas the effects of a negative DR is most pronounced in patients with low serum phosphate and/or serum albumin.CONCLUSIONSHigher variability of serum phosphate is related to mortality at all levels of phosphate, especially in lower levels with a negative DR and in low serum albumin levels. This could possibly reflect dietary intake in patients who are already inflamed or malnourished, where a further reduction in serum phosphate should prompt for nutritional evaluation.BACKGROUNDIn maintenance hemodialysis (HD) patients, previous studies have shown that serum phosphate levels have a bidirectional relation to outcome. Less is known about the relation between temporal dynamics of serum phosphate in relation to outcome. We aimed to further explore the relation between serum phosphate variability and all-cause mortality.METHODSAll adult incident HD patients treated in US Fresenius Kidney Care clinics between January 2010 and October 2018 were included. Baseline period was defined as 6 months after initiation of HD and months 7–18 as follow-up period. All-cause mortality was recorded during the follow-up period. The primary metric of variability used was directional range (DR) that is the difference between the largest and smallest values within a time period; DR was positive when the smallest value preceded the largest and negative otherwise. Cox proportional hazards models with spline terms were applied to explore the association between phosphate, DR, and all-cause mortality. In addition, tensor product smoothing splines were computed to further elucidate the interactions of phosphate, DR, and all-cause mortality.

  • Richard V Remigio, Hyeonjin Song, Jochen G Raimann, Peter Kotanko, Frank W Maddux, Rachel A Lasky, Xin He, Amir Sapkota

    RESULTSWe observed positive associations between inclement weather and missed appointment (rainfall, hurricane and tropical storm, snowfall, snow depth, and wind advisory) when compared with noninclement weather days. The risk of missed appointments was most pronounced during the day of inclement weather (lag 0) for rainfall (incidence rate ratio [RR], 1.03 per 10-mm rainfall; 95% confidence interval [CI], 1.02 to 1.03) and snowfall (RR, 1.02; 95% CI, 1.01 to 1.02). Over 7 days (lag 0-6), hurricane and tropical storm exposures were associated with a 55% higher risk of missed appointments (RR, 1.55; 95% CI, 1.22 to 1.98). Similarly, 7-day cumulative exposure to sustained wind advisories was associated with 29% higher risk (RR, 1.29; 95% CI, 1.25 to 1.31), while wind gusts advisories showed a 34% higher risk (RR, 1.34; 95% CI, 1.29 to 1.39) of missed appointment.CONCLUSIONSInclement weather was associated with higher risk of missed hemodialysis appointments within the Northeastern United States. Furthermore, the association between inclement weather and missed hemodialysis appointments persisted for several days, depending on the inclement weather type.BACKGROUNDNonadherence to hemodialysis appointments could potentially result in health complications that can influence morbidity and mortality. We examined the association between different types of inclement weather and hemodialysis appointment adherence.METHODSWe analyzed health records of 60,135 patients with kidney failure who received in-center hemodialysis treatment at Fresenius Kidney Care clinics across the Northeastern US counties during 2001-2019. County-level daily meteorological data on rainfall, hurricane and tropical storm events, snowfall, snow depth, and wind speed were extracted using National Oceanic and Atmosphere Agency data sources. A time-stratified case-crossover study design with conditional Poisson regression was used to estimate the effect of inclement weather exposures within the Northeastern US region. We applied a distributed lag nonlinear model framework to evaluate the delayed effect of inclement weather for up to 1 week.

  • Lin-Chun Wang, Leticia M Tapia, Xia Tao, Joshua E Chao, Ohnmar Thwin, Hanjie Zhang, Stephan Thijssen, Peter Kotanko, Nadja Grobe

    RESULTSThe SEV group reported a 3.3-fold higher frequency of BSS stool types 1 and 2 (more likely constipated, p < 0.05), whereas the SFO group reported a 1.5-fold higher frequency of BSS stool types 5-7 (more likely loose stool and diarrhea, not significant). Participants in the SFO group showed a trend toward better adherence to phosphate binder therapy (SFO: 87.6% vs. SEV: 66.6%, not significant). UTOX, serum phosphorus, nutritional and liver function markers, and tryptophan were not different between the two groups.CONCLUSIONThere was no difference in the gut microbiome-derived UTOX levels between phosphate binders (SFO vs. SEV), despite SFO therapy resulting in fewer constipated participants. This pilot study may inform study design of future clinical trials and highlights the importance of including factors beyond bowel habits and their association with UTOX levels.INTRODUCTIONConstipation is prevalent in patients with kidney failure partly due to the use of medication, such as phosphate binders. We hypothesized that serum levels of gut microbiome-derived uremic toxins (UTOX) may be affected by the choice of phosphate binder putatively through its impact on colonic transit time. We investigated two commonly prescribed phosphate binders, sevelamer carbonate (SEV) and sucroferric oxyhydroxide (SFO), and their association with gut microbiome-derived UTOX levels in hemodialysis (HD) patients.METHODSWeekly blood samples were collected from 16 anuric HD participants during the 5-week observational period. All participants were on active phosphate binder monotherapy with either SFO or SEV for at least 4 weeks prior to enrollment. Eight UTOX (7 gut microbiome-derived) and tryptophan were quantified using liquid chromatography-mass spectrometry. Serum phosphorus, nutritional, and liver function markers were also measured. For each substance, weekly individual levels, the median concentration per participant, and differences between SFO and SEV groups were reported. Patient-reported bowel movements, by the Bristol Stool Scale (BSS), and pill usage were assessed weekly.

  • Priscila Preciado, Laura Rosales Merlo, Hanjie Zhang, Jeroen P Kooman, Frank M van der Sande, Peter Kotanko

    DISCUSSIONConcurrent combined monitoring of intradialytic ScvO2 and RBV change may provide additional insights into a patient's circulatory status. Patients with low ScvO2 and small changes in RBV may represent a specifically vulnerable group of patients at particularly high risk for adverse outcomes, possibly related to poor cardiac reserve and fluid overload.INTRODUCTIONIn maintenance hemodialysis (HD) patients, low central venous oxygen saturation (ScvO2 ) and small decline in relative blood volume (RBV) have been associated with adverse outcomes. Here we explore the joint association between ScvO2 and RBV change in relation to all-cause mortality.FINDINGSBaseline comprised 5231 dialysis sessions in 216 patients. The median RBV change was -5.5% and median ScvO2 was 58.8%. During follow-up, 44 patients (20.4%) died. In the adjusted model, all-cause mortality was highest in patients with ScvO2 below median and RBV change above median (HR 6.32; 95% confidence interval [CI] 1.37-29.06), followed by patients with ScvO2 below median and RBV change below median (HR 5.04; 95% CI 1.14-22.35), and ScvO2 above median and RBV change above median (HR 4.52; 95% CI 0.95-21.36).METHODSWe conducted a retrospective study in maintenance HD patients with central venous catheters as vascular access. During a 6-month baseline period, Crit-Line (Fresenius Medical Care, Waltham, MA) was used to measure continuously intradialytic ScvO2 and hematocrit-based RBV. We defined four groups per median change of RBV and median ScvO2 . Patients with ScvO2 above median and RBV change below median were defined as reference. Follow-up period was 3 years. We constructed Cox proportional hazards model with adjustment for age, diabetes, and dialysis vintage to assess the association between ScvO2 and RBV and all-cause mortality during follow-up.

  • Advances in kidney disease and health

    23 Apr 2023 Deep Learning for Image Analysis in Kidney Care

    Hanjie Zhang, Max Botler, Jeroen P Kooman

    Analysis of medical images, such as radiological or tissue specimens, is an indispensable part of medical diagnostics. Conventionally done manually, the process may sometimes be time-consuming and prone to interobserver variability. Image classification and segmentation by deep learning strategies, predominantly convolutional neural networks, may provide a significant advance in the diagnostic process. In renal medicine, most evidence has been generated around the radiological assessment of renal abnormalities and histological analysis of renal biopsy specimens' segmentation. In this article, the basic principles of image analysis by convolutional neural networks, brief descriptions of convolutional neural networks, and their system architecture for image analysis are discussed, in combination with examples regarding their use in image analysis in nephrology.

  • Advances in kidney disease and health

    17 Apr 2023 Omics and Artificial Intelligence in Kidney Diseases

    Nadja Grobe, Josef Scheiber, Hanjie Zhang, Christian Garbe, Xiaoling Wang

    Omics applications in nephrology may have relevance in the future to improve clinical care of kidney disease patients. In a short term, patients will benefit from specific measurement and computational analyses around biomarkers identified at various omics-levels. In mid term and long term, these approaches will need to be integrated into a holistic representation of the kidney and all its influencing factors for individualized patient care. Research demonstrates robust data to justify the application of omics for better understanding, risk stratification, and individualized treatment of kidney disease patients. Despite these advances in the research setting, there is still a lack of evidence showing the combination of omics technologies with artificial intelligence and its application in clinical diagnostics and care of patients with kidney disease.

  • Sheetal Chaudhuri, John Larkin, Murilo Guedes, Yue Jiao, Peter Kotanko, Yuedong Wang, Len Usvyat, Jeroen P Kooman

    MATERIALS AND METHODSWe included data HD patients who had data across a baseline period of at least 1 year and 1 day in the internationally representative Monitoring Dialysis Outcomes (MONDO) Initiative dataset. Twenty-three input parameters considered in the model were chosen in an a priori manner. The prediction model used 1 year baseline data to predict death in the following 3 years. The dataset was randomly split into 80% training data and 20% testing data for model development. Two different modeling techniques were used to build the mortality prediction model.DISCUSSIONIn the internationally representative MONDO data for HD patients, we describe the development of a ML model and a traditional statistical model that was suitable for classification of a prevalent HD patient's 3-year risk of death. While both models had a reasonably high AUROC, the ML model was able to identify levels of hematocrit (HCT) as an important risk factor in mortality. If implemented in clinical practice, such proof-of-concept models could be used to provide pre-emptive care for HD patients.INTRODUCTIONSeveral factors affect the survival of End Stage Kidney Disease (ESKD) patients on dialysis. Machine learning (ML) models may help tackle multivariable and complex, often non-linear predictors of adverse clinical events in ESKD patients. In this study, we used advanced ML method as well as a traditional statistical method to develop and compare the risk factors for mortality prediction model in hemodialysis (HD) patients.FINDINGSA total of 95,142 patients were included in the analysis sample. The area under the receiver operating curve (AUROC) of the model on the test data with XGBoost ML model was 0.84 on the training data and 0.80 on the test data. AUROC of the logistic regression model was 0.73 on training data and 0.75 on test data. Four out of the top five predictors were common to both modeling strategies.

  • Richard V Remigio, Hao He, Jochen G Raimann, Peter Kotanko, Frank W Maddux, Amy Rebecca Sapkota, Xin-Zhong Liang, Robin Puett, Xin He, Amir Sapkota

    RESULTSFrom 2001 to 2016, the sample population consisted of 43,338 ESKD patients. We recorded 5217 deaths and 78,433 hospital admissions. A 10-unit increase in PM2.5 concentration was associated with a 5% increase in ACM (rate ratio [RRLag0-3]: 1.05, 95% CI: 1.00-1.10) and same-day O3 (RRLag0: 1.02, 95% CI: 1.01-1.03) after adjusting for extreme heat exposures. Mortality models suggest evidence of interaction and effect measure modification, though not always simultaneously. ACM risk increased up to 8% when daily ozone concentrations exceeded National Ambient Air Quality Standards established by the United States, but the increases in risk were considerably higher during EHE days across lag periods.CONCLUSIONOur findings suggest interdependent effects of EHE and air pollution among ESKD patients for all-cause mortality risks. National level assessments are needed to consider the ESKD population as a sensitive population and inform treatment protocols during extreme heat and degraded pollution episodes.BACKGROUNDIncreasing number of studies have linked air pollution exposure with renal function decline and disease. However, there is a lack of data on its impact among end-stage kidney disease (ESKD) patients and its potential modifying effect from extreme heat events (EHE).METHODSFresenius Kidney Care records from 28 selected northeastern US counties were used to pool daily all-cause mortality (ACM) and all-cause hospital admissions (ACHA) counts. County-level daily ambient PM2.5 and ozone (O3) were estimated using a high-resolution spatiotemporal coupled climate-air quality model and matched to ESKD patients based on ZIP codes of treatment sites. We used time-stratified case-crossover analyses to characterize acute exposures using individual and cumulative lag exposures for up to 3 days (Lag 0-3) by using a distributed lag nonlinear model framework. We used a nested model comparison hypothesis test to evaluate for interaction effects between air pollutants and EHE and stratification analyses to estimate effect measures modified by EHE days.

  • Bioengineering (Basel, Switzerland)

    28 Feb 2023 Hemodiafiltration: Technical and Medical Insights

    Thomas Lang, Adam M Zawada, Lukas Theis, Jennifer Braun, Bertram Ottillinger, Pascal Kopperschmidt, Alfred Gagel, Peter Kotanko, Manuela Stauss-Grabo, James P Kennedy, Bernard Canaud

    Despite the significant medical and technical improvements in the field of dialytic renal replacement modalities, morbidity and mortality are excessively high among patients with end-stage kidney disease, and most interventional studies yielded disappointing results. Hemodiafiltration, a dialysis method that was implemented in clinics many years ago and that combines the two main principles of hemodialysis and hemofiltration-diffusion and convection-has had a positive impact on mortality rates, especially when delivered in a high-volume mode as a surrogate for a high convective dose. The achievement of high substitution volumes during dialysis treatments does not only depend on patient characteristics but also on the dialyzer (membrane) and the adequately equipped hemodiafiltration machine. The present review article summarizes the technical aspects of online hemodiafiltration and discusses present and ongoing clinical studies with regards to hard clinical and patient-reported outcomes.

  • David J Jörg, Doris H Fuertinger, Peter Kotanko

    Patients with renal anemia are frequently treated with erythropoiesis-stimulating agents (ESAs), which are dynamically dosed in order to stabilize blood hemoglobin levels within a specified target range. During typical ESA treatments, a fraction of patients experience hemoglobin 'cycling' periods during which hemoglobin levels periodically over- and undershoot the target range. Here we report a specific mechanism of hemoglobin cycling, whereby cycles emerge from the patient's delayed physiological response to ESAs and concurrent ESA dose adjustments. We introduce a minimal theoretical model that can explain dynamic hallmarks of observed hemoglobin cycling events in clinical time series and elucidates how physiological factors (such as red blood cell lifespan and ESA responsiveness) and treatment-related factors (such as dosing schemes) affect cycling. These results show that in general, hemoglobin cycling cannot be attributed to patient physiology or ESA treatment alone but emerges through an interplay of both, with consequences for the design of ESA treatment strategies.

  • Caitlin K Monaghan, John W Larkin, Sheetal Chaudhuri, Hao Han, Yue Jiao, Kristine M Bermudez, Eric D Weinhandl, Ines A Dahne-Steuber, Kathleen Belmonte, Luca Neri, Peter Kotanko, Jeroen P Kooman, Jeffrey L Hymes, Robert J Kossmann, Len A Usvyat, Franklin W Maddux

    RESULTSWe used a select cohort of 40,490 patients on HD to build the ML model (11,166 patients who were COVID-19 positive and 29,324 patients who were unaffected controls). The prevalence of COVID-19 in the cohort (28% COVID-19 positive) was by design higher than the HD population. The prevalence of COVID-19 was set to 10% in the testing dataset to estimate the prevalence observed in the national HD population. The threshold for classifying observations as positive or negative was set at 0.80 to minimize false positives. Precision for the model was 0.52, the recall was 0.07, and the lift was 5.3 in the testing dataset. Area under the receiver operating characteristic curve (AUROC) and area under the precision-recall curve (AUPRC) for the model was 0.68 and 0.24 in the testing dataset, respectively. Top predictors of a patient on HD having a SARS-CoV-2 infection were the change in interdialytic weight gain from the previous month, mean pre-HD body temperature in the prior week, and the change in post-HD heart rate from the previous month.CONCLUSIONSThe developed ML model appears suitable for predicting patients on HD at risk of having COVID-19 at least 3 days before there would be a clinical suspicion of the disease.BACKGROUNDWe developed a machine learning (ML) model that predicts the risk of a patient on hemodialysis (HD) having an undetected SARS-CoV-2 infection that is identified after the following ≥3 days.METHODSAs part of a healthcare operations effort, we used patient data from a national network of dialysis clinics (February-September 2020) to develop an ML model (XGBoost) that uses 81 variables to predict the likelihood of an adult patient on HD having an undetected SARS-CoV-2 infection that is identified in the subsequent ≥3 days. We used a 60%:20%:20% randomized split of COVID-19-positive samples for the training, validation, and testing datasets.

  • Xiaoling Wang, Amrish Patel, Lela Tisdale, Zahin Haq, Xiaoling Ye, Rachel Lasky, Priscila Preciado, Xia Tao, Gabriela Ferreira Dias, Joshua E Chao, Mohamad Hakim, Maggie Han, Ohnmar Thwin, Jochen Raimann, Dinesh Chatoth, Peter Kotanko, Nadja Grobe

    RESULTSA total of 26 spent PD dialysate samples were collected from 11 patients from ten dialysis centers. Spent PD dialysate samples were collected, on average, 25±13 days (median, 20; range, 10-45) after the onset of symptoms. The temporal distance of PD effluent collection relative to the closest positive nasal-swab RT-PCR result was 15±11 days (median, 14; range, 1-41). All 26 PD effluent samples tested negative at three SARS-CoV-2 genomic regions.CONCLUSIONSOur findings indicate the absence of SARS-CoV-2 in spent PD dialysate collected at ≥10 days after the onset of COVID-19 symptoms. We cannot rule out the presence of SARS-CoV-2 in spent PD dialysate in the early stage of COVID-19.BACKGROUNDTo date, it is unclear whether SARS-CoV-2 is present in spent dialysate from patients with COVID-19 on peritoneal dialysis (PD). Our aim was to assess the presence or absence of SARS-CoV-2 in spent dialysate from patients on chronic PD who had a confirmed diagnosis of COVID-19.METHODSSpent PD dialysate samples from patients on PD who were positive for COVID-19 were collected between March and August 2020. The multiplexed, real-time RT-PCR assay contained primer/probe sets specific to different SARS-CoV-2 genomic regions and to bacteriophage MS2 as an internal process control for nucleic acid extraction. Demographic and clinical data were obtained from patients' electronic health records.

  • Murilo Guedes, Brian Bieber, Indranil Dasgupta, Almudena Vega, Kosaku Nitta, Steven Brunelli, John Hartman, Jochen G Raimann, Bruce M Robinson, Ronald L Pisoni

    Mineral bone disorder (MBD) is a frequent consequence of chronic kidney disease, more so in patients with kidney failure treated by kidney replacement therapy. Despite the wide availability of interventions to control serum phosphate and parathyroid hormone levels, unmet gaps remain on optimal targets and best practices, leading to international practice pattern variations over time. In this Special Report, we describe international trends from the Dialysis Outcomes and Practice Patterns Study (DOPPS) for MBD biomarkers and treatments from 2002-2021, including data from a group of 7 European countries (Belgium, France, Germany, Italy, Spain, Sweden, United Kingdom), Japan, and the United States. From 2002-2012, mean phosphate levels declined in Japan (5.6 to 5.2 mg/dL), Europe (5.5 to 4.9 mg/dL), and the United States (5.7 to 5.0 mg/dL). Since then, levels rose in the United States (to mean 5.6 mg/dL, 2021), were stable in Japan (5.3 mg/dL), and declined in Europe (4.8 mg/dL). In 2021, 52% (United States), 27% (Europe), and 39% (Japan) had phosphate >5.5 mg/dL. In the United States, overall phosphate binder use was stable (80%-84% over 2015-2021), and parathyroid hormone levels rose only modestly. Although these results potentially stem from pervasive knowledge gaps in clinical practice, the noteworthy steady increase in serum phosphate in the United States over the past decades may be consequential to patient outcomes, an uncertainty that hopefully will soon be addressed by ongoing clinical trials. The DOPPS will continue to monitor international trends as new interventions and strategies ensue for MBD management in chronic kidney disease.

  • Paulo Paneque Galuzio, Alhaji Cherif

    We reviewed some of the latest advancements in the use of mathematical models in nephrology. We looked over 2 distinct categories of mathematical models that are widely used in biological research and pointed out some of their strengths and weaknesses when applied to health care, especially in the context of nephrology. A mechanistic dynamical system allows the representation of causal relations among the system variables but with a more complex and longer development/implementation phase. Artificial intelligence/machine learning provides predictive tools that allow identifying correlative patterns in large data sets, but they are usually harder-to-interpret black boxes. Chronic kidney disease (CKD), a major worldwide health problem, generates copious quantities of data that can be leveraged by choice of the appropriate model; also, there is a large number of dialysis parameters that need to be determined at every treatment session that can benefit from predictive mechanistic models. Following important steps in the use of mathematical methods in medical science might be in the intersection of seemingly antagonistic frameworks, by leveraging the strength of each to provide better care.

  • Dalia E Yousif, Xiaoling Ye, Stefano Stuard, Juan Berbessi, Adrian M Guinsburg, Len A Usvyat, Jochen G Raimann, Jeroen P Kooman, Frank M van der Sande, Neill Duncan, Kevin J Woollard, Rupert Bright, Charles Pusey, Vineet Gupta, Joachim H Ix, Peter Kotanko, Rakesh Malhotra

    RESULTSWe studied 18,726 incident hemodialysis patients. Their age at dialysis initiation was 71.3 ± 11.9 years; 10,802 (58%) were males. Within the first 6 months, 2068 (11%) patients died, and 12,295 patients (67%) survived >36 months (survivor cohort). Hemodialysis patients who died showed a distinct biphasic pattern of change in inflammatory markers where an initial decline of inflammation was followed by a rapid rise that was consistently evident approximately 6 months before death. This pattern was similar in all patients who died and was consistent across the survival time intervals. In contrast, in the survivor cohort, we observed initial decline of inflammation followed by sustained low levels of inflammatory biomarkers.CONCLUSIONOur international study of incident hemodialysis patients highlights a temporal relationship between serial measurements of inflammatory markers and patient survival. This finding may inform the development of prognostic models, such as the integration of dynamic changes in inflammatory markers for individual risk profiling and guiding preventive and therapeutic interventions.INTRODUCTIONInflammation is highly prevalent among patients with end-stage kidney disease and is associated with adverse outcomes. We aimed to investigate longitudinal changes in inflammatory markers in a diverse international incident hemodialysis patient population.METHODSThe MONitoring Dialysis Outcomes (MONDO) Consortium encompasses hemodialysis databases from 31 countries in Europe, North America, South America, and Asia. The MONDO database was queried for inflammatory markers (total white blood cell count [WBC], neutrophil count, lymphocyte count, serum albumin, and C-reactive protein [CRP]) and hemoglobin levels in incident hemodialysis patients. Laboratory parameters were measured every month. Patients were stratified by survival time (≤6 months, >6 to 12 months, >12 to 18 months, >18 to 24 months, >24 to 30 months, >30 to 36 months, and >36 months) following dialysis initiation. We used cubic B-spline basis function to evaluate temporal changes in inflammatory parameters in relationship with patient survival.

  • Gabriela F Dias, Sara S Tozoni, Gabriela Bohnen, Beatriz A K van Spitzenbergen, Nadja Grobe, Lia S Nakao, Roberto Pecoits-Filho, Peter Kotanko, Andréa N Moreno-Amaral

    Oxidative stress (OS) is essential in uremia-associated comorbidities, including renal anemia. Complications experienced by hemodialysis (HD) patients, such as hypoxemia and uremic toxins accumulation, induce OS and premature death of red blood cells (RBC). We aimed to characterize reactive oxygen species (ROS) production and antioxidant pathways in HD-RBC and RBC from healthy controls (CON-RBC) and evaluate the role of uremia and hypoxia in these pathways. ROS production, xanthine oxidase (XO) and superoxide dismutase (SOD) activities, glutathione (GSH), and heme oxygenase-1 (HO-1) levels were measured using flow cytometry or spectrophotometry in CON-RBC and HD-RBC (pre- and post-HD), at baseline and after 24 h incubation with uremic serum (S-HD) and/or under hypoxic conditions (5% O2 ). Ketoprofen was used to inhibit RBC uremic toxins uptake. HD-RBC showed higher ROS levels and lower XO activity than CON-RBC, particularly post-HD. GSH levels were lower, while SOD activity and HO-1 levels of HD-RBC were higher than control. Hypoxia per se triggered ROS production in CON-RBC and HD-RBC. S-HD, on top of hypoxia, increased ROS levels. Inhibition of uremic toxins uptake attenuated ROS of CON and HD-RBC under hypoxia and uremia. CON-RBC in uremia and hypoxia showed lower GSH levels than cells in normoxia and non-uremic conditions. Redox mechanisms of HD-RBC are altered and prone to oxidation. Uremic toxins and hypoxia play a role in unbalancing these systems. Hypoxia and uremia participate in the pathogenesis of OS in HD-RBC and might induce RBC death and thus compound anemia.

  • Rasha Hussein, Murilo Guedes, Nada Ibraheim, Mazin M Ali, Amal El-Tahir, Nahla Allam, Hussain Abuakar, Roberto Pecoits-Filho, Peter Kotanko

    OBJECTIVESDespite the possibility of concurrent infection with COVID-19 and malaria, little is known about the clinical course of coinfected patients. We analysed the clinical outcomes of patients with concurrent COVID-19 and malaria infection.RESULTSWe included 591 patients with confirmed COVID-19 diagnosis who were also tested for malaria. Mean (SD) age was 58 (16.2) years, 446/591 (75.5%) were males. Malaria was diagnosed in 270/591 (45.7%) patients. Most malaria patients were infected by Plasmodium falciparum (140/270; 51.9%), while 121/270 (44.8%) were coinfected with Plasmodium falciparum and Plasmodium vivax. Median follow-up was 29 days. Crude mortality rates were 10.71 and 5.87 per 1000 person-days for patients with and without concurrent malaria, respectively. In the fully adjusted Cox model, patients with concurrent malaria and COVID-19 had a greater mortality risk (hazard ratio 1.43, 95% confidence interval 1.21-1.69).DISCUSSIONCoinfection with COVID-19 and malaria is associated with increased all-cause in-hospital mortality compared to monoinfection with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2).METHODSWe conducted a retrospective cohort study that assessed prospectively collected data of all patients who were admitted between May and December 2020 to the Universal COVID-19 treatment center (UCTC), Khartoum, Sudan. UCTC compiled demographic, clinical, laboratory (including testing for malaria), and outcome data in all patients with confirmed COVID-19 hospitalized at that clinic. The primary outcome was all-cause mortality during the hospital stay. We built proportional hazard Cox models with malaria status as the main exposure and stepwise adjustment for age, sex, cardiovascular comorbidities, diabetes, and hypertension.

  • Paulo P Galuzio, Alhaji Cherif, Xia Tao, Ohnmar Thwin, Hanjie Zhang, Stephan Thijssen, Peter Kotanko

    In patients with kidney failure treated by hemodialysis, intradialytic arterial oxygen saturation (SaO2) time series present intermittent high-frequency high-amplitude oximetry patterns (IHHOP), which correlate with observed sleep-associated breathing disturbances. A new method for identifying such intermittent patterns is proposed. The method is based on the analysis of recurrence in the time series through the quantification of an optimal recurrence threshold ([Formula: see text]). New time series for the value of [Formula: see text] were constructed using a rolling window scheme, which allowed for real-time identification of the occurrence of IHHOPs. The results for the optimal recurrence threshold were confronted with standard metrics used in studies of obstructive sleep apnea, namely the oxygen desaturation index (ODI) and oxygen desaturation density (ODD). A high correlation between [Formula: see text] and the ODD was observed. Using the value of the ODI as a surrogate to the apnea-hypopnea index (AHI), it was shown that the value of [Formula: see text] distinguishes occurrences of sleep apnea with great accuracy. When subjected to binary classifiers, this newly proposed metric has great power for predicting the occurrences of sleep apnea-related events, as can be seen by the larger than 0.90 AUC observed in the ROC curve. Therefore, the optimal threshold [Formula: see text] from recurrence analysis can be used as a metric to quantify the occurrence of abnormal behaviors in the arterial oxygen saturation time series.

  • Adrián M Guinsburg, Yue Jiao, María Inés Díaz Bessone, Caitlin K Monaghan, Beatriz Magalhães, Michael A Kraus, Peter Kotanko, Jeffrey L Hymes, Robert J Kossmann, Juan Carlos Berbessi, Franklin W Maddux, Len A Usvyat, John W Larkin

    RESULTSAmong HD patients with COVID-19, 28.8% (1,001/3,473) died in LatAm and 20.5% (4,426/21,624) died in North America. Mortality occurred earlier in LatAm versus North America; 15.0% and 7.3% of patients died within 0-14 days, 7.9% and 4.6% of patients died within 15-30 days, and 5.9% and 8.6% of patients died > 30 days after COVID-19 presentation, respectively. Area under curve ranged from 0.73 to 0.83 across prediction models in both regions. Top predictors of death after COVID-19 consistently included older age, longer vintage, markers of poor nutrition and more inflammation in both regions at all timepoints. Unique patient attributes (higher BMI, male sex) were top predictors of mortality during 0-14 and 15-30 days after COVID-19, yet not mortality > 30 days after presentation.CONCLUSIONSFindings showed distinct profiles of mortality in COVID-19 in LatAm and North America throughout 2020. Mortality rate was higher within 0-14 and 15-30 days after COVID-19 in LatAm, while mortality rate was higher in North America > 30 days after presentation. Nonetheless, a remarkable proportion of HD patients died > 30 days after COVID-19 presentation in both regions. We were able to develop a series of suitable prognostic prediction models and establish the top predictors of death in COVID-19 during shorter-, intermediate-, and longer-term follow up periods.BACKGROUNDWe developed machine learning models to understand the predictors of shorter-, intermediate-, and longer-term mortality among hemodialysis (HD) patients affected by COVID-19 in four countries in the Americas.METHODSWe used data from adult HD patients treated at regional institutions of a global provider in Latin America (LatAm) and North America who contracted COVID-19 in 2020 before SARS-CoV-2 vaccines were available. Using 93 commonly captured variables, we developed machine learning models that predicted the likelihood of death overall, as well as during 0-14, 15-30, > 30 days after COVID-19 presentation and identified the importance of predictors. XGBoost models were built in parallel using the same programming with a 60%:20%:20% random split for training, validation, & testing data for the datasets from LatAm (Argentina, Columbia, Ecuador) and North America (United States) countries.

  • Hemodialysis international. International Symposium on Home Hemodialysis

    29 Oct 2022 Estimation of fluid status using three multifrequency bioimpedance methods in hemodialysis patients

    Lin-Chun Wang, Jochen G Raimann, Xia Tao, Priscila Preciado, Ohnmar Thwin, Laura Rosales, Stephan Thijssen, Peter Kotanko, Fansan Zhu

    DISCUSSIONAlthough segmental eight-point bioimpedance techniques provided comparable TBW measurements not affected by standing over a period of 10-15 min, the ECW/TBW ratio appeared to be significantly lower in InBody compared with Seca and Hydra. Results from our study showed lack of agreement between different bioimpedance devices; direct comparison of ECW, ICW, and ECW/TBW between different devices should be avoided and clinicians should use the same device to track the fluid status in their HD population in a longitudinal direction.INTRODUCTIONSegmental eight-point bioimpedance has been increasingly used in practice. However, whether changes in bioimpedance analysis components before and after hemodialysis (HD) using this technique in a standing position is comparable to traditional whole-body wrist-to-ankle method is still unclear. We aimed to investigate the differences between two eight-point devices (InBody 770 and Seca mBCA 514) and one wrist-to-ankle (Hydra 4200) in HD patients and healthy subjects in a standing position.FINDINGSOverall, total body water (TBW) was not different between the three devices, but InBody showed lower extracellular water (ECW) and higher intracellular water (ICW) compared to the other two devices. When intradialytic weight loss was used as a surrogate for changes in ECW (∆ECW) and changes in TBW (∆TBW), ∆ECW was underestimated by Hydra (-0.79 ± 0.89 L, p < 0.01), InBody (-1.44 ± 0.65 L, p < 0.0001), and Seca (-0.32 ± 1.34, n.s.). ∆TBW was underestimated by Hydra (-1.14 ± 2.81 L, n.s.) and InBody (-0.52 ± 0.85 L, p < 0.05) but overestimated by Seca (+0.93 ± 3.55 L, n.s.).METHODSThirteen HD patients were studied pre- and post-HD, and 12 healthy subjects once. Four measurements were performed in the following order: InBody; Seca; Hydra; and InBody again. Electrical equivalent models by each bioimpedance method and the fluid volume estimates by each device were also compared.

  • Jochen G Raimann, Christopher T Chan, John T Daugirdas, Thomas Depner, Tom Greene, George A Kaysen, Alan S Kliger, Peter Kotanko, Brett Larive, Gerald Beck, Robert McGregor Lindsay, Michael V Rocco, Glenn M Chertow, Nathan W Levin

    RESULTSIn 197 enrolled subjects in the FHN Daily Trial, the treatment effect of frequent HD on ∆LVM was modified by SNa. When the FHN Daily Trial participants are divided into lower and higher predialysis SNa groups (less and greater than 138 mEq/L), the LVM reduction in the lower group was substantially higher (-28.0 [95% CI -40.5 to -15.4] g) than in the higher predialysis SNa group (-2.0 [95% CI -15.5 to 11.5] g). Accounting for GNa, TIFL also showed more pronounced effects among patients with higher GNa or higher TIFL. Results in the Nocturnal Trial were similar in direction and magnitude but did not reach statistical significance.INTRODUCTIONThe Frequent Hemodialysis Network (FHN) Daily and Nocturnal trials aimed to compare the effects of hemodialysis (HD) given 6 versus 3 times per week. More frequent in-center HD significantly reduced left-ventricular mass (LVM), with more pronounced effects in patients with low urine volumes. In this study, we aimed to explore another potential effect modifier: the predialysis serum sodium (SNa) and related proxies of plasma tonicity.DISCUSSION/CONCLUSIONIn the FHN Daily Trial, the favorable effects of frequent HD on left-ventricular hypertrophy were more pronounced among patients with lower predialysis SNa and higher GNa and TIFL. Whether these metrics can be used to identify patients most likely to benefit from frequent HD or other dialytic or nondialytic interventions remains to be determined. Prospective, adequately powered studies studying the effect of GNa reduction on mortality and hospitalization are needed.METHODSUsing data from the FHN Daily and Nocturnal Trials, we compared the effects of frequent HD on LVM among patients stratified by SNa, dialysate-to-predialysis serum-sodium gradient (GNa), systolic and diastolic blood pressure, time-integrated sodium-adjusted fluid load (TIFL), and extracellular fluid volume estimated by bioelectrical impedance analysis.

  • Xiaoling Wang, Maggie Han, Lemuel Rivera Fuentes, Ohnmar Thwin, Nadja Grobe, Kevin Wang, Yuedong Wang, Peter Kotanko

    RESULTSForty-two patients had three doses of mRNA1273. Compared to levels prior to the third dose, nAb-WT increased 18-fold (peak at day 23) and nAb-Omicron increased 23-fold (peak at day 24) after the third dose. Peak nAb-WT exceeded peak nAb-Omicron 27-fold. Twenty-one patients had COVID-19 between December 24, 2021, and February 2, 2022. Following COVID-19, nAb-WT and nAb-Omicron increased 12- and 40-fold, respectively. While levels of vaccinal and post-COVID nAb-WT were comparable, post-COVID nAb-Omicron levels were 3.2 higher than the respective peak vaccinal nAb-Omicron. Four immunocompromised patients having reasons other than end-stage kidney disease have very low to no nAb after the third dose or COVID-19.CONCLUSIONSOur results suggest that most hemodialysis patients have a strong humoral response to the third dose of vaccination and an even stronger post-COVID-19 humoral response. Nevertheless, nAb levels clearly decay over time. These findings may inform ongoing discussions regarding a fourth vaccination in hemodialysis patients.BACKGROUNDIn hemodialysis patients, a third vaccination is frequently administered to augment protection against coronavirus disease 2019 (COVID-19). However, the newly emerged B.1.1.159 (Omicron) variant may evade vaccinal protection more easily than previous strains. It is of clinical interest to better understand the neutralizing activity against severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) variants after booster vaccine or COVID-19 infection in these mostly immunocompromised patients.METHODSHemodialysis patients from four dialysis centers were recruited between June 2021 and February 2022. Each patient provided a median of six serum samples. SARS-CoV-2 neutralizing antibodies (nAbs) against wild type (WT) or Omicron were measured using the GenScript SARS-CoV-2 Surrogate Virus Neutralization Test Kit.

  • Clinical microbiology and infection

    17 Sep 2022 Sample pooling: burden or solution

    Nadja Grobe, Alhaji Cherif, Xiaoling Wang, Zijun Dong, Peter Kotanko

    AIMSThis narrative review aims to provide a comprehensive overview of the global efforts to implement pool testing, specifically for COVID-19 screening.SOURCESData were retrieved from a detailed search for peer-reviewed articles and preprint reports using Medline/PubMed, medRxiv, Web of Science, and Google up to 21st March 2021, using search terms "pool testing", "viral", "serum", "SARS-CoV-2" and "COVID-19".IMPLICATIONSThe theory of pool testing is well understood and numerous successful examples from the past are available. Operationalization of pool testing requires sophisticated processes that can be adapted to the local medical circumstances. Special attention needs to be paid to sample collection, sample pooling, and strategies to avoid re-sampling.CONTENTThis review summarizes the history and theory of pool testing. We identified numerous peer-reviewed articles that describe specific details and practical implementation of pool testing. Successful examples as well as limitations of pool testing, in general and specifically related to the detection of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) RNA and antibodies, are reviewed. While promising, significant operational, pre-analytical, logistical, and economic challenges need to be overcome to advance pool testing.BACKGROUNDPool-testing strategies combine samples from multiple people and test them as a group. A pool-testing approach may shorten the screening time and increase the test rate during times of limited test availability and inadequate reporting speed. Pool testing has been effectively used for a wide variety of infectious disease screening settings. Historically, it originated from serological testing in syphilis. During the current coronavirus disease 2019 (COVID-19) pandemic, pool testing is considered across the globe to inform opening strategies and to monitor infection rates after the implementation of interventions.

  • Pablo Maggiani-Aguilera, Jochen G Raimann, Jonathan S Chávez-Iñiguez, Guillermo Navarro-Blackaller, Peter Kotanko, Guillermo Garcia-Garcia

    RESULTSIn 1,632 patients from RRI, the CVC prevalence at month 1 was 64% and 97% among 174 HC patients. The conversion rate was 31.7% in RRI and 10.6% in HC. CVC to non-central venous catheter (NON-CVC) conversion reduced the risk of hospitalization in both HC (aHR 0.38 [95% CI: 0.21-0.68], p = 0.001) and RRI (aHR 0.84 [95% CI: 0.73-0.93], p = 0.001). NON-CVC patients had a lower mortality risk in both populations.INTRODUCTIONCentral venous catheter (CVC) as vascular access in hemodialysis (HD) associates with adverse outcomes. Early CVC to fistula or graft conversion improves these outcomes. While socioeconomic disparities between the USA and Mexico exist, little is known about CVC prevalence and conversion rates in uninsured Mexican HD patients. We examined vascular access practice patterns and their effects on survival and hospitalization rates among uninsured Mexican HD patients, in comparison with HD patients who initiated treatment in the USA.DISCUSSION/CONCLUSIONCVC prevalence and conversion rates of CVC to NON-CVC differed between the US and Mexican patients. An association exists between vascular access type and hospitalization and mortality risk. Prospective studies are needed to evaluate if accelerated and systematic catheter use reduction would improve outcomes in these populations.METHODSIn this retrospective study of incident HD patients at Hospital Civil (HC; Guadalajara, MX) and the Renal Research Institute (RRI; USA), we categorized patients by the vascular access at the first month of HD and after the following 6 months. Factors associated with continued CVC use were identified by a logistic regression model. We developed multivariate Cox proportional hazards models to investigate the effects of access and conversion on mortality and hospitalization over an 18-month follow-up period.

  • Hemodialysis international. International Symposium on Home Hemodialysis

    5 Sep 2022 Trajectories of clinical and laboratory characteristics associated with COVID-19 in hemodialysis patients by survival

    Sheetal Chaudhuri, Rachel Lasky, Yue Jiao, John Larkin, Caitlin Monaghan, Anke Winter, Luca Neri, Peter Kotanko, Jeffrey Hymes, Sangho Lee, Yuedong Wang, Jeroen P Kooman, Franklin Maddux, Len Usvyat

    RESULTSThere were 12,836 HD patients with a suspicion of COVID-19 who received RT-PCR testing (8895 SARS-CoV-2 positive). We observed significantly different trends (p < 0.05) in pre-HD systolic blood pressure (SBP), pre-HD pulse rate, body temperature, ferritin, neutrophils, lymphocytes, albumin, and interdialytic weight gain (IDWG) between COVID-19 positive and negative patients. For COVID-19 positive group, we observed significantly different clinical trends (p < 0.05) in pre-HD pulse rate, lymphocytes, neutrophils, and albumin between survivors and nonsurvivors. We also observed that, in the group of survivors, most clinical parameters returned to pre-COVID-19 levels within 60-90 days.CONCLUSIONWe observed unique temporal trends in various clinical and laboratory parameters among HD patients who tested positive versus negative for SARS-CoV-2 infection and those who survived the infection versus those who died. These trends can help to define the physiological disturbances that characterize the onset and course of COVID-19 in HD patients.INTRODUCTIONThe clinical impact of COVID-19 has not been established in the dialysis population. We evaluated the trajectories of clinical and laboratory parameters in hemodialysis (HD) patients.METHODSWe used data from adult HD patients treated at an integrated kidney disease company who received a reverse transcription polymerase chain reaction (RT-PCR) test to investigate suspicion of a severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2) infection between May 1 and September 1, 2020. Nonparametric smoothing splines were used to fit data for individual trajectories and estimate the mean change over time in patients testing positive or negative for SARS-CoV-2 and those who survived or died within 30 days of first suspicion or positive test date. For each clinical parameter of interest, the difference in average daily changes between COVID-19 positive versus negative group and COVID-19 survivor versus nonsurvivor group was estimated by fitting a linear mixed effects model based on measurements in the 14 days before (i.e., Day -14 to Day 0) Day 0.

  • David J Jörg, Doris H Fuertinger, Alhaji Cherif, David A Bushinsky, Ariella Mermelstein, Jochen G Raimann, Peter Kotanko

    Our bones are constantly being renewed in a fine-tuned cycle of destruction and formation that helps keep them healthy and strong. However, this process can become imbalanced and lead to osteoporosis, where the bones are weakened and have a high risk of fracturing. This is particularly common post-menopause, with one in three women over the age of 50 experiencing a broken bone due to osteoporosis. There are several drug types available for treating osteoporosis, which work in different ways to strengthen bones. These drugs can be taken individually or combined, meaning that a huge number of drug combinations and treatment strategies are theoretically possible. However, it is not practical to test the effectiveness of all of these options in human trials. This could mean that patients are not getting the maximum potential benefit from the drugs available. Jörg et al. developed a mathematical model to predict how different osteoporosis drugs affect the process of bone renewal in the human body. The model could then simulate the effect of changing the order in which the therapies were taken, which showed that the sequence had a considerable impact on the efficacy of the treatment. This occurs because different drugs can interact with each other, leading to an improved outcome when they work in the right order. These results suggest that people with osteoporosis may benefit from altered treatment schemes without changing the type or amount of medication taken. The model could suggest new treatment combinations that reduce the risk of bone fracture, potentially even developing personalised plans for individual patients based on routine clinical measurements in response to different drugs.

  • Usama Hussein, Monica Cimini, Garry J Handelman, Jochen G Raimann, Li Liu, Samer R Abbas, Peter Kotanko, Nathan W Levin, Fredric O Finkelstein, Fansan Zhu

    Diagnosis of fluid overload (FO) in early stage is essential to manage fluid balance of patients with chronic kidney disease (CKD) and to prevent cardiovascular disease (CVD). However, the identification of fluid status in patients with CKD is largely dependent on the physician's clinical acumen. The ratio of fluid overload to extracellular volume (FO/ECV) has been used as a reference to assess fluid status. The primary aim of this study was to compare FO/ECV with other bioimpedance methods and clinical assessments in patients with CKD. Whole body ECV, intracellular volume (ICV), total body water (TBW), and calf normalized resistivity (CNR) were measured (Hydra 4200). Thresholds of FO utilizing CNR and ECV/TBW were derived by receiver operator characteristic (ROC) analysis based on data from pooled patients with CKD and healthy subjects (HSs). Clinical assessments of FO in patients with CKD were performed by nephrologists. Patients with CKD (stage 3 and stage 4) (n = 50) and HSs (n = 189) were studied. The thresholds of FO were ≤14.3 (10-2 Ωm3/kg) for females and ≤13.1 (10-2 Ωm3/kg) for males using CNR and ≥0.445 in females and ≥0.434 in males using ECV/TBW. FO was diagnosed in 78%, 62%, and 52% of patients with CKD by CNR, FO/ECV, and ECV/TBW, respectively, whereas only 24% of patients with CKD were diagnosed to be FO by clinical assessment. The proportion of FO in patients with nondialysis CKD was largely underestimated by clinical assessment compared with FO/ECV, CNR, and ECV/TBW. CNR and FO/ECV methods were more sensitive than ECV/TBW in identifying fluid overload in these patients with CKD.NEW & NOTEWORTHY We found that fluid overload (FO) in patients with nondialysis CKD was largely underestimated by clinical assessment compared with bioimpedance methods, which was majorly due to lack of appropriate techniques to assess FO. In addition, although degree of FO by bioimpedance markers positively correlated with the age in healthy subjects (HSs), no difference was observed in the three hydration markers between groups of 50 ≤ age <70 yr and age ≥70 yr in the patients with CKD.

  • Murilo Guedes, Liz Wallim, Camila R Guetter, Yue Jiao, Vladimir Rigodon, Chance Mysayphonh, Len A Usvyat, Pasqual Barretti, Peter Kotanko, John W Larkin, Franklin W Maddux, Roberto Pecoits-Filho, Thyago Proenca de Moraes

    RESULTSWe used data from 4,285 PD patients (Brazil n = 1,388 and United States n = 2,897). Model estimates showed lower vitality levels within 90 days of starting PD were associated with a higher risk of mortality, which was consistent in Brazil and the United States cohorts. In the multivariate survival model, each 10-unit increase in vitality score was associated with lower risk of all-cause mortality in both cohorts (Brazil HR = 0.79 [95%CI 0.70 to 0.90] and United States HR = 0.90 [95%CI 0.88 to 0.93], pooled HR = 0.86 [95%CI 0.75 to 0.98]). Results for all models provided consistent effect estimates.CONCLUSIONSAmong patients in Brazil and the United States, lower vitality score in the initial months of PD was independently associated with all-cause mortality.BACKGROUNDWe tested if fatigue in incident Peritoneal Dialysis associated with an increased risk for mortality, independently from main confounders.METHODSWe conducted a side-by-side study from two of incident PD patients in Brazil and the United States. We used the same code to independently analyze data in both countries during 2004 to 2011. We included data from adults who completed KDQOL-SF vitality subscale within 90 days after starting PD. Vitality score was categorized in four groups: >50 (high vitality), ≥40 to ≤50 (moderate vitality), >35 to <40 (moderate fatigue), ≤35 (high fatigue; reference group). In each country's cohort, we built four distinct models to estimate the associations between vitality (exposure) and all-cause mortality (outcome): (i) Cox regression model; (ii) competing risk model accounting for technique failure events; (iii) multilevel survival model of clinic-level clusters; (iv) multivariate regression model with smoothing splines treating vitality as a continuous measure. Analyses were adjusted for age, comorbidities, PD modality, hemoglobin, and albumin. A mixed-effects meta-analysis was used to pool hazard ratios (HRs) from both cohorts to model mortality risk for each 10-unit increase in vitality.

  • Jochen G Raimann, Yuedong Wang, Ariella Mermelstein, Peter Kotanko, John T Daugirdas

    RESULTSIn the studied 2542 patients, UFR not scaled to body weight was strongly associated with MHR, whereas postdialysis weight was inversely associated with MHR. MHR crossed 1.5 when unscaled UFR exceeded 1000 ml/h, and this relationship was largely independent of postdialysis weight in the range of 80 to 140 kg. A UFR warning level associated with a lower MHR of 1.3 would be 900 ml/h, whereas the UFR associated with an MHR of 1.0 was patient-size dependent. The MHR when exceeding a UFR threshold of 13 ml/h per kg was dependent on patient weight (MHR = 1.20, 1.45, and >2.0 for a 60, 80, and 100 kg patient, respectively).CONCLUSIONUFR thresholds based on unscaled UFR give more uniform risk levels for patients of different sizes than thresholds based on UFR/kg.INTRODUCTIONOne proposed threshold ultrafiltration rate (UFR) of concern in hemodialysis patients is 13 ml/h per kg. We evaluated associations among UFR, postdialysis weight, and mortality to determine whether exceeding such a threshold would result in similar levels of risk for patients of different body weights.METHODSData were analyzed in this retrospective cohort study for 1 year following dialysis initiation (baseline) and over 2 years of follow-up in incident patients receiving thrice-weekly in-center hemodialysis. Patient-level UFR was averaged over the baseline period. To investigate the joint effect of UFR and postdialysis weight on survival, we fit Cox proportional hazards models using bivariate tensor product spline functions, adjusting for sex, race, age, diabetes, and predialysis serum albumin, phosphorus, and systolic blood pressure (BP). We constructed contour plots of mortality hazard ratios (MHRs) over the entire range of UFR values and postdialysis weights.

  • FASEB bioAdvances

    16 Jul 2022 The Piezo1 hypothesis of renal anemia

    Peter Kotanko, David J Jörg, Nadja Grobe, Christoph Zaba

    Erythropoietin deficiency is an extensively researched cause of renal anemia. The etiology and consequences of shortened red blood cell (RBC) life span in chronic kidney disease (CKD) are less well understood. Traversing capillaries requires RBC geometry changes, a process enabled by adaptions of the cytoskeleton. These changes are mediated by transient activation of the mechanosensory Piezo1 channel, resulting in calcium influx. Importantly, prolonged Piezo1 activation shortens RBC life span, presumably through activation of calcium-dependent intracellular pathways triggering RBC death. Two Piezo1-activating small molecules, Jedi1 and Jedi2, share remarkable structural similarities with 3-carboxy-4-methyl-5-propyl-2-furanpropanoic acid (CMPF), a uremic retention solute cleared by the healthy kidney. We hypothesize that in CKD the accumulation of CMPF leads to prolonged activation of Piezo1 (similar in effect to Jedi1 and Jedi2), thus reducing RBC life span. This hypothesis can be tested through bench experiments and, ultimately, by studying the effect of CMPF removal on renal anemia.

  • Orly F Kohn, Susie Q Lew, Steve Siu-Man Wong, Ramin Sam, Hung-Chun Chen, Jochen G Raimann, David J Leehey, Antonios H Tzamaloukas, Todd S Ing

    Herbal medicine, a form of complementary and alternative medicine (CAM), is used throughout the world, in both developing and developed countries. The ingredients in herbal medicines are not standardized by any regulatory agency. Variability exists in the ingredients as well as in their concentrations. Plant products may become contaminated with bacteria and fungi during storage. Therefore, harm can occur to the kidney, liver, and blood components after ingestion. We encourage scientific studies to identify the active ingredients in herbs and to standardize their concentrations in all herbal preparations. Rigorous studies need to be performed in order to understand the effect of herbal ingredients on different organ systems as well as these substances' interaction with other medications.

  • Hemodialysis international. International Symposium on Home Hemodialysis

    17 Jun 2022 Prevalence of fluid overload in an urban US hemodialysis population: A cross-sectional study

    Ulrich Moissl, Lemuel Rivera Fuentes, Mohamad I Hakim, Manuel Hassler, Dewangi A Kothari, Laura Rosales, Fansan Zhu, Jochen G Raimann, Stephan Thijssen, Peter Kotanko

    DISCUSSIONWhile about half of the patients had normal fluid status pre-HD, a considerable proportion of patients was either fluid overloaded or depleted, indicating the need for tools to objectively quantify fluid status.INTRODUCTIONInadequate fluid status remains a key driver of cardiovascular morbidity and mortality in chronic hemodialysis (HD) patients. Quantification of fluid overload (FO) using bioimpedance spectroscopy (BIS) has become standard in many countries. To date, no BIS device has been approved in the United States for fluid status assessment in kidney patients. Therefore, no previous quantification of fluid status in US kidney patients using BIS has been reported. Our aim was to conduct a cross-sectional BIS-based assessment of fluid status in an urban US HD population.FINDINGSWe studied 170 urban HD patients (age 61 ± 14 years, 60% male). Pre- and post-HD FO (mean ± SD), were 2.2 ± 2.4 and -0.2 ± 2.7 L, respectively. Pre-HD, 43% of patients were fluid overloaded, 53% normally hydrated, and 4% fluid depleted. Post-HD, 12% were fluid overloaded, 55% normohydrated and 32% fluid depleted. Only 48% of fluid overloaded patients were hypertensive, while 38% were normotensive and 14% hypotensive. Fluid status did not differ significantly between African Americans (N = 90) and Caucasians (N = 61).METHODSWe determined fluid status in chronic HD patients using whole body BIS (Body Composition Monitor, BCM). The BCM reports FO in liters; negative FO denotes fluid depletion. Measurements were performed before dialysis. Post-HD FO was estimated by subtracting the intradialytic weight loss from the pre-HD FO.

  • The International journal of artificial organs

    1 May 2022 Proportional integral feedback control of ultrafiltration rate in hemodialysis

    Sabrina Casper, Doris H Fuertinger, Leticia M Tapia Silva, Lemuel Rivera Fuentes, Stephan Thijssen, Peter Kotanko

    RESULTSIn all tests, the ultrafiltration controller performed as expected. In the in silico and ex vivo bench experiments, the controller showed robust reaction toward deliberate disruptive interventions (e.g. signal noise; extreme plasma refill rates). No adverse events were observed in the clinical study.CONCLUSIONSThe ultrafiltration controller can steer RBV trajectories toward desired RBV ranges while obeying to a set of constraints. Prospective studies in hemodialysis patients with diverse clinical characteristics are warranted to further explore the controllers impact on intradialytic hemodynamic stability, quality of life, and long-term outcomes.BACKGROUNDMost hemodialysis patients without residual kidney function accumulate fluid between dialysis session that needs to be removed by ultrafiltration. Ultrafiltration usually results in a decline in relative blood volume (RBV). Recent epidemiological research has identified RBV ranges that were associated with significantly better survival. The objective of this work was to develop an ultrafiltration controller to steer a patient's RBV trajectory into these favorable RBV ranges.METHODSWe designed a proportional-integral feedback ultrafiltration controller that utilizes signals from a device that reports RBV. The control goal is to attain the RBV trajectory associated with improved patient survival. Additional constraints such as upper and lower bounds of ultrafiltration volume and rate were realized. The controller was evaluated in in silico and ex vivo bench experiments, and in a clinical proof-of-concept study in two maintenance dialysis patients.

  • Paul A Rootjes, Erik Lars Penne, Georges Ouellet, Yanna Dou, Stephan Thijssen, Peter Kotanko, Jochen G Raimann

    RESULTSSeventeen chronic HD patients (11 males, age 54.1 ± 18.7 years) completed the study. The average priming and rinsing volumes were 236.7 ± 77.5 and 245.0 ± 91.8 mL respectively. The mean IDWG did not significantly change (2.52 ± 0.88 kg in Phase 1; 2.28 ± 0.70 kg in Phase 2; and 2.51 ± 1.2 kg in Phase 3). No differences in blood pressures, intradialytic symptoms or thirst were observed.MATERIALS AND METHODSWe enrolled non-diabetic and anuric stable HD patients. First, the extracorporeal circuit was primed and rinsed with approximately 200-250 mL of isotonic saline during 4 weeks (Phase 1), subsequently a similar volume of a 5% dextrose solution replaced the saline for another 4 weeks (Phase 2), followed by another 4 weeks of saline (Phase 3). We collected data on interdialytic weight gain (IDWG), pre- and post-dialysis blood pressure, intradialytic symptoms, and thirst.CONCLUSIONSReplacing saline by 5% dextrose for priming and rinsing is feasible in stable HD patients and may reduce intradialytic sodium loading. A non-significant trend toward a lower IDWG was observed when 5% dextrose was used. Prospective studies with a larger sample size and longer follow-up are needed to gain further insight into the possible effects of using alternate priming and rinsing solutions lowering intradialytic sodium loading.TRIAL REGISTRATIONIdentifier NCT01168947 (ClinicalTrials.gov).INTRODUCTIONExcess sodium intake and consequent volume overload are major clinical problems in hemodialysis (HD) contributing to adverse outcomes. Saline used for priming and rinsing of the extracorporeal circuit is a potentially underappreciated source of intradialytic sodium gain. We aimed to examine the feasibility and clinical effects of replacing saline as the priming and rinsing fluid by a 5% dextrose solution.

  • Ravi Thadhani, Joanna Willetts, Catherine Wang, John Larkin, Hanjie Zhang, Lemuel Rivera Fuentes, Len Usvyat, Kathleen Belmonte, Yuedong Wang, Robert Kossmann, Jeffrey Hymes, Peter Kotanko, Franklin Maddux

    MEASUREMENTSConditional logistic regression models tested whether chair exposure after a positive patient conferred a higher risk of SARS-CoV-2 infection to the immediate subsequent patient.RESULTSAmong 170,234 hemodialysis patients, 4,782 (2.8%) tested positive for SARS-CoV-2 (mean age 64 years, 44% female). Most facilities (68.5%) had 0 to 1 positive SARS-CoV-2 patient. We matched 2,379 SARS-CoV-2 positive cases to 2,379 non-SARS-CoV-2 controls; 1.30% (95%CI 0.90%, 1.87%) of cases and 1.39% (95%CI 0.97%, 1.97%) of controls were exposed to a chair previously sat in by a shedding SARS-CoV-2 patient. Transmission risk among cases was not significantly different from controls (OR=0.94; 95%CI 0.57 to 1.54; p=0.80). Results remained consistent in adjusted and sensitivity analyses.PATIENTSAdult (age ≥18 years) hemodialysis patients.DESIGNWe used real-world data from hemodialysis patients treated between February 1 st and June 8 th , 2020 to perform a case-control study matching each SARS-CoV-2 positive patient (case) to a non-SARS-CoV-2 patient (control) in the same dialysis shift and traced back 14 days to capture possible exposure from chairs sat in by SARS-CoV-2 patients. Cases and controls were matched on age, sex, race, facility, shift date, and treatment count.CONCLUSIONSThe risk of indirect patient-to-patient transmission of SARS-CoV-2 infection from dialysis chairs appears to be low.OBJECTIVEWe examined transmission within hemodialysis facilities, with a specific focus on the possibility of indirect patient-to-patient transmission through shared dialysis chairs.PRIMARY FUNDING SOURCEFresenius Medical Care North America; National Institute of Diabetes and Digestive and Kidney Diseases (R01DK130067).SETTING2,600 hemodialysis facilities in the United States.BACKGROUNDSARS-CoV-2 is primarily transmitted through aerosolized droplets; however, the virus can remain transiently viable on surfaces.LIMITATIONAnalysis used real-world data that could contain errors and only considered vertical transmission associated with shared use of dialysis chairs by symptomatic patients.

  • Priscila Preciado, Leticia M Tapia Silva, Xiaoling Ye, Hanjie Zhang, Yuedong Wang, Peter Waguespack, Jeroen P Kooman, Peter Kotanko

    RESULTSIntradialytic SaO2 was available in 52 patients (29 males; mean ± standard deviation age 66.5 ± 15.7 years) contributing 338 HD treatments. Mean time between onset of symptoms indicative of COVID-19 and diagnosis was 1.1 days (median 0; range 0-9). Prior to COVID-19 diagnosis the rate of HD treatments with hypoxemia, defined as treatment-level average SaO2 <90%, increased from 2.8% (2-4 weeks pre-diagnosis) to 12.2% (1 week) and 20.7% (3 days pre-diagnosis). Intradialytic O2 supplementation increased sharply post-diagnosis. Eleven patients died from COVID-19 within 5 weeks. Compared with patients who recovered from COVID-19, demised patients showed a more pronounced decline in SaO2 prior to COVID-19 diagnosis.CONCLUSIONSIn HD patients, hypoxemia may precede the onset of clinical symptoms and the diagnosis of COVID-19. A steep decline of SaO2 is associated with poor patient outcomes. Measurements of SaO2 may aid the pre-symptomatic identification of patients with COVID-19.BACKGROUNDMaintenance hemodialysis (MHD) patients are particularly vulnerable to coronavirus disease 2019 (COVID-19), a viral disease that may cause interstitial pneumonia, impaired alveolar gas exchange and hypoxemia. We ascertained the time course of intradialytic arterial oxygen saturation (SaO2) in MHD patients between 4 weeks pre-diagnosis and the week post-diagnosis of COVID-19.METHODSWe conducted a quality improvement project in confirmed COVID-19 in-center MHD patients from 11 dialysis facilities. In patients with an arterio-venous access, SaO2 was measured 1×/min during dialysis using the Crit-Line monitor (Fresenius Medical Care, Waltham, MA, USA). We extracted demographic, clinical, treatment and laboratory data, and COVID-19-related symptoms from the patients' electronic health records.

  • Bernard Canaud, Jeroen P Kooman, Nicholas M Selby, Maarten Taal, Andreas Maierhofer, Pascal Kopperschmidt, Susan Francis, Allan Collins, Peter Kotanko

    The development of maintenance hemodialysis (HD) for end stage kidney disease patients is a success story that continues to save many lives. Nevertheless, intermittent renal replacement therapy is also a source of recurrent stress for patients. Conventional thrice weekly short HD is an imperfect treatment that only partially corrects uremic abnormalities, increases cardiovascular risk, and exacerbates disease burden. Altering cycles of fluid loading associated with cardiac stretching (interdialytic phase) and then fluid unloading (intradialytic phase) likely contribute to cardiac and vascular damage. This unphysiologic treatment profile combined with cyclic disturbances including osmotic and electrolytic shifts may contribute to morbidity in dialysis patients and augment the health burden of treatment. As such, HD patients are exposed to multiple stressors including cardiocirculatory, inflammatory, biologic, hypoxemic, and nutritional. This cascade of events can be termed the dialysis stress storm and sickness syndrome. Mitigating cardiovascular risk and morbidity associated with conventional intermittent HD appears to be a priority for improving patient experience and reducing disease burden. In this in-depth review, we summarize the hidden effects of intermittent HD therapy, and call for action to improve delivered HD and develop treatment schedules that are better tolerated and associated with fewer adverse effects.

  • Richard V Remigio, Rodman Turpin, Jochen G Raimann, Peter Kotanko, Frank W Maddux, Amy Rebecca Sapkota, Xin-Zhong Liang, Robin Puett, Xin He, Amir Sapkota

    RESULTSBased on Lag 2- Lag 1 temporal ordering, 1 °C increase in daily TMAX was associated with increased hazard of ACHA by 1.4% (adjusted hazard ratio (HR), 1.014; 95% confidence interval, 1.007-1.021) and ACM 7.5% (adjusted HR, 1.075, 1.050-1.100). Short-term lag exposures to 1 °C increase in temperature predicted mean reductions in IDWG and preSBP by 0.013-0.015% and 0.168-0.229 mmHg, respectively. Mediation analysis for ACHA identified significant indirect effects for all three studied pathways (preSBP, IDWG, and preSBP + IDWG) and significant indirect effects for IDWG and conjoined preSBP + IDWG pathways for ACM. Of note, only 1.03% of the association between temperature and ACM was mediated through preSBP. The mechanistic path for IDWG, independent of preSBP, demonstrated inconsistent mediation and, consequently, potential suppression effects in ACHA (-15.5%) and ACM (-6.3%) based on combined pathway models. Proportion mediated estimates from preSBP + IDWG pathways achieved 2.2% and 0.3% in combined pathway analysis for ACHA and ACM outcomes, respectively. Lag 2 discrete-time ACM mediation models exhibited consistent mediation for all three pathways suggesting that 2-day lag in IDWG and preSBP responses can explain 2.11% and 4.41% of total effect association between temperature and mortality, respectively.CONCLUSIONWe corroborated the previously reported association between ambient temperature, ACHA and ACM. Our results foster the understanding of potential physiological linkages that may explain or suppress temperature-driven hospital admissions and mortality risks. Of note, concomitant changes in preSBP and IDWG may have little intermediary effect when analyzed in combined pathway models. These findings advance our assessment of candidate interventions to reduce the impact of outdoor temperature change on ESKD patients.BACKGROUNDTypical thermoregulatory responses to elevated temperatures among healthy individuals include reduced blood pressure and perspiration. Individuals with end-stage kidney disease (ESKD) are susceptible to systemic fluctuations caused by ambient temperature changes that may increase morbidity and mortality. We investigated whether pre-dialysis systolic blood pressure (preSBP) and interdialytic weight gain (IDWG) can independently mediate the association between ambient temperature, all-cause hospital admissions (ACHA), and all-cause mortality (ACM).METHODSThe study population consisted of ESKD patients receiving hemodialysis treatments at Fresenius Medical Care facilities in Philadelphia County, PA, from 2011 to 2019 (n = 1981). Within a time-to-event framework, we estimated the association between daily maximum dry-bulb temperature (TMAX) and, as separate models, ACHA and ACM during warmer calendar months. Clinically measured preSBP and IDWG responses to temperature increases were estimated using linear mixed effect models. We employed the difference (c-c') method to decompose total effect models for ACHA and ACM using preSBP and IDWG as time-dependent mediators. Covariate adjustments for exposure-mediator and total and direct effect models include age, race, ethnicity, blood pressure medication use, treatment location, preSBP, and IDWG. We considered lags up to two days for exposure and 1-day lag for mediator variables (Lag 2-Lag 1) to assure temporality between exposure-outcome models. Sensitivity analyses for 2-day (Lag 2-only) and 1-day (Lag 1-only) lag structures were also conducted.

  • Sheetal Chaudhuri, Hao Han, Caitlin Monaghan, John Larkin, Peter Waguespack, Brian Shulman, Zuwen Kuang, Srikanth Bellamkonda, Jane Brzozowski, Jeffrey Hymes, Mike Black, Peter Kotanko, Jeroen P Kooman, Franklin W Maddux, Len Usvyat

    RESULTSTreatment data from 616 in-center dialysis patients in the six clinics was curated into a big data store and fed into a Machine Learning (ML) model developed and deployed within the cloud. The threshold for classifying observations as positive or negative was set at 0.08. Precision for the model at this threshold was 0.33 and recall was 0.94. The area under the receiver operating curve (AUROC) for the ML model was 0.89 using test data.CONCLUSIONSThe findings from our proof-of concept analysis demonstrate the design of a cloud-based framework that can be used for making real-time predictions of events during dialysis treatments. Making real-time predictions has the potential to assist clinicians at the point of care during hemodialysis.METHODWe conducted a proof-of-concept analysis to retrospectively assess near real-time dialysis treatment data from in-center patients in six clinics using Optical Sensing Device (OSD), during December 2018 to August 2019. The goal of this analysis was to use real-time OSD data to predict if a patient's relative blood volume (RBV) decreases at a rate of at least - 6.5 % per hour within the next 15 min during a dialysis treatment, based on 10-second windows of data in the previous 15 min. A dashboard application was constructed to demonstrate how reporting structures may be developed to alert clinicians in real time of at-risk cases. Data was derived from three sources: (1) OSDs, (2) hemodialysis machines, and (3) patient electronic health records.BACKGROUNDInadequate refilling from extravascular compartments during hemodialysis can lead to intradialytic symptoms, such as hypotension, nausea, vomiting, and cramping/myalgia. Relative blood volume (RBV) plays an important role in adapting the ultrafiltration rate which in turn has a positive effect on intradialytic symptoms. It has been clinically challenging to identify changes RBV in real time to proactively intervene and reduce potential negative consequences of volume depletion. Leveraging advanced technologies to process large volumes of dialysis and machine data in real time and developing prediction models using machine learning (ML) is critical in identifying these signals.

  • Vaibhav Maheshwari, Xia Tao, Stephan Thijssen, Peter Kotanko

    Removal of protein-bound uremic toxins (PBUTs) during conventional dialysis is insufficient. PBUTs are associated with comorbidities and mortality in dialysis patients. Albumin is the primary carrier for PBUTs and only a small free fraction of PBUTs are dialyzable. In the past, we proposed a novel method where a binding competitor is infused upstream of a dialyzer into an extracorporeal circuit. The competitor competes with PBUTs for their binding sites on albumin and increases the free PBUT fraction. Essentially, binding competitor-augmented hemodialysis is a reactive membrane separation technique and is a paradigm shift from conventional dialysis therapies. The proposed method has been tested in silico, ex vivo, and in vivo, and has proven to be very effective in all scenarios. In an ex vivo study and a proof-of-concept clinical study with 18 patients, ibuprofen was used as a binding competitor; however, chronic ibuprofen infusion may affect residual kidney function. Binding competition with free fatty acids significantly improved PBUT removal in pre-clinical rat models. Based on in silico analysis, tryptophan can also be used as a binding competitor; importantly, fatty acids or tryptophan may have salutary effects in HD patients. More chemoinformatics research, pre-clinical, and clinical studies are required to identify ideal binding competitors before routine clinical use.

  • Jonathan S Chávez-Íñiguez, Pablo Maggiani-Aguilera, Christian Pérez-Flores, Rolando Claure-Del Granado, Andrés E De la Torre-Quiroga, Alejandro Martínez-Gallardo González, Guillermo Navarro-Blackaller, Ramón Medina-González, Jochen G Raimann, Francisco G Yanowsky-Escatell, Guillermo García-García

    RESULTSFrom 2017 to 2020, we analyzed 288 AKI patients. The mean age was 55.3 years, 60.7% were male, AKI KDIGO stage 3 was present in 50.5% of them, sepsis was the main etiology 50.3%, and 72 (25%) patients started KRT. The overall survival was 84.4%. Fluid adjustment was the only intervention associated with a decreased risk for starting KRT (odds ratio [OR]: 0.58, 95% confidence interval [CI]: 0.48-0.70, and p ≤ 0.001) and AKI progression to stage 3 (OR: 0.59, 95% CI: 0.49-0.71, and p ≤ 0.001). Receiving vasopressors and KRT were associated with mortality. None of the interventions studied was associated with reducing the risk of death.CONCLUSIONSIn this prospective cohort study of AKI patients, we found for the first time that early nephrologist intervention and fluid prescription adjustment were associated with lower risk of starting KRT and progression to AKI stage 3.BACKGROUNDBased on the pathophysiology of acute kidney injury (AKI), it is plausible that certain early interventions by the nephrologist could influence its trajectory. In this study, we investigated the impact of 5 early nephrology interventions on starting kidney replacement therapy (KRT), AKI progression, and death.METHODSIn a prospective cohort at the Hospital Civil of Guadalajara, we followed up for 10 days AKI patients in whom a nephrology consultation was requested. We analyzed 5 early interventions of the nephrology team (fluid adjustment, nephrotoxic withdrawal, antibiotic dose adjustment, nutritional adjustment, and removal of hyperchloremic solutions) after the propensity score and multivariate analysis for the risk of starting KRT (primary objective), AKI progression to stage 3, and death (secondary objectives).

  • Warren Krackov, Murat Sor, Rishi Razdan, Hanjie Zheng, Peter Kotanko

    RESULTSThere was a high degree of correlation between our noninvasive AI instrument and the results of the adjudication by the vascular experts. Our results indicate that CNN can automatically classify aneurysms. We achieved a >90% classification accuracy in the validation images.CONCLUSIONThis is the first quality improvement project to show that an AI instrument can reliably grade vascular access aneurysms in a noninvasive way, allowing rapid assessments to be made on patients who would otherwise be at risk for highly morbid events. Moreover, these AI-assisted assessments can be made without having to schedule separate appointments and potentially even via telehealth.BACKGROUNDInnovations in artificial intelligence (AI) have proven to be effective contributors to high-quality health care. We examined the beneficial role AI can play in noninvasively grading vascular access aneurysms to reduce high-morbidity events, such as rupture, in ESRD patients on hemodialysis.METHODSOur AI instrument noninvasively examines and grades aneurysms in both arteriovenous fistulas and arteriovenous grafts. Aneurysm stages were adjudicated by 3 vascular specialists, based on a grading system that focuses on actions that need to be taken. Our automatic classification of aneurysms builds on 2 components: (a) the use of smartphone technology to capture aneurysm appearance and (b) the analysis of these images using a cloud-based convolutional neural network (CNN).

  • Maggie Han, Xiaoling Ye, Sharon Rao, Schantel Williams, Stephan Thijssen, Jeffrey Hymes, Franklin W Maddux, Peter Kotanko

    RESULTSPatients were 65 years old, 57% male, and had a HD vintage of 10 months. Patients whose dialysis treatments started before 8:30 a.m. were more likely to be younger, male, and have a greater dialysis vintage. Patients receiving Engerix B® and starting dialysis before 8:30 a.m. had a significantly higher seroconversion rate compared to patients who started dialysis after 8:30 a.m. Early dialysis start was a significant predictor of seroconversion in univariate and multivariate regression including male gender, but not in multivariate regression including age, neutrophil-to-lymphocyte ratio, and vintage.CONCLUSIONWhile better sleep following vaccination is associated with seroconversion in the general population, this is not the case in hemodialysis patients after multivariate adjustment. In the context of end-stage kidney disease, early dialysis start is not a significant predictor of HB vaccination response. The association between objectively measured postvaccination sleep duration and seroconversion rate should be investigated.BACKGROUND/AIMSHepatitis B (HB) vaccination in hemodialysis patients is important as they are at a higher risk of contracting HB. However, hemodialysis patients have a lower HB seroconversion rate than their healthy counterparts. As better sleep has been associated with better seroconversion in healthy populations and early hemodialysis start has been linked to significant sleep-wake disturbances in hemodialysis patients, we examined if hemodialysis treatment start time is associated with HB vaccination response.METHODSDemographics, standard-of-care clinical, laboratory, and treatment parameters, dialysis shift data, HB antigen status, HB vaccination status, and HB titers were collected from hemodialysis patients in Fresenius clinics from January 2010 to December 2015. Patients in our analysis received 90% of dialysis treatments either before or after 8:30 a.m., were negative for HB antigen, and received a complete series of HB vaccination (Engerix B® or Recombivax HB™). Univariate and multivariate regression models examined whether dialysis start time is a predictor of HB vaccination response.

  • Pablo Maggiani-Aguilera, Jonathan S Chávez-Iñiguez, Joana G Navarro-Gallardo, Guillermo Navarro-Blackaller, Alondra M Flores-Llamas, Tania Pelayo-Retano, Erendira A Arellano-Delgado, Violeta E González-Montes, Ekatherina Yanowsky-Ortega, Jochen G Raimann, Guillermo Garcia-Garcia

    RESULTSBetween January 2019 and January 2020 a total of 75 patients with tunnelled catheter insertion were analysed. Catheter replacement at 6-month occur in 10 (13.3%) patients. By multivariate analysis, the incorrect catheter tip position (SVC) (OR 1.23, 95% CI 1.07-1.42, p <.004), the presence of extrasystoles during the procedure (OR 0.88, 95% CI 0.78-0.98, p = .03), incorrect catheter tug (OR 1.31, 95% CI 1.10-1.55, p = .003), incorrect catheter top position (kinking; OR 1.40, 95% CI 1.04-1.88, p = .02) and catheter-related bloodstream infection (OR 2.60, 95% CI 2.09-3.25, p <.001) were the only variables associated with catheter replacement at 6-month follow-up.AIMTunnelled haemodialysis (HD) catheters can be used instantly, but there are several anatomical variables that could impact it survival. This study aimed to examine the impact of different novel anatomic variables, with catheter replacement.CONCLUSIONThe risk of catheter replacement at 6-month follow-up could be attenuated by avoiding incorrect catheter tug and top position, and by placing the vascular catheter tip in the CAJ and MDA.METHODSIn a single-centre a prospective cohort in chronic kidney disease G5 patients were conducted. The primary outcome was to determine the factors associated with catheter replacement during the first 6-month of follow-up. All procedures were performed without fluoroscopy. Three anatomic regions for catheter tip position were established: considered as superior vena cava (SVC), cavo-atrial junction (CAJ) and mid-to deep atrium (MDA). Many other anatomical variables were measured. Catheter-related bloodstream infection was also included.

  • Juliana Leme, Murilo Guedes, John Larkin, Maggie Han, Ana Beatriz Lesqueves Barra, Maria Eugenia F Canziani, Américo Lourenço Cuvello Neto, Carlos Eduardo Poli-de-Figueiredo, Thyago Proenca de Moraes, Roberto Pecoits-Filho

    RESULTSAmong 176 patients, Vitality score was 63 ± 21 and the DRT was ≤30 minutes in 57% of patients. The mean number of steps was 5288 ± 3540 in 24 hours after HD and 953 ± 617 in the 2-hour post-HD period. The multivariable analysis confirmed Vitality scores were associated with physical activity in the 24-hour post-HD period. In contrast, DRT was not associated with physical activity captured by the accelerometer in the period immediately (2 hours) after the HD session.AIMFatigue in haemodialysis (HD) patients can be captured in quality of life questionnaires and by the dialysis recovery time (DRT) question. The associations between fatigue and measured physical activity has not been explored until the present. We tested our hypothesis that the patient perception of chronic and post dialysis fatigue would be associated with lower physical activity.CONCLUSIONChronic fatigue was negatively associated with step counts, while patient perception of post-dialysis fatigue was not associated with physical activity. These patterns indicate limitations in interpretation of DRT. Since physical activity is an important component of a healthy life, our results may partially explain the associations between fatigue and poor outcomes in HD patients.METHODSThis study was a cross sectional evaluation of baseline data from HD patients recruited in the HDFIT trial. Vitality scores from the Kidney Disease Quality of Life (KDQOL-36) and the dialysis recovery time (DRT) question were used as indicators of chronic and post dialysis fatigue, respectively. Granular physical activity was measured by accelerometers as part of the study protocol.

  • Hemodialysis international. International Symposium on Home Hemodialysis

    15 Oct 2021 Physical activity in hemodialysis patients on nondialysis and dialysis days: Prospective observational study

    Rakesh Malhotra, Ujjala Kumar, Patricia Virgen, Bryan Magallon, Pranav S Garimella, Tushar Chopra, Peter Kotanko, T Alp Ikizler, Danuta Trzebinska, Lisa Cadmus-Bertram, Joachim H Ix

    DISCUSSIONESKD participants receiving hemodialysis are frequently sedentary, and differences appear more pronounced in older patients. These findings may assist in designing patient-centered interventions to increase physical activity among hemodialysis patients.INTRODUCTIONThe physical decline in patients with end-stage kidney disease (ESKD) is associated with morbidity and mortality. Prior studies have attempted to promote physical activity at the time of dialysis; however, physical activity patterns on the nondialysis days are unknown. This study aimed to quantify physical activity on dialysis and nondialysis days in hemodialysis patients using a wearable actigraph.FINDINGSOf the 52 recruited, 45 participants (urban = 25; rural = 20) completed the study. The mean age was 61 ± 15 years, 42% were women, 64% were Hispanic, and the mean dialysis vintage was 4.4 ± 3.0 years. For those with valid Fitbit data (defined as ≥10 hours of wear per day) for 28 days (n = 45), participants walked an average of 3688 steps per day, and 73% of participants were sedentary (<5000 steps/day). Participants aged >80 years were less active than younger (age < 65 years) participants (1232 vs. 4529 steps, P = 0.01). There were no statistical differences between the groups when stratified by gender (women vs. men [2817 vs. 4324 steps, respectively]), urbanicity (rural vs. urban dialysis unit [3141 vs. 4123 steps, respectively]), and dialysis/nondialysis day (3177 vs. 4133 steps, respectively). Due to the small sample size, we also calculated effect sizes. The effect size was medium for the gender differences (cohen's d = 0.57) and small to medium for urbanicity and dialysis/nondialysis day (d = 0.37 and d = 0.33, respectively). We found no association between physical activity and self-reported depression and fatigue scale. The majority of participants (62%, 28/45) found the Fitbit tracker easy to wear and comfortable.METHODSIn this prospective study, subjects receiving hemodialysis were recruited from two outpatient dialysis units in urban San Diego and rural Imperial County, CA, between March 2018 and April 2019. Key inclusion criteria included: (1) receiving thrice weekly hemodialysis for ≥3 months, (2) age ≥ 18 years, and (3) able to walk with or without assistive devices. All participants wore a Fitbit Charge 2 tracker for a minimum of 4 weeks. The primary outcome was the number of steps per day. Each participant completed the Physical Activity Questionnaire, the Patient Health Questionnaire (PHQ)-9, the PROMIS Short form Fatigue Questionnaire at baseline, and the Participant Technology Experience Questionnaire at day 7 after study enrolment.

  • Roberto Pecoits-Filho, John Larkin, Carlos Eduardo Poli-de-Figueiredo, Américo Lourenço Cuvello-Neto, Ana Beatriz Lesqueves Barra, Priscila Bezerra Gonçalves, Shimul Sheth, Murilo Guedes, Maggie Han, Viviane Calice-Silva, Manuel Carlos Martins de Castro, Peter Kotanko, Thyago Proenca de Moraes, Jochen G Raimann, Maria Eugenia F Canziani

    RESULTSWe randomized 195 patients (HDF 97; HD 98) between August 2016 and October 2017. Despite the achievement of a high convective volume in the majority of sessions and a positive impact on solute removal, the treatment effect HDF on the primary outcome was +538 [95% confidence interval (CI) -330 to 1407] steps/24 h after dialysis compared with HD, and was not statistically significant. Despite a lack of statistical significance, the observed size of the treatment effect was modest and driven by steps taken between 1.5 and 24.0 h after dialysis, in particular between 20 and 24 h (+197 steps; 95% CI -95 to 488).CONCLUSIONSHDF did not have a statistically significant treatment effect on PA 24 h following dialysis, albeit effect sizes may be clinically meaningful and deserve further investigation.BACKGROUNDDialysis patients are typically inactive and their physical activity (PA) decreases over time. Uremic toxicity has been suggested as a potential causal factor of low PA in dialysis patients. Post-dilution high-volume online hemodiafiltration (HDF) provides greater higher molecular weight removal and studies suggest better clinical/patient-reported outcomes compared with hemodialysis (HD).METHODSHDFIT was a randomized controlled trial at 13 clinics in Brazil that aimed to investigate the effects of HDF on measured PA (step counts) as a primary outcome. Stable HD patients (vintage 3-24 months) were randomized to receive HDF or high-flux HD. Treatment effect of HDF on the primary outcome from baseline to 3 and 6 months was estimated using a linear mixed-effects model.

  • Marijke J E Dekker, Len A Usvyat, Constantijn J A M Konings, Jeroen P Kooman, Bernard Canaud, Paola Carioni, Daniele Marcelli, Frank M van der Sande, Vaibhav Maheshwari, Yuedong Wang, Peter Kotanko, Jochen G Raimann

    Pre-hemodialysis systolic blood pressure variability (pre-HD SBPV) has been associated with outcomes. The association of a change in pre-HD SBPV over time with outcomes, and predictors of this change, has not yet been studied. Therefore, we studied this in a cohort of 8825 incident hemodialysis (HD) patients from the European Monitoring Dialysis Outcomes Initiative database. Patient level pre-HD SBPV was calculated as the standard deviation of the residuals of a linear regression model of systolic blood pressure (SBP) over time divided by individual mean SBP in the respective time periods. The pre-HD SBPV difference between months 1-6 and 7-12 was used as an indicator of pre-HD SBPV change. The association between pre-HD SBPV change and all-cause mortality in year 2 was analyzed by multivariate Cox models. Predictors of pre-HD SBPV change was determined by logistic regression models. We found the highest pre-HD SBPV tertile, in the first 6 months after initiation of HD, had the highest mortality rates (adjusted HR 1.44 (95% confidence intervals (95% CI): 1.15-1.79)). An increase in pre-HD SBPV between months 1-6 and 7-12 was associated with an increased risk of mortality in year 2 (adjusted HR 1.29 (95% CI: 1.05-1.58)) compared with stable pre-HD SPBV. A pre-HD SBPV increase was associated with female gender, higher mean pre-HD SBP and pulse pressure, and lower HD frequency.

  • Hemodialysis international. International Symposium on Home Hemodialysis

    6 Oct 2021 Achieving high convective volume in hemodiafiltration: Lessons learned after successful implementation in the HDFit trial

    Murilo Guedes, Ana Claudia Dambiski, Sinaia Canhada, Ana Beatriz L Barra, Carlos Eduardo Poli-de-Figueiredo, Américo Lourenço Cuvello Neto, Maria Eugênia F Canziani, Jorge Paulo Strogoff-de-Matos, Jochen G Raimann, John Larkin, Bernard Canaud, Roberto Pecoits-Filho

    DESIGN, SETTING, PARTICIPANTS, AND MEASUREMENTSWe analyzed the results of the implementation of postdilution OL-HDF in patients randomized to the HDF arm of a clinical trial (impact of hemoDiaFIlTration on physical activity and self-reported outcomes: a randomized controlled trial (HDFit) trial [ClinicalTrials.gov:NCT02787161]). The day before randomization of the first patient to OL-HDF at each clinic staff started a 3-day in-person training module on operation of Fresenius 5008 CorDiax machine in HDF mode. Patients were converted from high-flux HD to OL-HDF under oversight of trainers. OL-HDF was performed over a 6-months follow-up with a CV target of 22 L/treatment. We characterized median achieved CV >22 L/treatment record and analyzed the impact of HDF on biochemical variables.RESULTSNinety-seven patients (mean age 53 ± 16 years, 29% with diabetes, and 11% had a catheter) from 13 clinics randomized to the OL-HDF arm of the trial were converted from HD to HDF. Median CV > 22 L/treatment was achieved in 99% (94/95) of OL-HDF patients throughout follow-up. Monthly mean CV ranged from 27.1 L to 27.5 L. OL-HDF provided an increased single pool Kt/V at 3-months (0.2 [95% CI: 0.1-0.3]) and 6-months (0.2 [95% CI: 0.1-0.4]) compared to baseline, and reduced phosphate at 3-months (-0.4 mg/dL [95% CI: -0.8 to -0.12]) of follow-up.CONCLUSIONSHigh-volume online hemodiafiltration was successfully implemented with 99% of patients achieving protocol defined CV target. Monthly mean CV was consistently >22 L/treatment during follow-up. Kt/V increased, and phosphate decreased with OL-HDF. Findings resulting from a short training period in several dialysis facilities appear to suggest HDF is an easily implementable technique.BACKGROUND AND OBJECTIVESHigh-volume online hemodiafiltration (OL-HDF) associates with improved outcomes compared to hemodialysis (HD), provided adequate dosing is achieved as estimated from convective volume (CV). Achievement of high CV and its impact on biochemical indicators following a standardized protocol converting HD patients to OL-HDF has not been systematically reported. We assessed the success of implementation of OL-HDF in clinics naïve to the modality.

  • Markus Pirklbauer, David A Bushinsky, Peter Kotanko, Gudrun Schappacher-Tilp

    Background: Personalized management of secondary hyperparathyroidism is a critical part of hemodialysis patient care. We used a mathematical model of parathyroid gland (PTG) biology to predict (1) short-term peridialytic intact PTH (iPTH) changes in response to diffusive calcium (Ca) fluxes and (2) to predict long-term iPTH levels. Methods: We dialyzed 26 maintenance hemodialysis patients on a single occasion with a dialysate Ca concentration of 1.75 mmol/l to attain a positive dialysate-to-blood ionized Ca (iCa) gradient and thus diffusive Ca loading. Intradialytic iCa kinetics, peridialytic iPTH change, and dialysate-sided iCa mass balance (iCaMB) were assessed. Patient-specific PTG model parameters were estimated using clinical, medication, and laboratory data. We then used the personalized PTG model to predict peridialytic and long-term (6-months) iPTH levels. Results: At dialysis start, the median dialysate-to-blood iCa gradient was 0.3 mmol/l (IQR 0.11). The intradialytic iCa gain was 488 mg (IQR 268). Median iPTH decrease was 75% (IQR 15) from pre-dialysis 277 to post-dialysis 51 pg/ml. Neither iCa gradient nor iCaMB were significantly associated with peridialytic iPTH changes. The personalized PTG model accurately predicted both short-term, treatment-level peridialytic iPTH changes (r = 0.984, p < 0.001, n = 26) and patient-level 6-months iPTH levels (r = 0.848, p < 0.001, n = 13). Conclusions: This is the first report showing that both short-term and long-term iPTH dynamics can be predicted using a personalized mathematical model of PTG biology. Prospective studies are warranted to explore further model applications, such as patient-level prediction of iPTH response to PTH-lowering treatment.

  • Gabriela Ferreira Dias, Sara Soares Tozoni, Gabriela Bohnen, Nadja Grobe, Silvia D Rodrigues, Tassiana Meireles, Lia S Nakao, Roberto Pecoits-Filho, Peter Kotanko, Andréa Novais Moreno-Amaral

    RESULTSHere, we show that HD-RBC have less intracellular oxygen and that it is further decreased post-HD. Also, incubation in 5% O2 and uremia triggered eryptosis in vitro by exposing PS. Hypoxia itself increased the PS exposure in HD-RBC and CON-RBC, and the addition of uremic serum aggravated it. Furthermore, inhibition of the organic anion transporter 2 with ketoprofen reverted eryptosis and restored the levels of intracellular oxygen. Cytosolic levels of the uremic toxins pCS and IAA were decreased after dialysis.CONCLUSIONThese findings suggest the participation of uremic toxins and hypoxia in the process of eryptosis and intracellular oxygenation.BACKGROUND/AIMSChronic kidney disease is frequently accompanied by anemia, hypoxemia, and hypoxia. It has become clear that the impaired erythropoietin production and altered iron homeostasis are not the sole causes of renal anemia. Eryptosis is a process of red blood cells (RBC) death, like apoptosis of nucleated cells, characterized by Ca2+ influx and phosphatidylserine (PS) exposure to the outer RBC membrane leaflet. Eryptosis can be induced by uremic toxins and occurs before senescence, thus shortening RBC lifespan and aggravating renal anemia. We aimed to assess eryptosis and intracellular oxygen levels of RBC from hemodialysis patients (HD-RBC) and their response to hypoxia, uremia, and uremic toxins uptake inhibition.METHODSUsing flow cytometry, RBC from healthy individuals (CON-RBC) and HD-RBC were subjected to PS (Annexin-V), intracellular Ca2+ (Fluo-3/AM) and intracellular oxygen (Hypoxia Green) measurements, at baseline and after incubation with uremic serum and/or hypoxia (5% O2), with or without ketoprofen. Baseline levels of uremic toxins were quantified in serum and cytosol by high performance liquid chromatography.

  • Ravi Thadhani, Joanna Willetts, Catherine Wang, John Larkin, Hanjie Zhang, Lemuel Rivera Fuentes, Len Usvyat, Kathleen Belmonte, Yuedong Wang, Robert Kossmann, Jeffrey Hymes, Peter Kotanko, Franklin Maddux

    RESULTSAmong 170,234 hemodialysis patients, 4,782 (2.8 %) tested positive for SARS-CoV-2 (mean age 64 years, 44 % female). Most facilities (68.5 %) had 0 to 1 positive SARS-CoV-2 patient. We matched 2,379 SARS-CoV-2 positive cases to 2,379 non-SARS-CoV-2 controls; 1.30 % (95 %CI 0.90 %, 1.87 %) of cases and 1.39 % (95 %CI 0.97 %, 1.97 %) of controls were exposed to a chair previously sat in by a shedding SARS-CoV-2 patient. Transmission risk among cases was not significantly different from controls (OR = 0.94; 95 %CI 0.57 to 1.54; p = 0.80). Results remained consistent in adjusted and sensitivity analyses.CONCLUSIONSThe risk of indirect patient-to-patient transmission of SARS-CoV-2 infection from dialysis chairs appears to be low.BACKGROUNDSARS-CoV-2 can remain transiently viable on surfaces. We examined if use of shared chairs in outpatient hemodialysis associates with a risk for indirect patient-to-patient transmission of SARS-CoV-2.METHODSWe used data from adults treated at 2,600 hemodialysis facilities in United States between February 1st and June 8th, 2020. We performed a retrospective case-control study matching each SARS-CoV-2 positive patient (case) to a non-SARS-CoV-2 patient (control) treated in the same dialysis shift. Cases and controls were matched on age, sex, race, facility, shift date, and treatment count. For each case-control pair, we traced backward 14 days to assess possible prior exposure from a 'shedding' SARS-CoV-2 positive patient who sat in the same chair immediately before the case or control. Conditional logistic regression models tested whether chair exposure after a shedding SARS-CoV-2 positive patient conferred a higher risk of SARS-CoV-2 infection to the immediate subsequent patient.

  • Murilo Guedes, Roberto Pecoits-Filho, Juliana El Ghoz Leme, Yue Jiao, Jochen G Raimann, Yuedong Wang, Peter Kotanko, Thyago Proença de Moraes, Ravi Thadhani, Franklin W Maddux, Len A Usvyat, John W Larkin

    RESULTSAmong 98,616 incident HD patients (age 62.6 ± 14.4 years, 57.8% male) who responded to DRT survey, a higher spKt/V in the incident period was associated with 13.5% (OR = 0.865; 95%CI 0.801-to-0.935) lower risk of a change to a longer DRT in the first-prevalent year. A higher number of HD treatments with IDH episodes per month in the incident period was associated with a 0.8% (OR = 1.008; 95%CI 1.001-to-1.015) and 1.6% (OR = 1.016; 95%CI 1.006-to-1.027) higher probability of a change to a longer DRT in the first- and second-prevalent years, respectively. Consistently, an increased in incidence of IDH episodes/months was associated to a change to a longer DRT over time.CONCLUSIONSIncident patients who had higher spKt/V and less sessions with IDH episodes had a lower likelihood of changing to a longer DRT in first year of HD. Dose optimization strategies with cardiac stability in fluid removal should be tested.BACKGROUNDDialysis recovery time (DRT) surveys capture the perceived time after HD to return to performing regular activities. Prior studies suggest the majority of HD patients report a DRT > 2 h. However, the profiles of and modifiable dialysis practices associated with changes in DRT relative to the start of dialysis are unknown. We hypothesized hemodialysis (HD) dose and rates of intradialytic hypotension (IDH) would associate with changes in DRT in the first years after initiating dialysis.METHODSWe analyzed data from adult HD patients who responded to a DRT survey ≤180 days from first date of dialysis (FDD) during 2014 to 2017. DRT survey was administered with annual KDQOL survey. DRT survey asks: "How long does it take you to be able to return to your normal activities after your dialysis treatment?" Answers are: < 0.5, 0.5-to-1, 1-to-2, 2-to-4, or > 4 h. An adjusted logistic regression model computed odds ratio for a change to a longer DRT (increase above DRT > 2 h) in reference to a change to a shorter DRT (decrease below DRT < 2 h, or from DRT > 4 h). Changes in DRT were calculated from incident (≤180 days FDD) to first prevalent (> 365-to- ≤ 545 days FDD) and second prevalent (> 730-to- ≤ 910 days FDD) years.

  • John W Larkin, Maggie Han, Hao Han, Murilo H Guedes, Priscila Bezerra Gonçalves, Carlos Eduardo Poli-de-Figueiredo, Américo Lourenço Cuvello-Neto, Ana Beatriz L Barra, Thyago Proença de Moraes, Len A Usvyat, Peter Kotanko, Maria Eugenia F Canziani, Jochen G Raimann, Roberto Pecoits-Filho

    RESULTSAmong 195 patients (mean age 53 ± 15 years, 71% male), step counts per 24-h were 3919 ± 2899 on HD days, 5308 ± 3131 on first non-HD days (p < 0.001), and 4926 ± 3413 on second non-HD days (p = 0.032). During concurrent/parallel times to HD on first and second non-HD days, patients took 1308 and 1128 more steps (both p < 0.001). Patients took 276 more steps and had highest rates of steps/hour 2-h post-HD versus same times on first non-HD days (all p < 0.05). Consistent findings were observed on second non-HD days.CONCLUSIONSPA was higher within 2-h of HD versus same times on non-HD days. Lower PA on HD days was attributable to intradialytic inactivity. The established PA profiles are of importance to the design and development of exercise programs that aim to increase activity during and between HD treatments.TRIAL REGISTRATIONHDFIT was prospectively registered 20 April 2016 on ClinicalTrials.gov (NCT02787161).BACKGROUNDPhysical activity (PA) is typically lower on hemodialysis (HD) days. Albeit intradialytic inactivity is expected, it is unknown whether recovery after HD contributes to low PA. We investigated the impact of HD and post-HD period on granular PA relative to HD timing.METHODSWe used baseline data from the HDFIT trial conducted from August 2016 to October 2017. Accelerometry measured PA over 1 week in patients who received thrice-weekly high-flux HD (vintage 3 to 24 months), were clinically stable, and had no ambulatory limitations. PA was assessed on HD days (0 to ≤24 h after start HD), first non-HD days (> 24 to ≤48 h after start HD) and second non-HD day (> 48 to ≤72 h after start HD). PA was recorded in blocks/slices: 4 h during HD, 0 to ≤2 h post-HD (30 min slices), and > 2 to ≤20 h post-HD (4.5 h slices). Blocks/slices of PA were captured at concurrent/parallel times on first/second non-HD days compared to HD days.

  • Sheetal Chaudhuri, Hao Han, Len Usvyat, Yue Jiao, David Sweet, Allison Vinson, Stephanie Johnstone Steinberg, Dugan Maddux, Kathleen Belmonte, Jane Brzozowski, Brad Bucci, Peter Kotanko, Yuedong Wang, Jeroen P Kooman, Franklin W Maddux, John Larkin

    RESULTSThe between group difference in annual hospital admission and day rates was similar at baseline (2015) with a mean difference between DHRP versus control clinics of -0.008 ± 0.09 ppy and -0.05 ± 0.96 ppy respectively. The between group difference in hospital admission and day rates became more distinct at the end of follow up (2018) favoring DHRP clinics with the mean difference being -0.155 ± 0.38 ppy and -0.97 ± 2.78 ppy respectively. A paired t-test showed the change in the between group difference in hospital admission and day rates from baseline to the end of the follow up was statistically significant (t-value = 2.73, p-value < 0.01) and (t-value = 2.29, p-value = 0.02) respectively.CONCLUSIONSThese findings suggest ML model-based risk-directed interdisciplinary team interventions associate with lower hospitalization rates and hospital day rate in HD patients, compared to controls.BACKGROUNDAn integrated kidney disease company uses machine learning (ML) models that predict the 12-month risk of an outpatient hemodialysis (HD) patient having multiple hospitalizations to assist with directing personalized interdisciplinary interventions in a Dialysis Hospitalization Reduction Program (DHRP). We investigated the impact of risk directed interventions in the DHRP on clinic-wide hospitalization rates.METHODSWe compared the hospital admission and day rates per-patient-year (ppy) from all hemodialysis patients in 54 DHRP and 54 control clinics identified by propensity score matching at baseline in 2015 and at the end of the pilot in 2018. We also used paired T test to compare the between group difference of annual hospitalization rate and hospitalization days rates at baseline and end of the pilot.

  • Vaibhav Maheshwari, Robert S Hoffman, Stephan Thijssen, Xia Tao, Doris H Fuertinger, Peter Kotanko

    Hemodialysis (HD) has limited efficacy towards treatment of drug toxicity due to strong drug-protein binding. In this work, we propose to infuse a competitor drug into the extracorporeal circuit that increases the free fraction of a toxic drug and thereby increases its dialytic removal. We used a mechanistic model to assess the removal of phenytoin and carbamazepine during HD with or without binding-competition. We simulated dialytic removal of (1) phenytoin, initial concentration 70 mg/L, using 2000 mg aspirin, (2) carbamazepine, initial concentration 35 mg/L, using 800 mg ibuprofen, in a 70 kg patient. The competitor drug was infused at constant rate. For phenytoin (~ 13% free at t = 0), HD brings the patient to therapeutic concentration in 460 min while aspirin infusion reduces that time to 330 min. For carbamazepine (~ 27% free at t = 0), the ibuprofen infusion reduces the HD time to reach therapeutic concentration from 265 to 220 min. Competitor drugs with longer half-life further reduce the HD time. Binding-competition during HD is a potential treatment for drug toxicities for which current recommendations exclude HD due to strong drug-protein binding. We show clinically meaningful reductions in the treatment time necessary to achieve non-toxic concentrations in patients poisoned with these two prescription drugs.

  • Jochen G Raimann, Joseph Marfo Boaheng, Philipp Narh, Harrison Matti, Seth Johnson, Linda Donald, Hongbin Zhang, Friedrich Port, Nathan W Levin

    In rural regions with limited resources, the provision of clean water remains challenging. The resulting high incidence of diarrhea can lead to acute kidney injury and death, particularly in the young and the old. Membrane filtration using recycled hemodialyzers allows water purification. This study quantifies the public health effects. Between 02/2018 and 12/2018, 4 villages in rural Ghana were provided with a high-volume membrane filtration device (NuFiltration). Household surveys were collected monthly with approval from Ghana Health Services. Incidence rates of diarrhea for 5-month periods before and after implementation of the device were collected and compared to corresponding rates in 4 neighboring villages not yet equipped. Data of 1,130 villagers over 10 months from the studied communities were studied. Incidence rates showed a decline following the implementation of the device from 0.18 to 0.05 cases per person-month (ppm) compared to the control villages (0.11 to 0.08 ppm). The rate ratio of 0.27 for the study villages is revised to 0.38 when considering the non-significant rate reduction in the control villages. Provision of a repurposed hemodialyzer membrane filtration device markedly improves health outcomes as measured by diarrhea incidence within rural communities.

  • David F Keane, Jochen G Raimann, Hanjie Zhang, Joanna Willetts, Stephan Thijssen, Peter Kotanko

    Intradialytic hypotension (IDH) is a common complication of hemodialysis, but there is no data about the time of onset during treatment. Here we describe the incidence of IDH throughout hemodialysis and associations of time of hypotension with clinical parameters and survival by analyzing data from 21 dialysis clinics in the United States to include 785682 treatments from 4348 patients. IDH was defined as a systolic blood pressure of 90 mmHg or under while IDH incidence was calculated in 30-minute intervals throughout the hemodialysis session. Associations of time of IDH with clinical and treatment parameters were explored using logistic regression and with survival using Cox-regression. Sensitivity analysis considered further IDH definitions. IDH occurred in 12% of sessions at a median time interval of 120-149 minutes. There was no notable change in IDH incidence across hemodialysis intervals (range: 2.6-3.2 episodes per 100 session-intervals). Relative blood volume and ultrafiltration volume did not notably associate with IDH in the first 90 minutes but did thereafter. Associations between central venous but not arterial oxygen saturation and IDH were present throughout hemodialysis. Patients prone to IDH early as compared to late in a session had worse survival. Sensitivity analyses suggested IDH definition affects time of onset but other analyses were comparable. Thus, our study highlights the incidence of IDH during the early part of hemodialysis which, when compared to later episodes, associates with clinical parameters and mortality.

  • Rhys D R Evans, Ulla Hemmila, Henry Mzinganjira, Mwayi Mtekateka, Enos Banda, Naomi Sibale, Zuze Kawale, Chimota Phiri, Gavin Dreyer, Viviane Calice-Silva, Jochen G Raimann, Nathan Levin, Roberto Pecoits-Filho, Ravi Mehta, Etienne Macedo

    RESULTSOf 710 patients who presented at increased risk of kidney disease, 655 (92.3%) underwent SUN testing at enrolment, and were included (aged 38 (29-52) years, 367 (56%) female and 333 (50.8%) with HIV). Kidney disease was present in 482 (73.6%) patients and 1479 SUN measurements were made overall. Estimated glomerular filtration rate (eGFR) correlated with SUN (r=-0.39; p<0.0001). The area under the receiver operating characteristics curve was 0.61 for presenting SUN to detect acute or chronic kidney disease, and 0.87 to detect severe (eGFR <15 mL/min/1.73 m2) kidney disease (p<0.0001; sensitivity 82.3%, specificity 81.8%, test accuracy 81.8%). In-hospital mortality was greater if enrolment SUN was elevated (>test pad #1) compared with patients with non-elevated SUN (p<0.0001; HR 3.3 (95% CI 1.7 to 6.1).CONCLUSIONSSUN, measured by dipstick, is feasible and may be used to screen for kidney disease in low resource settings where creatinine tests are unavailable.BACKGROUNDKidney disease is prevalent in low-resource settings worldwide, but tests for its diagnosis are often unavailable. The saliva urea nitrogen (SUN) dipstick is a laboratory and electricity independent tool, which may be used for the detection of kidney disease. We investigated the feasibility and performance of its use in diagnosing kidney disease in community settings in Africa.METHODSAdult patients at increased risk of kidney disease presenting to three community health centres, a rural district hospital and a central hospital in Malawi were recruited between October 2016 and September 2017. Patients underwent concurrent SUN and creatinine testing at enrolment, and at 1 week, 1 month, 3 months and 6 months thereafter.

  • Mathematical biosciences and engineering

    21 Jun 2021 A mathematical model of the four cardinal acid-base disorders

    Alhaji Cherif, Vaibhav Maheshwari, Doris Fuertinger, Gudrun Schappacher-Tilp, Priscila Preciado, David Bushinsky, Stephan Thijssen, Peter Kotanko

    Precise maintenance of acid-base homeostasis is fundamental for optimal functioning of physiological and cellular processes. The presence of an acid-base disturbance can affect clinical outcomes and is usually caused by an underlying disease. It is, therefore, important to assess the acid-base status of patients, and the extent to which various therapeutic treatments are effective in controlling these acid-base alterations. In this paper, we develop a dynamic model of the physiological regulation of an HCO3-/CO2 buffering system, an abundant and powerful buffering system, using Henderson-Hasselbalch kinetics. We simulate the normal physiological state and four cardinal acidbase disorders: Metabolic acidosis and alkalosis and respiratory acidosis and alkalosis. We show that the model accurately predicts serum pH over a range of clinical conditions. In addition to qualitative validation, we compare the in silico results with clinical data on acid-base homeostasis and alterations, finding clear relationships between primary acid-base disturbances and the secondary adaptive compensatory responses. We also show that the predicted primary disturbances accurately resemble clinically observed compensatory responses. Furthermore, via sensitivity analysis, key parameters were identified which could be the most effective in regulating systemic pH in healthy individuals, and those with chronic kidney disease and distal and proximal renal tubular acidosis. The model presented here may provide pathophysiologic insights and can serve as a tool to assess the safety and efficacy of different therapeutic interventions to control or correct acid-base disorders.

  • Jeroen P Kooman, Peter Stenvinkel, Paul G Shiels, Martin Feelisch, Bernard Canaud, Peter Kotanko

    Patients treated with hemodialysis (HD) repeatedly undergo intradialytic low arterial oxygen saturation and low central venous oxygen saturation, reflecting an imbalance between upper body systemic oxygen supply and demand, which are associated with increased mortality. Abnormalities along the entire oxygen cascade, with impaired diffusive and convective oxygen transport, contribute to the reduced tissue oxygen supply. HD treatment impairs pulmonary gas exchange and reduces ventilatory drive, whereas ultrafiltration can reduce tissue perfusion due to a decline in cardiac output. In addition to these factors, capillary rarefaction and reduced mitochondrial efficacy can further affect the balance between cellular oxygen supply and demand. Whereas it has been convincingly demonstrated that a reduced perfusion of heart and brain during HD contributes to organ damage, the significance of systemic hypoxia remains uncertain, although it may contribute to oxidative stress, systemic inflammation, and accelerated senescence. These abnormalities along the oxygen cascade of patients treated with HD appear to be diametrically opposite to the situation in Tibetan highlanders and Sherpa, whose physiology adapted to the inescapable hypobaric hypoxia of their living environment over many generations. Their adaptation includes pulmonary, vascular, and metabolic alterations with enhanced capillary density, nitric oxide production, and mitochondrial efficacy without oxidative stress. Improving the tissue oxygen supply in patients treated with HD depends primarily on preventing hemodynamic instability by increasing dialysis time/frequency or prescribing cool dialysis. Whether dietary or pharmacological interventions, such as the administration of L-arginine, fermented food, nitrate, nuclear factor erythroid 2-related factor 2 agonists, or prolyl hydroxylase 2 inhibitors, improve clinical outcome in patients treated with HD warrants future research.

  • Bernard Canaud, Xiaoling Ye, Len Usvyat, Jeroen Kooman, Frank van der Sande, Jochen Raimann, Yuedong Wang, Peter Kotanko

    RESULTSWe included 23 495 HD patients; 3662 were incident. Females and older patients have lower baseline SCI. Higher SCI was associated with a lower risk of mortality [hazard ratio 0.81 (95% confidence interval 0.79-0.82)]. SCI decline accelerated ∼5-7 months before death. Lean tissue index (LTI) estimated by SCI was correlated with measured LTI in both sexes (males: R2 = 0.94; females: R2 = 0.92; both P < 0.001). Bland-Altman analysis showed that measured LTI was 4.71 kg/m2 (±2 SD: -12.54-3.12) lower than estimated LTI.CONCLUSIONSCI is a simple, easily obtainable and clinically relevant surrogate marker of MM in HD patients.METHODWe included all in-centre HD patients from 16 European countries with at least one SCI. The baseline period was defined as 30 days before and after the first multifrequency bioimpedance spectroscopy measurement; the subsequent 7 years constituted the follow-up. SCI was calculated by the Canaud equation. Multivariate Cox proportional hazards models were applied to assess the association of SCI with all-cause mortality. Using backward analysis, we explored the trends of SCI before death. Bland-Altman analysis was performed to analyse the agreement between estimated and measured MM.BACKGROUNDProtein-energy wasting, muscle mass (MM) loss and sarcopenia are highly prevalent and associated with poor outcome in haemodialysis (HD) patients. Monitoring of MM and/or muscle metabolism in HD patients is of paramount importance for timely detection of muscle loss and to intervene adequately. In this study we assessed the reliability and reproducibility of a simplified creatinine index (SCI) as a surrogate marker of MM and explored its predictive value on outcome.

  • Gabriela Ferreira Dias, Nadja Grobe, Sabrina Rogg, David J Jörg, Roberto Pecoits-Filho, Andréa Novais Moreno-Amaral, Peter Kotanko

    Red blood cells (RBC) are the most abundant cells in the blood. Despite powerful defense systems against chemical and mechanical stressors, their life span is limited to about 120 days in healthy humans and further shortened in patients with kidney failure. Changes in the cell membrane potential and cation permeability trigger a cascade of events that lead to exposure of phosphatidylserine on the outer leaflet of the RBC membrane. The translocation of phosphatidylserine is an important step in a process that eventually results in eryptosis, the programmed death of an RBC. The regulation of eryptosis is complex and involves several cellular pathways, such as the regulation of non-selective cation channels. Increased cytosolic calcium concentration results in scramblase and floppase activation, exposing phosphatidylserine on the cell surface, leading to early clearance of RBCs from the circulation by phagocytic cells. While eryptosis is physiologically meaningful to recycle iron and other RBC constituents in healthy subjects, it is augmented under pathological conditions, such as kidney failure. In chronic kidney disease (CKD) patients, the number of eryptotic RBC is significantly increased, resulting in a shortened RBC life span that further compounds renal anemia. In CKD patients, uremic toxins, oxidative stress, hypoxemia, and inflammation contribute to the increased eryptosis rate. Eryptosis may have an impact on renal anemia, and depending on the degree of shortened RBC life span, the administration of erythropoiesis-stimulating agents is often insufficient to attain desired hemoglobin target levels. The goal of this review is to indicate the importance of eryptosis as a process closely related to life span reduction, aggravating renal anemia.

  • Kidney international reports

    12 Nov 2020 Dialysis-Induced Cardiovascular and Multiorgan Morbidity

    Bernard Canaud, Jeroen P Kooman, Nicholas M Selby, Maarten W Taal, Susan Francis, Andreas Maierhofer, Pascal Kopperschmidt, Allan Collins, Peter Kotanko

    Hemodialysis has saved many lives, albeit with significant residual mortality. Although poor outcomes may reflect advanced age and comorbid conditions, hemodialysis per se may harm patients, contributing to morbidity and perhaps mortality. Systemic circulatory "stress" resulting from hemodialysis treatment schedule may act as a disease modifier, resulting in a multiorgan injury superimposed on preexistent comorbidities. New functional intradialytic imaging (i.e., echocardiography, cardiac magnetic resonance imaging [MRI]) and kinetic of specific cardiac biomarkers (i.e., Troponin I) have clearly documented this additional source of end-organ damage. In this context, several factors resulting from patient-hemodialysis interaction and/or patient management have been identified. Intradialytic hypovolemia, hypotensive episodes, hypoxemia, solutes, and electrolyte fluxes as well as cardiac arrhythmias are among the contributing factors to systemic circulatory stress that are induced by hemodialysis. Additionally, these factors contribute to patients' symptom burden, impair cognitive function, and finally have a negative impact on patients' perception and quality of life. In this review, we summarize the adverse systemic effects of current intermittent hemodialysis therapy, their pathophysiologic consequences, review the evidence for interventions that are cardioprotective, and explore new approaches that may further reduce the systemic burden of hemodialysis. These include improved biocompatible materials, smart dialysis machines that automatically may control the fluxes of solutes and electrolytes, volume and hemodynamic control, health trackers, and potentially disruptive technologies facilitating a more personalized medicine approach.

  • N Pilia, S Severi, J G Raimann, S Genovesi, O Dössel, P Kotanko, C Corsi, A Loewe

    Diseases caused by alterations of ionic concentrations are frequently observed challenges and play an important role in clinical practice. The clinically established method for the diagnosis of electrolyte concentration imbalance is blood tests. A rapid and non-invasive point-of-care method is yet needed. The electrocardiogram (ECG) could meet this need and becomes an established diagnostic tool allowing home monitoring of the electrolyte concentration also by wearable devices. In this review, we present the current state of potassium and calcium concentration monitoring using the ECG and summarize results from previous work. Selected clinical studies are presented, supporting or questioning the use of the ECG for the monitoring of electrolyte concentration imbalances. Differences in the findings from automatic monitoring studies are discussed, and current studies utilizing machine learning are presented demonstrating the potential of the deep learning approach. Furthermore, we demonstrate the potential of computational modeling approaches to gain insight into the mechanisms of relevant clinical findings and as a tool to obtain synthetic data for methodical improvements in monitoring approaches.

No Results Found