Elevated Oxygen Extraction During Heart Transplantation Is Associated With Increased Morbidity and Mortality: Implications for Goal-Directed Perfusion

This article examines whether oxygen extraction ratio, or O2ER, during cardiopulmonary bypass can better predict poor outcomes after adult heart transplantation than the more traditional perfusion target of indexed oxygen delivery, or DO2i. The study addresses an important question in transplant surgery because heart transplantation is physiologically complex, especially during reperfusion of the donor heart, and standard flow or oxygen delivery targets may not fully capture whether the body’s tissues are actually receiving enough oxygen to meet metabolic demand. The authors propose that O2ER may be a more useful marker because it reflects the balance between oxygen supply and oxygen consumption rather than supply alone. 

The investigators performed a retrospective single-center cohort analysis at Vanderbilt University Medical Center and included 381 adult heart transplant recipients treated between November 2021 and June 2025. They excluded multiorgan transplants, adult congenital heart disease cases, and patients with missing perfusion data. Minute-by-minute cardiopulmonary bypass data were extracted from the institutional perfusion management system, allowing the researchers to analyze oxygen-related variables continuously through the operation. Their primary exposure was time spent with O2ER greater than 0.20, a threshold chosen because it falls within published physiologic ranges and offers a practical real-time target for perfusion teams. The primary outcome was a composite morbidity-mortality endpoint that included severe primary graft dysfunction, prolonged ventilation beyond 72 hours, intensive care unit stay longer than 15 days, need for renal replacement therapy, or death within 90 days. 

The central finding was that higher O2ER burden during bypass was associated with worse outcomes after transplantation. Among the 381 recipients, 40 patients, or 10.5%, experienced the composite morbidity-mortality endpoint. Patients with poor outcomes had higher median O2ER values, larger O2ER area under the curve, and longer time above the 0.20 threshold than those without major events. The paper’s trajectory analysis showed that O2ER differences between the poor-outcome and better-outcome groups became most apparent during the middle portion of bypass, roughly 35 to 100 minutes into the procedure. That time-dependent separation is important because it suggests the risk signal is not static across the operation and may be most clinically relevant during periods of rewarming, flow shifts, and reperfusion stress. The line graph on page 4 and the central message box on page 1 both reinforce this pattern. 

In the weighted regression analysis, every additional 10 minutes with O2ER above 0.20 was associated with a 7% increase in the odds of the composite morbidity-mortality endpoint and a 13% increase in the odds of 90-day mortality. The association with severe primary graft dysfunction alone was not significant in the main continuous exposure model, but patients in the high O2ER burden group still had worse adjusted odds of severe graft dysfunction, severe right ventricular dysfunction, and death at 90 days. The survival analysis also showed worse one-year survival in the high O2ER burden group compared with the low O2ER burden group. These findings suggest that cumulative oxygen supply-demand mismatch during bypass is not just a theoretical marker but a clinically meaningful predictor of outcomes after transplant. 

One of the most valuable aspects of this paper is its comparison between O2ER and DO2i. Goal-directed perfusion in cardiac surgery often centers on maintaining DO2i above a fixed threshold, commonly 280 mL/min/m². This study found that adding O2ER burden to a model based only on time below DO2i improved model fit, while adding low DO2i time to an O2ER-only model did not. In other words, O2ER appeared to carry the stronger prognostic signal. The heat map on page 8 visually supports this conclusion, showing that predicted morbidity-mortality risk rises mainly with increasing O2ER burden, while DO2i burden alone contributes less once O2ER is considered. That is a clinically relevant insight because it argues that supply alone is not enough; perfusion teams may need to monitor whether oxygen delivery is adequate for the patient’s actual metabolic demand in real time. 

The post hoc phase-specific analysis adds another layer of insight. When the authors separated bypass into pre-reperfusion and post-reperfusion phases, they found that post-reperfusion O2ER burden was independently associated with the composite morbidity-mortality outcome and severe primary graft dysfunction. Pre-reperfusion O2ER burden, meanwhile, was more closely linked to 90-day mortality. This distinction is biologically plausible because reperfusion of the donor heart is a period of abrupt metabolic stress, altered microcirculation, and ischemia-reperfusion injury. The paper argues that heightened vigilance during reperfusion may therefore be especially important if O2ER is to be used as a real-time target. 

The study also explored factors associated with prolonged high O2ER burden. Diabetes, prior sternotomy, pretransplant left ventricular assist device support, and extracorporeal membrane oxygenation were associated with longer time above the O2ER threshold, suggesting that sicker or more surgically complex patients are especially vulnerable to intraoperative oxygen imbalance. Some graft storage strategies, including 10°C static cold storage and hypothermic oxygenated perfusion, appeared protective. These exploratory findings may help identify which patients are most likely to benefit from tighter perfusion management. 

This was a well-executed observational study with several strengths. It used continuous minute-level perfusion data, prespecified covariates, generalized propensity score weighting, multiple complementary statistical models, and clinically meaningful outcomes. Still, the authors appropriately note major limitations. It was retrospective, single-center, and observational, so causation cannot be proven. The event count was modest, residual confounding remains possible, and console-derived oxygen variables may contain measurement error, especially during weaning from bypass. Even so, the consistency of the findings across dose-response, grouped analysis, survival analysis, and phase-specific modeling makes the signal persuasive enough to justify prospective validation. Overall, this article supports a shift toward O2ER-guided goal-directed perfusion in adult heart transplantation and suggests that minimizing time above O2ER thresholds may improve early transplant outcomes. 

3
This is a meaningful and methodologically thoughtful observational cohort study with a solid sample size for a heart transplant population, continuous intraoperative data capture, and appropriate multivariable and weighted analyses. It does not reach the highest levels of evidence because it is retrospective, single-center, nonrandomized, and vulnerable to residual confounding, so it supports association rather than causation.