Objectives: The aim of this study was to estimate thresholds for production volume, durability, and cost of care for the cost-effective adoption of liver organ replacement technologies (ORTs).
Methods: We constructed a discrete-event simulation model of the liver allocation system in the United States. The model was calibrated against UNOS data (1994–2000). Into this model, we introduced ORTs with varying durability (time to failure), cost of care, and production volume. Primary outputs of interest were time to 5 percent reduction in the waiting list and time to 5 percent increase in expected transplant volume.
Results: Model output for both calibration and validation phases closely matched published data: waiting list length (±2 percent), number of transplants (±2 percent), deaths while waiting (±5 percent), and time to transplant (±11 percent). Reducing the waiting list was dependent on both ORT durability and production volume. The longer the durability, the less production volume needed to reduce the waiting list and vice versa. However, below 250 ORT/year, durability needed to be >2 years for any significant change to be seen in the waiting list. For base-case costs, all ORT production volume and durability scenarios result in more transplants per year at less total cost of care/patient than the current system. ORTs remain cost saving until manufacturing costs are >5 times base-case costs, production is less 500 ORT/year, and durability <6 months.
Conclusions: Although there remain many technical challenges to overcome, as long as ORTs can meet these threshold criteria, they have the potential of transforming the world of end-stage liver disease.