Implant waste during total hip arthroplasty (THA) represents a significant cost to the USA healthcare system. While studies have explored methods to improve THA cost-effectiveness, the literature comparing the proportions of implant waste by intraoperative technology used during THA is limited. The aims of this study were to: 1) examine whether the use of enabling technologies during THA results in a smaller proportion of wasted implants compared to navigation-guided and conventional manual THA; 2) determine the proportion of wasted implants by implant type; and 3) examine the effects of surgeon experience on rates of implant waste by technology used. We identified 104,420 implants either implanted or wasted during 18,329 primary THAs performed on 16,724 patients between January 2018 and June 2022 at our institution. THAs were separated by technology used: robotic-assisted (n = 4,171), imageless navigation (n = 6,887), and manual (n = 7,721). The primary outcome of interest was the rate of implant waste during primary THA.Aims
Methods
Prosthetic joint infection (PJI) is a devastating and costly complication of total joint arthroplasty (TJA). Use of extended oral antibiotic prophylaxis (EOAP) has become increasingly popular in the United States following a highly publicized study (Inabathula et al) from a single center demonstrating a significant protective effect (81% reduction) against PJI in ‘high-risk’ patients. However, these results have not been reproduced elsewhere and EOAP use directly conflicts with current antibiotic stewardship efforts. In order to study the role of EOAP in PJI prevention, consensus is needed for what defines ‘high-risk’ patients. The revision TJA (rTJA) population is an appropriate group to study due to having a higher incidence of PJI. The purpose of the current study was to rigorously determine which preoperative conditions described by Inabathula et al. (referred to as Inabathula criteria (IBC)) confer a higher rate of PJI in patients undergoing aseptic rTJA. 2,256 patients that underwent aseptic rTJA at a single high-volume institution between 2016–2022 were retrospectively reviewed. Patient demographics and comorbidities were recorded to determine if they had 1 or more ‘IBC’, a long list of preoperative conditions including autoimmune diseases, active smoking, body mass index (BMI)>35, diabetes mellitus, and chronic kidney disease (CKD). Reoperation for PJI at 90-days and 1-year was recorded. Chi-squared or Fischer's exact tests were calculated to determine the association between preoperative presence/absence of IBC and PJI. Multivariable logistic regressions were conducted to determine if specific comorbidities within the IBC individually conferred an increased PJI risk.Aim
Method
The purpose of this study was to assess mid-term survivorship following primary total knee arthroplasty (TKA) with Optetrak Logic components and identify the most common revision indications at a single institution. We identified a retrospective cohort of 7,941 Optetrak primary TKAs performed from January 2010 to December 2018. We reviewed the intraoperative findings of 369 TKAs that required revision TKA from January 2010 to December 2021 and the details of the revision implants used. Kaplan-Meier analysis was used to determine survivorship. Cox regression analysis was used to examine the impact of patient variables and year of implantation on survival time.Aims
Methods
Known risk factors for early periprosthetic femur fracture (PFF) following total hip arthroplasty (THA) include poor bone quality, surgical approach and cementless implants. The association between femoral component size and alignment and the risk of early PFF is not well described. We evaluated radiographic parameters of femoral component sizing and alignment as risk factors for early PFF. From 16,065 primary cementless THA, we identified 66 cases (0.41%) of early PFF (<90 days from index THA) at a single institution between 2016–2020. The stem was unstable and revised in all cases. We matched 60 cases of early PFFs (2:1) to 120 controls based on femoral component model, offset, surgical approach, age, BMI, and sex. Mean age was 67 years; 60% were female. Radiographic assessment of preoperative bone morphology and postoperative femoral component parameters including stem alignment, metaphyseal fill, and medial congruence with the calcar. A multivariable logistic regression was built to identify radiographic risk factors associated with early PPF. Markers of poor preoperative bone quality including canal calcar ratio (p=0.003), canal flare index (p<0.001), anteroposterior canal bone ratio (CBR) (p<0.001) and lateral CBR (p<0.001) were statistically associated with PFF. Valgus alignment (23% versus 12%) (p<0.001) was more prevalent in the PFF group compared to controls, as well as varus alignment of the implant (57% versus 43%). Distance between the medial aspect of the implant and the calcar was greater in cases of PFF (2.5 mm versus 1.4 mm) (p<0.001). Multivariate analysis demonstrated that valgus implant alignment (Odds Ratio (OR) 5) and medial implant-calcar incongruity (OR 2) increased the risk of early PFF. Medial implant-calcar incongruity and valgus alignment of the femoral component were risk factors for early PFF following cementless THA after controlling for age, sex, BMI, approach, proximal femoral morphology, and implant design.
Due to the opioid epidemic in the USA, our service progressively decreased the number of opioid tablets prescribed at discharge after primary hip (THA) and knee (TKA) arthroplasty. The goal of this study was to analyze the effect on total morphine milligram equivalents (MMEs) prescribed and post-discharge opioid repeat prescriptions. We retrospectively reviewed 19,428 patients undergoing a primary THA or TKA between 1 February 2016 and 31 December 2019. Two reductions in the number of opioid tablets prescribed at discharge were implemented over this time; as such, we analyzed three periods (P1, P2, and P3) with different routine discharge MME (750, 520, and 320 MMEs, respectively). We investigated 90-day refill rates, refill MMEs, and whether discharge MMEs were associated with represcribing in a multivariate model.Aims
Methods
Due to the opioid epidemic, our service developed a cultural change highlighted by decreasing discharge opioids after lower extremity arthroplasty. However, concern of potentially increasing refill requests exists. As such, the goal of this study was to analyze whether decreased discharge opioids led to increased postoperative opioid refills. We retrospectively reviewed 19,428 patients undergoing a primary hip or knee arthroplasty at a single institution from 2016–2019. Patients that underwent secondary procedures within that timeframe were excluded. Two-thousand two-hundred and forty-one patients (12%) were on narcotics preoperatively or had chronic pain syndrome. Two reductions in routine discharge narcotics were performed over this timeframe. First, 8,898 patients routinely received 750 morphine milligram equivalents (MMEs). After the first reduction, 4,842 patients routinely received 520 MMEs. After a second reduction, 5,688 patients routinely received 320 MMEs. We analyzed refill rates, refill MMEs, and whether discharge MMEs were associated with refill MMEs in a multivariate model.Introduction
Methods
Instability following total knee arthroplasty is a leading cause of failure and is often treated with component revision. The goal of this study was to determine if isolated tibial polyethylene insert exchange (ITPIE) to a higher-level constraint would afford similar outcomes to component revision in the properly selected patient. We retrospectively evaluated 176 consecutive patients between 2016–2017 who were revised for symptomatic instability at a single institution. Demographic information and level of constraint preoperatively and postoperatively were documented. Radiographic parameters were also recorded for patients undergoing ITPIE. Outcome measures included all cause re-revision rates as well as patient reported outcome measures (PROMs) obtained preoperatively and at minimum 1-yr follow up. Descriptive analysis including sample t-test and chi square test were performed with statistical significance set at p <0.05.Introduction
Methods
This combined clinical and in vitro study aimed to determine the incidence of liner malseating in modular dual mobility (MDM) constructs in primary total hip arthroplasties (THAs) from a large volume arthroplasty centre, and determine whether malseating increases the potential for fretting and corrosion at the modular metal interface in malseated MDM constructs using a simulated corrosion chamber. For the clinical arm of the study, observers independently reviewed postoperative radiographs of 551 primary THAs using MDM constructs from a single manufacturer over a three-year period, to identify the incidence of MDM liner-shell malseating. Multivariable logistic regression analysis was performed to identify risk factors including age, sex, body mass index (BMI), cup design, cup size, and the MDM case volume of the surgeon. For the in vitro arm, six pristine MDM implants with cobalt-chrome liners were tested in a simulated corrosion chamber. Three were well-seated and three were malseated with 6° of canting. The liner-shell couples underwent cyclic loading of increasing magnitudes. Fretting current was measured throughout testing and the onset of fretting load was determined by analyzing the increase in average current.Aims
Methods
The purpose of this investigation was to determine the relationship between height, weight, and sex with implant size in total knee arthroplasty (TKA) using a multivariate linear regression model and a Bayesian model. A retrospective review of an institutional registry was performed of primary TKAs performed between January 2005 and December 2016. Patient demographics including patient age, sex, height, weight, and body mass index (BMI) were obtained from registry and medical record review. In total, 8,100 primary TKAs were included. The mean age was 67.3 years (SD 9.5) with a mean BMI of 30.4 kg/m2 (SD 6.3). The TKAs were randomly split into a training cohort (n = 4,022) and a testing cohort (n = 4,078). A multivariate linear regression model was created on the training cohort and then applied to the testing cohort . A Bayesian model was created based on the frequencies of implant sizes in the training cohort. The model was then applied to the testing cohort to determine the accuracy of the model at 1%, 5%, and 10% tolerance of inaccuracy.Aims
Methods
Obesity has been shown to be an independent risk factor for aseptic loosening of the tibia and smaller implant size has been correlated with increased risk of failure of tibial components in obese patients [1,2]. Many surgeons have noted that obese patients, especially females, not uncommonly will have small implant sizes. As such, we hypothesized that obesity was not directly correlated with total knee arthroplasty (TKA) implant sizes. The purpose of this study was to determine if increasing body mass index (BMI), height, and/or weight is associated with implant size in primary TKA. The institutional registry of a single academic center was reviewed to identify all primary TKAs performed between 2005 and 2016. Those without minimum 2-year follow-up or with incomplete implant data were excluded. The different manufacturer's implant designs were categorized based on anteroposterior and mediolateral dimensions of the femoral and tibial component sizes and cross sectional area was determined. BMI was categorized by the World Health Organization (WHO) obesity scale (Class I: BMI 30 to <35, Class II: BMI 35 to <40, Class III: BMI 40 kg/m2 or greater). Patient demographics including sex, height, weight, and BMI were analyzed to evaluate correlations with implant size using Pearson correlation coefficients.Background
Methods
Primary total knee arthroplasties (TKA) performed in younger patients raise concerns regarding the potential for accelerated polyethylene wear, aseptic loosening, and thus revision TKA at a younger age. The purpose of this study was to determine the long-term implant survivorship, functional outcomes, and pain relief of primary TKA performed in patients under 35 years of age. A retrospective review of our institutional registry identified 185 TKAs performed in 119 patients under the age of 35 between 1985 and 2010. Medical records and radiographs were reviewed. Patients were contacted for two serial questionnaires in 2011–2012 and again in 2018. Implant survivorship was calculated using Kaplan-Meier survivorship curves and Cox proportional hazard model. The median age was 26.1 (21.5–30.1) years, with a BMI of 23.5 (20.4–26.6) kg/m2. Median follow-up was 13.9 (8.5–19.8) years.Introduction
Methods
The outcomes of total knee arthroplasty (TKA) depend on many factors. The impact of implant design on patient-reported outcomes is unknown. Our goal was to evaluate the patient-reported outcomes and satisfaction after primary TKA in patients with osteoarthritis undergoing primary TKA using five different brands of posterior-stabilized implant. Using our institutional registry, we identified 4135 patients who underwent TKA using one of the five most common brands of implant. These included Biomet Vanguard (Zimmer Biomet, Warsaw, Indiana) in 211 patients, DePuy/Johnson & Johnson Sigma (DePuy Synthes, Raynham, Massachusetts) in 222, Exactech Optetrak Logic (Exactech, Gainesville, Florida) in 1508, Smith & Nephew Genesis II (Smith & Nephew, London, United Kingdom) in 1415, and Zimmer NexGen (Zimmer Biomet) in 779 patients. Patients were evaluated preoperatively using the Knee Injury and Osteoarthritis Outcome Score (KOOS), Lower Extremity Activity Scale (LEAS), and 12-Item Short-Form Health Survey questionnaire (SF-12). Demographics including age, body mass index, Charlson Comorbidity Index, American Society of Anethesiologists status, sex, and smoking status were collected. Postoperatively, two-year KOOS, LEAS, SF-12, and satisfaction scores were compared between groups.Aims
Patients and Methods
Custom flange acetabular components (CFACs) are a patient-specific option for addressing large acetabular defects at revision total hip arthroplasty (THA), but patient and implant characteristics that affect survivorship remain unknown. This study aimed to identify patient and design factors related to survivorship. A retrospective review of 91 patients who underwent revision THA using 96 CFACs was undertaken, comparing features between radiologically failed and successful cases. Patient characteristics (demographic, clinical, and radiological) and implant features (design characteristics and intraoperative features) were collected. There were 74 women and 22 men; their mean age was 62 years (31 to 85). The mean follow-up was 24.9 months (Aims
Patients and Methods
Custom flanged acetabular components (CFAC) have been shown to be effective in treating complex acetabular reconstructions in revision total hip arthroplasty (THA). However, the specific patient factors and CFAC design characteristics that affect the overall survivorship remain unclear. Once the surgeon opts to follow this treatment pathway, numerous decisions need to be made during the pre-operative design phase and during implantation, which may influence the ultimate success of CFAC. The goal of this study was to retrospectively review the entire cohort of CFAC cases performed at a large volume institution and to identify any patient, surgeon, or design factors that may be related to the long-term survival of these prostheses. We reviewed 96 CFAC cases performed in 91 patients between 2004 and 2017, from which 36 variables were collected spanning patient demographics, pre-operative clinical and radiographic features, intraoperative information, and implant design characteristics. Patient demographics and relevant clinical features were collected from individual medical records. Radiographic review included analysis of pre-operative radiographs, computer tomographic (CT) scans, and serial post-operative radiographs. Radiographic failure was defined as loosening or gross migration as determined by a board-certified orthopedic surgeon. CFAC implant design characteristics and intra-operative features were collected from the design record, surgical record and post-operative radiograph for each case respectively. Two sets of statistical analyses were performed with this dataset. First, univariate analyses were performed for each variable, comprising of a Pearson chi-square test for categorical variables and an independent t-test for continuous variables. Second, a random forest supervised machine learning method was applied to identify the most influential variables within the dataset, which were then used to perform a bivariable logistic regression to generate odds ratios. Statistical significance for this study was set at p < 0.05.Introduction
Methods
This study reports the outcomes of a technique of soft-tissue coverage and Chopart amputation for severe crush injuries of the forefoot. Between January 2012 to December 2016, 12 patients (nine male; three female, mean age 38.58 years; 26 to 55) with severe foot crush injury underwent treatment in our institute. All patients were followed-up for at least one year. Their medical records, imaging, visual analogue scale score, walking ability, complications, and functional outcomes one year postoperatively based on the American Orthopedic Foot and Ankle Society (AOFAS) and 36-Item Short-Form Health Survey (SF-36) scores were reviewed.Aims
Patients and Methods
Despite the increasing prevalence of sleep apnoea,
little information is available regarding its impact on the peri-operative
outcome of patients undergoing posterior lumbar fusion. Using a
national database, patients who underwent lumbar fusion between
2006 and 2010 were identified, sub-grouped by diagnosis of sleep
apnoea and compared. The impact of sleep apnoea on various outcome
measures was assessed by regression analysis. The records of 84
655 patients undergoing posterior lumbar fusion were identified
and 7.28% (n = 6163) also had a diagnostic code for sleep apnoea.
Compared with patients without sleep apnoea, these patients were
older, more frequently female, had a higher comorbidity burden and
higher rates of peri-operative complications, post-operative mechanical
ventilation, blood product transfusion and intensive care. Patients
with sleep apnoea also had longer and more costly periods of hospitalisation. In the regression analysis, sleep apnoea emerged as an independent
risk factor for the development of peri-operative complications
(odds ratio (OR) 1.50, confidence interval (CI) 1.38;1.62), blood
product transfusions (OR 1.12, CI 1.03;1.23), mechanical ventilation
(OR 6.97, CI 5.90;8.23), critical care services (OR 1.86, CI 1.71;2.03), prolonged
hospitalisation and increased cost (OR 1.28, CI 1.19;1.37; OR 1.10,
CI 1.03;1.18). Patients with sleep apnoea who undergo posterior lumbar fusion
pose significant challenges to clinicians. Cite this article:
Increasing numbers of posterior lumbar fusions
are being performed. The purpose of this study was to identify trends
in demographics, mortality and major complications in patients undergoing
primary posterior lumbar fusion. We accessed data collected for
the Nationwide Inpatient Sample for each year between 1998 and 2008
and analysed trends in the number of lumbar fusions, mean patient
age, comorbidity burden, length of hospital stay, discharge status,
major peri-operative complications and mortality. An estimated 1 288 496
primary posterior lumbar fusion operations were performed between
1998 and 2008 in the United States. The total number of procedures,
mean patient age and comorbidity burden increased over time. Hospital
length of stay decreased, although the in-hospital mortality (adjusted
and unadjusted for changes in length of hospital stay) remained
stable. However, a significant increase was observed in peri-operative
septic, pulmonary and cardiac complications. Although in-hospital mortality
rates did not change over time in the setting of increases in mean
patient age and comorbidity burden, some major peri-operative complications
increased. These trends highlight the need for appropriate peri-operative services
to optimise outcomes in an increasingly morbid and older population
of patients undergoing lumbar fusion.