Receive monthly Table of Contents alerts from Orthopaedic Proceedings
Comprehensive article alerts can be set up and managed through your account settings
View my account settingsRecent registry data from around the world has strongly suggested that using cemented hip hemiarthroplasty has lower revision rates compared to cementless hip hemiarthroplasty for acute femoral neck hip fractures. The adoption of using cemented hemiarthroplasty for hip fracture has been slow as many surgeons continue to use uncemented stems. One of the reasons is that surgeons feel more comfortable with uncemented hemiarthroplasty as they have used it routinely. The purpose of this study is to compare the difference in revision rates of cemented and cementless hemiarthroplasty and stratify the risk by surgeon experience. By using a surgeons annual volume of Total Hip Replacements performed as an indicator for surgeon experience. The Canadian Joint Replacement Registry Database was used to collect and compare the outcomes to report on the revision rates based on surgeon volume.
This is a large Canadian Registry Study where 68447 patients were identified for having a hip hemiarthroplasty from 2012-2020. This is a retrospective cohort study, identifying patients that had cementless or cemented hip hemiarthroplasty. The surgeons who performed the procedures were linked to the procedure Total Hip Replacement. Individuals were categorized as experienced hip surgeons or not based on whether they performed 50 hip replacements a year. Identifying high volume surgeon (>50 cases/year) and low volume (<50 cases/year) surgeons. Hazard ratios adjusted for age and sex were performed for risk of revision over this 8-year span. A p-value <0.05 was deemed significant.
For high volume surgeons, cementless fixation had a higher revision risk than cemented fixation, HR 1.29 (1.05-1.56), p=0.017. This pattern was similar for low volume surgeons, with cementless fixation having a higher revision risk than cemented fixation, HR 1.37 (1.11-1.70) p=0.004 We could not detect a difference in revision risk for cemented fixation between low volume and high volume surgeons; at 0-1.5 years the HR was 0.96 (0.72-1.28) p=0.786, and at 1.5+ years the HR was 1.61 (0.83-3.11) p=0.159. Similarly, we could not detect a difference in revision risk for cementless fixation between low volume and high volume surgeons, HR 1.11 (0.96-1.29) p=0.161
Using large registry data, cemented hip hemiarthroplasty has a significant lower revision rate than the use of cementless stems even when surgeons are stratified to high and low volume. Low volume surgeons who use uncemented prostheses have the highest rate of revision. The low volume hip surgeon who cements has a lower revision rate than the high volume cementless surgeon. The results of this study should help to guide surgeons that no matter the level of experience, using a cemented hip hemiarthroplasty for acute femoral neck fracture is the safest option. That high volume surgeons who perform cementless hemiarthroplasty are not immune to having revisions due to their technique. Increased training and education should be offered to surgeons to improve comfort when using this technique.
Fractures of the humeral diaphysis occur in a bimodal distribution and represent 3-5% of all fractures. Presently, the standard treatment of isolated humeral diaphyseal fractures is nonoperative care using splints, braces, and slings. Recent data has questioned the effectiveness of this strategy in ensuring fracture healing and optimal patient function. The primary objective of this randomized controlled trial (RCT) was to assess whether operative treatment of humeral shaft fractures with a plate and screw construct provides a better functional outcome than nonoperative treatment. Secondary objectives compared union rates and both clinical and patient-reported outcomes.
Eligible patients with an isolated, closed humeral diaphyseal fracture were randomized to either nonoperative care (initial sugar-tong splint, followed by functional coaptation brace) or open reduction and internal fixation (ORIF; plate and screw construct). The primary outcome measure was the Disability Shoulder, Arm, Hand (DASH) score assessed at 2-, 6-, 16-, 24-, and 52-weeks. Secondary outcomes included the Short Musculoskeletal Functional Assessment (SMFA), the Constant Shoulder Score, range of motion (ROM), and radiographic parameters. Independent samples t-tests and Chi-squared analyses were used to compare treatment groups. The DASH, SMFA, and Constant Score were modelled over time using a multiple variable mixed effects model.
A total of 180 patients were randomized, with 168 included in the final analysis. There were 84 patients treated nonoperatively and 84 treated with ORIF. There was no significant difference between the two treatment groups for age (mean = 45.4 years, SD 16.5 for nonoperative group and 41.7, SD 17.2 years for ORIF group; p=0.16), sex (38.1% female in nonoperative group and 39.3% female in ORIF group; p=0.87), body mass index (mean = 27.8, SD 8.7 for nonoperative group and 27.2, SD 6.2 for ORIF group; p=0.64), or smoking status (p=0.74). There was a significant improvement in the DASH scores at 6 weeks in the ORIF group compared to the nonoperative group (mean=33.8, SD 21.2 in the ORIF group vs. mean=56.5, SD=21.1 in the nonoperative group; p < 0 .0001). At 4 months, the DASH scores were also significantly better in the ORIF group (mean=21.6, SD=19.7 in the ORIF group vs. mean=31.6, SD=24.6 in the nonoperative group; p=0.009. However, there was no difference in DASH scores at 12-month follow-up between the groups (mean=8.8,SD=10.9 vs. mean=11.0, SD=16.9 in the nonoperative group; p=0.39). Males had improved DASH scores at all timepoints compared with females. There was significantly quicker time to union (p=0.016) and improved position (p < 0 .001) in the ORIF group. There were 13 (15.5%) nonunions in the nonoperative group and four (4.7%) combined superficial and deep infections in the ORIF group. There were seven radial nerve palsies in the nonoperative group and five (a single iatrogenic) radial nerve palsies in the ORIF group.
This large RCT comparing operative and nonoperative treatment of humeral diaphyseal fractures found significantly improved functional outcome scores in patients treated surgically at 6 weeks and 4 months. However, the early functional improvement did not persist at the 12-month follow-up. There was a 15.5% nonunion rate, which required surgical intervention, in the nonoperative group and a similar radial nerve palsy rate between groups.
Multiligament knee injuries (MLKI) are rare and life-altering injuries that remain difficult to treat clinically due to a paucity of evidence guiding surgical management and timing. The purpose of this study was to compare injury specific functional outcomes following early versus delayed surgical reconstruction in MLKI patients to help inform timing decisions in clinical practice.
A retrospective analysis of prospectively collected data from patients with MLKIs at a single academic level 1-trauma center was conducted. Patients were eligible for inclusion if they had an MLKI, underwent reconstructive surgery either prior to 6wks from injury or between 12weeks and 2 years from injury, and had at least 12months of post-surgical follow-up. Patients with a vascular injury, open injuries or associated fractures were excluded. Study participants were stratified into early (<6wks from injury) and delayed surgical intervention (>12 weeks – 2 years from injury). The primary outcome was patient reported, injury specific, quality of life in the form of the Multiligament Quality of Life questionnaire (MLQOL) and its four domains (Physical Impairment, Emotional Impairment, Activity Limitations and Societal Involvement). We secondarily analyzed differences in the need for manipulation under anesthesia, and reoperation rates, as well as radiographic Kellgren Lawrence (KL) arthritis grades, knee laxity grading and range of motion at the most recent follow-up.
A total of 131 patients met our inclusion criteria, all having had surgery between 2006 and 2019. There were 75 patients in the early group and 56 in the delayed group. The mean time to surgery was 17.6 ± 8.0 days in the early group versus 279 ± 146.5 days in the delayed. Mean postoperative follow-up was 58 months. There were no significant differences between early and delayed groups with respect to age (34 vs. 32.8 years), sex (77% vs 63% male), BMI (28.3 vs 29.7 kg/m2), or injury mechanism (p>0.05). The early surgery group was found to include more patients with lateral sided injuries (n=49 [65%] vs. n=23 [41%]; p=0.012), a higher severity of Schenck Classification (p=0.024) as well as nerve injuries at initial presentation (n=35 [49%] vs n=8 [18%]; p<0.001). Multivariable linear regression analyses of the four domains of the MLQOL did not demonstrate an independent association with early versus delayed surgery status (p>0.05), when controlling for age, sex, Schenck classification, medial versus lateral injury, and nerve injury status. In terms of our secondary outcomes, we found that the early group underwent significantly more manipulations under anesthesia compare with the delayed group (n=24, [32%] vs n=8 [14%], p=0.024). We did not identify a significant difference in physical examination laxity grades, range of motion, KL grade or reoperation rates between groups (p>0.05).
We found no difference in patient reported outcomes between those who underwent early versus delayed surgery following MLKI reconstruction. In our secondary outcomes, we found significantly more patients in the early surgery group required a manipulation under anesthesia following surgery, which may indicate a propensity for arthrofibrosis after early MLKI reconstruction.
Hallux valgus surgery can result in moderate to severe post-operative pain requiring the use of narcotic medication. The percutaneous distal metatarsal osteotomy is a minimally invasive approach which offers many advantages including minimal scarring, immediate weight bearing and decreased post-operative pain. The goal of this study is to determine whether the use of narcotics can be eliminated using an approach combining multimodal analgesia, ankle block anesthesia and a minimally invasive surgical approach.
Following ethics board approval, a total of 160 ambulatory patients between the ages of 18-70 with BMI ≤ 40 undergoing percutaneous hallux valgus surgery are to be recruited and randomized into
During the first post-operative week, the
For the VAS scores at 24, 48, 72 hours and seven days the
Adequate visual clarity is paramount to performing arthroscopic shoulder surgery safely, efficiently, and effectively. The addition of epinephrine in irrigation fluid, and the intravenous or local administration of tranexamic acid (TXA) have independently been reported to decrease bleeding thereby improving the surgeon's visualization during arthroscopic shoulder procedures. No study has compared the effect of systemic administered TXA, epinephrine added in the irrigation fluid or the combination of both TXA and epinephrine on visual clarity during shoulder arthroscopy with a placebo group. The purpose of this study is to determine if intravenous TXA is a safe alternative to epinephrine delivered by a pressure-controlled pump in improving arthroscopic shoulder visualization during arthroscopic procedures and whether using both TXA and epinephrine together has an additive effect in improving visualization.
The design of the study was a double-blinded, randomized controlled trial with four 1:1:1:1 parallel groups conducted at one center. Patients aged ≥18 years undergoing arthroscopic shoulder procedures including rotator cuff repair, arthroscopic biceps tenotomy/tenodesis, distal clavicle excision, subacromial decompression and labral repair by five fellowship-trained upper extremity surgeons were randomized into one of four arms: Pressure pump-controlled regular saline irrigation fluid (control), epinephrine (1ml of 1:1000) mixed in irrigation fluid (EPI), 1g intravenous TXA (TXA), and epinephrine and TXA (EPI/TXA). Visualization was rated on a 4-point Likert scale every 15 minutes with 0 indicating ‘poor’ quality and 3 indicating ‘excellent’ quality. The primary outcome measure was the unweighted mean of these ratings. Secondary outcomes included mean arterial blood pressure (MAP), surgery duration, surgery complexity, and adverse events within the first postoperative week.
One hundred and twenty-eight participants with a mean age (± SD) of 56 (± 11) years were randomized. Mean visualization quality for the control, TXA, EPI, and EPI/TXA groups were 2.1 (±0.40), 2.1 (±0.52), 2.6 (±0.37), 2.6 (±0.35), respectively. In a regression model with visual quality as the dependent variable, the presence/absence of EPI was the most significant predictor of visualization quality (R=0.525; p < 0 .001). TXA presence/absence had no effect, and there was no interaction between TXA and EPI. The addition of MAP and surgery duration strengthened the model (R=0.529; p < 0 .001). Increased MAP and surgery duration were both associated with decreased visualization quality. When surgery duration was controlled, surgery complexity was not a significant predictor of visualization quality. No adverse events were recorded in any of the groups.
Intravenous administration of TXA is not an effective alternative to epinephrine in the irrigation fluid to improve visualization during routine arthroscopic shoulder surgeries although its application is safe. There is no additional improvement in visualization when TXA is used in combination with epinephrine beyond the effect of epinephrine alone.
Distal radius fractures (DRF) are common and the indication for surgical treatment remain controversial in patients higher than 60 years old. The purpose of the study was to review and analyze the current evidence-based literature.
We performed a systematic review and meta-analysis according to PRISMA guidelines in order to evaluate the efficacy of volar locking plating (VLP) and conservative treatment in DRF in patients over 60 years old. Electronic databases including MEDLINE, CENTRAL, Embase, Web of science and Clinical Trial.gov were searched from inception to October 2020 for randomized controlled trials. Relevant article reference lists were also passed over.
Two reviewers independently screened and extracted the data. Main outcomes included functional status: wrist range of motion, validated scores and grip strength. Secondary outcomes include post-operative complications and radiologic assessment.
From 3009 screened citations, 5 trials (539 patients) met the inclusion criteria. All trials of this random effect meta-analysis were at moderate risk of bias due to lack of blinding. Differences in the DASH score (MD −5,91; 95% CI, −8,83; −3,00), PRWE score (MD −9.07; 95% CI, −14.57, −3.57) and grip strength (MD 5,12; 95% CI, 0,59-9,65) were statistically significant and favored VLPs. No effect was observed in terms of range of motion. Adverse events are frequent in both treatment groups, reoperation rate is higher in the VLP group.
VLP may provide better functional outcomes in patients higher than 60 years old. More RCT are still needed to evaluate if the risks and complications of VLP outweigh the benefits.
Diagnostic interpretation error of paediatric musculoskeletal (MSK) radiographs can lead to late presentation of injuries that subsequently require more invasive surgical interventions with increased risks of morbidity. We aimed to determine the radiograph factors that resulted in diagnostic interpretation challenges for emergency physicians reviewing pediatric MSK radiographs.
Emergency physicians provided diagnostic interpretations on 1,850 pediatric MSK radiographs via their participation in a web-based education platform. From this data, we derived interpretation difficulty scores for each radiograph using item response theory. We classified each radiograph by body region, diagnosis (fracture/dislocation absent or present), and, where applicable, the specific fracture location(s) and morphology(ies). We compared the interpretation difficulty scores by diagnosis, fracture location, and morphology. An expert panel reviewed the 65 most commonly misdiagnosed radiographs without a fracture/dislocation to identify normal imaging findings that were commonly mistaken for fractures.
We included data from 244 emergency physicians, which resulted in 185,653 unique radiograph interpretations, 42,689 (23.0%) of which were diagnostic errors. For humerus, elbow, forearm, wrist, femur, knee, tibia-fibula radiographs, those without a fracture had higher interpretation difficulty scores relative to those with a fracture; the opposite was true for the hand, pelvis, foot, and ankle radiographs (p < 0 .004 for all comparisons). The descriptive review demonstrated that specific normal anatomy, overlapping bones, and external artefact from muscle or skin folds were often mistaken for fractures. There was a significant difference in difficulty score by anatomic locations of the fracture in the elbow, pelvis, and ankle (p < 0 .004 for all comparisons). Ankle and elbow growth plate, fibular avulsion, and humerus condylar were more difficult to diagnose than other fracture patterns (p < 0 .004 for all comparisons).
We identified actionable learning opportunities in paediatric MSK radiograph interpretation for emergency physicians. We will use this information to design targeted education to referring emergency physicians and their trainees with an aim to decrease delayed and missed paediatric MSK injuries.
Acute spinal cord injury (SCI) is most often secondary to trauma, and frequently presents with associated injuries. A neurological examination is routinely performed during trauma assessment, including through Advanced Trauma Life Support (ATLS). However, there is no standard neurological assessment tool specifically used for trauma patients to detect and characterize SCI during the initial evaluation. The International Standards for Neurological Classification of Spinal Cord Injury (ISNCSCI) is the most comprehensive and popular tool for assessing SCI, but it is not adapted to the acute trauma patients such that it is not routinely used in that setting. Therefore, the objective is to develop a new tool that can be used routinely in the initial evaluation of trauma patients to detect and characterize acute SCI, while preserving basic principles of the ISNCSCI.
The completion rate of the ISCNSCI during the initial evaluation after an acute traumatic SCI was first estimated. Using a modified Delphi technique, we designed the Montreal Acute Classification of Spinal Cord Injuries (MAC-SCI), a new tool to detect and characterize the completeness (grade) and level of SCI in the polytrauma patient. The ability of the MAC-SCI to detect and characterize SCI was validated in a cohort of 35 individuals who have sustained an acute traumatic SCI. The completeness and neurological level of injury (NLI) were assessed by two independent assessors using the MAC-SCI, and compared to those obtained with the ISNCSCI.
Only 33% of patients admitted after an acute traumatic SCI had a complete ISNCSCI performed at initial presentation. The MAC-SCI includes 53 of the 134 original elements of the ISNCSCI which is 60% less. There was a 100% concordance between the severity grade derived from the MAC-SCI and from the ISNCSCI. Concordance of the NLI within two levels of that obtained from the ISNCSCI was observed in 100% of patients with the MAC-SCI and within one level in 91% of patients. The ability of the MAC-SCI to discriminate between cervical (C0 to C7) vs. thoracic (T1 to T9) vs. thoraco-lumbar (T10 to L2) vs. lumbosacral (L3 to S5) injuries was 100% with respect to the ISNCSCI.
The rate of completion of the ISNCSCI is low at initial presentation after an acute traumatic SCI. The MAC-SCI is a streamlined tool proposed to detect and characterize acute SCI in polytrauma patients, that is specifically adapted to the acute trauma setting. It is accurate for determining the completeness of the SCI and localize the NLI (cervical vs. thoracic vs. lumbar). It could be implemented in the initial trauma assessment protocol to guide the acute management of SCI patients.
Novel immersive virtual reality (IVR) technologies are revolutionizing medical education. Virtual anatomy education using head-mounted displays allows users to interact with virtual anatomical objects, move within the virtual rooms, and interact with other virtual users. While IVR has been shown to be more effective than textbook learning and 3D computer models presented in 2D screens, the effectiveness of IVR compared to cadaveric models in anatomy education is currently unknown. In this study, we aim to compare the effectiveness of IVR with direct cadaveric bone models in teaching upper and lower limb anatomy for first-year medical students.
A randomized, double-blind crossover non-inferiority trial was conducted. Participants were first-year medical students from a single University. Exclusion criteria included students who undertook prior undergraduate or graduate degrees in anatomy. In the first stage of the study, students were randomized in a 1:1 ratio to IVR or cadaveric bone groups studying upper limb skeletal anatomy. All students were then crossed over and used cadaveric bone or IVR to study lower limb skeletal anatomy. All students in both groups completed a pre-and post-intervention knowledge test. The educational content was based on the University of Toronto Medical Anatomy Curriculum. The Oculus Quest 2 Headsets (Meta Technologies) and PrecisionOS Anatomy application (PrecisionOS Technology) were utilized for the virtual reality component. The primary endpoint of the study was student performance on the pre-and post-intervention knowledge tests. We hypothesized that student performance in the IVR groups would be comparable to the cadaveric bone group.
50 first-year medical students met inclusion criteria and were computer randomized (1:1 ratio) to IVR and cadaveric bone group for upper limb skeletal anatomy education. Forty-six students attended the study, 21 completed the upper limb modules, and 19 completed the lower limb modules. Among all students, average score on the pre-intervention knowledge test was 14.6% (Standard Deviation (SD)=18.2%) and 25.0% (SD=17%) for upper and lower limbs, respectively. Percentage increase in students’ scores between pre-and post-intervention knowledge test, in the upper limb for IVR, was 15 % and 16.7% for cadaveric bones (p = 0. 2861), and for the lower limb score increase was 22.6% in the IVR and 22.5% in the cadaveric bone group (p = 0.9356).
In this non-inferiority crossover randomized controlled trial, we found no significant difference between student performance in knowledge tests after using IVR or cadaveric bones. Immersive virtual reality and cadaveric bones were equally effective in skeletal anatomy education. Going forward, with advances in VR technologies and anatomy applications, we can expect to see further improvements in the effectiveness of these technologies in anatomy and surgical education. These findings have implications for medical schools having challenges in acquiring cadavers and cadaveric parts.
The reconstruction of peri-acetabular defects after severe bone loss or pelvic resection for tumor is among the most challenging surgical intervention. The Lumic® prosthesis (Implantcast, Buxtehude, Germany) was first introduced in 2008 in an effort to reduce the mechanical complications encountered with the classic peri-acetabular reconstruction techniques and to improve functional outcomes. Few have evaluated the results associated with the use of this recent implant.
A retrospective study from five Orthopedic Oncology Canadian centers was conducted. Every patient in whom a Lumic® endoprosthesis was used for reconstruction after peri-acetabular resection or severe bone loss with a minimal follow-up of three months was included. The charts were reviewed and data concerning patients’ demographics, peri-operative characteristics and post-operative complications was collected. Surgical and functional outcomes were also assessed.
Sixteen patients, 11 males and five females, were included and were followed for 28 months [3 – 60]. Mean age was 55 [17–86], and mean BMI reached 28 [19.6 – 44]. Twelve patients (75%) had a Lumic® after a resection of a primary sarcoma, two following pelvic metastasis, one for a benign tumor and one after a comminuted acetabular fracture with bone loss. Twelve patients (75%) had their surgery performed in one stage whereas four had a planned two-stage procedure. Mean surgical time was 555 minutes [173-1230] and blood loss averaged 2100 mL [500-5000]. MSTS score mean was 60.3 preoperatively [37.1 – 97] and 54.3 postoperatively [17.1-88.6]. Five patients (31.3%) had a cemented Lumic® stem. All patients got the dual mobility bearing, and 10 patients (62.5%) had the largest acetabular cup implanted (60 mm). In seven of these 10 patients the silver coated implant was used to minimize risk of infection. Five patients (31.3%) underwent capsular reconstruction using a synthetic fabric aiming to reduce the dislocation risk. Five patients had per-operative complications (31.3%), four were minor and one was serious (comminuted iliac bone fracture requiring internal fixation). Four patients dislocated within a month post-operatively and one additional patient sustained a dislocation one year post-operatively. Eight patients (50%) had a post-operative surgical site infection. All four patients who had a two-stage surgery had an infection. Ten patients (62.5%) needed a reoperation (two for fabric insertion, five for wash-outs, and three for implant exchange/removal). One patient (6.3%) had a septic loosening three years after surgery. At the time of data collection, 13 patients (81.3%) were alive with nine free of disease. Silver coating was not found to reduce infection risk (p=0.2) and capsuloplasty did not prevent dislocation (p=1).
These results are comparable to the sparse data published. Lumic® endoprosthesis is therefore shown to provide good functional outcomes and low rates of loosening on short to medium term follow-up. Infection and dislocation are common complications but we were unable to show benefits of capsuloplasty and of the use of silver coated implants. Larger series and longer follow-ups are needed.
Bone turnover and the accumulation of microdamage are impacted by the presence of skeletal metastases which can contribute to increased fracture risk. Treatments for metastatic disease may further impact bone quality. The present study aims to establish a preliminary understanding of microdamage accumulation and load to failure in osteolytic vertebrae following stereotactic body radiotherapy (SBRT), zoledronic acid (ZA), or docetaxel (DTX) treatment.
Twenty-two six-week old athymic female rats (Hsd:RH-Foxn1rnu, Envigo, USA) were inoculated with HeLa cervical cancer cells through intracardiac injection (day 0). Institutional approval was obtained for this work and the ARRIVE guidelines were followed. Animals were randomly assigned to four groups: untreated (n=6), spine stereotactic body radiotherapy (SBRT) administered on day 14 (n=6), zoledronic acid (ZA) administered on day 7 (n=5), and docetaxel (DTX) administered on day 14 (n=5). Animals were euthanized on day 21. T13-L3 vertebral segments were collected immediately after sacrifice and stored in −20°C wrapped in saline soaked gauze until testing. µCT scans (µCT100, Scanco, Switzerland) of the T13-L3 segment confirmed tumour burden in all T13 and L2 vertebrae prior to testing. T13 was stained with BaSO4 to label microdamage. High resolution µCT scans were obtained (90kVp, 44uA, 4W, 4.9µm voxel size) to visualize stain location and volume. Segmentations of bone and BaSO4 were created using intensity thresholding at 3000HU (~736mgHA/cm3) and 10000HU (~2420mgHA/cm3), respectively. Non-specific BaSO4 was removed from the outer edge of the cortical shell by shrinking the segmentation by 105mm in 3D. Stain volume fraction was calculated as the ratio of BaSO4 volume to the sum of BaSO4 and bone volume. The L1-L3 motion segments were loaded under axial compression to failure using a µCT compatible loading device (Scanco) and force-displacement data was recorded. µCT scans were acquired unloaded, at 1500µm displacement and post-failure. Stereological analysis was performed on the L2 vertebrae in the unloaded µCT scans. Differences in mean stain volume fraction, mean load to failure, and mean bone volume/total volume (BV/TV) were compared between treatment groups using one-way ANOVAs. Pearson's correlation between stain volume fraction and load to failure by treatment was calculated using an adjusted load to failure divided by BV/TV.
Stained damage fraction was significantly different between treatment groups (p=0.0029). Tukey post-hoc analysis showed untreated samples to have higher stain volume fraction (
Focal and systemic cancer treatments effect microdamage accumulation and load to failure in osteolytic vertebrae. Current testing of healthy controls will help to further separate the effects of the tumour and cancer treatments on bone quality.
Despite advances in treating acute spinal cord injury (SCI), measures to mitigate permanent neurological deficits in affected patients are limited. Augmentation of mean arterial blood pressure (MAP) to promote blood flow and oxygen delivery to the injured cord is one of the only currently available treatment options to potentially improve neurological outcomes after acute spinal cord injury (SCI). However, to optimize such hemodynamic management, clinicians require a method to measure and monitor the physiological effects of these MAP alterations within the injured cord in real-time. To address this unmet clinical need, we developed a series of miniaturized optical sensors and a monitoring system based on multi-wavelength near-infrared spectroscopy (MW-NIRS) technique for direct transdural measurement and continuous monitoring of spinal cord hemodynamics and oxygenation in real-time. We conducted a feasibility study in a porcine model of acute SCI. We also completed two separate animal studies to examine the function of the sensor and validity of collected data in an acute experiment and a seven-day post-injury survival experiment.
In our first animal experiment, nine Yorkshire pigs underwent a weight-drop T10 vertebral level contusion-compression injury and received episodes of ventilatory hypoxia and alterations in MAP. Spinal cord hemodynamics and oxygenation were monitored throughout by a transdural NIRS sensor prototype, as well as an invasive intraparenchymal (IP) sensor as a comparison. In a second experiment, we studied six Yucatan miniature pigs that underwent a T10 injury. Spinal cord oxygenation and hemodynamics parameters were continuously monitored by an improved NIRS sensor over a long period. Episodes of MAP alteration and hypoxia were performed acutely after injury and at two- and seven-days post-injury to simulate the types of hemodynamic changes patients experience after an acute SCI. All NIRS data were collected in real-time, recorded and analyzed in comparison with IP measures.
Noninvasive NIRS parameters of tissue oxygenation were highly correlated with invasive IP measures of tissue oxygenation in both studies. In particular, during periods of hypoxia and MAP alterations, changes of NIRS-derived spinal cord tissue oxygenation percentage were significant and corresponded well with the changes in spinal cord oxygen partial pressures measured by the IP sensors (p < 0.05).
Our studies indicate that a novel optical biosensor developed by our team can monitor real-time changes in spinal cord hemodynamics and oxygenation over the first seven days post-injury and can detect local tissue changes that are reflective of systemic hemodynamic changes. Our implantable spinal cord NIRS sensor is intended to help clinicians by providing real-time information about the effects of hemodynamic management on the injured spinal cord. Hence, our novel NIRS system has the near-term potential to impact clinical care and improve neurologic outcomes in acute SCI. To translate our studies from bench to bedside, we have developed an advanced clinical NIRS sensor that is ready to be implanted in the first cohort of acute SCI patients in 2022.
Shoulder arthroplasty humeral stem design has evolved to accommodate patient anatomy characteristics. As a result, stems are available in numerous shapes, coatings, lengths, sizes, and vary by fixation method. This abundance of stem options creates a surgical paradox of choice. Metrics describing stem stability, including a stem's resistance to subsidence and micromotion, are important factors that should influence stem selection, but have yet to be assessed in response to the diametral (i.e., thickness) sizing of short stem humeral implants.
Eight paired cadaveric humeri (age = 75±15 years) were reconstructed with surgeon selected ‘standard’ sized short-stemmed humeral implants, as well as 2mm ‘oversized’ implants. Stem sizing conditions were randomized to left and right humeral pairs. Following implantation, an anteroposterior radiograph was taken of each stem and the metaphyseal and diaphyseal fill ratios were quantified. Each humerus was then potted in polymethyl methacrylate bone cement and subjected to 2000 cycles of 90º forward flexion loading. At regular intervals during loading, stem subsidence and micromotion were assessed using a validated system of two optical markers attached to the stem and humeral pot (accuracy of <15µm).
The metaphyseal fill ratio did not differ significantly between the oversized and standard stems (0.50±0.06 vs 0.50±0.10; P = 0.997, Power = 0.05); however, the diaphyseal fill ratio did (0.52±0.06 vs 0.45±0.07; P < 0.001, Power = 1.0). Neither fill ratio correlated significantly with stem subsidence or micromotion. Stem subsidence and micromotion were found to plateau following 400 cycles of loading. Oversizing stem thickness prevented implant head-back contact in all but one specimen with the least dense metaphyseal bone, while standard sizing only yielded incomplete head-back contact in the two subjects with the densest bone. Oversized stems subsided significantly less than their standard counterparts (standard: 1.4±0.6mm, oversized: 0.5±0.5mm; P = 0.018, Power = 0.748;), and resulted in slightly more micromotion (standard: 169±59µm, oversized: 187±52µm, P = 0.506, Power = 0.094,).
Short stem diametral sizing (i.e., thickness) has an impact on stem subsidence and micromotion following humeral arthroplasty. In both cases, the resulting three-dimensional stem micromotion exceeded, the 150µm limit suggested for bone ingrowth, although that limit was derived from a uniaxial assessment. Though not statistically significant, the increased stem micromotion associated with stem oversizing may in-part be attributed to over-compacting the cancellous bed during broaching, which creates a denser, potentially smoother, interface, though this influence requires further assessment. The findings of the present investigation highlight the importance of proper short stem diametral sizing, as even a relatively small, 2mm, increase can negatively impact the subsidence and micromotion of the stem-bone construct. Future work should focus on developing tools and methods to support surgeons in what is currently a subjective process of stem selection.
Osteoarthritis (OA) is the most common form of arthritis and one of the ten most disabling diseases in developed countries. Total joint replacement (TJR) is considered by far as the most effective treatment for end-stage OA patients. The majority of patients achieve symptomatic improvement following TJR. However, about 22% of the TJR patients either do not improve or deteriorate after surgery. Several potential non-genetic predictors for the TJR outcome have been investigated. However, the results were either inconclusive or had very limited predictive power. The aim of this study was to identify genetic variants for the poor outcome of TJR in primary OA patients by a genome-wide association study (GWAS).
Study participants were total knee or hip replacement patients due to primary OA who were recruited to the Newfoundland Osteoarthritis Study (NFOAS) before 2017. The Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) was used to assess pain and functional impairment pre- and 3.99±1.38 years post-surgery. Two non-responder classification criteria were used in our study. One was defined by an absolute WOMAC change score. Participants with a change score less than 7/20 points for pain were considered as pain non-responders; and those with less than 22/68 points for function were classified as function non-responders. The second one was the Outcome Measures in Arthritis Clinical Trials and the Osteoarthritis Research Society International (OMERACT-OARSI) criteria. Blood DNA samples were genotyped using the Illumina GWAS microarrays genotyping platform. The quality control (QC) filtering was performed on GWAS data before the association of the genetic variants with non-responders to TJR was tested using the GenABEL package in R with adjustment for the relatedness of the study population and using the commonly accepted GWAS significance threshold p < 5*10−8 to control multiple testing.
In total, 316 knee and 122 hip OA patients (mean age 65.45±7.62 years, and 58% females) passed the QC check. These study participants included 368 responders and 56 non-responders to pain, and 364 responders and 68 non-responders to function based on the absolute WOMAC point score change classification. While 377 responders and 56 non-responders to pain, and 366 responders and 71 non-responders to function were identified by the OMERACT-OARSI classification criteria. Interestingly, the same results were obtained by both classification methods, and we found that the G allele of rs4797006 was significantly associated with pain non-responders with odds ratio (OR) of 5.12 (p<7.27×10-10). This SNP is in intron one of the melanocortin receptor 5 (
Our data suggested that two chromosomal regions are associated with TJR poor outcomes and could be the novel targets for developing strategies to improve the outcome of the TJR.
Bone turnover and the accumulation of microdamage are impacted by the presence of skeletal metastases which can contribute to increased fracture risk. Treatments for metastatic disease may further impact bone quality. The present study aims to establish a preliminary understanding of microdamage accumulation and load to failure in osteolytic vertebrae following stereotactic body radiotherapy (SBRT), zoledronic acid (ZA), or docetaxel (DTX) treatment.
Twenty-two six-week old athymic female rats (Hsd:RH-Foxn1rnu, Envigo, USA) were inoculated with HeLa cervical cancer cells through intracardiac injection (day 0). Institutional approval was obtained for this work and the ARRIVE guidelines were followed. Animals were randomly assigned to four groups: untreated (n=6), spine stereotactic body radiotherapy (SBRT) administered on day 14 (n=6), zoledronic acid (ZA) administered on day 7 (n=5), and docetaxel (DTX) administered on day 14 (n=5). Animals were euthanized on day 21. T13-L3 vertebral segments were collected immediately after sacrifice and stored in −20°C wrapped in saline soaked gauze until testing. µCT scans (µCT100, Scanco, Switzerland) of the T13-L3 segment confirmed tumour burden in all T13 and L2 vertebrae prior to testing. T13 was stained with BaSO4 to label microdamage. High resolution µCT scans were obtained (90kVp, 44uA, 4W, 4.9µm voxel size) to visualize stain location and volume. Segmentations of bone and BaSO4 were created using intensity thresholding at 3000HU (~736mgHA/cm3) and 10000HU (~2420mgHA/cm3), respectively. Non-specific BaSO4 was removed from the outer edge of the cortical shell by shrinking the segmentation by 105mm in 3D. Stain volume fraction was calculated as the ratio of BaSO4 volume to the sum of BaSO4 and bone volume. The L1-L3 motion segments were loaded under axial compression to failure using a µCT compatible loading device (Scanco) and force-displacement data was recorded. µCT scans were acquired unloaded, at 1500µm displacement and post-failure. Stereological analysis was performed on the L2 vertebrae in the unloaded µCT scans. Differences in mean stain volume fraction, mean load to failure, and mean bone volume/total volume (BV/TV) were compared between treatment groups using one-way ANOVAs. Pearson's correlation between stain volume fraction and load to failure by treatment was calculated using an adjusted load to failure divided by BV/TV.
Stained damage fraction was significantly different between treatment groups (p=0.0029). Tukey post-hoc analysis showed untreated samples to have higher stain volume fraction (
Focal and systemic cancer treatments effect microdamage accumulation and load to failure in osteolytic vertebrae. Current testing of healthy controls will help to further separate the effects of the tumour and cancer treatments on bone quality.
Gram-negative prosthetic joint infections (GN-PJI) present unique challenges in management due to their distinct pathogenesis of biofilm formation on implant surfaces. To date, there are no animal models that can fully recapitulate how a biofilm is challenged in vivo in the setting of GN-PJI. The purpose of this study is to establish a clinically representative GN-PJI in vivo model that can reliably depict biofilm formation on titanium implant surface. We hypothesized that the biofilm formation on the implant surface would affect the ability of the implant to be osseointegrated.
The model was developed using a 3D-printed, medical-grade titanium (Ti-6Al-4V), monoblock, cementless hemiarthroplasty hip implant. This implant was used to replace the femoral head of a Sprague-Dawley rat using a posterior surgical approach. To induce PJI, two bioluminescent Pseudomonas aeruginosa (PA) strains were utilized: a reference strain (PA14-lux) and a mutant strain that is defective in biofilm formation (DflgK-lux). PJI development and biofilm formation was quantitatively assessed in vivo using the in vivo imaging system (IVIS), and in vitro using the viable colony count of the bacterial load on implant surface. Magnetic Resonance Imaging (MRI) was acquired to assess the involvement of periprosthetic tissue in vivo, and the field emission scanning electron microscopy (FE-SEM) of the explanted implants was used to visualize the biofilm formation at the bone-implant interface. The implant stability, as an outcome, was directly assessed by quantifying the osseointegration using microCT scans of the extracted femurs with retained implants in vitro, and indirectly assessed by identifying the gait pattern changes using DigiGaitTM system in vivo.
A localized prosthetic infection was reliably established within the hip joint and was followed by IVIS in real-time. There was a quantitative and qualitative difference in the bacterial load and biofilm formation between PA14 and DflgK. This difference in the ability to persist in the model between the two strains was reflected on the gait pattern and implant osseointegration.
We developed a novel uncemented hip hemiarthroplasty GN-PJI rat model. This model is clinically representative since animals can bear weight on the implant. PJI was detected by various modalities. In addition, biofilm formation correlated with implant function and stability. In conclusion, the proposed in vivo GN-PJI model will allow for more reliable testing of novel biofilm-targeting therapetics
One in five patients remain unsatisfied due to ongoing pain and impaired mobility following total knee arthroplasty (TKA). It is important if surgeons can pre-operatively identify which patients may be at risk for poor outcomes after TKA. The purpose of this study was to determine if there is an association between pre-operative measures and post-operative outcomes in patients who underwent TKA.
This study included 28 patients (female = 12 / male = 16, age = 63.6 ± 6.9, BMI = 29.9 ± 7.4 kg/m2) with knee osteoarthritis who were scheduled to undergo TKA. All surgeries were performed by the same surgeon (GD), and a subvastus approach was performed for all patients. Patients visited the gait lab within one-month of surgery and 12 months following surgery. At the gait lab, patients completed the knee injury and osteoarthritis outcome score (KOOS), a timed up and go (TUG), and walking task. Variables of interest included the five KOOS sub-scores (symptoms, pain, activities of daily living, sport & recreation, and quality of life), completion time for the TUG, walking speed, and peak knee biomechanics variables (flexion angle, abduction moment, power absorption). A Pearson's product-moment correlation was run to assess the relationship between pre-operative measures and post-operative outcomes in the TKA patients.
Preliminary analyses showed the relationship to be linear with all variables normally distributed, as assessed by Shapiro-Wilk's test (p > .05), and there were no outliers. There were no statistically significant correlations between any of the pre-operative KOOS sub-scores and any of the post-operative biomechanical outcomes. Pre-operative TUG time had a statistically significant, moderate positive correlation with post-operative peak knee abduction moments [r(14) = .597, p < .001] and peak knee power absorption [r(14) = .498, p = .007], with pre-operative TUG time explaining 36% of the variability in peak knee abduction moment and 25% of the variability in peak knee power absorption. Pre-operative walking speed had a statistically significant, moderate negative correlation with post-operative peak knee abduction moments [r(14) = -.558, p = .002] and peak knee power absorption [r(14) = -.548, p = .003], with pre-operative walking speed explaining 31% of the variability in peak knee abduction moment and 30% of the variability in peak knee power absorption.
Patient reported outcome measures (PROMs), such as the KOOS, do indicate the TKA is generally successful at relieving pain and show an overall improvement. However, their pre-operative values do not correlate with any biomechanical indicators of post-operative success, such as peak knee abduction moment and knee power. Shorter pre-operative TUG times and faster pre-operative walking speeds were correlated with improved post-operative biomechanical outcomes. These are simple tasks surgeons can implement into their clinics to evaluate their patients. Future research should expand these findings to a larger sample size and to determine if other factors, such as surgical approach or implant design, improves patient outcomes.
Prosthetic joint infection (PJI) is a complex disease that causes significant damage to the peri-implant tissue. Developing an animal model that is clinically relevant in depicting this disease process is an important step towards developing novel successful therapies. In this study, we have performed a thorough histologic analysis of peri-implant tissue harvested post Staphylococcus aureus (S. aureus) infection of a cemented 3D-printed titanium hip implant in rats.
Sprague-Dawley rats underwent left hip cemented 3D-printed titanium hemiarthroplasty via posterior approach under general anesthesia. Four surgeries were performed for the control group and another four for the infected group. The hip joint was inoculated with 5×109 CFU/mL of
The histologic analysis revealed strong resemblance to tissue changes in the clinical setting of chronic PJI. IHC demonstrated the extent of bacterial spread within the peri-implant tissue away from the site of infection. The H&E and MT stains showed 5 main features in infected bone: 1) increased PMNs, 2) fibrovascular inflammation, 3) bone necrosis, and 4) increased osteoclasts 5) fibrosis of muscular tissue and cartilage. Micro CT data showed significantly more osteolysis present around the infected prosthesis compared to control (surgery with no infection).
This is the first clinically relevant PJI animal model with detailed histologic analysis that strongly resembles the clinical tissue pathology of chronic PJI. This model can provide a better understanding of how various PJI therapies can halt or reverse peri-implant tissue damage caused by infection.
One out of nine Canadian males would suffer prostate cancer (PC) during his lifetime. Life expectancy of males with PC has increased with modern therapy and 90% live >10 years. However, 20% of PC-affected males would develop incurable metastatic diseases. Bone metastases (BM) are present in ~80% of metastatic PC patients, and are the most severe complication of PC, generating severe pain, fractures, spinal cord compression, and death. Interestingly, PC-BMs are mostly osteoblastic. However, the structure of this newly formed bone and how it relates to pain and fracture are unknown. Due to androgen antagonist treatment, different PC phenotypes develop with differential dependency on androgen receptor (AR) signaling: androgen-dependent (AR+), double negative (AR-) and neuroendocrine. How these phenotypes are related to changes in bone structure has not been studied. Here we show a state-of-the-art structural characterization of PCBM and how PC phenotypes are associated to abnormal bone formation in PCBM.
Cadaveric samples (n=14) obtained from metastases of PC in thoracic or lumbar vertebrae (mean age 74yo) were used to analyze bone structure. We used micro-computed tomography (mCT) to analyze the three-dimensional structure of the bone samples. After imaging, the samples were sectioned and one 3mm thick section was embedded in epoxy-resin, ground and polished. Scanning electron microscopy (SEM)/energy-dispersive X-ray spectroscopy (EDS) and quantitative backscattering electron (qBSE) imaging were used to determine mineral morphology and composition. Another section was used for histological analysis of the PC-affected bone. Collagen structure, fibril orientation and extracellular matrix composition were characterized using histochemistry. Additionally, we obtained biopsies of 3 PCBM patients undergoing emergency decompression surgery following vertebral fracture and used them for immunohistological characterization.
By using mCT, we observed three dysmorphic bone patterns: osteolytic pattern with thinned trabecula of otherwise well-organized structures, osteoblastic pattern defined as accumulation of disorganized matrix deposited on pre-existing trabecula, and osteoblastic pattern with minimum residual trabecula and bone space dominated by accumulation of disorganized mineralized matrix. Comparing mCT data with patho/clinical parameters revealed a trend for higher bone density in males with larger PSA increase. Through histological sections, we observed that PC-affected bone, lacks collagen alignment structure, have a higher number of lacunae and increased amount of proteoglycans as decorin.
Immunohistochemistry of biopsies revealed that PC-cells inside bone organize into two manners: i) glandular-like structures where cells maintain their polarization in the expression of prostate markers, ii) diffuse infiltrate that spreads along bone surfaces, with loss of cell polarity. These cells take direct contact with osteoblasts in the surface of trabecula. We define that PCBM are mostly composed by AR+ with some double negative cells. We did not observe neuroendocrine phenotype cells.
PCBMs generate predominantly osteoblastic lesions that are characterized by high lacunar density, lack of collagen organization and elevated proteoglycan content. These structural changes are associated with the infiltration of PC cells that are mostly androgen-dependent but have lost their polarization and contact directly with osteoblasts, perhaps altering their function. These changes could be associated with lower mechanical properties that led to fracture and weakness of the PCBM affected bone.
Femoroacetabular impingement (FAI) – enlarged, aspherical femoral head deformity (cam-type) or retroversion/overcoverage of the acetabulum (pincer-type) – is a leading cause for early hip osteoarthritis. Although anteverting/reverse periacetabular osteotomy (PAO) to address FAI aims to preserve the native hip and restore joint function, it is still unclear how it affects joint mobility and stability. This in vitro cadaveric study examined the effects of surgical anteverting PAO on range of motion and capsular mechanics in hips with acetabular retroversion.
Twelve cadaveric hips (n = 12, m:f = 9:3; age = 41 ± 9 years; BMI = 23 ± 4 kg/m2) were included in this study. Each hip was CT imaged and indicated acetabular retroversion (i.e., crossover sign, posterior wall sign, ischial wall sign, retroversion index > 20%, axial plane acetabular version < 15°); and showed no other abnormalities on CT data. Each hip was denuded to the bone-and-capsule and mounted onto a 6-DOF robot tester (TX90, Stäubli), equipped with a universal force-torque sensor (Omega85, ATI). The robot positioned each hip in five sagittal angles: Extension, Neutral 0°, Flexion 30°, Flexion 60°, Flexion 90°; and performed hip internal-external rotations and abduction-adduction motions to 5 Nm in each position. After the intact stage was tested, each hip underwent an anteverting PAO, anteverting the acetabulum and securing the fragment with long bone screws. The capsular ligaments were preserved during the surgery and each hip was retested postoperatively in the robot. Postoperative CT imaging confirmed that the acetabular fragment was properly positioned with adequate version and head coverage. Paired sample t-tests compared the differences in range of motion before and after PAO (CI = 95%; SPSS v.24, IBM).
Preoperatively, the intact hips with acetabular retroversion demonstrated constrained internal-external rotations and abduction-adduction motions. The PAO reoriented the acetabular fragment and medialized the hip joint centre, which tightened the iliofemoral ligament and slackenend the pubofemoral ligament. Postoperatively, internal rotation increased in the deep hip flexion positions of Flexion 60° (∆IR = +7°, p = 0.001) and Flexion 90° (∆IR = +8°, p = 0.001); while also demonstrating marginal decreases in external rotation in all positions. In addition, adduction increased in the deep flexion positions of Flexion 60° (∆ADD = +11°, p = 0.002) and Flexion 90° (∆ADD = +12°, p = 0.001); but also showed marginal increases in abduction in all positions.
The anteverting PAO restored anterosuperior acetabular clearance and increased internal rotation (28–33%) and adduction motions (29–31%) in deep hip flexion. Restricted movements and positive impingement tests typically experienced in these positions with acetabular retroversion are associated with clinical symptoms of FAI (i.e., FADIR). However, PAO altered capsular tensions by further tightening the anterolateral hip capsule which resulted in a limited external rotation and a stiffer and tighter hip. Capsular tightness may still be secondary to acetabular retroversion, thus capsular management may be warranted for larger corrections or rotational osteotomies. In efforts to optimize surgical management and clinical outcomes, anteverting PAO is a viable option to address FAI due to acetabular retroversion or overcoverage.
In 2020, the COVID-19 pandemic meant that proceeding with elective surgery was restricted to minimise exposure on the wards. In order to maintain throughput of elective cases, our hospital was forced to convert as many cases as possible to same day procedures rather than overnight admission. In this retrospective analysis we review the cases performed as same day arthroplasty surgeries compared to the same period 12 months previous.
We conducted a retrospective analysis of patients undergoing total hip and knee arthroplasties in a three month period between October and December in 2019 and again in 2020, in the middle of the SARS-CoV-2 pandemic. Patient demographics, number of out-patient primary arthroplasty cases, length of stay for admissions, 30-day readmission and complications were collated.
In total, 428 patient charts were reviewed for the months of October-December of 2019 (n=195) and 2020 (n=233). Of those, total hip arthroplasties comprised 60% and 58.8% for 2019 and 2020, respectively. Demographic data was comparable with no statistical difference for age, gender contralateral joint replacement or BMI. ASA grade I was more highly prevalent in the 2020 cohort (5.1x increase, n=13 vs n=1). Degenerative disc disease and fibromyalgia were less significantly prevalent in the 2020 cohort. There was a significant increase in same day discharges for non-DAA THAs (2x increase) and TKA (10x increase), with a reciprocal decrease in next day discharges. There were significantly fewer reported superficial wound infections in 2020 (5.6% vs 1.7%) and no significant differences in readmissions or emergency department visits (3.1% vs 3.0%).
The SARS-CoV-2 pandemic meant that hospitals and patients were hopeful to minimise the exposure to the wards and to not put strain on the already taxed in-patient beds. With few positives during the Coronavirus crisis, the pandemic was the catalyst to speed up the outpatient arthroplasty program that has resulted in our institution being more efficient and with no increase in readmissions or early complications.
A concern of metal on metal hip resurfacing arthroplasty is long term exposure to Cobalt (Co) and Chromium (CR) wear debris from the bearing. This study compares whole blood metal ion levels from patients drawn at one-year following Birmingham Hip Resurfacing (BHR) to levels taken at a minimum 10-year follow-up.
A retrospective chart review was conducted to identify all patients who underwent a BHR for osteoarthritis with a minimum 10-year follow-up. Whole blood metal ion levels were drawn at final follow-up in June 2019. These results were compared to values from patients with one-year metal ion levels.
Of the 211 patients who received a BHR, 71 patients (54 males and 17 females) had long term metal ion levels assessed (mean follow-up 12.7 +/− 1.4 years). The mean Co and Cr levels for patients with unilateral BHRs (43 males and 13 females) were 3.12 ± 6.31 ug/L and 2.62 ± 2.69 ug/L, respectively, and 2.78 ± 1.02 ug/L and 1.83 ± 0.65 ug/L for patients with bilateral BHRs (11 males and 4 females). Thirty-five patients (27 male and 8 female) had metal-ion levels tested at one-year postoperatively. The mean changes in Co and Cr levels were 2.29 ug/l (p = 0.0919) and 0.57 (p = 0.1612), respectively, at one year compared to long-term. These changes were not statistically significant.
This study reveals that whole blood metal ion levels do not change significantly when comparing one-year and ten-year Co and Cr levels. These ion levels appear to reach a steady state at one year.
Our results also suggest that regular metal-ion testing as per current Medicines and Healthcare products Regulatory Agency (MHRA) guidelines may be impractical for asymptomatic patients. Metal-ion levels, in and of themselves, may in fact possess little utility in determining the risk of failure and should be paired with radiographic and clinical findings to determine the need for revision.
The presence of hip osteoarthritis is associated with abnormal spinopelvic characteristics. This study aims to determine whether the pre-operative, pathological spinopelvic characteristics “normalize” at 1-year post-THA.
This is a prospective, longitudinal, case-control matched cohort study. Forty-seven patients underwent pre- and post- (at one-year) THA assessments. This group was matched (age, sex, BMI) with 47 controls/volunteers with well-functioning hips. All participants underwent clinical and radiographic assessments including lateral radiographs in standing, upright-seated and deep-flexed-seated positions. Spinopelvic characteristics included change in lumbar lordosis (ΔLL), pelvic tilt (ΔPT) and hip flexion (pelvic-femoral angle, ΔPFA) when moving from the standing to each of the seated positions. Spinopelvic hypermobility was defined as ΔPT>30° between standing and upright-seated positions.
Pre-THA, patients illustrated less hip flexion (ΔPFA −54.8°±17.1° vs. −68.5°± 9.5°, p<0.001), greater pelvic tilt (ΔPT 22.0°±13.5° vs. 12.7°±8.1°, p<0.001) and greater lumbar movements (ΔLL −22.7°±15.5° vs. −15.4°±10.9°, p=0.015) transitioning from standing to upright-seated. Post-THA, these differences were no longer present (ΔPFApost −65.8°±12.5°, p=0.256; ΔPTpost 14.3°±9.5°, p=0.429; ΔLLpost −15.3°±10.6°, p=0.966). The higher prevalence of pre-operative spinopelvic hypermobility in patients compared to controls (21.3% vs. 0.0%; p=0.009), was not longer present post-THA (6.4% vs. 0.0%; p=0.194). Similar results were found moving from standing to deep-seated position post-THA.
Pre-operative, spinopelvic characteristics that contribute to abnormal mechanics can normalize post-THA following improvement in hip flexion. This leads to patients having the expected hip-, pelvic- and spinal flexion as per demographically-matched controls, thus potentially eliminating abnormal mechanics that contribute to the development/exacerbation of hip-spine syndrome.
Simultaneous bilateral total hip arthroplasty (THA) in patients with bilateral hip osteoarthritis is gradually becoming attractive, as it requires a single anesthesia and hospitalization. However, there are concerns about the potential complications following this surgical option. The purpose of this study is to compare the short-term major and minor complications and assess the readmission rate, between patients treated with same-day bilateral THA and those with staged procedures within a year.
We retrospectively reviewed the charts of all patients with bilateral hip osteoarthritis that underwent simultaneous or staged (within a year) bilateral total THA in our institution, between 2016-2020. Preoperative patient variables between the two groups were compared using the 2-sample t-test for continuous variables, the Fisher's exact test for binary variables, or the chi-square test for multiple categorical variables. Similarly, differences in the 30-day major and minor complications and readmission rates were assessed. A logistic regression model was also developed to identify potential risk factors.
A total of 160 patients (mean age: 64.3 years, SD: ±11.7) that underwent bilateral THA was identified. Seventy-nine patients were treated with simultaneous and eighty-one patients with staged procedures. There were no differences in terms of preoperative laboratory values, gender, age, Body Mass Index (BMI), or American Society of Anesthesiologists Scores (ASA) (p>0.05) between the two groups. Patients in the simultaneous group were more likely to receive general anesthesia (43% vs 9.9%, p0.05). After controlling for potential confounders, the multivariable logistic regression analysis showed similar odds of having a major (odds ratio 0.29, 95% confidence interval [0.30-2.88], p=0.29) or minor (odds ratio 1.714, 95% confidence interval [0.66-4.46], p=0.27) complication after simultaneous compared to staged bilateral THA. No differences in emergency department visits or readmission for reasons related to the procedure were recorded (p>0.05).
This study shows that similar complication and readmission rates are expected after simultaneous and staged THAs. Simultaneous bilateral THA is a safe and effective procedure, that should be sought actively and counselled by surgeons, for patients that present with radiologic and clinical bilateral hip disease.
Psoas tendinopathy is a potential cause of groin pain after primary total hip arthroplasty (THA). The direct anterior approach (DAA) is becoming increasingly popular as the standard approach for primary THA due to being a muscle preserving technique. It is unclear what the prevalence is for the development of psoas-related pain after DAA THA, how this can influence patient reported outcome, and which risk factors can be identified.
This retrospective case control study of prospectively recorded data evaluated 1784 patients who underwent 2087 primary DAA THA procedures between January 2017 and September 2019. Psoas tendinopathy was defined as (1) persistence of groin pain after DAA THA and was triggered by active hip flexion, (2) exclusion of other causes such as dislocation, infection, implant loosening or (occult) fractures, and (3) a positive response to an image-guided injection with xylocaine and steroid into the psoas tendon sheath. Complication-, re-operation rates, and patient-reported outcome measures (PROMs) were measured.
Forty-three patients (45 hips; 2.2%) were diagnosed with psoas tendinopathy according to the above-described criteria. The mean age of patients who developed psoas tendinopathy was 50.8±11.7 years, which was significantly lower than the mean age of patients without psoas pain (62.4±12.7y; p<0.001). Patients with primary hip osteoarthritis were significantly less likely to develop psoas tendinopathy (14/1207; 1.2%) in comparison to patients with secondary hip osteoarthritis to dysplasia (18/501; 3.6%) (p<0.001) or FAI (12/305; 3.9%) (p<0.001). Patients with psoas tendinopathy had significantly lower PROM scores at 6 weeks and 1 year follow-up.
Psoas tendinopathy was present in 2.2% after DAA THA. Younger age and secondary osteoarthritis due to dysplasia or FAI were risk factors for the development of psoas tendinopathy. Post-operatively, patients with psoas tendinopathy often also presented with low back pain and lateral trochanteric pain. Psoas tendinopathy had an important influence on the evolution of PROM scores.
Immigrated Canadians make up approximately 20% of the total population in Canada, and 30% of the population in Ontario. Despite universal health coverage and an equal prevalence of severe arthritis in immigrants relative to non-immigrants, the former may be underrepresented amongst arthroplasty recipients secondary to challenges navigating the healthcare system. The primary aim of this study was to determine if utilization of arthroplasty differs between immigrant populations and persons born in Canada. The secondary aim was to determine differences in outcomes following total hip and knee arthroplasty (THA and TKA, respectively).
This is a retrospective population-based cohort study using health administrative databases. All patients aged ≥18 in Ontario who underwent their first primary elective THA or TKA between 2002 and 2016 were identified. Immigration status for each patient was identified via linkage to the ‘Immigration, Refugee and Citizenship Canada’ database. Outcomes included all-cause and septic revision surgery within 12-months, dislocation (for THA) and total post-operative case cost and were compared between groups. Cochrane-Armitage Test for Trend was utilized to determine if the uptake of arthroplasty by immigrants changed over time.
There was a total of 186,528 TKA recipients and 116,472 THA recipients identified over the study period. Of these, 10,193 (5.5%) and 3,165 (2.7%) were immigrants, respectively. The largest proportion of immigrants were from the Asia and Pacific region for those undergoing TKA (54.0%) and Europe for THA recipients (53.4%). There was no difference in the rate of all-cause revision or septic revision at 12 months between groups undergoing TKA (p=0.864, p=0.585) or THA (p=0.527, p=0.397), respectively. There was also no difference in the rate of dislocations between immigrants and people born in Canada (p=0.765, respectively).
Despite having similar complication rates and costs, immigrants represent a significantly smaller proportion of joint replacement recipients than they represent in the general population in Ontario. These results suggest significant underutilization of surgical management for arthritis among Canada's immigrant populations. Initiatives to improve access to total joint arthroplasty are warranted.
Recent registry data from around the world has strongly suggested that using cemented hip hemiarthroplasty has lower revision rates compared to cementless hip hemiarthroplasty for acute femoral neck hip fractures. The adoption of using cemented hemiarthroplasty for hip fracture has been slow as many surgeons continue to use uncemented stems. One of the reasons is that surgeons feel more comfortable with uncemented hemiarthroplasty as they have used it routinely. The purpose of this study is to compare the difference in revision rates of cemented and cementless hemiarthroplasty and stratify the risk by surgeon experience. By using a surgeons annual volume of Total Hip Replacements performed as an indicator for surgeon experience. The Canadian Joint Replacement Registry Database was used to collect and compare the outcomes to report on the revision rates based on surgeon volume.
This is a large Canadian Registry Study where 68447 patients were identified for having a hip hemiarthroplasty from 2012-2020. This is a retrospective cohort study, identifying patients that had cementless or cemented hip hemiarthroplasty. The surgeons who performed the procedures were linked to the procedure Total Hip Replacement. Individuals were categorized as experienced hip surgeons or not based on whether they performed 50 hip replacements a year. Identifying high volume surgeon (>50 cases/year) and low volume (<50 cases/year) surgeons. Hazard ratios adjusted for age and sex were performed for risk of revision over this 8-year span. A p-value <0.05 was deemed significant.
For high volume surgeons, cementless fixation had a higher revision risk than cemented fixation, HR 1.29 (1.05-1.56), p=0.017. This pattern was similar for low volume surgeons, with cementless fixation having a higher revision risk than cemented fixation, HR 1.37 (1.11-1.70) p=0.004 We could not detect a difference in revision risk for cemented fixation between low volume and high volume surgeons; at 0-1.5 years the HR was 0.96 (0.72-1.28) p=0.786, and at 1.5+ years the HR was 1.61 (0.83-3.11) p=0.159. Similarly, we could not detect a difference in revision risk for cementless fixation between low volume and high volume surgeons, HR 1.11 (0.96-1.29) p=0.161
Using large registry data, cemented hip hemiarthroplasty has a significant lower revision rate than the use of cementless stems even when surgeons are stratified to high and low volume. Low volume surgeons who use uncemented prostheses have the highest rate of revision. The low volume hip surgeon who cements has a lower revision rate than the high volume cementless surgeon. The results of this study should help to guide surgeons that no matter the level of experience, using a cemented hip hemiarthroplasty for acute femoral neck fracture is the safest option. That high volume surgeons who perform cementless hemiarthroplasty are not immune to having revisions due to their technique. Increased training and education should be offered to surgeons to improve comfort when using this technique.
Total hip arthroplasty (THA) is performed under general anesthesia (GA) or spinal anesthesia (SA). The first objective of this study was to determine which patient factors are associated with receiving SA versus GA. The second objective was to discern the effect of anesthesia type on short-term postoperative complications and readmission. The third objective was to elucidate factors that impact the effect of anesthesia type on outcome following arthroplasty.
This retrospective cohort study included 108,905 patients (median age, 66 years; IQR 60-73 years; 56.0% females) who underwent primary THA for treatment of primary osteoarthritis in the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) database during the period of 2013-2018. Multivariable logistic regression analysis was performed to evaluate variables associated with anesthesia type and outcomes following arthroplasty.
Anesthesia type administered during THA was significantly associated with race. Specifically, Black and Hispanic patients were less likely to receive SA compared to White patients (White: OR 1.00; Black: OR 0.73; 95% confidence interval [CI] 0.71-0.75; Hispanic: OR 0.81; CI, 0.75-0.88), while Asian patients were more likely to receive SA (OR 1.44, CI 1.31-1.59). Spinal anesthesia was associated with increased age (OR 1.01; CI 1.00-1.01). Patients with less frailty and lower comorbidity were more likely to receive SA based on the modified frailty index ([mFI-5]=0: OR 1.00; mFI-5=1: OR 0.90, CI 0.88-0.93; mFI-5=2 or greater: OR 0.86, CI 0.83-0.90) and American Society of Anesthesiologists (ASA) class (ASA=1: OR 1.00; ASA=2: OR 0.85, CI 0.79-0.91; ASA=3: OR 0.64, CI 0.59-0.69; ASA=4-5: OR 0.47; CI 0.41-0.53). With increased BMI, patients were less likely to be treated with SA (OR 0.99; CI 0.98-0.99).
Patients treated with SA had less post-operative complications than GA (OR 0.74; CI 0.67-0.81) and a lower risk of readmission than GA (OR 0.88; CI 0.82-0.95) following THA. Race, age, BMI, and ASA class were found to affect the impact of anesthesia type on post-operative complications. Stratified analysis demonstrated that the reduced risk of complications following arthroplasty noted in patients treated with SA compared to GA was more pronounced in Black, Asian, and Hispanic patients compared to White patients. Furthermore, the positive effect of SA compared to GA was stronger in patients who had reduced age, elevated BMI, and lower ASA class.
Among patients undergoing THA for management of primary osteoarthritis, factors including race, BMI, and frailty appear to have impacted the type of anesthesia received. Patients treated with SA had a significantly lower risk of readmission to hospital and adverse events within 30 days of surgery compared to those treated with GA. Furthermore, the positive effect on outcome afforded by SA was different between patients depending on race, age, BMI, and ASA class. These findings may help to guide selection of anesthesia type in subpopulations of patients undergoing primary THA.
Hip instability is one of the most common causes for total hip arthroplasty (THA) revision surgery. Studies have indicated that lumbar fusion (LF) surgery is a risk factor for hip dislocation. Instrumented spine fusion surgery decreases pelvic tilt, which might lead to an increase in hip motion to accommodate this postural change. To the best of our knowledge, spine-pelvis-hip kinematics during a dynamic activity in patients that previously had both a THA and LF have not been investigated. Furthermore, patients with a combined THA and LF tend to have greater disability. The purpose was to examine spine-pelvis-hip kinematics during a sit to stand task in patients that have had both THA and LF surgeries and compare it to a group of patients that had a THA with no history of spine surgery. The secondary purpose was to compare pain, physical function, and disability between these patients.
This cross-sectional study recruited participants that had a combined THA and LF (n=10; 6 females, mean age 73 y) or had a THA only (n=11; 6 females, mean age 72 y). Spine, pelvis, and hip angles were measured using a TrakSTAR motion capture system sampled at 200 Hz. Sensors were mounted over the lateral thighs, base of the sacrum, and the spinous process of the third lumbar,12th thoracic, and ninth thoracic vertebrae. Participants completed 10 trials of a standardized sit-to-stand-to-sit task. Hip, pelvis, lower lumbar, upper lumbar, and lower thoracic sagittal joint angle range of motion (ROM) were calculated over the entire task. In addition, pain, physical function, and disability were measured with clinical outcomes: Hip Disability Osteoarthritis Outcome Score (pain and physical function), Oswestry Low Back Disability Questionnaire (disability), and Harris Hip Score (pain, physical function, motion). Physical function performance was measured using 6-Minute Walk Test, Stair Climb Test, and 30s Chair Test. Angle ROMs during the sit-to-stand-to-sit task and clinical outcomes were compared between THA+LF and THA groups using independent t-tests and effect sizes (d).
The difference in hip ROM was approaching statistical significance (p=0.07). Specifically, the THA+LF group had less hip ROM during the sit-to-stand-to-sit task than the THA only group (mean difference=11.17, 95% confidence interval=-1.13 to 23.47), which represented a large effect size (d=0.83). There were no differences in ROM for pelvis (p=0.54, d=0.28) or spinal (p=0.14 to 0.97; d=0.02 to 0.65) angles between groups. The THA+LF group had worse clinical outcomes for all measures of pain, physical function, and disability (p=0.01 to 0.06), representing large effect sizes (d=0.89 to 2.70).
Hip ROM was not greater in the THA+LF group, and thus this is unlikely a risk factor for hip dislocation during this specific sit-to-stand-to-sit task. Other functional tasks that demand greater excursions in the joints should be investigated. Furthermore, the lack of differences in spinal and pelvis ROM were likely due to the task and the THA+LF group had spinal fusions at different levels. Combined THA+LF results in worse clinical outcomes and additional rehabilitation is required for these patients.
Short cementless femoral stems are increasingly popular as they allow for less dissection for insertion. Use of such stems with the anterior approach (AA) may be associated with considerable per-operative fracture risk. This study's primary aim was to evaluate whether patient-specific femoral- and pelvic- morphology and surgical technique, influence per-operative fracture risk. In doing so, we aimed to describe important anatomical thresholds alerting surgeons.
This is a single-center, multi-surgeon retrospective, case-control matched study. Of 1145 primary THAs with a short, cementless stem inserted via the AA, 39 periprosthetic fractures (3.4%) were identified. These were matched for factors known to increase fracture risk (age, gender, BMI, side, Dorr classification, stem offset and indication for surgery) with 78 THAs that did not sustain a fracture. Radiographic analysis was performed using validated software to measure femoral- (canal flare index [CFI], morphological cortical index [MCI], calcar-calcar ratio [CCR]) and pelvic- (Ilium-ischial ratio [IIR], ilium overhang, and ASIS to greater trochanter distance) morphologies and surgical technique (% canal fill). Multivariate and Receiver-Operator Curve (ROC) analysis was performed to identify predictors of fracture.
Femoral factors that differed included CFI (3.7±0.6 vs 2.9±0.4, p3.17 and II ratio>3 (OR:29.2 95%CI: 9.5–89.9, p<0.001).
Patient-specific anatomical parameters are important predictors of fracture-risk. When considering the use of short stems via the AA, careful radiographic analysis would help identify those at risk in order to consider alternative stem options.
To date, the literature has not yet revealed superiority of Minimally Invasive (MI) approaches over conventional techniques. We performed a systematic review to determine whether minimally invasive approaches are superior to conventional approaches in total hip arthroplasty for (1) clinical and (2) functional outcomes. We performed a meta-analysis of level 1 evidence to determine whether (3) minimally invasive approaches are superior to conventional approaches for clinical outcomes.
All studies comparing MI approaches to conventional approaches were eligible for analysis. The PRISMA guidelines were adhered to throughout this study. Registries were searched using the following MeSH terms: ‘minimally invasive’, ‘muscle-sparing’, ‘THA’, ‘THR’, ‘hip arthroplasty’ and ‘hip replacement’. Locations searched included PubMed, the Cochrane Library, ClinicalTrials.gov, the EU clinical trials register and the International Clinical Trials Registry Platform (World Health Organisation).
Twenty studies were identified. There were 1,282 MI THAs and 1,351 conventional THAs performed.
There was no difference between MI and conventional approaches for all clinical outcomes of relevance including all-cause revision (p=0.959), aseptic revision (p=0.894), instability (p=0.894), infection (p=0.669) and periprosthetic fracture (p=0.940).
There was also no difference in functional outcome at early or intermediate follow-up between the two groups (p=0.38).
In level I studies exclusively, random-effects meta-analysis demonstrated no difference in the rate of aseptic revision (p=0.461) between both groups.
Intermuscular MI approaches are equivalent to conventional THA approaches when considering all-cause revision, aseptic revision, infection, dislocation, fracture rates and functional outcomes. Meta-analysis of level 1 evidence supports this claim.
With the introduction of highly crosslinked polyethylene (HXLPE) in total hip arthroplasty (THA), orthopaedic surgeons have moved towards using larger femoral heads at the cost of thinner liners to decrease the risk of instability. Several short and mid-term studies have shown minimal liner wear with the use HXLPE liners, but the safety of using thinner HXPLE liners to maximize femoral head size remains uncertain and concerns that this may lead to premature failure exist. Our objective was to analyze the outcomes for primary THA done with HXLPE liners in patients who have a 36-mm head or larger and a cup of 52-mm or smaller, with a minimum of 10-year follow-up. Additionally, linear and volumetric wear rates of the HXLPE were evaluated in those with a minimum of seven-year follow-up. We hypothesized that there would be minimal wear and good clinical outcome.
Between 2000 and 2010, we retrospectively identified 55 patients that underwent a primary THA performed in a high-volume single tertiary referral center using HXLPE liners with 36-mm or larger heads in cups with an outer diameter of or 52-mm or smaller. Patient characteristics, implant details including liner thickness, death, complications, and all cause revisions were recorded. Patients that had a minimum radiographic follow-up of seven years were assessed radiographically for linear and volumetric wear. Wear was calculated using ROMAN, a validated open-source software by two independent researchers on anteroposterior X-rays of the pelvis.
A total of 55 patients were identified and included, with a mean age of 74.8 (range 38.67 - 95.9) years and a mean BMI of 28.98 (range 18.87 - 63-68). Fifty-one (94.4%) of patients were female. Twenty-six (47.7%) patients died during the follow-up period. Three patients were revised, none for liner wear, fracture or dissociation. Twenty-two patients had a radiographic follow-up of minimum seven years (mean 9.9 years, min-max 7.5 –13.7) and were included in the long-term radiographic analysis. Liner thickness was 5.5 mm at 45 degrees in all cases but one, who had a liner thickness of 4.7mm, and all patients had a cobalt-chrome head. Cup sizes were 52mm (n=15, 68%) and 50mm (n=7, 32%). Mean linear liner wear was 0.0470 mm/year (range 0 - 0.2628 mm) and mean volumetric wear was 127.69 mm3/year (range 0 - 721.23 mm3/year).
Using HXLPE liners with 36-mm heads or bigger in 52-mm cups or smaller is safe, with low rates of linear and volumetric wear in the mid to long-term follow-up. Patients did not require revision surgery for liner complications, including liner fracture, dissociation, or wear. Our results suggest that the advantages of using larger heads should outweigh the potential risks of using thin HXLPE liners.
Total knee and hip arthroplasty (TKA and THA) are two of the highest volume and resource intensive surgical procedures. Key drivers of the cost of surgical care are duration of surgery (DOS) and postoperative inpatient length of stay (LOS). The ability to predict TKA and THA DOS and LOS has substantial implications for hospital finances, scheduling and resource allocation. The goal of this study was to predict DOS and LOS for elective unilateral TKAs and THAs using machine learning models (MLMs) constructed on preoperative patient factors using a large North American database.
The American College of Surgeons (ACS) National Surgical and Quality Improvement (NSQIP) database was queried for elective unilateral TKA and THA procedures from 2014-2019. The dataset was split into training, validation and testing based on year. Multiple conventional and deep MLMs such as linear, tree-based and multilayer perceptrons (MLPs) were constructed. The models with best performance on the validation set were evaluated on the testing set. Models were evaluated according to 1) mean squared error (MSE), 2) buffer accuracy (the number of times the predicted target was within a predesignated buffer of the actual target), and 3) classification accuracy (the number of times the correct class was predicted by the models). To ensure useful predictions, the results of the models were compared to a mean regressor.
A total of 499,432 patients (TKA 302,490; THA 196,942) were included. The MLP models had the best MSEs and accuracy across both TKA and THA patients. During testing, the TKA MSEs for DOS and LOS were 0.893 and 0.688 while the THA MSEs for DOS and LOS were 0.895 and 0.691. The TKA DOS 30-minute buffer accuracy and ≤120 min, >120 min classification accuracy were 78.8% and 88.3%, while the TKA LOS 1-day buffer accuracy and ≤2 days, >2 days classification accuracy were 75.2% and 76.1%. The THA DOS 30-minute buffer accuracy and ≤120 min, >120 min classification accuracy were 81.6% and 91.4%, while the THA LOS 1-day buffer accuracy and ≤2 days, >2 days classification accuracy were 78.3% and 80.4%. All models across both TKA and THA patients were more accurate than the mean regressors for both DOS and LOS predictions across both buffer and classification accuracies.
Conventional and deep MLMs have been effectively implemented to predict the DOS and LOS of elective unilateral TKA and THA patients based on preoperative patient factors using a large North American database with a high level of accuracy. Future work should include using operational factors to further refine these models and improve predictive accuracy. Results of this work will allow institutions to optimize their resource allocation, reduce costs and improve surgical scheduling.
Acknowledgements:
The American College of Surgeons National Surgical Quality Improvement Program and the hospitals participating in the ACS NSQIP are the source of the data used herein; they have not verified and are not responsible for the statistical validity of the data analysis or the conclusions derived by the authors.
The benefit of using acetabular screws in primary total hip arthroplasty (THA) has been questioned in recent years. The disadvantages of using screws include increased operative time, risk of injury to surrounding neurovascular structures and metal ware breakage. Recent large registry studies have reported that screws do not confer a protective effect against acetabular loosening or the presence of osteolysis. Other studies have even described an increased risk of aseptic acetabular loosening with the selective use of screws. We report findings from a multicentre cohort study.
This large cohort study compared clinical outcomes between primary acetabular components that were inserted with and without screws. Independent variables included the presence (or absence) of screws, the total number of screws used and the cumulative screw length (CSL). Outcome measures included all-cause revision, acetabular component revision and acetabular component loosening. Statistical software (Stata/IC 13.1 for Mac [64-bit Intel]) was used to conduct all statistical analyses. A p-value < 0 .05 taken to be significant.
There were 4,583 THAs performed in total. Screws were used in 15.9% (n=733). At a mean follow-up of 5.2 years, the all-cause revision rate in the screw cohort was 1.5% compared to 0.83% in the no screw cohort (p=0.085). There was no difference in acetabular component revision rates for screws (3/733, 0.41%) versus no screws (12/3,850, 0.31%) (p=0.439). The rate of acetabular loosening noted during the time of revision surgery was significantly higher when screws were used in the index procedure (2/733, 0.2%) compared to the no screw cohort (1/3,850, 0.02%) (p=0.017). There was no difference in outcomes when stratifying by the number of screws used or the cumulative screw length.
Primary acetabular components do not require screws for fixation. All cause revision rates and acetabular component revision rates are comparable for the screw and the no screw cohorts. The rate of acetabular component loosening, as observed during revision surgery, is significantly higher when screws are used in the index total hip replacement.
Adverse spinopelvic characteristics (ASC) have been associated with increased dislocation risk following primary total hip arthroplasty (THA). A stiff lumbar spine, a large posterior standing tilt when standing and severe sagittal spinal deformity have been identified as key risk factors for instability. It has been reported that the rate of dislocation in patients with such ASC may be increased and some authors have recommended the use of dual mobility bearings or robotics to reduce instability to within acceptable rates (<2%).
The aims of the prospective study were to 1: Describe the true incidence of ASC in patients presenting for a THA 2. Assess whether such characteristics are associated with greater symptoms pre-THA due to the concomitant dual pathology of hip and spine and 3. Describe the early term dislocation rate with the use of ≤36mm bearings.
This is an IRB-approved, two-center, multi-surgeon, prospective, consecutive, cohort study of 220 patients undergoing THA through anterolateral- (n=103; 46.8%), direct anterior- (n=104; 27.3%) or posterior- approaches (n=13; 5.9%). The mean age was 63.8±12.0 years (range: 27.7-89.0 years) and the mean BMI 28.0±5.0 kg/m2 (range: 19.4-44.4 kg/m2). There were 44 males (47.8%) and 48 females (52.2%). The mean follow-up was 1.6±0.5 years. Overall, 54% of femoral heads was 32 mm, and 46% was 36mm.
All participants underwent lateral spinopelvic radiographs in the standing and deep-flexed seated positions were taken to determine lumbar lordosis (LL), sacral slope (SS), pelvic tilt (PT), pelvic-femoral angle (PFA) and pelvic incidence (PI) in both positions. Spinal stiffness was defined as lumbar flexion <20° when transitioning between the standing and deep-seated position; adverse standing PT was defined as >19° and adverse sagittal lumbar balance was defined as mismatch between standing PI and LL >10°.
Pre-operative patient reported outcomes was measured using the Oxford Hip Score (OHS) and EuroQol Five-Dimension questionnaire (EQ-5D). Dislocation rates were prospectively recorded. Non-parametric tests were used, significance was set at p<0.05.
The prevalence of PI-LL mismatch was 22.1% (43/195) and 30.4% had increased standing PT (59/194). The prevalence of lumbar stiffness was 3.5% (5/142) and these patients had all three adverse spinopelvic characteristics (5/142; 3.5%).
There was no significant difference in the pre-operative OHS between patients with (20.7±7.6) and patients without adverse spinopelvic characteristics (21.6±8.7; p=0.721), nor was there for pre-operative EQ5D (0.651±0.081 vs. 0.563±0.190; p=0.295).
Two patients sustained a dislocation (0.9%): One in the lateral (no ASC) and one in the posterior approaches, who also exhibited ASC pre-operatively.
Sagittal lumbar imbalance, increased standing spinal tilt and spinal stiffness are not uncommon among patients undergoing THA. The presence of such characteristics is not associated with inferior pre-operative PROMs. However, when all characteristics are present, the risk of instability is increased. Patients with ASC treated with posterior approach THA may benefit from the use of advanced technology due to a high risk of dislocation. The use of such technology with the anterior or lateral approach to improve instability is to date unjustified as the rate of instability is low even amongst patients with ASCs.
Same day home (SDH) discharge in total joint arthroplasty (TJA) has increased in popularity in recent years. The objective of this study was to evaluate the causes and predictors of failed discharges in planned SDH patients.
A consecutive cohort of patients who underwent total knee (TKA) or total hip arthroplasty (THA) that were scheduled for SDH discharge between April 1, 2019 to March 31, 2021 were retrospectively reviewed. Patient demographics, causes of failed discharge, perioperative variables, 30-day readmissions and 6-month reoperation rates were collected. Multivariate regression analysis was undertaken to identify independent predictors of failed discharge.
The cohort consisted of 527 consecutive patients. One hundred and one (19%) patients failed SDH discharge. The leading causes were postoperative hypotension (20%) and patients who were ineligible for the SDH pathway (19%). Two individual surgeons, later operative start time (OR 1.3, 95% CI, 1.15-1.55, p=0.001), ASA class IV (OR 3.4, 95% CI, 1.4-8.2; p=0.006) and undergoing a THA (OR 2.0, 95% CI, 1.2-3.1, p=0.004) were independent predictors of failed SDH discharge. No differences in age, BMI, gender, surgical approach or type of anesthetic were found (p>0.05). The 30-day readmission or 6-month reoperation were similar between groups (p>0.05).
Hypotension and inappropriate patient selection were the leading causes of failed SDH discharge. Significant variability existed between individual surgeons failed discharge rates. Patients undergoing a THA, classified as ASA IV or had a later operative start time were all more likely to fail SDH discharge.
Although day surgery has a good patient satisfaction and safety profile, accurate episode-of-care costs (EOCC) calculation for of this procedure compared to standard same-day admission (SDA), while considering functional outcomes, is not well known. This study assesses the EOCC for patients with a THA while comparing DS and Same Day Admission (SDA) (with a 1-day hospitalization) pathways.
The episode-of-care cost (EOCC) of 50 consecutive day surgery and SDA patients who underwent a THA was evaluated. The episode-of-care cost was determined using a bottom-up Time Driven- Activity Based Funding method. Functional outcomes were measured using preoperative and postoperative Harris Hip Score (HHS).
Overall, the SDA THA cost 11% more than a DS THA. The mean total EOCC of DS THA was 9 672 CAD compared to 10 911 CAD in the SDA THA group. Both groups showed an improvement in HHS score following the procedure but patients in the DS group had a significantly higher postoperative HHS score and a significantly greater improvement in their HHS score postoperatively.
Day surgery THA is cost-effective, safe and associated with high patient satisfaction due to functional improvement. Providing policymakers the information to develop optimal financing methods is paramount for clinicians wishing to develop modern protocols, increase productivity while providing the optimal care for patients.
Hip resurfacing may be a useful surgical procedure when patient selection is correct and only implants with superior performance are used. In order to establish a body of evidence in relation to hip resurfacing, pseudotumour formation and its genetic predisposition, we performed a case-control study investigating the role of HLA genotype in the development of pseudotumour around MoM hip resurfacings.
All metal-on-metal (MoM) hip resurfacings performed in the history of the institution were assessed. A total of 392 hip resurfacings were performed by 12 surgeons between February 1st 2005 and October 31st 2007. In all cases, pseudotumour was confirmed in the preoperative setting on Metal Artefact Reduction Sequencing (MARS) MRI. Controls were matched by implant (ASR or BHR) and absence of pseudotumour was confirmed on MRI. Blood samples from all cases and controls underwent genetic analysis using Next Generation Sequencing (NGS) assessing for the following alleles of 11 HLA loci (A, B, C, DRB1, DRB3/4/5, DQA1, DQB1, DPB1, DPA1). Statistical significance was determined using a Fisher's exact test or Chi-Squared test given the small sample size to quantify the clinical association between HLA genotype and the need for revision surgery due to pseudotumour.
Both groups were matched for implant type (55% ASR, 45% BHR in both the case and control groups). According to the ALVAL histological classification described by Kurmis et al., the majority of cases (63%, n=10) were found to have group 2 histological findings. Four cases (25%) had group 3 histological findings and 2 (12%) patients had group 4 findings. Of the 11 HLA loci analysed, 2 were significantly associated with a higher risk of pseudotumour formation (DQB1*05:03:01 and DRB1*14:54:01) and 4 were noted to be protective against pseudotumour formation (DQA1*03:01:01, DRB1*04:04:01, C*01:02:01, B*27:05:02).
These findings further develop the knowledge base around specific HLA genotypes and their role in the development of pseudotumour formation in MoM hip resurfacing. Specifically, the two alleles at higher risk of pseudotumour formation (DQB1*05:03:01 and DRB1*14:54:01) in MoM hip resurfacing should be noted, particularly as patient-specific genotype-dependent surgical treatments continue to develop in the future.
A stiff spine leads to increased demand on the hip, creating an increased risk of total hip arthroplasty (THA) dislocation. Several authors propose that a change in sacral slope of ≤10° between the standing and relaxed-seated positions (ΔSSstanding→relaxed-seated) identifies a patient with a stiff lumbar spine and have suggested use of dual-mobility bearings for such patients. However, such assessment may not adequately test the lumbar spine to draw such conclusions. The aim of this study was to assess how accurately ΔSSstanding→relaxed-seated can identify patients with a stiff spine.
This is a prospective, multi-centre, consecutive cohort series. Two-hundred and twenty-four patients, pre-THA, had standing, relaxed-seated and flexed-seated lateral radiographs. Sacral slope and lumbar lordosis were measured on each functional X-ray. ΔSSstanding→relaxed-seated seated was determined by the change in sacral slope between the standing and relaxed-seated positions. Lumbar flexion (LF) was defined as the difference in lumbar lordotic angle between standing and flexed-seated. LF≤20° was considered a stiff spine. The predictive value of ΔSSstanding→relaxed-seated for characterising a stiff spine was assessed.
A weak correlation between ΔSSstanding→relaxed-seated and LF was identified (r2= 0.15). Fifty-four patients (24%) had ΔSSstanding→relaxed-seated ≤10° and 16 patients (7%) had a stiff spine. Of the 54 patients with ΔSSstanding→relaxed-seated ≤10°, 9 had a stiff spine. The positive predictive value of ΔSSstanding→relaxed-seated ≤10° for identifying a stiff spine was 17%.
ΔSSstanding→relaxed-seated ≤10° was not correlated with a stiff spine in this cohort. Utilising this simplified approach could lead to a six-fold overprediction of patients with a stiff lumbar spine. This, in turn, could lead to an overprediction of patients with abnormal spinopelvic mobility, unnecessary use of dual mobility bearings and incorrect targets for component alignment. Referring to patients ΔSSstanding→relaxed-seated ≤10° as being stiff can be misleading; we thus recommend use of the flexed-seated position to effectively assess pre-operative spinopelvic mobility.
With an ageing population and an increasing number of primary arthroplasties performed, the revision burden is predicted to increase. The aims of this study were to 1. Determine the revision burden in an academic hospital over a 11-year period; 2. identify the direct hospital cost associated with the delivery of revision service and 3. ascertain factors associated with increased cost.
This is an IRB-approved, retrospective, single tertiary referral center, consecutive case series. Using the hospital data warehouse, all patients that underwent revision hip or knee arthroplasty surgery between 2008-2018 were identified. 1632 revisions were identified (1304 patients), consisting of 1061 hip and 571 knee revisions. The majority of revisions were performed for mechanical-related problems and aseptic loosening (n=903; 55.3%); followed by periprosthetic joint infection (n=553; 33.9%) and periprosthetic fractures (176; 10.8%). Cost and length of stay was determined for all patient. The direct in-hospital costs were converted to 2020 inflation-adjusted Canadian dollars. Several patients- (age; gender; HOMR- and ASA-scores; Hemoglobin level) and surgical- (indication for surgery; surgical site) factors were tested for possible associations.
The number of revisions increased by 210% in the study period (2008 vs. 2018: 83 vs. 174). Revision indications changed over study period; with prevalence of fracture increasing by 460% (5 in 2008 vs. 23 in 2018) with an accompanying reduction in mechanical-related reasons, whilst revisions for infection remained constant. The mean annual cost over the entire study period was 3.9 MMCAD (range:2.4–5.1 MMCAD). The cost raised 150% over the study period from 2.4 MMCAD in 2008 to 3.6 MMCAD. Revisions for fractured had the greatest length of stay, the highest mean age, HOMR-score, ASA and cost associated with treatment compared to other revision indications (p < 0 .001). Patient factors associated with cost and length of stay included ASA- and HOMR-scores, Charlson-Comorbidity score and age.
The revision burden increased 1.5-fold over the years and so has the direct cost of care delivery. The increased cost is primarily related to the prolonged hospital stay and increased surgical cost. For tertiary care units, these findings indicate a need to identify strategies on improving efficiencies whilst improving the quality of patient care (e.g. efficient ways of reducing acute hospital stay) and reducing the raise of the economic burden on a publicly funded health system.
The study of spinopelvic anatomy and movement has received great interest as these characteristics influence the biomechanical behavior (and outcome) following hip arthroplasty. However, to-date there is little knowledge of what “normal” is and how this varies with age. This study aims to determine how dynamic spino-pelvic characteristics change with age, with well-functioning hips and assess how these changes are influenced by the presence of hip arthritis.
This is an IRB-approved, cross-sectional, cohort study; 100 volunteers (asymptomatic hips, Oxford-Hip-sore>45) [age:53 ± 17 (24-87) years-old; 51% female; BMI: 28 ± 5] and 200 patients with end-stage hip arthritis [age:56 ± 19 (16-89) years-old; 55% female; BMI:28 ± 5] were studied. All participants underwent lateral spino-pelvic radiographs in the standing and deep-seated positions to determine maximum hip and spine flexion. Parameters measured included lumbar-lordosis (LL), pelvic incidence, pelvic-tilt (PT), pelvic-femoral angles (PFA). Lumbar flexion (ΔLL), hip flexion (ΔPFA) and pelvic movement (ΔPT) were calculated. The prevalence of spinopelvic imbalance (PI–LL>10?) was determined.
There were no differences in any of the spino-pelvic characteristics or movements between sexes. With advancing age, standing LL reduced and standing PT increased (no differences between groups). With advancing age, both hip (4%/decade) and lumbar (8%/decade) flexion reduced (p<0.001) (no difference between groups). ΔLL did not correlate with ΔPFA (rho=0.1). Hip arthritis was associated with a significantly reduced hip flexion (82 ±;22? vs. 90 ± 17?; p=0.003) and pelvic movements (1 ± 16? vs. 8 ± 16?; p=0.002) at all ages and increased prevalence of spinopelvic imbalance (OR:2.6; 95%CI: 1.2-5.7).
With aging, the lumbar spine loses its lumbar lordosis and flexion to a greater extent that then the hip and resultantly, the hip's relative contribution to the overall sagittal movement increases. With hip arthritis, the reduced hip flexion and the necessary compensatory increased pelvic movement is a likely contributor to the development of hip-spine syndrome and of spino-pelvic imbalance.
Increased femoral head size reduces the rate of dislocation after total hip arthroplasty (THA). With the introduction of highly crosslinked polyethylene (HXLPE) liners in THA there has been a trend towards using larger size femoral heads in relatively smaller cup sizes, theoretically increasing the risk of liner fracture, wear, or aseptic loosening. Short to medium follow-up studies have not demonstrated a negative effect of using thinner HXLPE liners. However, there is concern that these thinner liners may prematurely fail in the long-term, especially in those with thinner liners. The aim of this study was to evaluate the long-term survival and revision rates of HXLPE liners in primary THA, as well as the effect of liner thickness on these outcomes. We hypothesized that there would be no significant differences between the different liner thicknesses.
We performed a retrospective database analysis from a single center of all primary total hip replacements using HXLPE liners from 2010 and earlier, including all femoral head sizes. All procedures were performed by fellowship trained arthroplasty surgeons. Patient characteristics, implant details including liner thickness, death, and revisions (all causes) were recorded. Patients were grouped for analysis for each millimeter of PE thickness (e.g. 4.0-4.9mm, 5.0-5.9mm). Kaplan-Meier survival estimates were estimated with all-cause and aseptic revisions as the endpoints.
A total of 2354 patients (2584 hips) were included (mean age 64.3 years, min-max 19-96). Mean BMI was 29.0 and 47.6% was female. Mean follow-up was 13.2 years (range 11.0-18.8). Liner thickness varied from 4.9 to 12.7 mm. Seven patients had a liner thickness <5.0mm and 859 had a liner thickness of <6.0mm. Head sizes were 28mm (n=85, 3.3%), 32mm (n=1214, 47.0%), 36mm (n=1176, 45.5%), and 40mm (n=109, 4.2%), and 98.4% were metal heads. There were 101 revisions, and in 78 of these cases the liner was revised. Reason for revision was instability/dislocation (n=34), pseudotumor/aseptic lymphocyte-dominant vasculitis associated lesion (n=18), fracture (n=17), early loosening (n=11), infection (n=7), aseptic loosening (n=4), and other (n=10). When grouped by liner thickness, there were no significant differences between the groups when looking at all-cause revision (p=0.112) or aseptic revision (p=0.116).
In our cohort, there were no significant differences in all-cause or aseptic revisions between any of the liner thickness groups at long-term follow-up. Our results indicate that using thinner HXPE liners to maximize femoral head size in THA does not lead to increased complications or liner failures at medium to long term follow-up. As such, orthopedic surgeons can consider the use of larger heads at the cost of liner thickness a safe practice to reduce the risk of dislocation after THA when using HXLPE liners.
Periprosthetic joint infection (PJI) occurs in 0.2-2% of primary hip and knee arthroplasty and is a leading cause of revision surgery, impaired function, and increased morbidity and mortality. Topical, intrawound vancomycin administration allows for high local drug concentrations at the surgical site and has demonstrated good results in prevention of surgical site infection after spinal surgery. It is a promising treatment to prevent infection following hip and knee arthroplasty. Prior studies have been limited by small sample sizes and the low incidence of PJI. This systematic review and meta-analysis was performed to determine the effectiveness of topical vancomycin for the primary prevention of PJI in hip and knee arthroplasty.
A search of Embase, MEDLINE, and PubMed databases as of June 2020 was performed according to PRISMA guidelines. Studies comparing topical vancomycin to standard perioperative intravenous antibiotics in primary THA and TKA with a minimum of three months follow-up were identified. The results from applicable studies were meta-analysed to determine the impact of topical vancomycin on PJI rates as well as wound-related and overall complications. Results were expressed as odds ratios (ORs) and 95% confidence intervals
Nine comparative observational studies were eligible for inclusion. 3371 patients treated with 0.5-2g of topical vancomycin were compared to 2884 patients treated with standard care. Only one of nine studies found a significantly lower rate of PJI after primary THA or TKA (OR 0.09-1.97, p=0.04 for one study, p>0.05 for eight of nine studies), though meta-analysis showed a significant benefit, with vancomycin lowering PJI rates from 1.6% in controls to 0.7% in the experimental group (OR 0.47, p=0.02, Figure 1). Individually, only one of five studies showed a significant benefit to topical vancomycin in THA, while none of seven studies investigating PJI after TKA showed a benefit to topical vancomycin. In meta-analysis of our subgroups, there was a significant reduction in PJI with vancomycin in THA (OR 0.34, p=0.04), but there was no significant difference in PJI after TKA (OR 0.60, p = 0.13). In six studies which reported complication rates other than PJI, there were no significant differences in overall complication rates with vancomycin administration for any study individually (OR 0.48-0.94, p>0.05 for all studies), but meta-analysis found a significant difference in complications, with a 6.7% overall complication rate in controls compared to 4.8% after topical vancomycin, largely driven by a lower PJI incidence (OR 0.76, p=0.04).
Topical vancomycin is protective against PJI after hip and knee arthroplasty. No increase in wound-related or overall complication rates was found with topical vancomycin. This meta-analysis is the largest to date and includes multiple recent comparative studies while excluding other confounding interventions (such as povidone-iodine irrigation). However, included studies were predominantly retrospective and no randomized-controlled trials have been published. The limited evidence summarized here indicates topical vancomycin may be a promising modality to decrease PJI, but there is insufficient evidence to conclusively show a decrease in PJI or to demonstrate safety. A prospective, randomized-controlled trial is ongoing to better answer this question.
For any figures or tables, please contact the authors directly.
This study used model-based radiostereometric analysis (MBRSA) to compare migration of a recently introduced cementless hip stem to an established hip stem of similar design. Novel design features of the newer hip stem included a greater thickness of hydroxyapatite coating and a blended compaction extraction femoral broach.
Fifty-seven patients requiring primary total hip arthroplasty (THA) were enrolled at a single centre. Patients were randomized to receive either an Avenir collarless stem and Trilogy IT cup (ZimmerBiomet) or a Corail collarless stem and Pinnacle cup (DePuy Synthes) via a posterior or lateral approach. Both stems are broach-only femoral bone preparation. RSA beads (Halifax Biomedical) were inserted into the proximal femur during surgery. Patients underwent supine RSA imaging a 6 weeks (baseline), 6, 12, and 24 months following surgery. The primary study outcome was total subsidence of the hip stem from baseline to 24 months as well as progression of subsidence between 12 and 24 months. These values were compared against published migration thresholds for well-performing hip stems (0.5mm). The detection limit, or precision, of MBRSA was calculated based on duplicate examinations taken at baseline. Patient reported outcome measures were collected throughout the study and included the Oxford-12 Hip Score (OHS), EuroQoL EQ-5D-5L, Hip Osteoarthritis Score (HOOS) as well as visual analogue scales (VAS) for thigh pain and satisfaction. Analysis comprised of paired and unpaired t-tests with significance set at p≤0.05.
Forty-eight patients (30 males) were included for analysis; 7 patients received a non-study hip stem intra-operatively, 1 patient suffered a traumatic dislocation within three weeks of surgery, and 1 patient died within 12 months post-surgery. RSA data was obtained for 45 patients as three patients did not receive RSA beads intra-operatively. Our patient cohort had a mean age of 65.9 years (±;7.2) at the time of surgery and body mass index of 30.5 kg/m2 (±;5.2). No statistical difference in total stem migration was found between the Avenir and Corail stems at 12 months (p=0.045, 95%CI: −0.046 to 0.088) and 24 months (p=0.936, 95% CI: −0.098 to 0.090). Progression of subsidence from 12-24 months was 0.011mm and 0.034mm for the Avenir and Corail groups which were not statistically different (p=0.163, 95%CI: −0.100 to 0.008) between groups and significantly less than the 0.5mm threshold (pNo statistically significant differences existed between study groups for any pre-operative function scores (p>0.05). All patients showed significant functional improvement from pre- to post-surgery and no outcome measures were different between study groups with exception of EQ-5D-5L health visual analogue scale at 12 months which showed marginally superior (p=0.036) scores in the Avenir group. This study was not powered to detect differences in clinical outcomes.
This study has demonstrated no statistical difference in subsidence or patient-reported outcomes between the Corail hip stem and the more recently introduced Avenir hip stem. This result is predictable as both stems are of a triple-tapered design, are coated with hydroxyapatite, and utilize a broach-only bone preparation technique. Both stem designs demonstrate migration below 0.5mm suggesting both are low-risk for aseptic loosening in the long-term.
Demand for total knee arthroplasty (TKA) is increasing as it remains the gold-standard treatment for end-stage osteoarthritis (OA) of the knee. While magnetic-resonance imaging (MRI) scans of the knee are not indicated for diagnosing knee OA, they are commonly ordered prior to the referral to an orthopaedic surgeon. The purpose of this study was to determine the proportion of patients who underwent an MRI in the two years prior to their primary TKA for OA. Secondary outcomes included determining patient and physician associations with increased MRI usage.
This is a population-based cohort study using billing codes in Ontario, Canada. All patients over 40 years-old who underwent a primary TKA between April 1, 2008 and March 31, 2017 were included. Statistical analyses were performed using SAS and included the Cochran-Armitage test for trend of MRI prior to surgery, and predictive multivariable regression model. Significance was set to p<0.05.
There were 172,689 eligible first-time TKA recipients, of which 34,140 (19.8%) received an MRI in the two years prior to their surgery. The majority of these (70.8%) were ordered by primary care physicians, followed by orthopaedic surgeons (22.5%). Patients who received an MRI were younger and had fewer comorbidities than patients who did not (p<0.001). MRI use prior to TKA increased from 15.9% in 2008 to 20.1% in 2017 (p<0.0001).
Despite MRIs rarely being indicated for the work-up of knee OA, nearly one in five patients have an MRI in the two years prior to their TKA. Reducing the use of this prior to TKA may help reduce wait-times for surgery.
The demand for revision total knee arthroplasty (TKA) has grown significantly in recent years. The two major fixation methods for stems in revision TKA include cemented and ‘hybrid’ fixation. We explore the optimal fixation method using data from recent, well-designed comparative studies.
We performed a systematic review of comparative studies published within the last 10 years with a minimum follow-up of 24 months. To allow for missing data, a random-effects meta-analysis of all available cases was performed. The odds ratio (OR) for the relevant outcome was calculated with 95% confidence intervals. The effects of small studies were analyzed using a funnel plot, and asymmetry was assessed using Egger's test. The primary outcome measure was all-cause failure. Secondary outcome measures included all-cause revision, aseptic revision and radiographic failure.
There was a significantly lower failure rate for hybrid stems when compared to cemented stems (p = 0.006) (OR 0.61, 95% CI 0.42-0.87). Heterogeneity was 4.3% and insignificant (p = 0.39). There was a trend toward superior hybrid performance for all other outcome measures including all-cause re-revision, aseptic re-revision and radiographic failure.
Recent evidence suggests a significantly lower failure rate for hybrid stems in revision TKA. There is also a trend favoring the use of hybrid stems for all outcome variables assessed in this study. This is the first time a significant difference in outcome has been demonstrated through systematic review of these two modes of stem fixation. We therefore recommend the use, where possible, of hybrid stems in revision TKA.
The benefits of HXLPE in total knee arthroplasty (TKA) have not been as evident as total hip arthroplasty (THA). A systematic review and meta-analysis to assess the impact of highly-crosslinked polyethylene (HXLPE) on TKA outcomes compared to conventional polyethylene (CPE) is described.
All studies comparing HXLPE with CPE for primary TKA were included for analysis. The minimum dataset included revision rates, indication for revision, aseptic component loosening and follow-up time. The primary outcome variables were all-cause revision, aseptic revision, revision for loosening, radiographic component loosening, osteolysis and incidence of radiolucent lines. Secondary outcome measures included postoperative functional knee scores. A random-effects meta-analysis allowing for all missing data was performed for all primary outcome variables.
Six studies met the inclusion criteria. In total, there were 2,234 knees (1,105 HXLPE and 1,129 CPE). The combined mean follow-up for all studies was 6 years. The aseptic revision rate in the HXLPE group was 1.02% compared to 1.97% in the CPE group. There was no difference in the rate of all-cause revision (p = 0.131), aseptic revision (p = 0.298) or revision for component loosening (p = 0.206) between the two groups. Radiographic loosening (p = 0.200), radiolucent lines (p = 0.123) and osteolysis (p = 0.604) was similar between both groups. Functional outcomes were similar between groups.
The use of HXLPE in TKA yields similar results for clinical and radiographic outcomes when compared to CPE at midterm follow-up. HXLPE does not confer the same advantages to TKA as seen in THA.
We recently performed a clinical trial comparing motor sparing blocks (MSB) to periarticular infiltration (PAI) following total knee arthroplasty (TKA). We found that MSBs provided longer analgesia (8.8 hours) than PAI with retention of quadriceps strength, and with similar function, satisfaction, and length of hospital stay. However, its potential increased cost could serve as a barrier to its adoption. Therefore, our aim was to compare the costs of MSBs to PAI following TKA.
We conducted a retrospective review of data from our previous RCT. There were 82 patients included in the RCT (n=41 MSB group, n=41 PAI group). We compared the mean total costs associated with each group until hospital discharge including intervention costs, healthcare professional service fees, intraoperative medications, length of stay, and postoperative opioid use.
Seventy patients were included (n=35 MSB group, n=35 PAI group). The mean total costs for the MSB group was significantly higher ($1959.46 ± 755.4) compared to the PAI group ($1616.25 ± 488.33), with a mean difference of $343.21 (95% CI = $73.28 to $664.11, p = 0.03). The total perioperative intervention costs for performing the MSB was also significantly higher however postoperative inpatient costs including length of stay and total opioid use did not differ significatnly.
Motor sparing blocks had significantly higher mean total and perioperative costs compared to PAI with no significant difference in postoperative inpatient costs. However, its quadricep sparing nature and previously demonstrated prolonged postoperative analgesia can be used to facilitate an outpatient TKA pathway thereby offsetting its increased costs.
Primary hip and knee joint replacements in Canada have been estimated to cost over $1.4 billion dollars annually, with revision surgery costing $177 million. The most common cause of revision arthroplasty surgery in Canada is infection. Periprosthetic joint infections (PJIs) are a devastating though preventable complication following arthroplasty. Though variably used, antibiotic laden bone cement (ALBC) has been demonstrated to decrease PJIs following primary total knee arthroplasty (TKA). Unfortunately, ALBC is costlier than regular bone cement (RBC). Therefore, the aim of this study was to determine if the routine use of ALBC in primary TKA surgery is a cost-effective practice from the perspective of the Canadian healthcare system.
A decision tree was constructed using a decision analysis software (TreeAge Software, Williamstown, Massachusetts) to a two-year time horizon comparing primary TKA with either ALBC or RBC from the perspective of a single-payer healthcare system. All costs were in 2020 Canadian dollars. Health utilities were in the form of quality adjusted life years (QALYs). Model inputs for cost were derived from regional and national databases. Health utilities and probability parameters were derived from the latest literature. One-way deterministic sensitivity analysis was performed on all model parameters. The primary outcome of this analysis was an incremental cost-effectiveness ratio (ICER) with a willingness-to-pay (WTP) threshold of $50,000 per QALY.
Primary TKA with ALBC (TKA-ALBC) was found to be more cost-effective compared to primary TKA with RBC (TKA-RBC). More specifically, TKA-ALBC dominated TKA-RBC as it was less costly on the long term ($11,160 vs. $11,118), while providing the same QALY (1.66). The ICER of this cost-utility analysis (CUA) was $-11,049.72 per QALY, much less than the WTP threshold of $50,000 per QALY. The model was sensitive to costs of ALBC-TKA as well as the probability of PJI following ALBC-TKA and RBC-TKA. ALBC ceased to be cost effective once the cost of ALBC was greater than $223.08 CAD per bag of cement.
The routine use of ALBC in primary TKA is a cost-effective practice in the context of the Canadian healthcare system as long as the cost of ALBC is maintained at a reasonable price and the published studies to-date keep supporting the efficacy of ALBC in decreasing PJI following primary TKA. Further, this analysis is very conservative, and ALBC is likely much more cost-effective than presented. This is due to this model's revision surgery cost parameter being based on the average cost of all revision TKA surgery in Canada, regardless of etiology. Considering many PJIs require two-stage revisions, the cost parameter used in this analysis for revision surgery is an underestimate of true cost. Ultimately, this is the first cost-effectiveness study evaluating this topic from the perspective of the Canadian healthcare system and can inform future national guidelines on the subject matter.
Canada is second only to the United States worldwide in the number of opioid prescriptions per capita. Despite this, little is known about prescription patterns for patients undergoing total joint arthroplasty (TJA). The purpose of this study was to detail preoperative opioid use patterns and investigate the effect it has on perioperative quality outcomes in patients undergoing elective total hip and total knee arthroplasty surgery (THA and TKA).
The study cohort was constructed from hospital Discharge Abstract Data (DAD) and National Ambulatory Care Reporting System (NACRS) data, using Canadian Classification of Health Intervention codes to select all primary THA and TKA procedures from 2017-2020 in Nova Scotia. Opioid use was defined as any prescription filled at discharge as identified in the Nova Scotia Drug Information System (DIS). Emergency Department (ED) and Family Doctor (FD) visits for pain were ascertained from Physician Claims data. Multivariate logistic regression was used to test for associations controlling for confounders. Chi-squared statistics at 95% confidence level used to test for statistical significance.
In total, 14,819 TJA patients were analysed and 4306 patients (29.0%) had at least one opioid prescription in the year prior to surgery. Overall, there was no significant difference noted in preoperative opiate use between patients undergoing TKA vs THA (28.8% vs 29.4%). During the period 2017-2019 we observed a declining year-on-year trend in preoperative opiate use. Interestingly, this trend failed to continue into 2020, where preoperative opiate use was observed to increase by 15% and exceeded 2017 levels. Within the first 90 days of discharge, 22.9% of TKA and 20.9% of THA patients presented to the ED or their FD with pain related issues. Preoperative opiate use was found to be a statistically significant predictor for these presentations (TKA: odds ratio [OR], 1.45; 95% confidence interval [CI], 1.29 to 1.62; THA: OR, 1.46; 95% CI, 1.28 to 1.65).
Preoperative opioid consumption in TJA remains high, and is independently associated with a higher risk of 90 day return to the FD or ED. The widespread dissemination of opioid reduction strategies introduced during the middle of the last decade may have reduced preoperative opiate utilisation. Access barriers and practice changes due to the COVID-19 pandemic may now have annulled this effect.
During total knee arthroplasty (TKA), a tourniquet is often used intraoperatively. There are proposed benefits of tourniquet use including shorter duration of surgery, improved surgical field visualization and increased cement penetration which may improve implant longevity. However, there are also cited side effects that include increased post-operative pain, slowed recovery, skin bruising, neurovascular injury and quadriceps weakness. Randomized controlled trials have demonstrated no differences in implant longevity, however they are limited by short follow-up and small sample sizes. The objective of the current study was to evaluate the rates of revision surgery among patients undergoing cemented TKA with or without an intraoperative tourniquet and to understand the causes and risk factors for failure.
A retrospective cohort study was undertaken of all patients who received a primary, cemented TKA at a high-volume arthroplasty centre from January 1999 to December 2010. Patients who underwent surgery without the use of a tourniquet and those who had a tourniquet inflated for the entirety of the case were included. The causes and timing of revision surgery were recorded and cross referenced with the Canadian Institute of Health Information Discharge Abstract Database to reduce the loss to follow-up. Survivorship analysis was performed with the use of Kaplan-Meier curves to determine overall survival rates at final follow-up. A Cox proportional hazards model was utilized to evaluate independent predictors of revision surgery.
Data from 3939 cases of primary cemented TKA were available for analysis. There were 2276 (58%) cases in which a tourniquet was used for the duration of the surgery and 1663 (42%) cases in which a tourniquet was not utilized. Mean time from the primary TKA was 14.7 years (range 0 days - 22.8 years) when censored by death or revision surgery. There were 150 recorded revisions in the entire cohort, with periprosthetic joint infection (n=50) and aseptic loosening (n=41) being the most common causes for revision. The cumulative survival at final follow-up for the tourniquetless group was 93.8% at final follow-up while the cumulative survival at final follow-up for the tourniquet group was 96.9% at final follow-up. Tourniquetless surgery was an independent predictor for all-cause revision with an HR of 1.53 (95% CI 1.1, 2.1, p=0.011). Younger age and male sex were also independent factors for all cause revision.
The results of the current study demonstrate higher all-cause revision rates with tourniquetless surgery in a large cohort of patients undergoing primary cemented TKA. The available literature consists of short-term trials and registry data, which have inherent limitations. Potential causes for increased revision rates in the tourniquetless group include reduced cement penetration, increased intraoperative blood loss and longer surgical. The results of the current study should be taken into consideration, alongside the known risks and benefits of tourniquet use, when considering intraoperative tourniquet use in cemented TKA.
With the rising rates, and associated costs, of total knee arthroplasty (TKA), enhanced clarity regarding patient appropriateness for TKA is warranted. Towards addressing this gap, we elucidated in qualitative research that surgeons and osteoarthritis (OA) patients considered TKA need, readiness/willingness, health status, and expectations of TKA most important in determining patient appropriateness for TKA. The current study evaluated the predictive validity of pre-TKA measures of these appropriateness domains for attainment of a good TKA outcome.
This prospective cohort study recruited knee OA patients aged 30+ years referred for TKA at two hip/knee surgery centers in Alberta, Canada. Those receiving primary, unilateral TKA completed questionnaires pre-TKA assessing TKA need (WOMAC-pain, ICOAP-pain, NRS-pain, KOOS-physical function, Perceived Arthritis Coping Efficacy, prior OA treatment), TKA readiness/willingness (Patient Acceptable Symptom State (PASS), willingness to undergo TKA), health status (PHQ-8, BMI, MSK and non-MSK comorbidities), TKA expectations (HSS KR Expectations survey items) and contextual factors (e.g., age, gender, employment status). One-year post-TKA, we assessed for a ‘good outcome’ (yes/no), defined as improved knee symptoms (OARSI-OMERACT responder criteria) AND overall satisfaction with TKA results. Multiple logistic regression, stepwise variable selection, and best possible subsets regression was used to identify the model with the smallest number of independent variables and greatest discriminant validity for our outcome. Receiver Operating Characteristic (ROC) curves were generated to compare the discriminative ability of each appropriateness domain based on the ‘area under the ROC curve’ (AUC). Multivariable robust Poisson regression was used to assess the relationship of the variables to achievement of a good outcome.
f 1,275 TKA recipients, 1,053 (82.6%) had complete data for analyses (mean age 66.9 years [SD 8.8]; 58.6% female). Mean WOMAC pain and KOOS-PS scores were 11.5/20 (SD 3.5) and 52.8/100 (SD 17.1), respectively. 78.1% (95% CI 75.4–80.5%) achieved a good outcome. Stepwise variable selection identified optimal discrimination was achieved with 13 variables. The three best 13-variable models included measures of TKA need (WOMAC pain, KOOS-PS), readiness/willingness (PASS, TKA willingness), health status (PHQ-8, troublesome hips, contralateral knee, low back), TKA expectations (the importance of improved psychological well-being, ability to go up stairs, kneel, and participate in recreational activities as TKA outcomes), and patient age. Model discrimination was fair for TKA need (AUC 0.68, 95% CI 0.63-0.72), TKA readiness/willingness (AUC 0.61, 95% CI 0.57-0.65), health status (AUC 0.59, 95% CI 0.54-0.63) and TKA expectations (AUC 0.58, 95% CI 0.54-0.62), but the model with all appropriateness variables had good discrimination (AUC 0.72, 95% CI 0.685-0.76). The likelihood of achieving a good outcome was significantly higher for those with greater knee pain, disability, unacceptable knee symptoms, definite willingness to undergo TKA, less depression who considered improved ability to perform recreational activities or climb stairs ‘very important’ TKA outcomes, and lower in those who considered it important that TKA improve psychological wellbeing or ability to kneel.
Beyond surgical need (OA symptoms) and health status, assessment of patients’ readiness and willingness to undergo, and their expectations for, TKA, should be incorporated into assessment of patient appropriateness for surgery.
Tourniquet use in total knee arthroplasty (TKA) remains a subject of considerable debate. A recent study questioned the need for tourniquets based on associated risks. However, the study omitted analysis of crucial tourniquet-related parameters which have been demonstrated in numerous studies to be associated with safe tourniquet use and reduction of adverse events. The current utilization and preferences of tourniquet use in Canada remain unknown. Our primary aim was to determine the current practices, patterns of use, and opinions of tourniquet use in TKA among members of the Canadian Arthroplasty Society (CAS). Additionally, we sought to determine the need for updated best practice guidelines to inform optimal tourniquet use and to identify areas requiring further research.
A self-administered survey was emailed to members of the CAS in October 2021(six-week period). The response rate was 57% (91/161). Skip logic branching was used to administer a maximum of 59 questions related to tourniquet use, beliefs, and practices. All respondents were staff surgeons and 88% were arthroplasty fellowship trained. Sixty-five percent have been in practice for ≥11 years and only 16% for 50 TKA/year, 59% have an academic practice, and >67% prefer cemented TKA.
Sixty-six percent currently use tourniquets, 25% no longer do but previously did, and 9% never used tourniquets. For those not using tourniquets, the most common reasons are potential harm/risks and publications/conferences. Among current users, 48% use in all cases and an additional 37% use in 76-99% of cases. The top reason for use was improved visualization/bloodless field (88%), followed by performing a cemented TKA, used in training, and faster operative times. The main patient factor influencing selective tourniquet use was peripheral vascular disease and main surgical factors were operative duration and cementless TKA. The most frequent adverse events reported were bruising/pinching under the tourniquet and short-term pain, which majority believed were related to improper tourniquet use (prolonged time, high-pressures, poor cuff fit), yet only 8% use contoured tourniquets and 32% don't use limb protection. Despite substantial evidence in literature that tourniquet safety and probability of harm are affected by tourniquet time and pressure, only 83% and 72% of respondents believe reducing tourniquet time and pressure respectively reduce the probability of harm. In addition, no surgeon utilizes personalized limb occlusion pressure which has been demonstrated to substantially reduce tourniquet pressure while being safe and effective. Furthermore, 62% always use fixed pressure and 37% will modify the pressure based on patient parameters, most often systolic blood pressure and limb size. Almost all (88%) were interested in new evidence-based guidelines regarding these parameters.
Tourniquet use in TKA remains prevalent among arthroplasty surgeons in the CAS; however tremendous practice variability regarding several key parameters required for optimal use exists. Current best practices of tourniquet use regarding personalized pressures, time, and type are not being utilized across Canada. There is considerable interest and need for further research and updated guidelines regarding key parameters of safe tourniquet usage to optimize tourniquet use in TKA.
The coronavirus pandemic has reduced the capability of Canadian hospitals to offer elective orthopaedic surgery requiring admission, despite ongoing and increasing demands for elective total hip and total knee arthroplasty surgery (THA and TKA). We sought to determine if the coronavirus pandemic resulted in more outpatient THA and TKA in Nova Scotia, and if so, what effect increased outpatient surgery had on 90 day post-operative readmission or Emergency Department/Family Doctor (FD) visits.
The study cohort was constructed from hospital Discharge Abstract Data (DAD), inpatient admissions, and National Ambulatory Care Reporting System (NACRS) data, day surgery observations, using Canadian Classification of Health Intervention codes to select all primary hip and knee procedures from 2005-2020 in Nova Scotia. Emergency Department and General Practitioner visits were identified from the Physician Billings data and re-admissions from the DAD and NACRS. Rates were calculated by dividing the number of cases with any visit within 90 days after discharge. Chi-squared statistics at 95% confidence level used to test for statistical significance. Knee and hip procedures were modelled separately.
There was a reduction in THA and TKA surgery in Nova Scotia during the coronavirus pandemic in 2020. Outpatient arthroplasty surgery in Nova Scotia in the years prior to 2020 were relatively stable. However, in 2020 there was a significant increase in the proportion and absolute number of outpatient THA and TKA. The proportion of THA increased from 1% in 2019 to 14% in 2020, while the proportion of TKA increased from 1% in 2019 to 11% in 2020. The absolute number of outpatient THA increased from 16 cases in 2019, to 163 cases in 2020. Outpatient TKA cases increased from 21 in 2019, to 173 in 2020. The increase in outpatient surgery resulted in an increase in 90 day presentations to ED following TKA but not THA which was not statistically significant. For outpatient THA and TKA, there was a decrease in 90 day readmissions, and a statistically significant decrease in FD presentations.
Outpatient THA and TKA increased significantly in 2020, likely due to the restrictions imposed during the coronavirus pandemic on elective Orthopaedic surgery requiring admission to hospital. The increase in outpatient surgery resulted in an increase in 90 day presentations to ED for TKA, and a decrease in 90 day readmissions and FD presentations for THA and TKA. Reducing the inpatient surgical burden may result in a post-operative burden on ED, but does not appear to have caused an increase in hospital readmission rates.
To describe the longitudinal trends in patients with obesity and Metabolic Syndrome (MetS) undergoing TKA and the associated impact on complications and lengths of hospital stay.
We identified patients who underwent primary TKA between 2006 – 2017 within the American College of Surgeons National Surgical Quality Improvement Program (NSQIP) database. We recorded patient demographics, length of stay (LOS), and 30-day major and minor complications. We labelled those with an obese Body Mass Index (BMI ≥ 30), hypertension, and diabetes as having MetS. We evaluated mean BMI, LOS, and 30-day complication rates in all patients, obese patients, and those with MetS from 2006-2017. We used multivariable regression to evaluate the trends in BMI, complications, and LOS over time in all patients and those with MetS, and the effect of BMI and MetS on complication rates and LOS, stratified by year.
270,846 patients underwent primary TKA at hospitals participating in the NSQIP database. 63.71% of patients were obese (n = 172,333), 15.21% were morbidly obese (n = 41,130), and 12.37% met criteria for MetS (n = 33,470). Mean BMI in TKA patients increased at a rate of 0.03 per year (0.02-0.05; p < 0 .0001). Despite this, the rate of adverse events in obese patients decreased: major complications by an odds ratio (OR) of 0.94 (0.93-0.96; p < 0 .0001) and minor complications by 0.94 (0.93-0.95; p < 0 .001). LOS also decreased over time at an average rate of −0.058 days per year (-0.059 to −0.057; p < 0 .0001). The proportion of patients with MetS did not increase, however similar improvements in major complications (OR 0.94 [0.91-0.97] p < 0 .0001), minor complications (OR 0.97 [0.94-1.00]; p < 0 .0330), and LOS (mean −0.055 [-0.056 to −0.054] p < 0 .0001) were found. In morbidly obese patients (BMI ≥ 40), there was a decreased proportion per year (OR 0.989 [0.98-0.994] p < 0 .0001). Factors specifically associated with major complications in obese patients included COPD (OR 1.75 [1.55-2.00] p < 0.0001) and diabetes (OR 1.10 [1.02-1.1] p = 0.017). Hypertension (OR 1.12 [1.03-1.21] p = 0.0079) was associated with minor complications. Similarly, in patients with MetS, major complications were associated with COPD (OR 1.72 [1.35-2.18] p < 0.0001). Neuraxial anesthesia was associated with a lower risk for major complications in the obese cohort (OR 0.87 [0.81-0.92] p < 0.0001). BMI ≥ 40 was associated with a greater risk for minor complications (OR 1.37 [1.26-1.50] p < 0.0001), major complications (1.11 [1.02-1.21] p = 0.015), and increased LOS (+0.08 days [0.07-0.09] p < 0.0001).
Mean BMI in patients undergoing primary TKA increased from 2006 - 2017. MetS comorbidities such as diabetes and hypertension elevated the risk for complications in obese patients. COPD contributed to higher rates of major complications. The obesity-specific risk reduction with spinal anesthesia suggests an improved post-anesthetic clinical course in obese patients with pre-existing pulmonary pathology. Encouragingly, the overall rates of complications and LOS in patients with obesity and MetS exhibited a longitudinal decline. This finding may be related to the decreased proportion of patients with BMI ≥ 40 treated over the same period, possibly the result of quality improvement initiatives aimed at delaying high-risk surgery in morbidly obese patients until healthy weight loss is achieved. These findings may also reflect increased awareness and improved management of these patients and their elevated risk profiles.
Open debridement and Outerbridge and Kashiwagi debridement arthroplasty (OK procedure) are common surgical treatments for elbow arthritis. However, the literature contains little information on the long-term survivorship of these procedures. The purpose of this study was to determine the survivorship after elbow debridement techniques until conversion to total elbow arthroplasty and revision surgery.
We performed a retrospective chart review of patients who underwent open elbow surgical debridement (open debridement, OK procedure) between 2000 and 2015. Patients were diagnosed with primary elbow osteoarthritis, post-traumatic arthritis, or inflammatory arthritis. A total of 320 patients had primary surgery including open debridement (n=142) and OK procedure (n=178), and of these 33 patients required a secondary revision surgery (open debridement, n=14 and OK procedure, n=19). The average follow-up time was 11.5 years (5.5 - 21.5 years). Survivorship was analyzed with Kaplan-Meier curves and Log Rank test. A Cox proportional hazards model was used assess the likelihood of conversion to total elbow arthroplasty or revision surgery while adjusting for covariates (age, gender, diagnosis). Significance was set p<0.05.
Kaplan-Meier survival curves showed open debridement was 100.00% at 1 year, 99.25% at 5 years, and 98.49% at 10 years and for OK procedure 100.00% at 1 year, 98.80% at 5 years, 97.97% at 10 years (p=0.87) for conversion to total elbow arthroplasty. There was no difference in survivorship between procedures after adjusting for significant covariates with the cox proportional hazard model. The rate of revision for open debridement and OK procedure was similar at 11.31% rand 11.48% after 10 years respectively. There were higher rates of revision surgery in patients with open debridement (hazard ratio, 4.84 CI 1.29 - 18.17, p = 0.019) compared to OK procedure after adjusting for covariates. We also performed a stratified analysis with radiographic severity as an effect modifier and showed grade 3 arthritis did better with the OK procedure compared to open debridement for survivorship until revision surgery (p=0.05). However, this difference was not found for grade 1 or grade 2 arthritis. This may suggest that performing the OK procedure for more severe grade 3 arthritis could decrease reoperation rates. Further investigations are needed to better understand the indications for each surgical technique.
This study is the largest cohort of open debridement and OK procedure with long term follow-up. We showed that open elbow debridement and the OK procedure have excellent survivorship until conversion to total elbow arthroplasty and are viable options in the treatment of primary elbow osteoarthritis and post traumatic cases. The OK procedure also has lower rates of revision surgery than open debridement, especially with more severe radiographic arthritis.
The Accolade®TMZF is a taper-wedge cementless metaphyseal coated femoral stem widely utilized from 2002-2012. In recent years, there have been reports of early catastrophic failure of this implant. Establishing a deeper understanding of the rate and causes of revision in patients who developed aseptic failure in stems with documented concerns about high failure rates is critical. Understanding any potential patient or implant factors which are risk factors for failure is important to inform both clinicians and patients. We propose a study to establish the long-term survival of this stem and analyze patients who underwent aseptic revision to understand the causes and risk factors for failure.
A retrospective review was undertaken of all patients who received a primary total hip arthroplasty with an Accolade® TMZF stem at a high-volume arthroplasty center. The causes and timing of revision surgery were documented and cross referenced with the Canadian Institute of Health Information Discharge Abstract Database to minimize loss to follow-up. Survivorship analysis was performed with use of the Kaplan-Meier curves to determine the overall and aseptic survival rates at final follow-up. Patient and implant factors commonly associated with aseptic failure were extracted and Cox proportional hazards model was used.
A consecutive series of 2609 unilateral primary THA patients implanted with an Accolade®TMZF femoral hip stem were included. Mean time from primary surgery was 12.4 years (range 22 days to 19.5 years). Cumulative survival was 96.1% ± 0.2 at final follow-up. One hundred and seven patients underwent revision surgery with aseptic loosening of the femoral component was the most common cause of aseptic failure in this cohort (33/2609, 1.3%). Younger age and larger femoral head offset were independent risk factors for aseptic failure.
To our knowledge, this is the largest series representing the longest follow-up of this taper-wedge cementless femoral implant. Despite early concerns, the Accolade® TMZF stem has excellent survivorship in this cohort. Trunnionosis as a recognized cause for revision surgery was rare. Younger age and larger femoral head offset were independent risk factors for aseptic failure.
Operative management of clavicle fractures is increasingly common. In the context of explaining the risks and benefits of surgery, understanding the impact of incisional numbness as it relates to the patient experience is key to shared decision making. This study aims to determine the prevalence, extent, and recovery of sensory changes associated with supraclavicular nerve injury after open reduction and plate internal fixation of middle or lateral clavicle shaft fractures.
Eighty-six patients were identified retrospectively and completed a patient experience survey assessing sensory symptoms, perceived post-operative function, and satisfaction. Correlations between demographic factors and outcomes, as well as subgroup analyses were completed to identify factors impacting patient satisfaction.
Ninety percent of patients experienced sensory changes post-operatively. Numbness was the most common symptom (64%) and complete resolution occurred in 32% of patients over an average of 19 months. Patients who experienced burning were less satisfied overall with the outcome of their surgery whereas those who were informed of the risk of sensory changes pre-operatively were more satisfied overall.
Post-operative sensory disturbance is common. While most patients improve, some symptoms persist in the majority of patients without significant negative effects on satisfaction. Patients should always be advised of the risk of persistent sensory alterations around the surgical site to increase the likelihood of their satisfaction post-operatively.
To document and assess the available evidence regarding single bundle, hamstrings autograft preparation techniques for Anterior Cruciate Ligament reconstruction (ACLR) and provide graft preparation options for different clinical scenarios.
Three online databases (Embase, PubMed and Ovid [MEDLINE]) were searched from database inception until April 10, 2021. The inclusion criteria were English language studies, human studies, and operative technique studies for single bundle hamstrings autograft preparation for ACLR. Descriptive characteristics, the number of tendons, number of strands, tendon length, graft length and graft diameter were recorded. The methodological quality was assessed using the Methodological Index for Non-Randomized Studies (MINORS) instrument and the Grading of Recommendations Assessment, Development and Evaluation (GRADE) system for non-randomized and randomized studies, respectively.
The initial search yielded 5485 studies, 32 met the inclusion criteria. The mean MINORS score across all nonrandomized studies was 8.2 (standard deviation, SD 6.6) indicating an overall low quality of evidence. The mean MINORS score for comparative studies was 17.4 (SD 3.2) indicating a fair quality of evidence. The GRADE assessment for risk of bias in the randomized study included was low.
There were 2138 knees in 1881 participants, including 1296 (78.1%) males and 363 (21.9%) females recorded. The mean age was 30.3 years. The mean follow-up time was 43.9 months when reported (range 16-55 months). Eleven studies utilized the semitendinosus tendon alone, while 21 studies used both semitendinosus and gracilis tendons. There were 82 (3.8%) two-strand grafts, 158 (7.4%) three-strand grafts, 1044 (48.8%) four-strand grafts, 546 (25.5%) five-strand grafts, and 308 (14.4%) six-strand grafts included. Overall, 372 (19.7%) participants had a single-tendon ACLR compared to 1509 (80.2%) participants who had a two-tendon ACLR.
The mean graft diameter was 9.4mm when reported. The minimum semitendinosus and gracilis tendon lengths necessary ranged from 210-280mm and 160-280mm respectively. The minimum graft length necessary ranged from 63-120mm except for an all-epiphyseal graft in the paediatric population that required a minimum length of 50mm. The minimum femoral, tibial, and intra-articular graft length ranged from 15-25mm, 15-35mm and 20-30mm respectively. Thirteen studies detailed intra-operative strategies to increase graft size such as adding an extra strand or altering the tibial and/or femoral fixation strategies to shorten and widen the graft.
Two studies reported ACL reinjury or graft failure rate. One study found no difference in the re-injury rate between four-, five- and six-strand grafts (p = 0.06) and the other found no difference in the failure rate between four- and five- strand grafts (p = 0.55). There was no difference in the post-operative Lysholm score in 3 studies that compared four- and five-strand ACLR. One of the five studies that compared post-operative IKDC scores between graft types found a difference between two- and three- strand grafts, favoring three-strand grafts.
There are many single bundle hamstrings autograft preparation techniques for ACLR that have been used successfully with minimal differences in clinical outcomes. There are different configurations that may be utilized interchangeably depending on the number, size and length of tendons harvested to obtain an adequate graft diameter and successful ACLR.
Direct oral anticoagulant (DOAC) use is becoming more widespread in the geriatric population. Depending on the type of DOAC, several days are required for its anticoagulant effects to resorb, which may lead to surgical delays. This can have an important impact on hip fracture patients who require surgery. The goal of the current study is to compare surgical delays, mortality and complications for hip fracture patients who were on a DOAC to those who were not.
A retrospective cohort study was conducted at a university hospital in Sherbrooke. All hip fracture patients between 2012 and 2018 who were on a DOAC prior to their surgery were included. These patients were matched with similar patients who were not on an anticoagulant (non-DOAC) for age, sex, type of fracture and date of operation. Demographic and clinical data were collected for all patients. Surgical delay was defined as time of admission to time of surgery. Mortality and complications up to one year postoperative were also noted.
Each cohort comprised of 74 patients. There were no statistically signification differences in Charleson Comorbidty Index and American Society of Anesthesiologists scores between cohorts. Surgical delay was significantly longer for DOAC patients (36.3±22.2 hours vs. 18.6±18.9 hours, p < 0 .001). Mortality (6.1%) and overall complication (33.8%) rates were similar between the two cohorts. However, there were more surgical reinterventions in DOAC patients than non-DOAC ones (16.2% vs. 0.0%, p < 0 .001). Among DOAC patients, mortality was greater for those operated after 48 hours (23.1% vs. 3.3%, p < 0 .05) and complications were more frequent for those operated after 24 hours (52.0% vs. 37.5%, p < 0 .05).
Direct oral anticoagulant (DOAC) use in hip fracture patients is associated with longer surgical delays. Longer delays to surgery are associated with higher mortality and complication rates in hip fracture patients taking a DOAC. Hip fracture patients should have their surgery performed as soon as medically possible, regardless of anticoagulant use.
While controversy remains as to the relative benefit of operative (OM) versus non-operative management (NOM) of Achilles tendon ruptures (ATR), few studies have examined the effect on high impact maneuvers such as jumping and hopping. The purpose of this study is to compare functional performance and musculotendinous morphology in patients following OM or NOM for acute ATR.
Eligible patients were aged 18-65 years old with an ATR who underwent OM or NOM within three weeks of injury and were at least one-year post injury. Gastrocnemius muscle thickness and Achilles tendon length and thickness were assessed with ultrasound. Functional performance was examined with single-leg hop tests and isokinetic plantar strength at 60o/s and 120o/s.
24 participants completed testing (12/ group). Medial (OM: 2.2 ± 0.4 cm vs 1.9 ± 0.3 cm, NOM 2.15 ± 0.5 cm vs 1.7 ± 0.5 cm; p = 0.002) and lateral (OM 1.8 ± 0.3 cm vs 1.5 ± 0.4 cm, NOM 1.6 ± 0.4 cm vs 1.3 ± 0.5 cm; p = 0.008) gastrocnemius thickness were reduced on the affected limb. The Achilles tendon was longer (OM: 19.9 ± 2.2 cm vs 21.9 ± 1.6 cm; NOM: 19.0 ± 3.7 cm vs 21.4 ± 2.9 cm; p = 0.009) and thicker (OM: 0.48 ± 0.16 cm vs 1.24 ± 0.20 cm; NOM: 0.54 ± 0.08 cm vs 1.13 ± 0.23 cm; p < 0.001) on the affected limb with no differences between groups. Affected limb plantar flexion torque at 20o plantar flexion was reduced at 60o/s (OM: 55.6 ± 20.2 nm vs 47.8 ± 18.3 nm; NOM: 59.5 ± 27.5 nm vs 44.7 ± 21.0 nm; p = 0.06) and 120o/s (OM: 44.6 ± 17.9 nm vs 36.6 ± 15.0 nm; NOM: 48.6 ± 16.9 nm vs 35.8 ± 10.7 nm; p = 0.028) with no group effect. There was no difference in single leg hop performance. Achilles tendon length explained 31.6% (p = 0.003) and 18.0% (p = 0.025) of the variance in plantar flexion peak torque limb symmetry index (LSI) at 60o/s and 120o/s respectively. Tendon length explained 28.6% (p=0.006) and 9.5% (p = 0.087) of LSI when torque was measured at 20o plantar flexion at 60o/s and 120o/s respectively. Conversely, tendon length did not predict affected limb plantar flexion peak torque (nm), angle-specific torque at 20o plantar flexion (nm) and affected limb single leg hop distance (cm) or LSI (%).
There was no difference in tendon length between treatment groups and deficits in gastrocnemius thickness and strength are persistent. Deficits in the plantar flexion strength LSI are partially explained by increased tendon length following Achilles tendon rupture, regardless of treatment strategy. Hop test performance is maintained and may be the result of compensatory movements at other joints despite persistent plantarflexion weakness.
The aim of this study was to determine the incidence, annual trend, perioperative outcomes, and identify risk factors of early-onset (≤ 90 days) deep surgical site infection (SSI) following primary total knee arthroplasty (TKA) for osteoarthritis. Risk factors for early-onset deep SSI were assessed.
We performed a retrospective population-based cohort study using prospectively collected patient-level data from several provincial administrative data repositories between January 2013, and March 2020. The diagnosis of early-onset deep SSI was based on published Centre for Disease Control/National Healthcare Safety Network (CDC/NHSN) definitions. The Mann-Kendall Trend Test was used to detect monotonic trends in early-onset deep SSI rates over time. The effects of various patient and surgical risk factors for early-onset deep SSI were analyzed using multiple logistic regression. Secondary outcomes were 90-day mortality and 90-day readmission.
A total of 20,580 patients underwent primary TKA for osteoarthritis. Forty patients had a confirmed deep SSI within 90-days of surgery representing a cumulative incidence of 0.19%. The annual infection rate did not change over the 7-year study period (p = 0.879). Risk factors associated with early-onset deep SSI included blood transfusions (OR, 3.93 [95% CI 1.34-9.20]; p=0.004), drug or alcohol abuse (OR, 4.91 [95% CI 1.85-10.93]; p<0.001), and surgeon volume less than 30 TKA per year (OR, 4.45 [1.07-12.43]; p=0.013). Early-onset deep SSI was not associated with 90-days mortality (OR, 11.68 [0.09-90-58]; p=0.217), but was associated with an increased chance of 90-day readmission (OR, 50.78 [26.47-102.02]; p<0.001).
This study establishes a reliable baseline infection rate for early-onset deep SSI after TKA for osteoarthritis through the use of a robust methodological process. Several risk factors for early-onset deep SSI are potentially modifiable or can be optimized prior to surgery and be effective in reducing the incidence of early-onset SSI. This could guide the formulation of provincial screening programs and identify patients at high risk for SSI.
This study aims to 1) determine reported cannabis use among patients waiting for thoracolumbar surgery and to 2) identify demographics and health differences between cannabis-users and non-cannabis users.
This observational cohort study is a retrospective national multicenter review data from the Canadian Spine Outcomes and Research Network registry. Patients were dichotomized as cannabis users and non-cannabis users. Variables of interest: age, sex, BMI, smoking status, education, work status, exercise, modified Oswestry Disability Index (mODI), the Numerical Rating Scales (NRS) for leg and back pain, tingling/numbness scale, SF-12 Quality of Life Questionnaire - Mental Health Component (MCS), use of prescription cannabis, recreational cannabis, and narcotic pain medication. Continuous variables were compared using an independent t-test and categorical variables were compared using chi-square analyses.
Cannabis-use was reported by 28.4% of pre-operative patients (N=704), 47% of whom used prescription cannabis. Cannabis-use was reported most often by patients in Alberta (43.55%), British Colombia (38.09%) and New Brunswick (33.73%). Patients who reported using cannabis were significantly younger (mean=52.9 versus mean=61.21,). There was a higher percentage of concurrent narcotic-use (51.54 %) and smoking (21.5%) reported in cannabis-users in comparison to non-cannabis users (41.09%,p=0.001; 9.51%, p=0.001, respectively). There were significant differences in cannabis-use based on pathology (p=0.01). Patients who report using cannabis had significantly worse MCS scores (difference=3.93, p=0.001), and PHQ-8 scores (difference=2.51, p=0.001). There was a significant difference in work status (p=0.002) with cannabis-users reporting higher rates (20%) of being employed, but not working compared to non-cannabis users (11.13%). Non-cannabis users were more likely to be retired (45.92%) compared to cannabis-users (31.31%). There were no significant differences based on cannabis use for sex, education, exercise, NRS-back, NRS-Leg, tingling-leg, mODI, or health state.
Thoracolumbar spine surgery patients are utilizing cannabis prior to surgery both through recreational use and prescription. Patients who are using cannabis pre-operatively did not differ in regards to reported pain or disability from non-users, though they did in demographic and mental health variables.
Diffuse-type Tenosynovial Giant-Cell Tumour (d-TGCT) of large joints is a rare, locally aggressive, soft tissue tumour affecting predominantly the knee. Previously classified as Pigmented Villonodular Synovitis (PVNS), this monoarticular disease arises from the synovial lining and is more common in younger adults. Given the diffuse and aggressive nature of this tumour, local control is often difficult and recurrence rates are high. Current literature is comprised primarily of small, and a few larger but heterogeneous, observational studies. Both arthroscopic and open synovectomy techniques, or combinations thereof, have been described for the treatment of d-TGCT of the knee.
There is, however, no consensus on the best approach to minimize recurrence of d-TGCT of the knee. Some limited evidence would suggest that a staged, open anterior and posterior synovectomy might be of benefit in reducing recurrence. To our knowledge, no case series has specifically looked at the recurrence rate of d-TGCT of the knee following a staged, open, posterior and anterior approach. We hypothesized that this approach may provide better recurrence rates as suggested by larger more heterogeneous series.
A retrospective review of the local pathology database was performed to identify all cases of d-TGCT or PVNS of the knee treated surgically at our institution over the past 15 years. All cases were treated by a single fellowship-trained orthopaedic oncology surgeon, using a consistent, staged, open, posterior and anterior approach for synovectomy. All cases were confirmed by histopathology and followed-up with regular repeat MRI to monitor for recurrence. Medical records of these patients were reviewed to extract demographic information, as well as outcomes data, specifically recurrence rate and complications. Any adjuvant treatments or subsequent surgical interventions were noted.
Twenty-three patients with a minimum follow-up of two years were identified. Mean age was 36.3 at the time of treatment. There were 10 females and 13 males. Mean follow-up was seven and a half years. Fourteen of 23 (60.9%) had no previous treatment. Five of 23 had a previous arthroscopic synovectomy, one of 23 had a previous combined anterior arthroscopic and posterior open synovectomy, and three of 23 had a previous open synovectomy. Mean time between stages was 87 days (2.9 months). Seven of 23 (30.4%) patients had a recurrence. Of these, three of seven (42.9%) were treated with Imatinib, and four of seven (57.1%) were treated with repeat surgery (three of four arthroscopic and one of four open).
Recurrence rates of d-TGCT in the literature vary widely but tend to be high. In our retrospective study, a staged, open, anterior and posterior synovectomy provides recurrence rates that are lower than rates previously reported in the literature. These findings support prior data suggesting this approach may result in better rates of recurrence for this highly recurrent difficult to treat tumour.
Orthopaedic surgeons prescribe more opioids than any other surgical speciality. Opioids remain the analgesic of choice following arthroscopic knee and shoulder surgery. There is growing evidence that opioid-sparing protocols may reduce postoperative opioid consumption while adequately addressing patients’ pain. However, there are a lack of prospective, comparative trials evaluating their effectiveness. The objective of the current randomized controlled trial (RCT) was to evaluate the efficacy of a multi-modal, opioid-sparing approach to postoperative pain management in patients undergoing arthroscopic shoulder and knee surgery.
The NO PAin trial is a pragmatic, definitive RCT (NCT04566250) enrolling 200 adult patients undergoing outpatient shoulder or knee arthroscopy. Patients are randomly assigned in a 1:1 ratio to an opioid-sparing group or standard of care. The opioid-sparing group receives a three-pronged prescription package consisting of 1) a non-opioid prescription: naproxen, acetaminophen and pantoprazole, 2) a limited opioid “rescue prescription” of hydromorphone, and 3) a patient education infographic. The control group is the current standard of care as per the treating surgeon, which consists of an opioid analgesic. The primary outcome of interest is oral morphine equivalent (OME) consumption up to 6 weeks postoperatively. The secondary outcomes are postoperative pain scores, patient satisfaction, quantity of OMEs prescribed and number of opioid refills. Patients are followed at both 2 and 6 weeks postoperatively. Data analysts and outcome assessors are blinded to the treatment groups.
As of December 1, 2021 we have enrolled 166 patients, reaching 83% of target enrolment. Based on the current recruitment rate, we anticipate that enrolment will be completed by the end of January 2022 with final follow-up and study close out completed by March of 2022. The final results will be released at the Canadian Orthopaedic Association Meeting in June 2022 and be presented as follows. The mean difference in OME consumption was XX (95%CI: YY-YY, p=X). The mean difference in OMEs prescribed was XX (95%CI: YY-YY, p=X). The mean difference in Visual Analogue Pain Scores (VAS) and patient satisfaction are XX (95%CI: YY-YY, p=X). The absolute difference in opioid refills was XX (95%CI: YY-YY, p=X).
The results of the current study will demonstrate whether an opioid sparing approach to postoperative outpatient pain management is effective at reducing opioid consumption while adequately addressing postoperative pain in patients undergoing outpatient shoulder and knee arthroscopy. This study is novel in the field of arthroscopic surgery, and its results will help to guide appropriate postoperative analgesic management following these widely performed procedures.
Meniscal tears are the most common knee injuries, occurring in acute ruptures or in chronic degenerative conditions. Meniscectomy and meniscal repair are two surgical treatment options. Meniscectomy is easier, faster, and the patient can return to their normal activities earlier. However, this procedure has long-term consequences in the development of degenerative changes in the knee, potentially leading to knee replacement. On the other hand, meniscal repair can offer prolonged benefits to the patients, but it is difficult to perform and requires longer rehabilitation.
Sutures are used for meniscal repairs, but they have limitations. They induce tissue damage when passing through the meniscus. Furthermore, under dynamic loading of the knee, they can cause tissue shearing and potentially lead to meniscal repair failure.
Our team has developed a new technology of resistant adhesive hydrogels to coat the suture used to repair meniscal tissue.
The objective of this study is to biomechanically compare two suture types on bovine menisci specimens: 1) pristine sutures and 2) gel adhesive puncture sealing (GAPS) sutures, on a repaired radial tear under cyclic tensile testing.
Five bovine knees were dissected to retrieve the menisci. On the 10 menisci, a complete radial tear was performed. They were separated in two groups and repaired using either pristine (2-0 Vicryl) or GAPS (2-0 Vicryl coated with adhesive hydrogels) with a single stitch and five knots.
The repaired menisci were clamped on an Instron machine. The specimens were cyclically preconditioned between one and 10 newtons for 10 cycles and then cyclically loaded for 500 cycles between five and 25 newtons at a frequency of 0.16 Hz. The gap formed between the edges of the tear after 500 cycles was then measured using an electronic measurement device. The suture loop before and after testing was also measured to ensure that there was no suture elongation or loosening of the knot.
The groups were compared statistically using Mann-Whitney tests for nonparametric data. The level of significance was set to 0.05.
The mean gap formation of the pristine sutures was 5.61 mm (SD = 2.097) after 500 cycles of tensile testing and 2.38 mm (SD = 0.176) for the GAPS sutures. Comparing both groups, the gap formed with the coated sutures was significantly smaller (p = 0.009) than with pristine sutures. The length of the loop was equal before and after loading. Further investigation of tissue damage indicated that the gap was formed by suture filament cutting into the meniscal tissue.
The long-term objective of this research is to design a meniscal repair toolbox from which the surgeon can adapt his procedure for each meniscal tear. This preliminary experimentation on bovine menisci is promising because the new GAPS sutures seem to keep the edges of the meniscal tear together better than pristine sutures, with hopes of a clinical correlation with enhanced meniscal healing.
Prolonged bedrest in hospitalized patients is a major risk factor for venous thromboembolism (VTE), especially in high risk patients with hip fracture. Thrombelastography (TEG) is a whole blood viscoelastic hemostatic assay with evidence that an elevated maximal amplitude (MA), a measure of clot strength, is predictive of VTE in orthopaedic trauma patients. The objective of this study was to compare the TEG MA parameter between patients with hip fracture who were more mobile post-operatively and discharged from hospital early to patients with hip fracture with reduced mobility and prolonged hospitalizations post-operatively.
In this prospective cohort study, TEG analysis was performed in patients with hip fracture every 24-hours from admission until post-operative day (POD) 5, then at 2- and 6-weeks post-operatively. Hypercoagulability was defined by MA > 65. Patients were divided into an early (within 5-day) and late (after 5-day) discharge group, inpatient at 2-weeks group, and discharge to MSK rehabilitation (MSK rehab), and long term care (LTC) groups. Two-sample t-test was used to analyze differences in MA between the early discharge and less mobile groups. All statistical tests were two-sided, and p-values < 0.05 were considered statistically significant.
In total, 121 patients with a median age of 81.0 were included. Patients in the early discharge group (n=15) were younger (median age 64.0) and more likely to ambulate without gait aids pre-injury (86.7%) compared to patients in the late discharge group (n=105), inpatients at 2-weeks (n=48), discharged to MSK rehab (n=30), and LTC (n=20). At two weeks post-operative, the early discharge group was significantly less hypercoagulable (MA=68.9, SD 3.0) compared to patients in the other four groups. At 6-weeks post-operative, the early discharge group was the only group to demonstrate a trend towards mean MA below the MA > 65 hypercoagulable threshold (MA=64.4, p=0.45). Symptomatic VTE events were detected in three patients (2.5%) post-operatively. All three patients had hospitalizations longer than five days after surgery.
In conclusion, our analysis of hypercoagulability secondary to reduced post-operative mobility demonstrates that patients with hip fracture who were able to mobilize independently sooner after hip fracture surgery, have a reduced peak hypercoagulable state. In addition, there is a trend towards earlier return to normal coagulation status as determined by the TEG MA parameter. Post-operative mobility status may play a role in determining individualized duration of thromboprophylaxis following hip fracture surgery. Future studies comparing TEG to clinically validated mobility tools may more closely evaluate the contribution of venous stasis due to reduced mobility on hypercoagulation following hip fracture surgery.
Neuromuscular scoliosis patients face rates of major complications of up to 49%. Along with pre-operative risk reduction strategies (including nutritional and bone health optimization), intra-operative strategies to decrease blood loss and decrease surgical time may help mitigate these risks. A major contributor to blood loss and surgical time is the insertion of instrumentation which is challenging in neuromuscular patient given their abnormal vertebral and pelvic anatomy. Standard pre-operative radiographs provide minimal information regarding pedicle diameter, length, blocks to pedicle entry (e.g. iliac crest overhang), or iliac crest orientation. To minimize blood loss and surgical time, we developed an “ultra-low dose” CT protocol without sedation for neuromuscular patients.
Our prospective quality improvement study aimed to determine:
if ultra-low dose CT without sedation was feasible given the movement disorders in this population;
what the radiation exposure was compared to standard pre-operative imaging;
whether the images allowed accurate assessment of the anatomy and intra-operative navigation given the ultra-low dose and potential movement during the scan.
Fifteen non-ambulatory surgical patients with neuromuscular scoliosis received the standard spine XR and an ultra-low dose CT scan. Charts were reviewed for etiology of neuromuscular scoliosis and medical co-morbidities. The CT protocol was a high-speed, high-pitch, tube-current modulated acquisition at a fixed tube voltage. Adaptive statistical iterative reconstruction was applied to soft-tissue and bone kernels to mitigate noise. Radiation dose was quantified using reported dose indices (computed tomography dose index (CTDIvol) and dose-length product (DLP)) and effective dose (E), calculated through Monte-Carlo simulation. Statistical analysis was completed using a paired student's T-test (α= 0.05). CT image quality was assessed for its use in preoperative planning and intraoperative navigation using 7D Surgical System Spine Module (7D Surgical, Toronto, Canada).
Eight males and seven females were included in the study. Their average age (14±2 years old), preoperative Cobb angle (95±21 degrees), and kyphosis (60±18 degrees) were recorded. One patient was unable to undergo the ultra-low dose CT protocol without sedation due to a co-diagnosis of severe autism. The average XR radiation dose was 0.5±0.3 mSv. Variability in radiographic dose was due to a wide range in patient size, positioning (supine, sitting), number of views, imaging technique and body habitus. Associated CT radiation metrics were CTDIvol = 0.46±0.14 mGy, DLP = 26.2±8.1 mGy.cm and E = 0.6±0.2 mSv. CT radiation variability was due to body habitus and arm orientation. The radiation dose differences between radiographic and CT imaging were not statistically significant. All CT scans had adequate quality for preoperative assessment of pedicle diameter and orientation, obstacles impeding pedicle entry, S2-Alar screw orientation, and intra-operative navigation.
“Ultra-low dose” CT scans without sedation were feasible in paediatric patients with neuromuscular scoliosis. The effective dose was similar between the standard preoperative spinal XR and “ultra-low dose” CT scans. The “ultra-low dose” CT scan allowed accurate assessment of the anatomy, aided in pre-operative planning, and allowed intra-operative navigation despite the movement disorders in this patient population.
Meniscal root tears can result from traumatic injury to the knee or gradual degeneration. When the root is injured, the meniscus becomes de-functioned, resulting in abnormal distribution of hoop stresses, extrusion of the meniscus, and altered knee kinematics. If left untreated, this can cause articular cartilage damage and rapid progression of osteoarthritis. Multiple repair strategies have been described; however, no best fixation practice has been established. To our knowledge, no study has compared suture button, interference screw, and HEALICOIL KNOTLESS fixation techniques for meniscal root repairs. The goal of this study is to understand the biomechanical properties of these fixation techniques and distinguish any advantages of certain techniques over others. Knowledge of fixation robustness will aid in surgical decision making, potentially reducing failure rates, and improving clinical outcomes.
19 fresh porcine tibias with intact medial menisci were randomly assigned to four groups: 1) native posterior medial meniscus root (PMMR) (n = 7), 2) suture button (n = 4), 3) interference screw (n = 4), or 4) HEALICOIL KNOTLESS (n = 4). In 12 specimens, the PMMR was severed and then refixed by the specified group technique. The remaining seven specimens were left intact. All specimens underwent cyclic loading followed by load-to-failure testing. Elongation rate; displacement after 100, 500, and 1000 cycles; stiffness; and maximum load were recorded.
Repaired specimens had greater elongation rates and displacements after 100, 500, and 1000 cycles than native PMMR specimens (p 0.05). The native PMMR showed greater maximum load than all repair techniques (p 0.05). In interference screw and HEALICOIL KNOTLESS specimens, failure occurred as the suture was displaced from the fixation and tension was gradually lost. In suture button specimens, the suture was either displaced or completely separated from the button. In some cases, tear formation and partial failure also occurred at the meniscus luggage tag knot. Native PMMR specimens failed through meniscus or meniscus root tearing.
All fixation techniques showed similar biomechanical properties and performed inferiorly to the native PMMR. Evidence against significant differences between fixation techniques suggests that the HEALICOIL KNOTLESS technique may present an additional option for fixation in meniscal root repairs. While preliminary in vitro evidence suggests similarities between fixation techniques, further research is required to determine if clinical outcomes differ.
Despite the current trend favoring surgical treatment of displaced intra-articular calcaneal fractures (DIACFs), studies have not been able to demonstrate superior functional outcomes when compared to non-operative treatment. These fractures are notoriously difficult to reduce. Studies investigating surgical fixation often lack information about the quality of reduction even though it may play an important role in the success of this procedure. We wanted to establish if, amongst surgically treated DIACF, an anatomic reduction led to improved functional outcomes at 12 months.
From July 2011 to December 2020, at a level I trauma center, 84 patients with an isolated DIACF scheduled for surgical fixation with plate and screws using a lateral extensile approach were enrolled in this prospective cohort study and followed over a 12-month period. Post-operative computed tomography (CT) imaging of bilateral feet was obtained to assess surgical reduction using a combination of pre-determined parameters: Böhler's angle, calcaneal height, congruence and articular step-off of the posterior facet and calcaneocuboid (CC) joint. Reduction was judged anatomic when Böhler's angle and calcaneal height were within 20% of the contralateral foot while the posterior facet and CC joint had to be congruent with a step-off less than 2 mm. Several functional scores related to foot and ankle pathology were used to evaluate functional outcomes (American Orthopedic Foot and Ankle Score - AOFAS, Lower Extremity Functional Score - LEFS, Olerud and Molander Ankle Score - OMAS, Calcaneal Functional Scoring System - CFSS, Visual Analog Scale for pain – VAS) and were compared between anatomic and nonanatomic DIAFCs using Student's t-test. Demographic data and information about injury severity were collected for each patient.
Among the 84 enrolled patients, 6 were excluded while 11 were lost to follow-up. Thirty-nine patients had a nonanatomic reduction while 35 patients had an anatomic reduction (47%). Baseline characteristics were similar in both groups. When we compared the injury severity as defined by the Sanders’ Classification, we did not find a significant difference. In other words, the nonanatomic group did not have a greater proportion of complex fractures. Anatomically reduced DIACFs showed significantly superior results at 12 months for all but one scoring system (mean difference at 12 months: AOFAS 3.97, p = 0.12; LEFS 7.46, p = 0.003; OMAS 13.6, p = 0.002, CFSS 7.5, p = 0.037; VAS −1.53, p = 0.005). Univariate analyses did not show that smoking status, worker's compensation or body mass index were associated with functional outcomes. Moreover, fracture severity could not predict functional outcomes at 12 months.
This study showed superior functional outcomes in patients with a DIACF when an anatomic reduction is achieved regardless of the injury severity.
Acute spinal cord injury (SCI) is most often secondary to trauma, and frequently presents with associated injuries. A neurological examination is routinely performed during trauma assessment, including through Advanced Trauma Life Support (ATLS). However, there is no standard neurological assessment tool specifically used for trauma patients to detect and characterize SCI during the initial evaluation. The International Standards for Neurological Classification of Spinal Cord Injury (ISNCSCI) is the most comprehensive and popular tool for assessing SCI, but it is not adapted to the acute trauma patients such that it is not routinely used in that setting. Therefore, the objective is to develop a new tool that can be used routinely in the initial evaluation of trauma patients to detect and characterize acute SCI, while preserving basic principles of the ISNCSCI.
The completion rate of the ISCNSCI during the initial evaluation after an acute traumatic SCI was first estimated. Using a modified Delphi technique, we designed the Montreal Acute Classification of Spinal Cord Injuries (MAC-SCI), a new tool to detect and characterize the completeness (grade) and level of SCI in the polytrauma patient. The ability of the MAC-SCI to detect and characterize SCI was validated in a cohort of 35 individuals who have sustained an acute traumatic SCI. The completeness and neurological level of injury (NLI) were assessed by two independent assessors using the MAC-SCI, and compared to those obtained with the ISNCSCI.
Only 33% of patients admitted after an acute traumatic SCI had a complete ISNCSCI performed at initial presentation. The MAC-SCI includes 53 of the 134 original elements of the ISNCSCI which is 60% less. There was a 100% concordance between the severity grade derived from the MAC-SCI and from the ISNCSCI. Concordance of the NLI within two levels of that obtained from the ISNCSCI was observed in 100% of patients with the MAC-SCI and within one level in 91% of patients. The ability of the MAC-SCI to discriminate between cervical (C0 to C7) vs. thoracic (T1 to T9) vs. thoraco-lumbar (T10 to L2) vs. lumbosacral (L3 to S5) injuries was 100% with respect to the ISNCSCI.
The rate of completion of the ISNCSCI is low at initial presentation after an acute traumatic SCI. The MAC-SCI is a streamlined tool proposed to detect and characterize acute SCI in polytrauma patients, that is specifically adapted to the acute trauma setting. It is accurate for determining the completeness of the SCI and localize the NLI (cervical vs. thoracic vs. lumbar). It could be implemented in the initial trauma assessment protocol to guide the acute management of SCI patients.
Distal radius fractures (DRFs) are common injuries that represent 17% of all adult upper extremity fractures. Some fractures deemed appropriate for nonsurgical management following closed reduction and casting exhibit delayed secondary displacement (greater than two weeks from injury) and require late surgical intervention. This can lead to delayed rehabilitation and functional outcomes. This study aimed to determine which demographic and radiographic features can be used to predict delayed fracture displacement.
This is a multicentre retrospective case-control study using radiographs extracted from our Analytics Data Integration, Measurement and Reporting (DIMR) database, using diagnostic and therapeutic codes. Skeletally mature patients aged 18 years of age or older with an isolated DRF treated with surgical intervention between two and four weeks from initial injury, with two or more follow-up visits prior to surgical intervention, were included. Exclusion criteria were patients with multiple injuries, surgical treatment with fewer than two clinical assessments prior to surgical treatment, or surgical treatment within two weeks of injury. The proportion of patients with delayed fracture displacement requiring surgical treatment will be reported as a percentage of all identified DRFs within the study period. A multivariable conditional logistic regression analysis was used to assess case-control comparisons, in order to determine the parameters that are mostly likely to predict delayed fracture displacement leading to surgical management. Intra- and inter-rater reliability for each radiographic parameter will also be calculated.
A total of 84 age- and sex-matched pairs were identified (n=168) over a 5-year period, with 87% being female and a mean age of 48.9 (SD=14.5) years. Variables assessed in the model included pre-reduction and post-reduction radial height, radial inclination, radial tilt, volar cortical displacement, injury classification, intra-articular step or gap, ulnar variance, radiocarpal alignment, and cast index, as well as the difference between pre- and post-reduction parameters. Decreased pre-reduction radial inclination (Odds Ratio [OR] = 0.54; Confidence Interval [CI] = 0.43 – 0.64) and increased pre-reduction volar cortical displacement (OR = 1.31; CI = 1.10 – 1.60) were significant predictors of delayed fracture displacement beyond a minimum of 2-week follow-up. Similarly, an increased difference between pre-reduction and immediate post reduction radial height (OR = 1.67; CI = 1.31 – 2.18) and ulnar variance (OR = 1.48; CI = 1.24 – 1.81) were also significant predictors of delayed fracture displacement.
Cast immobilization is not without risks and delayed surgical treatment can result in a prolong recovery. Therefore, if reliable and reproducible radiographic parameters can be identified that predict delayed fracture displacement, this information will aid in earlier identification of patients with DRFs at risk of late displacement. This could lead to earlier, appropriate surgical management, rehabilitation, and return to work and function.
Despite total knee arthroplasty demonstrating high levels of success, 20% of patients report dissatisfaction with their result.
Wellness Stasis Socks are embedded with a proprietary pattern of neuro-receptor activation points that have been proven to activate a precise neuro-response, as according to the pattern theory of haptic perception, which stimulates improvements in pain and function.
Technologies that manipulate this sensory environment, such as textured insoles, have proven to be effective in improving gait patterns in patients with knee osteoarthritis. In regard to patients undergoing TKA using this new technology may prove beneficial as an adjunct to recovery as many patients suffer from further deficits to their proprioceptive system caused by ligamentous damage and alterations to mechanoreceptors during procedure. We hypothesized that the Wellness Stasis Socks are a safe, cost-effective and easily scalable strategy to support TKA patients through their recovery.
Double-blinded, placebo-controlled randomized trial. Randomization using a computer-generated program . All study coordinators, healthcare personel and patients were blinded to patient groups. All surgical procedures were conducted by the same technique and orthopaedic surgeon. Intervention group:
Cramer's V statistical analysis noted that other variables of Sex, BMI, ASA classification and Age were not statistically different between the control and intervention groups.
No statistical difference between groups in Preop Womac scores.
The data showed a consistent improvement in Womac scores for pain and stiffness at 2 weeks post op in the interventional group over the control group.
The womac scores assessing physical function showed a consistent improvement at both 2 and 6 weeks post op in the intervention group compared to the control group.
There were no complications in either group associated the sock use.
The intervention proved to be a low cost and safe additional intervention post operatively from TKA to help patients improve with regard to pain, stiffness and physical function.
This study suggests this modality can be added to the list of other commonly used post op interventions such as cryocuffs, physiotherapy, and relaxation techniques as safe post op interventions to help patients improve post op TKA and can act as an adjunct in providing non narcotic pain control .
Large cartilage lesions in younger patients can be treated by fresh osteochondral allograft transplantation, a surgical technique that relies on stable initial fixation and a minimum chondrocyte viability of 70% in the donor tissue to be successful. The Missouri Osteochondral Allograft Preservation System (MOPS) may extend the time when stored osteochondral tissues remain viable. This study aimed to provide an independent evaluation of MOPS storage by evaluating chondrocyte viability, chondrocyte metabolism, and the cartilage extracellular matrix using an ovine model.
Femoral condyles from twelve female Arcott sheep (6 years, 70 ± 15 kg) were assigned to storage times of 0 (control), 14, 28, or 56 days. Sheep were assigned to standard of care [SOC, Lactated Ringer's solution, cefazolin (1 g/L), bacitracin (50,000 U/L), 4°C storage] or MOPS [proprietary media, 22-25°C storage]. Samples underwent weekly media changes. Chondrocyte viability was assessed using Calcein AM/Ethidium Homodimer and reported as percent live cells and viable cell density (VCD). Metabolism was evaluated with the Alamar blue assay and reported as Relative Fluorescent Units (RFU)/mg. Electromechanical properties were measured with the Arthro-BST, a device used to non-destructively compress cartilage and calculate a quantitative parameter (QP) that is inversely proportional to stiffness. Proteoglycan content was quantified using the dimethylmethylene blue assay of digested cartilage and distribution visualized by Safranin-O/Fast Green staining of histological sections. A two-way ANOVA and Tukey's post hoc were performed.
Compared to controls, MOPS samples had fewer live cells (p=0.0002) and lower VCD (p=0.0004) after 56 days of storage, while SOC samples had fewer live cells (p=0.0004, 28 days; p=0.0002, 56 days) and lower VCD (p=0.0002, 28 days; p=0.0001, 56 days) after both 28 and 56 days (Table 1). At 14 days, the percentage of viable cells in SOC samples were statistically the same as controls but VCD was lower (p=0.0197). Cell metabolism in MOPS samples remained the same over the study duration but SOC had lower RFU/mg after 28 (p=0.0005) and 56 (p=0.0001) days in storage compared to controls. These data show that MOPS maintained viability up to 28 days yet metabolism was sustained for 56 days, suggesting that the conditions provided by MOPS storage allowed fewer cells to achieve the same metabolic levels as fresh cartilage. Electromechanical QP measurements revealed no differences between storage methods at any individual time point. QP data could not be used to interpret changes over time because a mix of medial and lateral condyles were used and they have intrinsically different properties. Proteoglycan content in MOPS samples remained the same over time but SOC was significantly lower after 56 days (p=0.0086) compared to controls. Safranin-O/Fast Green showed proteoglycan diminished gradually beginning at the articular surface and progressing towards bone in SOC samples, while MOPS maintained proteoglycan over the study duration (Figure 1).
MOPS exhibited superior viability, metabolic activity and proteoglycan retention compared to SOC, but did not maintain viability for 56 days. Elucidating the effects of prolonged MOPS storage on cartilage properties supports efforts to increase the supply of fresh osteochondral allografts for clinical use.
For any figures or tables, please contact the authors directly.
Analyzing shoulder kinematics is challenging as the shoulder is comprised of a complex group of multiple highly mobile joints. Unlike at the elbow or knee which has a primary flexion/extension axis, both primary shoulder joints (glenohumeral and scapulothoracic) have a large range of motion (ROM) in all three directions. As such, there are six degrees of freedom (DoF) in the shoulder joints (three translations and three rotations), and all these parameters need to be defined to fully describe shoulder motion. Despite the importance of glenohumeral and scapulothoracic coordination, it's the glenohumeral joint that is most studied in the shoulder. Additionally, the limited research on the scapulothoracic primarily focuses on planar motion such as abduction or flexion. However, more complex motions, such as internally rotating to the back, are rarely studied despite the importance for activities of daily living. A technique for analyzing shoulder kinematics which uses 4DCT has been developed and validated and will be used to conduct analysis. The objective of this study is to characterize glenohumeral and scapulothoracic motion during active internal rotation to the back, in a healthy young population, using a novel 4DCT approach.
Eight male participants over 18 with a healthy shoulder ROM were recruited. For the dynamic scan, participants performed internal rotation to the back. For this motion, the hand starts on the abdomen and is moved around the torso up the back as far as possible, unconstrained to examine variability in motion pathway. Bone models were made from the dynamic scans and registered to neutral models, from a static scan, to calculate six DoF kinematics. The resultant kinematic pathways measured over the entire motion were used to calculate the ROM for each DoF.
Results indicate that anterior tilting is the most important DoF of the scapula, the participants all followed similar paths with low variation. Conversely, it appears that protraction/retraction of the scapula is not as important for internally rotating to the back; not only was the ROM the lowest, but the pathways had the highest variation between participants. Regarding glenohumeral motion, internal rotation was by far the DoF with the highest ROM, but there was also high variation in the pathways. Summation of ROM values revealed an average glenohumeral to scapulothoracic ratio of 1.8:1, closely matching the common 2:1 ratio other studies have measured during abduction.
Due to the unconstrained nature of the motion, the complex relationship between the glenohumeral and scapulothoracic joints leads to high variation in kinematic pathways. The shoulder has redundant degrees of freedom, the same end position can result from different joint angles and positions. Therefore, some individuals might rely more on scapular motion while others might utilize primarily humeral motion to achieve a specific movement. More analysis needs to be done to identify if any direct correlations can be drawn between scapulothoracic and glenohumeral DoF. Analyzing the kinematics of the glenohumeral and scapulothoracic joint throughout motion will further improve understanding of shoulder mechanics and future work plans to examine differences with age.
Giant cell tumors of bone (GCTs) are locally aggressive tumors with recurrence potential that represent up to 10% of primary tumors of the bone. GCTs pathogenesis is driven by neoplastic mononuclear stromal cells that overexpress receptor activator of nuclear factor kappa-B/ligand (RANKL). Treatment with specific anti-RANKL antibody (denosumab) was recently introduced, used either as a neo-adjuvant in resectable tumors or as a stand-alone treatment in unresectable tumors. While denosumab has been increasingly used, a percentage of patients do not improve after treatment. Here, we aim to determine molecular and histological patterns that would help predicting GCTs response to denosumab to improve personalized treatment.
Nine pre-treatment biopsies of patients with spinal GCT were collected at 2 centres. In 4 patients denosumab was used as a neo-adjuvant, 3 as a stand-alone and 2 received denosumab as adjuvant treatment. Clinical data was extracted retrospectively. Total mRNA was extracted by using a formalin-fixed paraffin-embedded extraction kit and we determined the transcript profile of 730 immune-oncology related genes by using the Pan Cancer Immune Profiling panel (Nanostring). The gene expression was compared between patients with good and poor response to Denosumab treatment by using the nSolver Analysis Software (Nanostring). Immunohistochemistry was performed in the tissue slides to characterize cell populations and immune response in CGTs.
Two out of 9 patients showed poor clinical response with tumor progression and metastasis. Our analysis using unsupervised hierarchical clustering determined differences in gene expression between poor responders and good responders before denosumab treatment. Poor responding lesions are characterized by increased expression of inflammatory cytokines as IL8, IL1, interferon a and g, among a myriad of cytokines and chemokines (CCL25, IL5, IL26, IL25, IL13, CCL20, IL24, IL22, etc.), while good responders are characterized by elevated expression of platelets (CD31 and PECAM), coagulation (CD74, F13A1), and complement classic pathway (C1QB, C1R, C1QBP, C1S, C2) markers, together with extracellular matrix proteins (COL3A1, FN1,. Interestingly the T-cell response is also different between groups. Poor responding lesions have increased Th1 and Th2 component, but good responders have an increased Th17 component. Interestingly, the checkpoint inhibitor of the immune response PD1 (PDCD1) is increased ~10 fold in poor responders.
This preliminary study using a novel experimental approach revealed differences in the immune response in GCTs associated with clinical response to denosumab. The increased activity of checkpoint inhibitor PD1 in poor responders to denosumab treatment may have implications for therapy, raising the potential to investigate immunotherapy as is currently used in other neoplasms. Further validation using a larger independent cohort will be required but these results could potentially identify the patients who would most benefit from denosumab therapy.
Total shoulder arthroplasty implants have evolved to include more anatomically shaped components that replicate the native state. The geometry of the humeral head is non-spherical, with the sagittal diameter of the base of the head being up to 6% (or 2.1-3.9 mm) larger than the frontal diameter. Despite this, many TSA humeral head implants are spherical, meaning that the diameter must be oversized to achieve complete coverage, resulting in articular overhang, or portions of the resection plane will remain uncovered. It is suspected that implant-bone load transfer between the backside of the humeral head and the cortex on the resection plane may yield better load-transfer characteristics if resection coverage was properly matched without overhang, thereby mitigating proximal stress shielding.
Eight paired cadaveric humeri were prepared for reconstruction with a short stem total shoulder arthroplasty by an orthopaedic surgeon who selected and prepared the anatomic humeral resection plane using a cutting guide and a reciprocating sagittal saw. The humeral head was resected, and the resulting cortical boundary of the resection plane was digitized using a stylus and an optical tracking system with a submillimeter accuracy (Optotrak,NDI,Waterloo,ON). A plane was fit to the trace and the viewpoint was transformed to be perpendicular to the plane. To simulate optimal sizing of both circular and elliptical humeral heads, both circles and ellipses were fit to the filtered traces using the sum of least squares error method. Two extreme scenarios were also investigated: upsizing until 100% total coverage and downsizing until 0% overhang.
Total resection plane coverage for the fitted ellipses was found to be 98.2±0.6% and fitted circles was 95.9±0.9%Cortical coverage was found to be 79.8 ±8.2% and 60.4±6.9% for ellipses and circles respectively. By switching to an ellipsoid humeral head, a small 2.3±0.3% (P < 0.001) increase in total coverage led to a 19.5±1.3%(P < 0.001) increase in cortical coverage. The overhang for fitted ellipses and circles was 1.7 ±0.7% and 3.8 ±0.8% respectively, defined as a percentage of the total enclosed area that exceeded the bounds of the humerus resections. Using circular heads results in 2.0 ±0.1% (P < 0.001) greater overhang. Upsizing until 100% resection coverage, the ellipse produced 5.4 ±3.5% (P < 0.001) less overhang than the circle. When upsizing the overhang increases less rapidly for the ellipsoid humeral head that the circular one (Figure 1). Full coverage for the head is achieved more rapidly when up-sizing with an ellipsoid head as well. Downsizing until 0% overhang, total coverage and cortical coverage were 7.5 ±2.8% (P < 0.001) and 7.9 ±8.2% (P = 0.01) greater for the ellipse, respectively. Cortical coverage exhibits a crossover point at −2.25% downsizing, where further downsizing led to the circular head providing more cortical coverage.
Reconstruction with ellipsoids can provide greater total resection and cortical coverage than circular humeral heads while avoiding excessive overhang. Elliptical head cortical coverage can be inferior when undersized. These initial findings suggest resection-matched humeral heads may yield benefits worth pursuing in the next generation of TSA implant design.
For any figures or tables, please contact the authors directly.
Staphylococcus aureus is the most frequently isolated organism in periprosthetic joint infections. The mechanism by which synovial fluid (SF) kills bacteria has not yet been elucidated, and a better understanding of its antibacterial characteristics is needed. We sought to analyze the antimicrobial properties of exogenous copper in human SF against S. aureus.
SF samples were collected from patients undergoing total elective knee or hip arthroplasty. Different S. aureus strains previously found to be sensitive and resistant, UAMS-1 and USA300 WT, respectively, were used. We performed in-vitro growth and viability assays to determine the capability of S. aureus to survive in SF with the addition of 10µM of copper. We determined the minimum bactericidal concentration of copper (MBC-Cu) and evaluated the sensitivity to killing, comparing WT and CopAZB-deficient USA300 strains.
UAMS-1 evidenced a greater sensitivity to SF when compared to USA300 WT, at 12 (p=0.001) and 24 hours (p=0.027). UAMS-1 significantly died at 24 hours (p=0.017), and USA300 WT survived at 24 hours. UAMS-1 was more susceptible to the addition of copper at 4 (p=0.001), 12 (p=0.005) and 24-hours (p=0.006). We confirmed a high sensitivity to killing with the addition of exogenous copper on both strains at 4 (p=0.011), 12 (p=0.011), and 24 hours (p=0.011). Both WT and CopAZB-deficient USA300 strains significantly died in SF, evidencing a MBC-Cu of 50µM against USA300 WT (p=0.011).
SF has antimicrobial properties against S. aureus, and UAMS-1 was more sensitive than USA300 WT. The addition of 10µM of copper was highly toxic for both strains, confirming its bactericidal effect. We evidenced CopAZB-proteins involvement in copper effluxion by demonstrating the high sensitivity of the mutant strain to lower copper concentrations. Thus, we propose CopAZB-proteins as potential targets and the use of exogenous copper as possible treatment alternatives against S. aureus.
Patients receiving reverse total shoulder arthroplasty (RTSA) often have osseous erosions because of glenohumeral arthritis, leading to increased surgical complexity. Glenoid implant fixation is a primary predictor of the success of RTSA and affects micromotion at the bone-implant interface. Augmented implants which incorporate specific geometry to address superior erosion are currently available, but the clinical outcomes of these implants are still considered short-term. The objective of this study was to investigate micromotion at the glenoid-baseplate interface for a standard, 3 mm and 6 mm lateralized baseplates, half-wedge, and full-wedge baseplates. It was hypothesized that the mechanism of load distribution from the baseplate to the glenoid will differ between implants, and these varying mechanisms will affect overall baseplate micromotion.
Clinical CT scans of seven shoulders (mean age 69 years, 10°-19° glenoid inclinations) that were classified as having E2-type glenoid erosions were used to generate 3D scapula models using MIMICS image processing software (Materialise, Belgium) with a 0.75 mm mesh size. Each scapula was then repeatedly virtually reconstructed with the five implant types (standard,3mm,6mm lateralized, and half/full wedge; Fig.1) positioned in neutral version and inclination with full backside contact. The reconstructed scapulae were then imported into ABAQUS (SIMULIA, U.S.) finite element software and loads were applied simulating 15°,30°,45°,60°,75°, and 90° of abduction based on published instrumented in-vivo implant data. The micromotion normal and tangential to the bone surface, and effective load transfer area were recorded for each implant and abduction angle. A repeated measures ANOVA was used to perform statistical analysis.
Maximum normal micromotion was found to be significantly less when using the standard baseplate (5±4 μm), as opposed to the full-wedge (16±7 μm, p=0.004), 3 mm lateralized (10±6 μm, p=0.017), and 6 mm lateralized (16±8 μm, p=0.007) baseplates (Fig.2). The half-wedge baseplate (11±7 μm) also produced significantly less micromotion than the full-wedge (p=0.003), and the 3 mm lateralized produced less micromotion than the full wedge (p=0.026) and 6 mm lateralized (p=0.003). Similarly, maximum tangential micromotion was found to be significantly less when using the standard baseplate (7±4 μm), as opposed to the half-wedge (12±5 μm, p=0.014), 3 mm lateralized (10±5 μm, p=0.003), and 6 mm lateralized (13±6 μm, p=0.003) baseplates (Fig.2). The full wedge (11±3 μm), half-wedge, and 3 mm lateralized baseplate also produced significantly less micromotion than the 6 mm lateralized (p=0.027, p=012, p=0.02, respectively). Both normal and tangential micromotion were highest at the 30° and 45° abduction angles (Fig.2). The effective load transfer area (ELTA) was lowest for the full wedge, followed by the half wedge, 6mm, 3mm, and standard baseplates (Fig.3) and increased with abduction angle.
Glenoid baseplates with reduced lateralization and flat backside geometries resulted in the best outcomes with regards to normal and tangential micromotion. However, these types of implants are not always feasible due to the required amount of bone removal, and medialization of the bone-implant interface. Future work should study the acceptable levels of bone removal for patients with E-type glenoid erosion and the corresponding best implant selections for such cases.
For any figures or tables, please contact the authors directly.
Shoulder arthroplasty is effective at restoring function and relieving pain in patients suffering from glenohumeral arthritis; however, cortex thinning has been significantly associated with larger press-fit stems (fill ratio = 0.57 vs 0.48; P = 0.013)1. Additionally, excessively stiff implant-bone constructs are considered undesirable, as high initial stiffness of rigid fracture fixation implants has been related to premature loosening and an ultimate failure of the implant-bone interface2. Consequently, one objective which has driven the evolution of humeral stem design has been the reduction of stress-shielding induced bone resorption; this in-part has led to the introduction of short stems, which rely on metaphyseal fixation. However, the selection of short stem diametral (i.e., thickness) sizing remains subjective, and its impact on the resulting stem-bone construct stiffness has yet to be quantified.
Eight paired cadaveric humeri (age = 75±15 years) were reconstructed with surgeon selected ‘standard’ sized and 2mm ‘oversized’ short-stemmed implants. Standard stem sizing was based on a haptic assessment of stem and broach stability per typical surgical practice. Anteroposterior radiographs were taken, and the metaphyseal and diaphyseal fill ratios were quantified. Each humerus was then potted in polymethyl methacrylate bone cement and subjected to 2000 cycles of compressive loading representing 90º forward flexion to simulate postoperative seating. Following this, a custom 3D printed metal implant adapter was affixed to the stem, which allowed for compressive loading in-line with the stem axis (Fig.1). Each stem was then forced to subside by 5mm at a rate of 1mm/min, from which the compressive stiffness of the stem-bone construct was assessed. The bone-implant construct stiffness was quantified as the slope of the linear portion of the resulting force-displacement curves.
The metaphyseal and diaphyseal fill ratios were 0.50±0.10 and 0.45±0.07 for the standard sized stems and 0.50±0.06 and 0.52±0.06 for the oversized stems, respectively. Neither was found to correlate significantly with the stem-bone construct stiffness measure (metaphysis: P = 0.259, diaphysis: P = 0.529); however, the diaphyseal fill ratio was significantly different between standard and oversized stems (P < 0.001, Power = 1.0). Increasing the stem size by 2mm had a significant impact on the stiffness of the stem-bone construct (P = 0.003, Power = 0.971; Fig.2). Stem oversizing yielded a construct stiffness of −741±243N/mm; more than double that of the standard stems, which was −334±120N/mm.
The fill ratios reported in the present investigation match well with those of a finite element assessment of oversizing short humeral stems3. This work complements that investigation's conclusion, that small reductions in diaphyseal fill ratio may reduce the likelihood of stress shielding, by also demonstrating that oversizing stems by 2mm dramatically increases the stiffness of the resulting implant-bone construct, as stiffer implants have been associated with decreased bone stimulus4 and premature loosening2. The present findings suggest that even a small, 2mm, variation in the thickness of short stem humeral components can have a marked influence on the resulting stiffness of the implant-bone construct. This highlights the need for more objective intraoperative methods for selecting stem size to provide guidelines for appropriate diametral sizing.
For any figures or tables, please contact the authors directly.
Pain and disability following wrist trauma are highly prevalent, however the mechanisms underlying painare highly unknown. Recent studies in the knee have demonstrated that altered joint contact may induce changes to the subchondral bone density and associated pain following trauma, due to the vascularity of the subchondral bone. In order to examine these changes, a depth-specific imaging technique using quantitative computed tomography (QCT) has been used. We've demonstrated the utility of QCT in measuring vBMD according to static jointcontact and found differences invBMD between healthy and previously injured wrists. However, analyzing a static joint in a neutral position is not necessarily indicative of higher or lower vBMD. Therefore, the purposeof this study is to explore the relationship between subchondral vBMDand kinematic joint contact using the same imaging technique.
To demonstrate the relationship between kinematic joint contact and subchondral vBMDusing QCT, we analyzed the wrists of n = 10 participants (n = 5 healthy and n = 5 with previous wrist trauma). Participantsunderwent 4DCT scans while performing flexion to extension to estimate radiocarpal (specifically the radiolunate (RL) and radioscaphoid (RS)) joint contact area (JCa) between the articulating surfaces. The participantsalso underwent a static CT scan accompanied by a calibration phantom with known material densities that was used to estimate subchondral vBMDof the distal radius. Joint contact is measured by calculatinginter-bone distances (mm2) using a previously validated algorithm. Subchondral vBMD is presented using mean vBMD (mg/K2HPO4) at three normalized depths from the subchondral surface (0 to 2.5, 2.5 to 5 and 5 to 7.5 mm) of the distal radius.
The participants in the healthy cohort demonstrated a larger JCa in the RS joint during both extension and flexion, while the trauma cohort demonstrated a larger JCa in the RL during extension and flexion. With regards to vBMD, the healthy cohort demonstrated a higher vBMD for all three normalized depths from the subchondral surface when compared to the trauma cohort.
Results from our preliminary analysis demonstrate that in the RL joint specifically, a larger JCa throughout flexion and extension was associated with an overall lower vBMD across all three normalized layers. Potential reasoning behind this association could be that following wrist trauma, altered joint contact mechanics due to pathological changes (for example, musculoskeletal trauma), has led to overloading in the RL region. The overloading on this specific region may have led to a decrease in the underlying vBMD when compared to a healthy wrist. However, we are unable to conclude if this is a momentary decrease in vBMD that could be associated with the acute healing phase following trauma given that our analysis is cross-sectional. Therefore, future work should aim to analyze kinematic JCa and vBMD longitudinally to better understand how changes in kinematic JCa over time, and how the healing process following wrist trauma, impacts the underlying subchondral bone in the acute and longitudinal phases of recovery.
Revision surgeries for orthopaedic infections are done in two stages – one surgery to implant an antibiotic spacer to clear the infection and another to install a permanent implant. A permanent porous implant, that can be loaded with antibiotics and allow for single-stage revision surgery, will benefit patients and save healthcare resources. Gyroid structures can be constructed with high porosity, without stress concentrations that can develop in other period porous structures [1] [2]. The purpose of this research is to compare the resulting bone and prosthesis stress distributions when porous versus solid stems are implanted into three proximal humeri with varying bone densities, using finite element models (FEM).
Porous humeral stems were constructed in a gyroid structure at porosities of 60%, 70%, and 80% using computer-aided design (CAD) software. These CAD models were analyzed using FEM (Abaqus) to look at the stress distributions within the proximal humerus and the stem components with loads and boundary conditions representing the arm actively maintained at 120˚ of flexion. The stem was assumed to be made of titanium (Ti6Al4V). Three different bone densities were investigated, representing a healthy, an osteopenic, and an osteoporotic humerus, with an average bone shape created using a statistical shape and density model (SSDM) based on 75 cadaveric shoulders (57 males and 18 females, 73 12 years) [3]. The Young's moduli (E) of the cortical and trabecular bones were defined on an element-by-element basis, with a minimum allowable E of 15 MPa. The Von Mises stress distributions in the bone and the stems were compared between different stem scenarios for each bone density model.
A preliminary analysis shows an increase in stress values at the proximal-lateral region of the humerus when using the porous stems compared to the solid stem, which becomes more prominent as bone density decreases. With the exception of a few mesh dependent singularities, all three porous stems show stress distributions below the fatigue strength of Ti-6Al-4V (410 MPa) for this loading scenario when employed in the osteopenic and osteoporotic humeri [4]. The 80% porosity stem had a single strut exceeding the fatigue strength when employed in the healthy bone.
The results of this study indicate that the more compliant nature of the porous stem geometries may allow for better load transmission through the proximal humeral bone, better matching the stress distributions of the intact bone and possibly mitigating stress-shielding effects. Importantly, this study also indicates that these porous stems have adequate strength for long-term use, as none were predicted to have catastrophic failure under the physiologically-relevant loads. Although these results are limited to a single boney geometry, it is based on the average shape of 75 shoulders and different bone densities are considered. Future work could leverage the shape model for probabilistic models that could explore the effect of stem porosity across a broader population. The development of these models are instrumental in determining if these structures are a viable solution to combatting orthopaedic implant infections.
Tibial plateau fracture reduction involves restoration of alignment and articular congruity. Restorations of sagittal alignment (tibial slope) of medial and lateral condyles of the tibial plateau are independent of each other in the fracture setting. Limited independent assessment of medial and lateral tibial plateau sagittal alignment has been performed to date. Our objective was to characterize medial and lateral tibial slopes using fluoroscopy and to correlate X-ray and CT findings.
Phase One: Eight cadaveric knees were mounted in extension. C-arm fluoroscopy was used to acquire an AP image and the C-arm was adjusted in the sagittal plane from 15° of cephalad tilt to 15 ° of caudad tilt with images captured at 0.5° increments. The “perfect AP” angle, defined as the angle that most accurately profiled the articular surface, was determined for medial and lateral condyles of each tibia by five surgeons. Given that it was agreed across surgeons that more than one angle provided an adequate profile of each compartment, a range of AP angles corresponding to adequate images was recorded. Phase Two: Perfect AP angles from Phase One were projected onto sagittal CT images in Horos software in the mid-medial compartment and mid-lateral compartment to determine the precise tangent subchondral anatomic structures seen on CT to serve as dominant bony landmarks in a protocol generated for calculating medial and lateral tibial slopes on CT. Phase Three: 46 additional cadaveric knees were imaged with CT. Tibial slopes were determined in all 54 specimens.
Phase One: Based on the perfect AP angle on X-ray, the mean medial slope was 4.2°+/-2.6° posterior and mean lateral slope was 5.0°+/-3.8° posterior in eight knees. A range of AP angles was noted to adequately profile each compartment in all specimens and was noted to be wider in the lateral (3.9°+/-3.8°) than medial compartment (1.8°+/-0.7° p=0.002). Phase Two: In plateaus with a concave shape, the perfect AP angle on X-ray corresponded with a line between the superiormost edges of the anterior and posterior lips of the plateau on CT. In plateaus with a flat or convex shape, the perfect AP angle aligned with a tangent to the subchondral surface extending from center to posterior plateau on CT. Phase Three: Based on the CT protocol created in Phase Two, mean medial slope (5.2°+/-2.3° posterior) was significantly less than lateral slope (7.5°+/-3.0° posterior) in 54 knees (p<0.001). In individual specimens, the difference between medial and lateral slopes was variable, ranging from 6.8° more laterally to 3.1° more medially. In a paired comparison of right and left knees from the same cadaver, no differences were noted between sides (medial p=0.43; lateral p=0.62).
On average there is slightly more tibial slope in the lateral plateau than medial plateau (2° greater). However, individual patients may have substantially more lateral slope (up to 6.8°) or even more medial slope (up to 3.1°). Since tibial slope was similar between contralateral limbs, evaluating slope on the uninjured side provides a template for sagittal plane reduction of tibial plateau fractures.
The opposable thumb is one of the defining characteristics of human anatomy and is involved in most activities of daily life. Lack of optimal thumb motion results in pain, weakness, and decrease in quality of life. First carpometacarpal (CMC1) osteoarthritis (OA) is one of the most common sites of OA. Current clinical diagnosis and monitoring of CMC1 OA disease are primarily aided by X-ray radiography; however, many studies have reported discrepancies between radiographic evidence of CMC1 OA and patient-related outcomes of pain and disability. Radiographs lack soft-tissue contrast and are insufficient for the detection of early characteristics of OA such as synovitis, which play a key role in CMC OA disease progression. Magnetic resonance imaging (MRI) and two-dimensional ultrasound (2D-US) are alternative options that are excellent for imaging soft tissue pathology. However, MRI has high operating costs and long wait-times, while 2D-US is highly operator dependent and provides 2D images of 3D anatomical structures. Three-dimensional ultrasound imaging may be an option to address the clinical need for a rapid and safe point of care imaging device. The purpose of this research project is to validate the use of mechanically translated 3D-US in CMC OA patients to assess the measurement capabilities of the device in a clinically diverse population in comparison to MRI.
Four CMC1-OA patients were scanned using the 3D-US device, which was attached to a Canon Aplio i700 US machine with a 14L5 linear transducer with a 10MHz operating frequency and 58mm. Complimentary MR images were acquired using a 3.0 T MRI system and LT 3D coronal photon dense cube fat suppression sequence was used. The volume of the synovium was segmented from both 3D-US and MR images by two raters and the measured volumes were compared to find volume percent differences. Paired sample t-test were used to determine any statistically significant differences between the volumetric measurements observed by the raters and in the measurements found using MRI vs. 3D-US. Interclass Correlation Coefficients were used to determine inter- and intra-rater reliability.
The mean volume percent difference observed between the two raters for the 3D-US and MRI acquired synovial volumes was 1.77% and 4.76%, respectively. The smallest percent difference in volume found between raters was 0.91% and was from an MR image. A paired sample t-test demonstrated that there was no significant difference between the volumetric values observed between MRI and 3D-US. ICC values of 0.99 and 0.98 for 3D-US and MRI respectively, indicate that there was excellent inter-rater reliability between the two raters.
A novel application of a 3D-US acquisition device was evaluated using a CMC OA patient population to determine its clinical feasibility and measurement capabilities in comparison to MRI. As this device is compatible with any commercially available ultrasound machine, it increases its accessibility and ease of use, while proving a method for overcoming some of the limitations associated with radiography, MRI, and 2DUS. 3DUS has the potential to provide clinicians with a tool to quantitatively measure and monitor OA progression at the patient's bedside.
Massive irreparable rotator cuff tears often lead to superior migration of the humeral head, which can markedly impair glenohumeral kinematics and function. Although treatments currently exist for treating such pathology, no clear choice exists for the middle-aged patient demographic. Therefore, a metallic subacromial implant was developed for the purpose of restoring normal glenohumeral kinematics and function. The objective of this study was to determine this implant's ability in restoring normal humeral head position. It was hypothesized that (1) the implant would restore near normal humeral head position and (2) the implant shape could be optimized to improve restoration of the normal humeral head position.
A titanium implant was designed and 3D printed. It consisted of four design variables that varied in both implant thickness (5mm and 8mm) and curvature of the humeral articulating surface (high constraint and low constraint. To assess these different designs, these implants were sequentially assessed in a cadaver-based biomechanical testing protocol. Eight cadaver specimens (64 ± 13 years old) were loaded at 0, 30, and 60 degrees of glenohumeral abduction using a previously developed shoulder simulator. An 80N load was equally distributed across all three deltoid heads while a 10N load was applied to each rotator cuff muscle. Testing states included a fully intact rotator cuff state, a posterosuperior massive rotator cuff tear state (cuff deficient state), and the four implant designs. An optical tracking system (Northern Digital, Ontario, Canada) was used to record the translation of the humeral head relative to the glenoid in both superior-inferior and anterior-posterior directions.
Superior-Inferior Translation
The creation of a posterosuperior massive rotator cuff tear resulted in significant superior translation of the humeral head relative to the intact cuff state (P=0.016). No significant differences were observed between each implant design and the intact cuff state as all implants decreased the superior migration of the humeral head that was observed in the cuff deficient state. On average, the 5mm low and high constraint implant models were most effective at restoring normal humeral head position to that of the intact cuff state (-1.3 ± 2.0mm, P=0.223; and −1.5 ± 2.3mm, P=0.928 respectively).
Anterior-Posterior Translation
No significant differences were observed across all test states for anterior-posterior translation of the humeral head. The cuff deficient on average resulted in posterior translation of the humeral head, however, this was not statistically significant (P=0.128). Both low and high constraint implant designs were found to be most effective at restoring humeral head position to that of the intact cuff state, on average resulting in a small anterior offset (5mm high constraint: 2.0 ± 4.7mm, P=1.000; 8mm high constraint: 1.6 ± 4.9mm, P=1.000).
The 5mm high constraint implant was most effective in restoring normal humeral head position in both the superior-inferior and anterior-posterior directions. The results from this study suggest the implant may be an effective treatment for restoring normal glenohumeral kinematics and function in patients with massive irreparable rotator cuff tears. Future studies are needed to address the mechanical efficiency related to arm abduction which is a significant issue related to patient outcomes.
Degenerative disc disease (DDD) is a common cause of lower back pain. Calcification of the intervertebral disc (IVD) has been correlated with DDD, and is especially prevalent in scoliotic discs. The appearance of calcium deposits has been shown to increase with age, and its occurrence has been associated with several other disorders such as hyperparathyroidism, chondrocalcinosis, and arthritis. Trauma, vertebral fusion and infection have also been shown to increase the incidence of IVD calcification. Our data indicate that Ca2+ and expression of the extracellular calcium-sensing receptor (CaSR) are significantly increased in mild to severely degenerative human IVDs. In this study, we evaluated the effects of Ca2+ and CaSR on the degeneration and calcification of IVDs.
Human donor lumbar spines of Thompson grade 2, 3 and 4 through organ donations within 24 hs after death. IVD cells, NP and AF, were isolated from tissue by sequential digestion with Pronase followed by Collagenase. Cells were expanded for 7 days under standard cell culture conditions. Immunohistochemistry was performed on IVD tissue to validate the grade and expression of CaSR. Free calcium levels were also measured and compared between grades. Immunocytochemistry, Western blotting and RT-qPCR were performed on cultured NP and AF cells to demonstrate expression of CaSR, matrix proteins aggrecan and collagen, catabolic enzymes and calcification markers. IVD cells were cultured in increasing concentrations of Ca2+ [1.0-5.0 mM], CaSR allosteric agonist (cincalcet, 1 uM), and IL-1b [5 ng/mL] for 7 days. Ex vivo IVD organ cultures were prepared using PrimeGrowth Disc Isolation System (Wisent Bioproducts, Montreal, Quebec). IVDs were cultured in 1.0, 2.5 mM Ca2+ or with cinacalcet for 21 days to determine effects on disc degeneration, calcification and biomechanics. Complex modulus and structural stiffness of disc tissues was determined using the MACH-1 mechanical testing system (Biomomentum, Laval, Quebec).
Ca2+ dose-dependently decreased matrix protein synthesis of proteoglycan and Col II in NP and AF cells, similar to treatment with IL-1b. (n = 4). Contrarily to IL-1b, Ca2+ and cincalcet did not significantly increase the expression of catabolic enzymes save ADAMTS5. Similar effects were observed in whole organ cultures, as Ca2+ and cinacalcet decreased proteoglycan and collagen content. Although both Ca2+ and cinacalcet increased the expression of alkaline phosphatase (ALP), only in Ca2+-treated IVDs was there evidence of calcium deposits in NP and AF tissues as determined by von Kossa staining. Biomechanical studies on Ca2+ and cinacalcet-treated IVDs demonstrated decreases in complex modulus (p<0.01 and p<0.001, respectively; n=5), however, only Ca2+-treated IVDs was there significant increases stiffness in NP and AF tissues (p<0.001 and p<0.05, respectively; n=3).
Our results suggest that changes in the local concentrations of calcium and activation of CaSR affects matrix protein synthesis, calcification and IVD biomechanics. Ca2+ may be a contributing factor in IVD degeneration and calcification.
Emerging evidence suggests preoperative opioid use may increase the risk of negative outcomes following orthopedic procedures. This systematic review evaluated the impact of preoperative opioid use in patients undergoing shoulder surgery with respect to preoperative clinical outcomes, postoperative complications, and postoperative dependence on opioids.
EMBASE, MEDLINE, CENTRAL, and CINAHL were searched from inception to April, 2021 for studies reporting preoperative opioid use and its effect on postoperative outcomes or opioid use. The search, data extraction and methodologic assessment were performed in duplicate for all included studies.
Twenty-one studies with a total of 257,301 patients were included in the final synthesis. Of which, 17 were level III evidence. Of those, 51.5% of the patients reported pre-operative opioid use. Fourteen studies (66.7%) reported a higher likelihood of opioid use at follow-up among those used opioids preoperatively compared to preoperative opioid-naïve patients. Eight studies (38.1%) showed lower functional measurements and range of motion in opioid group compared to the non-opioid group post-operatively.
Preoperative opioid use in patients undergoing shoulder surgeries is associated with lower functional scores and post-operative range of motion. Most concerning is preoperative opioid use may predict increased post-operative opioid requirements and potential for misuse in patients.
The 2020-2021 Canadian Residency Matching Service (CaRMS) match year was altered on an unprecedented scale. Visiting electives were cancelled at a national level, and the CaRMS interview tour was moved to a virtual model. These changes posed a significant challenge to both prospective students and program directors (PDs), requiring each party to employ alternative strategies to distinguish themselves throughout the match process. For a variety of reasons, including a decline in applicant interest secondary to reduced job prospects, the field of orthopaedic surgery was identified as vulnerable to many of these changes, creating a window of opportunity to evaluate their impacts on students and recruiting residency programs.
This longitudinal survey study was disseminated to match-year medical students (3rd and 4th year) with an interest in orthopaedic surgery, as well as orthopaedic surgery program directors. Responses to the survey were collected using an electronic form designed in Qualtrics (Qualtrics, 2021, Provo, Utah, USA). Students were contacted through social media posts, as well as by snowball sampling methods through appropriate medical student leadership intermediates. The survey was disseminated to all 17 orthopedic surgery program directors in Canada.
A pre-match and post-match iteration of this survey were designed to identify whether expectations differed from reality regarding the effect of the COVID-19 pandemic on the CaRMS match 2020-2021 process. A similar package was disseminated to Canadian orthopaedic surgery program directors pre-match, with an option to opt-in for a post-match survey follow-up. This survey had a focus on program directors’ opinions of various novel communication, recruitment, and assessment strategies, in the wake of the COVID-19 pandemic.
Students’ responses to the loss of visiting electives were negative. Despite a reduction in financial stress associated with reduced need to travel (p=0.001), this was identified as a core component of the clerkship experience. In the case of virtual interviews, students’ initial trepidation pre-CaRMS turned into a positive outlook post-CaRMS (significant improvement, p=0.009) indicating an overall satisfaction with the virtual interview format, despite some concerns about a reduction in their capacity to network. Program directors and selection committee faculty also felt positively about the virtual interview format. Both students and program directors were overwhelmingly positive about virtual events put on by both school programs and student-led initiatives to complement the CaRMS tour.
CaRMS was initially developed to facilitate the matching process for both students and programs alike. We hope to continue this tradition of student-led and student-informed change by providing three evidence-based recommendations. First, visiting electives should not be discontinued in future iterations of CaRMS if at all possible. Second, virtual interviews should be considered as an alternative approach to the CaRMS interview tour moving forward. And third, ongoing virtual events should be associated with a centralized platform from which programs can easily communicate virtual sessions to their target audience.
Novel immersive virtual reality (IVR) technologies are revolutionizing medical education. Virtual anatomy education using head-mounted displays allows users to interact with virtual anatomical objects, move within the virtual rooms, and interact with other virtual users. While IVR has been shown to be more effective than textbook learning and 3D computer models presented in 2D screens, the effectiveness of IVR compared to cadaveric models in anatomy education is currently unknown. In this study, we aim to compare the effectiveness of IVR with direct cadaveric bone models in teaching upper and lower limb anatomy for first-year medical students.
A randomized, double-blind crossover non-inferiority trial was conducted. Participants were first-year medical students from a single University. Exclusion criteria included students who undertook prior undergraduate or graduate degrees in anatomy. In the first stage of the study, students were randomized in a 1:1 ratio to IVR or cadaveric bone groups studying upper limb skeletal anatomy. All students were then crossed over and used cadaveric bone or IVR to study lower limb skeletal anatomy. All students in both groups completed a pre-and post-intervention knowledge test. The educational content was based on the University of Toronto Medical Anatomy Curriculum. The Oculus Quest 2 Headsets (Meta Technologies) and PrecisionOS Anatomy application (PrecisionOS Technology) were utilized for the virtual reality component. The primary endpoint of the study was student performance on the pre-and post-intervention knowledge tests. We hypothesized that student performance in the IVR groups would be comparable to the cadaveric bone group.
50 first-year medical students met inclusion criteria and were computer randomized (1:1 ratio) to IVR and cadaveric bone group for upper limb skeletal anatomy education. Forty-six students attended the study, 21 completed the upper limb modules, and 19 completed the lower limb modules. Among all students, average score on the pre-intervention knowledge test was 14.6% (Standard Deviation (SD)=18.2%) and 25.0% (SD=17%) for upper and lower limbs, respectively. Percentage increase in students’ scores between pre-and post-intervention knowledge test, in the upper limb for IVR, was 15 % and 16.7% for cadaveric bones (p = 0. 2861), and for the lower limb score increase was 22.6% in the IVR and 22.5% in the cadaveric bone group (p = 0.9356).
In this non-inferiority crossover randomized controlled trial, we found no significant difference between student performance in knowledge tests after using IVR or cadaveric bones. Immersive virtual reality and cadaveric bones were equally effective in skeletal anatomy education. Going forward, with advances in VR technologies and anatomy applications, we can expect to see further improvements in the effectiveness of these technologies in anatomy and surgical education. These findings have implications for medical schools having challenges in acquiring cadavers and cadaveric parts.
Excessive resident duty hours (RDH) are a recognized issue with implications for physician well-being and patient safety. A major component of the RDH concern is on-call duty. While considerable work has been done to reduce resident call workload, there is a paucity of research in optimizing resident call scheduling. Call coverage is scheduled manually rather than demand-based, which generally leads to over-scheduling to prevent a service gap. Machine learning (ML) has been widely applied in other industries to prevent such issues of a supply-demand mismatch. However, the healthcare field has been slow to adopt these innovations. As such, the aim of this study was to use ML models to 1) predict demand on orthopaedic surgery residents at a level I trauma centre and 2) identify variables key to demand prediction.
Daily surgical handover emails over an eight year (2012-2019) period at a level I trauma centre were collected. The following data was used to calculate demand: spine call coverage, date, and number of operating rooms (ORs), traumas, admissions and consults completed. Various ML models (linear, tree-based and neural networks) were trained to predict the workload, with their results compared to the current scheduling approach. Quality of models was determined by using the area under the receiver operator curve (AUC) and accuracy of the predictions. The top ten most important variables were extracted from the most successful model.
During training, the model with the highest AUC and accuracy was the multivariate adaptive regression splines (MARS) model, with an AUC of 0.78±0.03 and accuracy of 71.7%±3.1%. During testing, the model with the highest AUC and accuracy was the neural network model, with an AUC of 0.81 and accuracy of 73.7%. All models were better than the current approach, which had an AUC of 0.50 and accuracy of 50.1%. Key variables used by the neural network model were (descending order): spine call duty, year, weekday/weekend, month, and day of the week.
This was the first study attempting to use ML to predict the service demand on orthopaedic surgery residents at a major level I trauma centre. Multiple ML models were shown to be more appropriate and accurate at predicting the demand on surgical residents as compared to the current scheduling approach. Future work should look to incorporate predictive models with optimization strategies to match scheduling with demand in order to improve resident well being and patient care.
Most cost containment efforts in public health systems have focused on regulating the use of hospital resources, especially operative time. As such, attempting to maximize the efficiency of limited operative time is important. Typically, hospital operating room (OR) scheduling of time is performed in two tiers: 1) master surgical scheduling (annual allocation of time between surgical services and surgeons) and 2) daily scheduling (a surgeon's selection of cases per operative day). Master surgical scheduling is based on a hospital's annual case mix and depends on the annual throughput rate per case type. This throughput rate depends on the efficiency of surgeons’ daily scheduling. However, daily scheduling is predominantly performed manually, which requires that the human planner simultaneously reasons about unknowns such as case-specific length-of-surgery and variability while attempting to maximize throughput. This often leads to OR overtime and likely sub-optimal throughput rate. In contrast, scheduling using mathematical and optimization methods can produce maximum systems efficiency, and is extensively used in the business world. As such, the purpose of our study was to compare the efficiency of 1) manual and 2) optimized OR scheduling at an academic-affiliated community hospital representative of most North American centres.
Historic OR data was collected over a four year period for seven surgeons. The actual scheduling, surgical duration, overtime and number of OR days were extracted. This data was first configured to represent the historic manual scheduling process. Following this, the data was then used as the input to an integer linear programming model with the goal of determining the minimum number of OR days to complete the same number of cases while not exceeding the historic overtime values. Parameters included the use of a different quantile for each case type's surgical duration in order to ensure a schedule within five percent of the historic overtime value per OR day.
All surgeons saw a median 10% (range: 9.2% to 18.3%) reduction in the number of OR days needed to complete their annual case-load compared to their historical scheduling practices. Meanwhile, the OR overtime varied by a maximum of 5%. The daily OR configurations differed from historic configurations in 87% of cases. In addition, the number of configurations per surgeon was reduced from an average of six to four.
Our study demonstrates a significant increase in OR throughput rate (10%) with no change in operative time required. This has considerable implications in terms of cost reduction, surgical wait lists and surgeon satisfaction. A limitation of this study was that the potential gains are based on the efficiency of the pre-existing manual scheduling at our hospital. However, given the range of scenarios tested, number of surgeons included and the similarity of our hospital size and configuration to the majority of North American hospitals with an orthopedic service, these results are generalizable. Further optimization may be achieved by taking into account factors that could predict case duration such as surgeon experience, patients characteristics, and institutional attributes via machine learning.
In patients admitted to hospital with a hip fracture, urinary issues are common. Despite guidelines that recommend avoiding foley catheter usage when possible, it remains a common part of perioperative care. To date, there is no prospective data on the safety and satisfaction associated with catheter use in such cohort. The aim of this study was to evaluate the satisfaction of patients when using a foley catheter while they await surgery for their fractured hip and the safety associated with catheter use.
In our prospectively collected database, 587 patients were admitted to our tertiary care center over a 1 year period. Most patients (328) were catheterized within the first 24h of admission, primarily inserted in ED. Of these patients, 119 patients (61 catheterized and 58 noncatheterized) completed a questionnaire about their perioperative management with foley catheter usage administered on day 1 of admission. This was used to determine satisfaction of catheter use (if catheterized) and pain levels (associated with catheterized or associated with transferring/voiding if not catheterized). Adverse effects related with catheter use included urinary tract infection (UTI) and post-operative urinary retention (POUR).
Ninety-five percent of patients found the catheter to be convenient. Only 5% of patients reported any pain with catheter use. On the contrary, 47.5% of non-catheterized patients found it difficult to move to the bathroom and 30.4% found it difficult to urinate. Catheterized patients had significative less pain than uncatheterized patients (0.62/10 vs 2.45/10 respectively, p < 0 .001). The use of nerve block reduced pain levels amongst catheterized patients but was not associated with reduced pain levels or satisfaction amongst non-catheterized patients. The use of catheter was not associated with increased risk of UTI(17.5% in the catheterized vs 13.3% in the non-catheterized, p = 0.541) or POUR (6.8% in the catheterized vs 11.1% in the non-catheterized, p = 0.406).
This study illustrates the benefits and safety associated with the use of urinary catheters in the pre-operative period amongst hip fractures. The use of catheters was associated with reduced pain and satisfaction without increasing post-operative UTI or POUR. These findings suggest that pre-operative catheter use is associated with less pain and more satisfaction for patients awaiting hip surgery and whom other measures, such as nerve blocks, are unlikely to reduce the discomfort associated with the mobility required to void. A prospective randomized control study could lead to a more evidence based approach for perioperative foley catheter usage in hip fracture patients.
Conferences centered around surgery suffers from gender disparity with male faculty having a more dominant presence in meetings compared to female faculty. Orthopedic Surgery possibly suffers the most from this problem of all surgical specialties, and is reflective of a gender disparity in the field. The objective of this study was to investigate the prevalence of “manels”, or male-only sessions, in eight major Orthopedic Surgery meetings hosted in 2021 and to quantify the differences in location of practice, academic position, years of practice, and research qualifications between male and female faculty.
Eight Orthopedic conferences organized by major Orthopedic associations (AAOS, COA, OTA, EFORT, AAHKS, ORS, NASS, and AOSSM) from February 2021 to November 2021 were analyzed. Meeting information was retrieved from the conference agendas, and details of chairs and speakers were obtained from Linkedin, Doximity, CPSO, personal websites, and Web of Science. Primary outcomes included: one) percentage of male faculty in all included sessions and two) overall percentage of manels. Secondary outcomes included one) percentage of male speakers and chairs in all included sessions, two) overall percentage of male-chair and male-speaker only sessions. Comparisons for outcomes were made between conferences and session topics (adult reconstruction hip, adult reconstruction knee, practice management/rehabilitation, trauma, sports, general, pediatrics, upper extremity, musculoskeletal oncology, foot and ankle, spine, and miscellaneous). Mean number of sessions for male and female were compared after being stratified into quartiles based on publications, sum of times cited, and H-indexes. Data was analyzed with non-parametric analysis, chi-square tests, or independent samples t-tests using SPSS version 28.0.0.0 with a p-value of < 0 .05 being considered statistically significant.
Of 193 included sessions, 121 (62.3%) were manels and the mean percentage of included faculty that was male was 88.9% Apart from the topics of practice management/rehabilitation and musculoskeletal oncology, male representation was very high. Additionally, most included conferences had an extremely high percentage of male representation apart from meetings hosted by the COA and ORS. Non-manel sessions had a greater mean number of chairs (p=0.006), speakers (p < 0 .001), and faculty (p < 0 .001) than manel sessions. Of 1080 total included faculty members, 960 (88.9%) were male. Male faculty were more likely to be Orthopedic surgeons than female faculty (p < 0 .001) while also more likely to hold academic rank as a professor. Mean number of sessions between male and female faculty within their respective quartiles of H-indexes, sum of times cited, and number of publications did not reach statistical significance. Mean years of practice between male and female faculty was also not significantly different.
There is a high prevalence of manels and an overall lack of female representation in Orthopedic meetings. Orthopedic associations should aim to make efforts to increase gender equity in future meetings.
While surgeon-industry relationships in orthopaedics have a critical role in advancing techniques and patient outcomes, they also present the potential for conflict of interest (COI) and increased risk of bias in surgical education. Consequently, robust processes of disclosure and mitigation of potential COI have been adopted across educational institutions, professional societies, and specialty journals. The past years have seen marked growth in the use of online video-based surgical education platforms that are commonly used by both trainees and practicing surgeons. However, it is unclear to what extent the same COI disclosure and mitigation principles are adhered to on these platforms. Thus, the purpose of the present study was to evaluate the frequency and adequacy of potential COI disclosure on orthopaedic online video-based educational platforms.
We retrospectively reviewed videos from a single, publicly-accessible online peer-to-peer orthopaedic educational video platform (VuMedi) that is used as an educational resource by a large number of orthopaedic trainees across North America. The 25 highest-viewed videos were identified for each of 6 subspecialty areas (hip reconstruction, knee reconstruction, shoulder/elbow, foot and ankle, spine and sports). A standardized case report form was developed based on the COI disclosure guidelines of the American Academy of Orthopaedic Surgery (AAOS) and the Journal of Bone and Joint Surgery. Two reviewers watched and assessed each video for presentation of any identifiable commercial products or brand names, disclosure of funding source for video, and presenter's potential conflict of interest. Additionally, presenter disclosures were cross-referenced against commercial relationships reported in the AAOS disclosure database to determine adequacy of disclosure. Any discrepancies between reviewers were resolved by consensus wherever possible, or with adjudication by a third reviewer when necessary.
Out of 150 reviewed videos, only 37 (25%) included a disclosure statement of any kind. Sixty-nine (46%) videos involved the presentation of a readily identifiable commercial orthopaedic device, implant or brand. Despite this, only 13 of these (19%) included a disclosure of any kind, and only 8 were considered adequate when compared to the presenter's disclosures in the AAOS database. In contrast, 83% of the presenters of the videos included in this study reported one or more commercial relationships in the AAOS disclosure database.
Videos of presentations given at conferences and/or academic meetings had significantly greater rates of disclosure as compared to those that were not (41% vs 14%; p=0.004). Similarly, disclosures associated with conference/meeting presentations had significantly greater rates of adequacy (21% vs 7%; p=0.018). Even so, less than half of the educational videos originating from a conference or meeting included a disclosure of any kind, and only about half of these were deemed adequate. No differences were seen in the rate of disclosures between orthopaedic subspecialties (p=0.791).
Online orthopaedic educational videos commonly involve presentation of specific, identifiable commercial products and brands, and the large majority of presenters have existing financial relationships with potential for conflict of interest. Despite this, the overall rate of disclosure of potential conflict of interest in these educational videos is low, and many of these disclosures are incomplete or inadequate. Further work is needed to better understand the impact of this low rate of disclosure on orthopaedic education both in-training and in practice.
Ankle fractures are common orthopedic injuries, often requiring operative intervention to restore joint stability, improve alignment, and reduce the risk of post-traumatic ankle arthritis. However, ankle fracture surgeries (AFSs) are associated with significant postoperative pain, typically requiring postoperative opioid analgesics. In addition to putting patients at risk of opioid dependence, the adverse effects of opioids include nausea, vomiting, and altered mental status which may delay recovery. Peripheral nerve blocks (PNBs) offer notable benefits to the postoperative pain profile when compared to general or spinal anaesthesia alone and may help improve recovery. The primary objective of this quality improvement (QI) study was to increase PNB administration for AFS at our institution to above 50% by January 2021.
A root cause analysis was performed by a multidisciplinary team to identify barriers for PNB administration. Four interventions were chosen & implemented: recruitment and training of expert anesthesiologists in regional anesthesia techniques, procurement of additional ultrasound machines, implementation of a dedicated block room with training to create an enhanced learning environment, and the development of an educational pamphlet for patients outlining strategies to manage rebound pain, instructions around the use of oral multimodal analgesia, and the potential for transient motor block of the leg.
The primary outcome was the percentage of patients who received PNB for AFS. Secondary outcome measures included total hospitalization length of stay (LOS), post-anesthesia care unit (PACU) and 24-hour postoperative opioid consumption (mean oral morphine equivalent [OME]), proportion of patients requiring opioid analgesic in PACU, and proportion of patients experiencing post-operative nausea and/or vomiting (PONV) requiring antiemetic in PACU. Thirty-day post-operative emergency department (ED) visits were collected as a balance measure.
The groups receiving PNB and not receiving PNB included 78 & 157 patients, respectively, with no significant differences in age, gender, or ASA class between groups. PNB administration increased from less than 10% to 53% following implementation of the improvement bundle. Mean total hospital LOS did not vary significantly across the PNB and no PNB groups (1.04 days vs. 1.42 days, P = 0.410). Both mean PACU and mean 24-hour postoperative opioid analgesic consumption was significantly lower in the PNB group compared to the no PNB group (OME in PACU 38.96mg vs. 55.42mg [P = 0.001]; 24-hour OME 44.74mg vs. 37.71mg [P = .008]). A greater proportion of patients in the PNB group did not require any PACU opioid analgesics compared to those in the no PNB group (62.8% vs. 27.4%, P < 0.001). The proportion of patients experiencing PONV and requiring antiemetic both in the PACU did not vary significantly across groups. Thirty-day postoperative ED visits did not vary significantly across groups.
By performing a root cause analysis and implementing a multidisciplinary, patient-centered QI bundle, we achieved significant increases in PNB administration for AFS. As a result, there were significant improvements in the recovery of patients following AFS, specifically reduced use of postoperative opioid analgesia. This multi-faceted approach provides a framework for an individualized QI approach to increase PNB administration and achieve improved patient outcomes following AFS.
Increased collection of patient-reported outcome measures (PROM) in registries enables international comparison of patient-centered outcomes after knee and hip replacement. We aimed to investigate 1) variations in PROM improvement, 2) the possible confounding factor of BMI, and 3) differences in comorbidity distributions between registries.
Registries affiliated with the International Society of Arthroplasty Registries (ISAR) or OECD membership countries were invited to report aggregate EQ-5D, OKS, OHS, HOOS-PS and KOOS-PS values. Eligible patients underwent primary total, unilateral knee or hip replacement for osteoarthritis within three years and had completed PROMs preoperatively and either 6 or 12 months postoperatively, excluding patients with subsequent revisions. For each PROM cohort, Chi-square tests were performed for BMI distributions across registries and 12 predefined PROM strata (male/female, age 20-64/65-74/>75, high or low preoperative PROM scores). Comorbidity distributions were reported for available comorbidity indexes.
Thirteen registries from 9 countries contributed data, n~130000 knee (range 140 to 79848) and n~113000 hip (range 137 to 85281). Mean EQ-5D index values (10 registries) ranged from 0.53 to 0.71 (knee) and 0.50 to 0.70 (hips) preoperatively and 0.78 to 0.85 (knee) and 0.83 to 0.87 (hip) postoperatively. Mean OKS (6 registries) ranged from 19.3 to 23.6 preoperatively and 36.2 to 41.2 postoperatively. Mean OHS (7 registries) ranged from 18.0 to 23.2 preoperatively and 39.8 to 44.2 postoperatively. Four registries reported KOOS-PS and three reported HOOS-PS. Proportions of patients with BMI >30 ranged from 35 to 62% (10 knee registries) and 16 to 43% (11 hip registries). For both knee and hip registries, distributions of patients across six BMI categories differed significantly among registries (p30 were for patients in the youngest age groups (20 to 64 and 65 to 74 years) with the lowest baseline scores. Additionally, females with lowest preoperative PROM scores had highest BMI. These findings were echoed for the OHS and OKS cohorts. Proportions of patients with ASA scores ≥3 ranged from 7 to 42% (9 knee registries) and 6 to 35% (8 hip registries).
PROM-score improvement varies between international registries, which may be partially explained by differences in age, sex and preoperative scores. BMI and comorbidity may be relevant to adjust for.
The purpose of this study was to assess the knowledge acquired from completing online case-based e-learning modules. A secondary objective was to identify how students use these independent resources and gauge their level of support for this novel instructional strategy.
Fourth year medical students were randomized to either a module or control group. Both groups received the standard musculoskeletal medical school curriculum, while the students in the module group were also given access to case-based online modules created to illustrate and teach important orthopaedic concepts related to unique clinical presentations. The first module depicted an athlete with an acute knee dislocation while the second module portrayed a patient with hip pain secondary to femoral acetabular impingement (FAI). All participating students completed a knowledge quiz designed to evaluate the material presented in the module topics, as well as general musculoskeletal concepts taught in the standard curriculum. Following the quiz, the students were invited to share their thoughts on the learning process in a focus- group setting, as well as an individual survey. Demographic data was also collected to gauge student's exposure to and interest in orthopaedics, emergency medicine, anatomy and any prior relevant experience outside of medicine.
Twenty-five fourth year medical students participated in the study with 12 randomized to the module group and 13 to the control group. The regression revealed students in the module group did on average 18.5 and 31.4 percentage points better on the knee and hip quizzes respectively, compared to the control group, which were both significant with a p-value < 0.01. Additionally, students who had completed an orthopaedics elective did 20 percentage points better than those who had not, while there was no significant improvement in students who had just completed their core orthopaedics rotation. The feedback collected from the survey and small group discussion was positive with students wishing more modules were available prior to musculoskeletal clinical skills sessions and their orthopaedics rotations.
Medical students given access to online case-based e-learning modules enjoyed the innovative teaching strategy and performed significantly better on knowledge quizzes than their classmates who only received the standard musculoskeletal curriculum.
Musculoskeletal (MSK) disorders continue to be a major cause of pain and disability worldwide. The mission statement of the Canadian Orthopaedic Association (COA) is to “promote excellence in orthopaedic and musculoskeletal health for Canadians,” and orthopaedic surgeons serve as leaders in addressing and improving musculoskeletal health.
However, patients with MSK complaints most commonly present first to a primary care physician. According to a survey of family physicians in British Columbia, 13.7-27.8% of patients present with a chief complaint that is MSK-related (Pinney et Regan, 2001). Therefore, providing excellent MSK care to Canadians requires that all physicians, especially those involved in primary care, be adequately trained to diagnose and treat common MSK conditions. To date, there has been no assessment of the total mandatory MSK training Canadian family medicine residents receive. It is also unclear, despite the prevalence of MSK complaints among Canadian patients, if current family physicians are competent or confident in their ability to provide fundamental MSK care. The purpose of this study is to determine the amount of mandatory MSK training Canadian family medicine residents are currently receiving.
Web-based research was used to determine how many weeks of mandatory MSK training was incorporated into current Canadian family medicine residency training programs. This information was gathered from either the Canadian Resident Matching Service website (carms.ca) or the residency program's individual website. If this information was not available on a program's website, a program administrator was contacted via email in order to ascertain this information directly. MSK training was considered to be any rotation in orthopaedic surgery, spine surgery, sports medicine, or physiatry.
156 Canadian family medicine residency training sites were identified. Information pertaining to mandatory MSK education was collected for 150 sites (95.5%). Of the 150 training sites, 102(68 %) did not incorporate any mandatory MSK training into their curriculum. Of the 48 programs that did, the average number of weeks of MSK training was 3.37 weeks. 32/48 programs (66.7%) included 4 weeks of MSK training, which represents 3.8% of a 2-year training program.
Current Canadian family medicine residents are not receiving sufficient musculoskeletal training when compared to the overall frequency of musculoskeletal presentations in the primary care setting. Understanding current family medicine physicians’ surveyed confidence and measured competence with respect to diagnosing and treating common musculoskeletal disorders could also prove helpful in demonstrating the need for increased musculoskeletal education. Future orthopaedic initiatives could help enhance family medicine MSK training.
The number of women entering medical school has been steadily increasing over the past two decades; however, the number of women pursuing careers in orthopaedic surgery has not increased at the same rate. One of the suggested reasons for this discrepancy is the perceived incompatibility of having a family while upholding the demands of a surgical career in orthopaedics. A growing body of scientific literature has also outlined the increased rate of infertility and pregnancy complications in women surgeons. The extent to which these factors play a role in the recruitment and retention of women in orthopaedic surgery is unknown. Understanding pregnancy and parenthood in orthopaedic surgery is a critical first step in addressing this issue.
A scoping review was conducted to identify literature pertaining to the perceptions and experiences of pregnancy and/or parenthood of women in orthopaedic surgery. Embase, MEDLINE and PsychINFO were searched on June 7th, 2021 with Boolean operators to combine the following terms: orthop?e*, pregnancy, maternity, motherhood, parenthood, parental, and parenting. Studies pertaining to orthopaedic surgery residents, fellows and staff were included. The Arksey and O'Malley framework for scoping studies was followed. Descriptive statistics were used to quantify the included studies while thematic analysis as described by Braun and Clarke was used to analyze the qualitative data.
A total of 17 studies from 2006 to 2021 met inclusion criteria. Over half of the available research was conducted within the last two years (n=9, 53%). The majority of studies were conducted in the United States (n=15, 88%) and the United Kingdom (n=2, 12%). The most commonly used study design was survey-based research (n=13, 76%), followed by review studies (n=3, 18%), and case series (n=1, 6%). Thematic analysis revealed five key themes contributing to the women's experiences of pregnancy and/or parenthood in orthopaedics: (1) women are subtly or blatantly discouraged from becoming pregnant by their colleagues and superiors, (2) women delay childbearing to preserve their professional reputation, (3) there are higher rates of infertility and preterm labor in orthopaedic surgeons than in the general population, (4) the orthopaedic work environment can be hazardous and challenging for the pregnant woman, but accommodations are possible to mitigate risks, and (5) overall, there is limited support for pregnant and/or parenting women in orthopaedics throughout their career.
The first woman to be board-certified in orthopaedic surgery in the United States was Ruth Jackson in 1937. Eighty-four years later, orthopaedic surgery has the lowest number of women of the surgical specialties. The barriers related to pregnancy and/or parenthood during a woman's career in orthopaedics may be one cause. This study identified five themes related to pregnancy and parenthood that warrant further investigation. Qualitative research approaches can be used to elucidate the details of women's experiences and to provide suggestions for structural changes in the orthopaedic work environment.
In the current healthcare environment, cost containment has become more important than ever. Perioperative services are often scrutinized as they consume more than 30% of North American hospitals’ budgets. The procurement, processing, and use of sterile surgical inventory is a major component of the perioperative care budget and has been recognized as an area of operational inefficiency. Although a recent systematic review supported the optimization of surgical inventory reprocessing as a means to increase efficiency and eliminate waste, there is a paucity of data on how to actually implement this change. A well-studied and established approach to implementing organizational change is Kotter's Change Model (KCM). The KCM process posits that organizational change can be facilitated by a dynamic 8-step approach and has been increasingly applied to the healthcare setting to facilitate the implementation of quality improvement (QI) interventions. We performed an inventory optimization (IO) to improve inventory and instrument reprocessing efficiency for the purpose of cost containment using the KCM framework. The purpose of this quality improvement (QI) project was to implement the IO using KCM, overcome organizational barriers to change, and measure key outcome metrics related to surgical inventory and corresponding clinician satisfaction. We hypothesized that the KCM would be an effective method of implementing the IO.
This study was conducted at a tertiary academic hospital across the four highest-volume surgical services - Orthopedics, Otolaryngology, General Surgery, and Gynecology. The IO was implemented using the steps outlined by KCM (Figure 1): 1) create coalition, 2) create vision for change, 3) establish urgency, 4) communicate the vision, 5) empower broad based action, 6) generate general short term wins, 7) consolidate gains, and 8) anchor change. This process was evaluated using inventory metrics - total inventory reduction and depreciation cost savings; operational efficiency metrics - reprocessing labor efficiency and case cancellation rate; and clinician satisfaction.
The implementation of KCM is described in Table 1. Total inventory was reduced by 37.7% with an average tray size reduction of 18.0%. This led to a total reprocessing time savings of 1333 hours per annum and labour cost savings of $39 995 per annum. Depreciation cost savings was $64 320 per annum. Case cancellation rate due to instrument-related errors decreased from 3.9% to 0.2%. The proportion of staff completely satisfied with the inventory was 1.7% pre-IO and 80% post-IO.
This was the first study to show the success of applying KCM to facilitate change in the perioperative setting with respect to surgical inventory. We have outlined the important organizational obstacles faced when making changes to surgical inventory. The same KCM protocol can be followed for optimization processes for disposable versus reusable surgical device purchasing or perioperative scheduling. Although increasing efforts are being dedicated to quality improvement and efficiency, institutions will need an organized and systematic approach such as the KCM to successfully enact changes.
For any figures or tables, please contact the authors directly.
Knee arthroscopy with debridement is commonly performed to treat osteoarthritis and degenerative meniscal tears in older adults; however robust evidence does not support sustained benefit from this procedure. Current Canadian guidelines advise against its use as first line treatment. Characterizing the use of this low value procedure will facilitate efforts to maximize quality of care, minimize harm and decrease healthcare costs. We sought to understand:
the volume and variations of arthroscopic knee debridement across Canada
The costs associated with potentially unnecessary arthroscopy
The characteristics of surgeons performing knee arthroscopy in older adults
Data were derived from National Ambulatory Care Reporting System (NACRS), the Discharge Abstract Database (DAD) and the National Physician Database for years 2011-12 to 2019-20. The study included all elective knee arthroscopies (CCI codes 1.VG.80.DA,1.VG.80.FY and 1.VG.87.DA) performed in day surgery and acute care settings in 9 provinces and 3 territories of Canada. Quebec was not included in the analysis due to different reporting methods. We set a threshold of 60 years of age at which it would be highly unlikely that a patient would undergo arthroscopy to treat anything other than osteoarthritis or degenerative meniscal tear. Trends at national and provincial levels were analyzed using regression. Costs were estimated separately using the 2020 case mix groups (CMG) and comprehensive ambulatory care classification system (CACS) methodologies. Surgeons were classified by decade of graduation from medical school (1989 and prior, 1990-99, 2000-09 and 2010+) and categorized based on the proportion of their patient population who were above (“high proportion inappropriate”) or below (“low proportion inappropriate”) the overall national proportion of ≥ 60 years of age.
The number of knee arthroscopies decreased by 37% (42,785 in 2011-12 to 27,034 in 2019-20) overall and 39% (11,103 in 2011-12 to 6,772 in 2019-20) in those 60 years and older (p 25% of patients 60 years and older. Fifty four percent of surgeons who graduated prior to 1989 were considered high proportion inappropriate, whereas only 30.1% of surgeons who graduated in 2010 or later were considered high proportion inappropriate (p < 0 .0001).
Knee arthroscopy continues to be a common procedure in patients over 60 despite strong evidence for lack of benefit. Lower rates in this population in some provinces are encouraging for potential opportunity for improvement. Efforts at practice change should be targeted at surgeons in practice the longest. Canada spends over $12,000,000 per year on this procedure, decreasing its use could allow these resources to be directed to other areas of orthopaedics that provide higher value care.
The use of cannabis is increasingly medically relevant as it is legalized and gains acceptance more broadly. However, the effects of marijuana use on postoperative outcomes following orthopedic surgery have not been well-characterized. This study seeks to illuminate the relationship between marijuana use and the incidence postoperative complications including: DVT, PE, nonunion, and infection following common orthopedic procedures.
This study was conducted using a national orthopaedic claims insurance database. We identified all patients undergoing knee arthroscopy, shoulder arthroscopy, operatively managed long bone fractures (humerus, femur, tibia and/or fibula, and radius and/or ulna), and single-level lumbar fusion. The proportion of patients within each surgery cohort who had a diagnostic code for marijuana dependence was assessed. The rates of DVT, PE, and infection within 90 days were assessed for all patients. The rate of nonunion was assessed for the long bone fracture and lumbar fusion cohorts. Univariate analyses of marijuana dependence on all outcomes were performed, followed by a multivariate logistic regression analysis controlling for known patient comorbidities.
We identified 1,113,944 knee arthroscopy, 747,938 shoulder arthroscopy, 88,891 lumbar fusion, and 37,163 long bone fracture patients. Out of the 1,987,936 patients, 24,404 patients had a diagnostic code for marijuana dependence. Within all four surgical subgroups, the marijuana dependence cohort experienced increased rates of infection, PE, and DVT, as well as increased rates of nonunion in the lumbar fusion and long bone fracture populations. In the multivariate analyses controlling for a variety of patient risk factors including tobacco use, marijuana dependence was identified as an independent risk factor for infection within all four surgical subgroups (Knee: OR 1.85, p < 0.001; Shoulder: OR 1.65, p < 0.001; Spine: OR 1.45, p < 0.001; Long bone: OR 1.28, p < 0.001), and for nonunion in the lumbar fusion (OR 1.38, p < 0.001) and long bone fracture (OR 1.31, p < 0.001) subgroups.
Our data suggests that marijuana dependence may be associated with increased rates of infection and nonunion following a variety of orthopaedic procedures. During preoperative evaluation, surgeons may consider marijuana use as a potential risk factor for postoperative complications, especially within the context of marijuana legalization. Future research into this relationship is necessary.
In Canada, hip and knee replacements are each among the top three surgeries performed annually. In 2020, surgeries across the country were cancelled in response to the COVID-19 pandemic. We examined the impact on these joint replacement surgeries throughout the year.
Using the Discharge Abstract Database and National Ambulatory Care Reporting System, we developed a dataset of all 208,041 hip and knee replacements performed in Canada (except from Quebec) between January 1, 2019 to December 31, 2020. We compared patient and surgical characteristics (including sex, age, main diagnosis, and type of surgery (planned/urgent, primary/revision, inpatient/day surgery) in 2020 to 2019.
In 2020, hip and knee replacements volumes decreased by 18.8% compared to 2019. In April and May 2020, hip and knee replacements fell by 69.4% and 93.8%, respectively, compared to the same period in 2019. During those months, 66.5% of hip replacements were performed to treat hip fracture versus 20.2% in April and May 2019, and 64.5% of knee replacements were primaries versus 93.0% in April and May 2019. Patterns by patient age group and sex were similar compared to 2019. These patterns were similar across all provinces. By the summer, planned surgeries resumed across the country and volumes mostly returned to pre-pandemic monthly levels by the end of the year. We also found that there was an increase in the proportion of hip and knee replacements done as day surgery, with 4% in 2020 versus 1% in 2019, and patients undergoing day surgery replacement for osteoarthritis were older, with a median age of 64 for hip patients and 65 for knee patients, versus 63 for both joints the previous year.
As a result of the COVID-19 pandemic, there was a notable drop in 2020 of hip and knee replacements performed in Canada. With the demand for joint replacements continuing to grow, the resulting backlog will have an immediate, significant impact on wait lists and patient quality of life. The shift to a greater proportion of joint replacements performed as day surgeries may have an effect on patient outcomes as well shifts in access to care. It will be important to continue monitor patient outcomes following day surgery and the impact on patients for which day surgery was not an option.
In recent literature, the fragility index (FI) has been used to evaluate the robustness of statistically significant findings of dichotomous outcomes. This metric is defined as the minimum number of outcome events to flip study conclusions from significant to nonsignificant. Orthopaedics literature is frequently found to be fragile with a median FI of 2 in 150 RCTs across spine, hand, sports medicine, trauma and orthopaedic oncology studies. While many papers discuss limitations of FI, we aimed to further characterize it by introducing the Fragility Likelihood (FL), a new metric that allows us to consider the probability of the event to occur and to calculate the likelihood of this fragility to be reached.
We systematically reviewed all randomized controlled trials in the Journal of Bone and Joint Surgery (Am) over 10 years. The FL was calculated with the following formula: A x B x C x 100% (A= FI; B = probability of the event in the group with the smallest number of events; C= probability of the non-event in the group with the highest number of events). A smaller FL demonstrates more robust results and conversely, a larger FL illustrates a higher likelihood of fragility being reached and more fragile the findings.
The median FI for the statistically significant outcomes was 2 (Mean: 3.8; Range 0-23). The median FL for the statistically significant outcomes was 11% (Mean: 22%, Range: 2%-73%). This means that the probability of reaching non-significance is only 11% when considering the probability of the event to occur. When comparing studies with the same FI we found the FL to range from 3% to 43%. This illustrates the large differences in robustness between trials with equal FI when the likelihood of the event was taken into consideration.
As orthopaedic studies are frequently reported as fragile, we found that by calculating the FL, studies may be more robust than previously assumed based off FI alone. By using the FL in conjunction with FI and p-values will provide additional insight into the robustness of the reported outcomes. Our results indicate that by calculating the FL, study conclusions are stronger than what the FI alone predicts. Although conducting RCTs in surgery can be challenging, we must endeavor to critically evaluate our results so we can answer important orthopaedic questions with certainty.
Despite the current trend favoring surgical treatment of displaced intra-articular calcaneal fractures (DIACFs), studies have not been able to demonstrate superior functional outcomes when compared to non-operative treatment. These fractures are notoriously difficult to reduce. Studies investigating surgical fixation often lack information about the quality of reduction even though it may play an important role in the success of this procedure. We wanted to establish if, amongst surgically treated DIACF, an anatomic reduction led to improved functional outcomes at 12 months.
From July 2011 to December 2020, at a level I trauma center, 84 patients with an isolated DIACF scheduled for surgical fixation with plate and screws using a lateral extensile approach were enrolled in this prospective cohort study and followed over a 12-month period. Post-operative computed tomography (CT) imaging of bilateral feet was obtained to assess surgical reduction using a combination of pre-determined parameters: Böhler's angle, calcaneal height, congruence and articular step-off of the posterior facet and calcaneocuboid (CC) joint. Reduction was judged anatomic when Böhler's angle and calcaneal height were within 20% of the contralateral foot while the posterior facet and CC joint had to be congruent with a step-off less than 2 mm. Several functional scores related to foot and ankle pathology were used to evaluate functional outcomes (American Orthopedic Foot and Ankle Score - AOFAS, Lower Extremity Functional Score - LEFS, Olerud and Molander Ankle Score - OMAS, Calcaneal Functional Scoring System - CFSS, Visual Analog Scale for pain - VAS) and were compared between anatomic and nonanatomic DIAFCs using Student's t-test. Demographic data and information about injury severity were collected for each patient.
Among the 84 enrolled patients, 6 were excluded while 11 were lost to follow-up. Thirty-nine patients had a nonanatomic reduction while 35 patients had an anatomic reduction (47%). Baseline characteristics were similar in both groups. When we compared the injury severity as defined by the Sanders’ Classification, we did not find a significant difference. In other words, the nonanatomic group did not have a greater proportion of complex fractures. Anatomically reduced DIACFs showed significantly superior results at 12 months for all but one scoring system (mean difference at 12 months: AOFAS 3.97, p = 0.12; LEFS 7.46, p = 0.003; OMAS 13.6, p = 0.002, CFSS 7.5, p = 0.037; VAS −1.53, p = 0.005). Univariate analyses did not show that smoking status, worker's compensation or body mass index were associated with functional outcomes. Moreover, fracture severity could not predict functional outcomes at 12 months
This study showed superior functional outcomes in patients with a DIACF when an anatomic reduction is achieved regardless of the injury severity.
Reported wound complication in below knee surgery can be quite high. Recent study demonstrated that increased blood loss and hematoma formation increase wound complications especially in foot and ankle surgeries. Despite the evidence on the benefit of TXA on blood loss in TKA and THA it is not routinely used by surgeon in below knee surgery.
To assess the efficacy and safety of this medication in reducing wound complication and blood loss and the risk of thromboembolic complications in patients undergoing below knee surgery. A systematic literature search of PubMed, Embase, Ovid, the Cochrane Library and AAOS and AOFAS conference proceedings was conducted. The primary outcome was the rate of wound complications. Data were analyzed using the Review Manager 5.3 software.
Nine studies involving 861 patients met the inclusion criteria. The meta-analysis indicated that TXA, when compared to a control group, reduced wound complications (OR, 0.54; 95% IC, 0.31 to 0.95, p = 0,03), blood loss (MD = −149,4 ml; 95% CI, −205,3ml to −93,6ml), post-operative drainage (MD = −169,8 ml; 95% CI, −176,7 to −162,9 ml) and hemoglobin drop (MD = −8,75 g/dL; 95% IC, −9,6 g/dL to −7,8 g/dL). There was no significant difference in thromboembolic events (RR 0,53; 95% CI, 0,15 - 1,90; p = 0,33).
This study demonstrated that TXA could be use in below knee surgery to reduce wound complication and blood loss without increased thromboembolic complications. The small number of studies limit the findings interpretation. Further studies are needed to sustain those resutls.
There is limited literature on the effects of socioeconomic factors on outcomes after total ankle arthroplasty (TAA). In the setting of hip or knee arthroplasty, patients of a lower socioeconomic status demonstrate poorer post-operative satisfaction, longer lengths of stay, and larger functional limitations. It is important to ascertain whether this phenomenon is present in ankle arthritis patients. This is the first study to address the weight of potential socioeconomic factors in affecting various socioeconomic classes, in terms of how they benefit from ankle arthroplasty.
This is retrospective cohort study of 447 patients who underwent a TAA. Primary outcomes included pre-operative and final follow-up AAOS pain, AAOS disability, and SF-36 scores. We then used postal codes to determine median household income using Canadian 2015 census data. Incomes were divided into five groups based on equal amounts over the range of incomes. This method has been used to study medical conditions such as COPD and cardiac disease. These income groups were then compared for differences in outcome measures. Statistical analysis was done using unpaired t-test.
A total of 447 patients were divided into quintiles by income. From lowest income to highest income, the groups had 54, 207, 86, 64, and 36 patients, respectively. The average time from surgery to final follow up was 85.6 months. Interestingly, we found that patients within the middle household income groups had significantly lower AAOS disability scores compared to the lowest income groups at final follow-up (26.41 vs 35.70, p=0.035). Furthermore, there was a trend towards middle income households and lower post-operative AAOS pain scores compared to the lowest income group (19.57 vs 26.65, p=0.063). There was also a trend toward poorer AAOS disability scores when comparing middle income groups to high income groups post-operatively (26.41 vs 32.27, p=0.058). Pre-operatively, patients within the middle-income group had more pain, compared to the lowest and the highest income groups. No significant differences in SF-36 scores were observed. There were no significant differences seen in middle income groups compared to the highest income group for AAOS pain post-operatively. There were no significant differences found in pre-operative AAOS disability score between income groups.
Patients from middle income groups who have undergone TAA demonstrate poorer function and possibly more pain, compared to lower and higher income groups. This suggests that TAA is a viable option for lower socioeconomic groups and should not be a source of discouragement for surgeons. In this circumstance there is no real disparity between the rich and the poor. Further investigation is needed to explore reasons for diminished performance in middle class patients.
Diabetes mellitus is a risk factor for complications after operative management of ankle fractures. Generally, diabetic sequelae such as neuropathy and nephropathy portend greater risk; however, the degree of risk resulting from these patient factors is poorly defined. We sought to evaluate the effects of the diabetic sequelae of neuropathy, chronic kidney disease (CKD), and peripheral vascular disease (PVD) on the risk of complications following operative management of ankle fractures.
Using a national claims-based database we analyzed patients who had undergone operative management of an ankle fracture and who remained active in the database for at least two years thereafter. Patients were divided into two cohorts, those with a diagnosis of diabetes and those without. Each cohort was further stratified into five groups: neuropathy, CKD, PVD, multiple sequelae, and no sequelae. The multiple sequelae group included patients with more than one of the three sequelae of interest: CKD, PVD and neuropathy. Postoperative complications were queried for two years following surgery. The main complications of interested were: deep vein thrombosis (DVT), surgical site infection, hospital readmission within 90 days, revision internal fixation, conversion to ankle fusion, and below knee amputation (BKA).
We identified 210,069 patients who underwent operative ankle fracture treatment; 174,803 had no history of diabetes, and 35,266 were diabetic. The diabetic cohort was subdivided as follows: 7,506 without identified sequelae, 8,994 neuropathy, 4,961 CKD, 1,498 PVD, and 12,307 with multiple sequelae.
Compared to non-diabetics, diabetics without sequelae had significantly higher odds of DVT, infection, readmission, revision internal fixation and conversion to ankle fusion (OR range 1.21 – 1.58, p values range Compared to uncomplicated diabetics, diabetics with neuropathy alone and diabetics with multiple sequelae were found to have significantly higher odds of all complications (OR range 1.18 – 31.94, p values range < 0.001 - 0.034). Diabetics with CKD were found to have significantly higher odds of DVT, readmission, and BKA (OR range 1.34 – 4.28, p values range < 0.001 - 0.002). Finally, diabetics with PVD were found to have significantly higher odds of DVT, readmission, conversion to ankle fusion, and BKA (OR range 1.62 - 9.69, p values range < 0.001 - 0.039).
Diabetic patients with sequelae of neuropathy, CKD or PVD generally had higher complication rates than diabetic patients without these diagnoses. Unsurprisingly, diabetic patients with multiple sequalae are at the highest risk of complications and had the highest odds ratios of all complications. While neuropathy is known to be associated with postoperative complications, our analysis demonstrates that CKD represents a significant risk factor for multiple complications following the operative management of ankle fractures and has rarely been discussed in prior studies.
Progressive collapsing foot deformity (PCFD) is a complex foot deformity with varying degrees of hindfoot valgus, forefoot abduction, forefoot varus, and collapse or hypermobility of the medial column. In its management, muscle and tendon balancing are important to address the deformity. Peroneus brevis is the primary evertor of the foot, and the strongest antagonist to the tibialis posterior. Moreover, peroneus longus is an important stabilizer of the medial column. To our knowledge, the role of peroneus brevis to peroneus longus tendon transfer in cases of PCFD has not been reported.
This study evaluates patient reported outcomes including pain scores and any associated surgical complications for patients with PCFD undergoing isolated peroneus brevis to longus tendon transfer and gastrocnemius recession.
Patients with symptomatic PCFD who had failed non-operative treatment, and underwent isolated soft tissue correction with peroneus brevis to longus tendon transfer and gastrocnemius recession were included. Procedures were performed by a single surgeon at a large University affiliated teaching hospital between January 1 2016 to March 31 2021. Patients younger than 18 years old, or undergoing surgical correction for PCFD which included osseous correction were excluded.
Patient demographics, medical comorbidities, procedures performed, and pre and post-operative patient related outcomes were collected via medical chart review and using the appropriate questionnaires.
Outcomes assessed included Visual Analogue Scale (VAS) for foot and ankle pain as well as sinus tarsi pain (0-10), patient reported outcomes on EQ-5D, and documented complications.
Statistical analysis was utilized to report change in VAS and EQ-5D outcomes using a paired t-test. Statistical significance was noted with p<0.05.
We analysed 43 feet in 39 adults who fulfilled the inclusion criteria. Mean age was 55.4 ± 14.5 years old. The patient reported outcome mean results and statistical analysis are shown in Table one below. Mean pre and post-operative foot and ankle VAS pain was 6.73, and 3.13 respectively with a mean difference of 3.6 (p<0.001, 95% CI 2.6, 4.6). Mean pre and post-operative sinus tarsi VAS pain was 6.03 and 3.88, respectively with a mean difference of 2.1 (p<0.001, 95% CI 0.9, 3.4). Mean pre and post-operative EQ-5D Pain scores were 2.19 and 1.83 respectively with a mean difference of 0.4 (p=0.008, 95% CI 0.1, 0.6). Mean follow up time was 18.8 ± 18.4 months.
Peroneus brevis to longus tendon transfer and gastrocnemius recession in the management of symptomatic progressive collapsing foot deformity significantly improved sinus tarsi and overall foot and ankle pain. Most EQ-5D scores improved, but did not reach statistically significant values with the exception of the pain score. This may have been limited by our cohort size. To our knowledge, this is the first report in the literature describing clinical results in the form of patient reported outcomes following treatment with this combination of isolated soft tissue procedures for the treatment of PCFD.
For any figures or tables, please contact the authors directly.