Advertisement for orthosearch.org.uk
Results 1 - 20 of 792
Results per page:
Bone & Joint Open
Vol. 3, Issue 5 | Pages 359 - 366
1 May 2022
Sadekar V Watts AT Moulder E Souroullas P Hadland Y Barron E Muir R Sharma HK

Aims. The timing of when to remove a circular frame is crucial; early removal results in refracture or deformity, while late removal increases the patient morbidity and delay in return to work. This study was designed to assess the effectiveness of a staged reloading protocol. We report the incidence of mechanical failure following both single-stage and two stage reloading protocols and analyze the associated risk factors. Methods. We identified consecutive patients from our departmental database. Both trauma and elective cases were included, of all ages, frame types, and pathologies who underwent circular frame treatment. Our protocol is either a single-stage or two-stage process implemented by defunctioning the frame, in order to progressively increase the weightbearing load through the bone, and promote full loading prior to frame removal. Before progression, through the process we monitor patients for any increase in pain and assess radiographs for deformity or refracture. Results. There were 244 frames (230 patients) included in the analyses, of which 90 were Ilizarov type frames and 154 were hexapods. There were 149 frames which underwent single-stage reloading and 95 frames which underwent a two-stage reloading protocol. Mechanical failure occurred after frame removal in 13 frames (5%), which suffered refracture. There were no cases of change in alignment. There was no difference between refracture patients who underwent single-stage or two-stage reloading protocols (p = 0.772). In all, 14 patients had failure prevented through identification with the reloading protocol. Conclusion. Our reloading protocol is a simple and effective way to confirm the timing of frame removal and minimize the rate of mechanical failure. Similar failure rates occurred between patients undergoing single-stage and two-stage reloading protocols. If the surgeon is confident with clinical and radiological assessment, it may be possible to progress directly to stage two and decrease frame time and patient morbidity. Cite this article: Bone Jt Open 2022;3(5):359–366


Orthopaedic Proceedings
Vol. 104-B, Issue SUPP_13 | Pages 10 - 10
1 Dec 2022
Behman A Bradley C Maddock C Sharma S Kelley S
Full Access

There is no consensus regarding the optimum frequency of ultrasound for monitoring the response to Pavlik harness (PH) treatment in developmental dysplasia of hip (DDH). The purpose of our study was to determine if a limited-frequency hip ultrasound (USS) assessment in children undergoing PH treatment for DDH had an adverse effect on treatment outcomes when compared to traditional comprehensive ultrasound monitoring. This study was a single-center non-inferiority randomized controlled trial. Children aged less than six months of age with dislocated, dislocatable and stable dysplastic hips undergoing a standardized treatment program with a PH were randomized, once stability had been achieved, to our current standard USS monitoring protocol (every clinic visit) or to a limited-frequency ultrasound protocol (USS only until hip stability and then end of treatment). Groups were compared based on alpha angle at the end of treatment, acetabular indices (AI) and IHDI grade on follow up radiographs at one-year post harness and complication rates. The premise was that if there were no differences in these outcomes, either protocol could be deemed safe and effective. One hundred patients were recruited to the study; after exclusions, 42 patients completed the standard protocol (SP) and 36 completed the limited protocol (LP). There was no significant difference between the mean age between both groups at follow up x-ray (SP: 17.8 months; LP: 16.6 months; p=0.26). There was no difference between the groups in mean alpha angle at the end of treatment (SP: 69°; LP: 68.1°: p=0.25). There was no significant difference in the mean right AI at follow up (SP: 23.1°; LP: 22.0°; p=0.26), nor on the left (SP:23.3°; LP 22.8°; p=0.59). All hips in both groups were IHDI grade 1 at follow up. The only complication was one femoral nerve palsy in the SP group. In addition, the LP group underwent a 60% reduction in USS use once stable. We found that once dysplastic or dislocated hips were reduced and stable on USS, a limited- frequency ultrasound protocol was not associated with an inferior complication or radiographic outcome profile compared to a standardized PH treatment pathway. Our study supports reducing the frequency of ultrasound assessment during PH treatment of hip dysplasia. Minimizing the need for expensive, time-consuming and in-person health care interventions is critical to reducing health care costs, improving patient experience and assists the move to remote care. Removing the need for USS assessment at every PH check will expand care to centers where USS is not routinely available and will facilitate the establishment of virtual care clinics where clinical examination may be performed remotely


Orthopaedic Proceedings
Vol. 105-B, Issue SUPP_2 | Pages 108 - 108
10 Feb 2023
Guo J Blyth P Clifford K Hooper N Crawford H
Full Access

Augmented reality simulators offer opportunities for practice of orthopaedic procedures outside of theatre environments. We developed an augmented reality simulator that allows trainees to practice pinning of paediatric supracondylar humeral fractures (SCHF) in a radiation-free environment at no extra risk to patients. The simulator is composed of a tangible child's elbow model, and simulated fluoroscopy on a tablet device. The treatment of these fractures is likely one of the first procedures involving X-ray guided wire insertion that trainee orthopaedic surgeons will encounter. This study aims to examine the extent of improvement simulator training provides to real-world operating theatre performance. This multi-centre study will involve four cohorts of New Zealand orthopaedic trainees in their SET1 year. Trainees with no simulator exposure in 2019 - 2021 will form the comparator cohort. Trainees in 2022 will receive additional, regular simulator training as the intervention cohort. The comparator cohort's performance in paediatric SCHF surgery will be retrospectively audited using routinely collected operative outcomes and parameters over a six-month period. The performance of the intervention cohorts will be collected in the same way over a comparable period. The data collected for both groups will be used to examine whether additional training with an augmented reality simulator shows improved real-world surgical outcomes compared to traditional surgical training. This protocol has been approved by the University of Otago Health Ethics committee, and the study is due for completion in 2024. This study is the first nation-wide transfer validity study of a surgical simulator in New Zealand. As of September 2022, all trainees in the intervention cohort have been recruited along with eight retrospective trainees via email. We present this protocol to maintain transparency of the prespecified research plans and ensure robust scientific methods. This protocol may also assist other researchers conducting similar studies within small populations


Orthopaedic Proceedings
Vol. 105-B, Issue SUPP_15 | Pages 6 - 6
7 Nov 2023
Jeffrey H Samuel T Hayter E Lee G Little M Hardman J Anakwe R
Full Access

We undertook this study to investigate the outcomes of surgical treatment for acute carpal tunnel syndrome following our protocol for concurrent nerve decompression and skeletal stabilization for bony wrist trauma to be undertaken within 48-hours. We identified all patients treated at our trauma centre following this protocol between 1 January 2014 and 31 December 2019. All patients were clinically reviewed at least 12 months following surgery and assessed using the Brief Michigan Hand Outcomes Questionnaire (bMHQ), the Boston Carpal Tunnel Questionnaire (BCTQ) and sensory assessment with Semmes-Weinstein monofilament testing. The study group was made up of 35 patients. Thirty-three patients were treated within 36-hours. Patients treated with our unit protocol for early surgery comprising nerve decompression and bony stabilization within 36-hours, report excellent outcomes at medium term follow up. We propose that nerve decompression and bony surgical stabilization should be undertaken as soon as practically possible once the diagnosis is made. This is emergent treatment to protect and preserve nerve function. In our experience, the vast majority of patients were treated within 24-hours


Orthopaedic Proceedings
Vol. 105-B, Issue SUPP_17 | Pages 45 - 45
24 Nov 2023
Dendoncker K Putzeys G Cornu O Nieuwenhuizen T Bertrand M Valster H Croes K
Full Access

Aim. Local antibiotics released through a carrier is a commonly used technique to prevent infection in orthopaedic procedures. An interesting carrier in aseptic bone reconstructive surgery are bone chips impregnated with AB solution. Systemically administered Cefazolin (CFZ) is used for surgical site infection prophylaxis however in vitro study showed that fresh frozen and processed bone chips impregnated with CFZ solution completely release the CFZ within a few hours. On the other hand irradiated freeze-dried bone chips, treated with supercritical CO2 (scCO2) have been shown to be an efficient carrier for the antibiotics vancomycine or tobramycine. With this pilot study we wanted to investigate if CFZ solution impregnation of bone chips treated with scCO2 shows a more favorable release pattern of CFZ. Method. The bone chips were prepared using the standard scCO2 protocol and were impregnated with 100 mg/ml cefazolin at different timepoints during the process: before freeze drying (BC type A), after freeze drying (BC type B) and after gamma-irradiation. 0.5g of the impregnated bone grafts were incubated with 5ml of fetal calf serum (FCS) at 37°C. At 2, 4, 6, 8 and 24h of incubation 200µl of eluate was taken for analysis. After 24h the remaining FCS was removed, bone grafts were washed and new FCS (5ml) was added. Consecutive eluate samples were taken at 48, 72 and 96h of incubation. The concentration of CFZ in the eluates was measured with the validated UPLC-DAD method. Analysis was performed in triplicate. Results. The mean concentration of CFZ in the eluate obtained from BC type A incubated for 2h was higher compared to BC type B, respectively 581 mg/l and 297 mg/l. However, the elution profile is the same for both types: the CFZ concentration in the eluates drops within the first 24h from 581 mg/l to 365 mg/l (37%) for BC type A and from 297 mg/l to 132 mg/l (56%) for BC type B. After 24h no further significant CFZ release is seen. Impregnation of the bone chips before or after gamma irradiation did not affect this elution profile. Conclusions. Bone chips treated with scCO2 show a comparable elution pattern compared to non-scCO2 treated bone chips. AB release depends on the properties of the AB, making it impossible to copy the same impregnation protocol for different antibiotics. The stability of CFZ in solution at 37°C and its release are a major concern when establishing an impregnation protocol with CFZ


Orthopaedic Proceedings
Vol. 99-B, Issue SUPP_22 | Pages 21 - 21
1 Dec 2017
Semenistyy A Obolenskiy V Semenistyy A Konnov A
Full Access

Aim. Chronic osteomyelitis of long bones is one of the most severe complications in orthopedics. Different options exist for treatment of this disease, however there is still no generally accepted comprehensive protocol that could potentially guide us in each particular step. There are many classifications that were designed to help us to make clinical decision, however even the most widely used Cierny-Mader classification does not count more a half of factors, assessment of which is essential for choosing the best treatment plan. This fact may be explained by the complexity of the disease process, diversity of treatment options and multistage approach to the management of these patients. Therefore, the purpose of this study was to work out a treatment protocol and clinical classification system, which will improve final outcomes in patients with chronic osteomyelitis of long bones. Method. Three orthopedic surgeons and one general surgeon who specialize on bone and joint infection independently of each other made a review of literature dedicated to the topic of chronic osteomyelitis. Each surgeon created a list of factors that are essential to assess for successful treatment of chronic osteomyelitis. After four lists were thoroughly matched and discussed, 10 most important factors were defined. Each surgeon proposed his own protocol of treatment, based on existent data and own experience. All four protocols were discussed and analyzed to come up with new the most comprehensive one. Therefore, the new protocol was created. After the list of factors and protocol were created, surgeons independently of each other defined the most important factors for every stage in the new protocol. Thus new multi-stage classification of chronic osteomyelitis (MSC-CO) was proposed. Results. We have defined the most important factors influencing on the decision making process in treatment of chronic osteomyelitis of long bones. The new comprehensive protocol and multi-stage clinical classification were developed. Conclusions. We assume, that the proposed tools may improve the results of chronic osteomyelitis treatment. However, the clinical trials should be conducted to assess the utility of new treatment protocol and MSC-CO in daily practice


Orthopaedic Proceedings
Vol. 100-B, Issue SUPP_6 | Pages 17 - 17
1 Apr 2018
Pascual SR Gheduzzi S Miles A Keogh P
Full Access

Back pain is a significant socio-economic problem affecting around 80% of the population at some point during their lives. Chronic back pain leads to millions of days of work absence per year, posing a burden to health services around the world. In order to assess surgical interventions, such as disc replacements and spinal instrumentations, to treat chronic back pain it is important to understand the biomechanics of the spine and the intervertebral disc (IVD). A wide range of testing protocols, machines and parameters are employed to characterise the IVD, making it difficult to compare data across laboratories. The aim of this study was to compare the two most commonly used testing protocols in the literature: the stiffness and the flexibility protocols, and determine if they produce the same data when testing porcine specimens in six degrees of freedom under the same testing conditions. In theory, the stiffness and the flexibility protocols should produce equivalent data, however, no detailed comparison study is available in the literature for the IVD, which is a very complex composite structure. Tests were performed using the unique six axis simulator at the University of Bath on twelve porcine lumbar functional spinal unit (FSU) specimens at 0.1 Hz under 400 N preload. The specimens were divided in two groups of six and each group was tested using one of the two testing protocols. To ensure the same conditions were used, tests were firstly carried out using the stiffness protocol, and the equivalent loading amplitudes were then applied using the flexibility protocol. The results from the two protocols were analysed to produce load-displacement graphs and stiffness matrices. The load-displacement graphs of the translational axes show that the stiffness protocol produces less spread between specimens than the flexibility protocol. However, for the rotational axes there is a large variability between specimens in both protocols. Additionally, a comparison was made between the six main diagonal terms of the stiffness matrices using the Mann-Whitney test, since the data was not normally distributed. No statistically significant difference was found between the stiffness terms produced by each protocol. However, overall the stiffness protocol generally produced larger stiffnesses and less variation between specimens. This study has shown that when testing porcine FSU specimens at 0.1 Hz and 400 N preload, there is no statistically significant difference between the main diagonal stiffness terms produced by the stiffness and the flexibility protocols. This is an important result, because it means that at this specific testing condition, using the same testing parameters and environment, both the stiffness and flexibility methods can be used to characterise the behaviour of the spine, and the results can be compared across the two protocols. Future work should investigate if the same findings occur at other testing conditions


Orthopaedic Proceedings
Vol. 102-B, Issue SUPP_1 | Pages 140 - 140
1 Feb 2020
Fassihi S Kraekel SM Soderquist MC Unger A
Full Access

Introduction. Enhanced Recovery After Surgery (ERAS) is a multi-disciplinary approach for establishing procedure–specific, evidence-based perioperative protocols to optimize patient outcomes. ERAS evidence is predominantly for non-orthopaedic procedures. We review the impact of ERAS protocol implementation on total joint arthroplasty (TJA) outcomes at our institution. Methods. All primary total hip and knee arthroplasties performed one year before and after ERAS implementation were identified by current procedural terminology code. Length of stay (LOS), disposition, readmission and opioid usage were analyzed before and after ERAS implementation and statistically analyzed with student t-test and chi-square test. Results. 2105 total patients were identified (967 THA, 494 pre-ERAS and 473 post-ERAS;1138 TKA, 575 pre-ERAS and 563 post-ERAS). TKA. After ERAS implementation, opioid consumption decreased for hospital day one (45.5MME to 36.2MME; p=0.000) and overall hospitalization (101.9MME to 83.9MME; p =0.000). Average LOS decreased (73.28hrs to 66.44hrs; p=0.000), blood transfusion rate trended down (3.3% to 1.95%; p=0.155), and disposition to home over skilled nursing facility increased (57.8% to 71.6%; p=0.000). Unplanned return-to-hospital encounters were unchanged (13.22% to 12.79%; p=0.8504). 30-day and 90-day readmission rates decreased (7.30% to 3.02%; p=0.0020 and 8.5% to 4.8%; p=0.0185, respectively). THA. After ERAS implementation, opioid consumption decreased for hospital day one (49.5MME to 35.4MME; p=0.000) and overall hospitalization (79.5MME to 59.5MME; p=0.000). Average LOS decreased (57.84hrs to 51.87hrs; p=0.011), blood transfusion rate was unchanged (4.25% to 3.81%; p=0.725), and disposition to home over skilled nursing facility increased (80.4% to 82.5%; p= 0.022). Unplanned return-to-hospital encounters were unchanged (8.51% to 8.88%; p=0.8486). Readmission trended up during postoperative days 0–30 and trended down during postoperative days 31–90. (1.42% to 2.96%; p=0.1074) and (1.21% to 0.85%; p=0.5748), respectively. Conclusion. ERAS protocols reduce postoperative opioid consumption, decrease hospital LOS, and increase patient disposition to home without adversely affecting short-term readmission rates


Orthopaedic Proceedings
Vol. 103-B, Issue SUPP_3 | Pages 60 - 60
1 Mar 2021
Jodoin M Rouleau D Provost C Bellemare A Sandman E Leduc S De Beaumont L
Full Access

Acute pain is one of the most common symptoms shared among patients who have suffered from an orthopedic trauma such as an isolated upper limb fracture (IULF). Development of interventions with limited side effects aiming to prevent the installation of chronic pain is critical as persistent pain is associated with an increased risk of opioid dependence, medical complications, staggering financial burdens and diminished quality of life. Theta burst stimulation (TBS), a non-invasive magnetic brain stimulation technique with minimal side effects, has shown promising results in patients experiencing various types of chronic pain conditions as it precisely targets brain regions involved in pain processing. Surprisingly, its impact on acute pain has never been investigated. This study aims to assess longitudinal effects of a 10-day continuous TBS (cTBS) protocol applied in the acute phase of an IULF on key functional outcomes. Patients with an IULF aged between 18 to 60 years old were recruited within 7 days post-accident at a Level I Trauma Center. Exclusion criteria included a history of brain injury, neurological disorders, musculoskeletal complications, and open fractures. In order to assess longitudinal changes, questionnaires measuring intensity and characteristics of pain (Numerical Rating Scale, NRS; McGill Pain Questionnaire, MPQ) as well as functional disability (DASH) were completed by all patients at three time points, namely prior to the start of the TBS program as well as 72 hours and 3 months post-intervention. Patients were randomly attributed to the active TBS protocol (active group) or to the placebo protocol (sham group). The stimulation site for each participant corresponded to the contralateral motor cortex of the injured arm. Fifty patients were recruited (female: 24; age: 40.38 years old), of which 25 were in the active group and 25 were in the sham group. Both groups were equivalent based on age, sex, type of injury, and surgical procedures (p>0.05). The intervention protocol was introduced on average 6.18 days post-accident. In comparison to the sham group, the active group showed a significant decrease in pain intensity (NRS) at 72h (F=6.02; p=0.02) and 3-month (F=6.37; p=0.02) post-intervention. No group difference was found early-on (72h post) in regard to pain characteristics (MPQ; F=3.90; p=0.06) and functional disabilities (DASH; F=0.48; p= 0.49). At three-month post-intervention, the active group showed statistically significant improvement on the MPQ (F=5.02; p=0.04) and the DASH (F=5.88; p=0.02) compared to the placebo group. No complications related to the treatment were reported. Results from this study show that patients who underwent active cTBS reported less pain and better functional states shortly after the end of the TBS protocol compared to sham patients and treatment effects were maintained at three months post-intervention. Given that acute pain intensity is an excellent predictor of chronic pain development, this safe technique available in numerous centers in Canada may help prevent chronic pain development when administered during the acute post-injury phase. Future studies should continue to investigate mechanisms involved to optimize this technique among the orthopedic trauma population and to reduce opioid consumption


Orthopaedic Proceedings
Vol. 104-B, Issue SUPP_13 | Pages 71 - 71
1 Dec 2022
Gazendam A Ekhtiari S Ayeni OR
Full Access

Orthopaedic surgeons prescribe more opioids than any other surgical speciality. Opioids remain the analgesic of choice following arthroscopic knee and shoulder surgery. There is growing evidence that opioid-sparing protocols may reduce postoperative opioid consumption while adequately addressing patients’ pain. However, there are a lack of prospective, comparative trials evaluating their effectiveness. The objective of the current randomized controlled trial (RCT) was to evaluate the efficacy of a multi-modal, opioid-sparing approach to postoperative pain management in patients undergoing arthroscopic shoulder and knee surgery. The NO PAin trial is a pragmatic, definitive RCT (NCT04566250) enrolling 200 adult patients undergoing outpatient shoulder or knee arthroscopy. Patients are randomly assigned in a 1:1 ratio to an opioid-sparing group or standard of care. The opioid-sparing group receives a three-pronged prescription package consisting of 1) a non-opioid prescription: naproxen, acetaminophen and pantoprazole, 2) a limited opioid “rescue prescription” of hydromorphone, and 3) a patient education infographic. The control group is the current standard of care as per the treating surgeon, which consists of an opioid analgesic. The primary outcome of interest is oral morphine equivalent (OME) consumption up to 6 weeks postoperatively. The secondary outcomes are postoperative pain scores, patient satisfaction, quantity of OMEs prescribed and number of opioid refills. Patients are followed at both 2 and 6 weeks postoperatively. Data analysts and outcome assessors are blinded to the treatment groups. As of December 1, 2021 we have enrolled 166 patients, reaching 83% of target enrolment. Based on the current recruitment rate, we anticipate that enrolment will be completed by the end of January 2022 with final follow-up and study close out completed by March of 2022. The final results will be released at the Canadian Orthopaedic Association Meeting in June 2022 and be presented as follows. The mean difference in OME consumption was XX (95%CI: YY-YY, p=X). The mean difference in OMEs prescribed was XX (95%CI: YY-YY, p=X). The mean difference in Visual Analogue Pain Scores (VAS) and patient satisfaction are XX (95%CI: YY-YY, p=X). The absolute difference in opioid refills was XX (95%CI: YY-YY, p=X). The results of the current study will demonstrate whether an opioid sparing approach to postoperative outpatient pain management is effective at reducing opioid consumption while adequately addressing postoperative pain in patients undergoing outpatient shoulder and knee arthroscopy. This study is novel in the field of arthroscopic surgery, and its results will help to guide appropriate postoperative analgesic management following these widely performed procedures


Orthopaedic Proceedings
Vol. 100-B, Issue SUPP_17 | Pages 14 - 14
1 Dec 2018
Hellebrekers P Rentenaar R McNally M Hietbrink F Houwert M Leenen L Govaert G
Full Access

Aim. Fracture-related infection (FRI) is an important complication following surgical fracture management. Key to successful treatment is an accurate diagnosis. To this end, microbiological identification remains the gold standard. Although a structured approach towards sampling specimens for microbiology seems logical, there is no consensus on a culture protocol for FRI. The aim of this study is to evaluate the effect of a structured microbiology sampling protocol for fracture-related infections compared to ad-hoc culture sampling. Method. We conducted a pre-/post-implementation cohort study that compared the effects of implementation of a structured FRI sampling protocol. The protocol included strict criteria for sampling and interpretation of tissue cultures for microbiology. All intraoperative samples from suspected or confirmed FRI were compared for culture results. Adherence to the protocol was described for the post-implementation cohort. Results. In total 101 patients were included, 49 pre-implementation and 52 post-implementation. From these patients 175 intraoperative culture sets were obtained, 96 and 79 pre- and post-implementation respectively. Cultures from the pre-implementation cohort showed significantly more antibiotic use during culture sampling (P = 0.002). The post-implementation cohort showed a tendency more positive culture sets (69% vs. 63%, P = 0.353), with a significant difference in open wounds (86% vs. 67%, P = 0.034). In all post-implementation culture sets causative pathogens were cultured more than once per set, in contrast to pre-implementation (P <0.001). Despite stricter tissue sampling and culture interpretation criteria, the number of polymicrobial infections was similar in both cohorts, approximately 29% of all culture sets and 44% of all positive culture sets. Significantly more polymicrobial cultures were found in early infections in the post-implementation cohort (P = 0.048). This indicates a better yield in the new protocol. Conclusions. A standardised protocol for intraoperative sampling for bacterial identification in FRI is superior than an ad-hoc approach. This was the combined effect of no antibiotics around sampling, more tissue samples with the ‘no touch-technique, increased awareness for both surgeon and microbiologist and stricter criteria for diagnosis. It resulted in more microbiologically confirmed infections and more certainty when identifying causative pathogens


Orthopaedic Proceedings
Vol. 103-B, Issue SUPP_5 | Pages 4 - 4
1 Mar 2021
Rosell CC Goma-Camps MV Mateu CA Calderer LC Pérez-Cardona PC
Full Access

Aim. The reconstruction of bone critical size defects of the tibia is one of the most complex therapeutic challenges in the orthopedic field. This study aims to describe and evaluate our three-staged surgical protocol of reconstruction of infected defects of the tibia emphasizing in limb salvage rate, resolution of infection, functional outcome and patient satisfaction. Method. A retrospective review was performed in all cases of complex infected tibia fracture with combined soft and bone tissue loss treated in a specialized limb reconstruction center between 2010 and 2018. In all cases, a three-stage protocol was performed: 1) Infected-limb damage control with radical debridement, 2) Soft tissue coverage with vascularized or local flap 3) Bone reconstruction procedure. The minimum follow-up required was 12 months after external fixator removal. Results. Twenty-eight patients with a mean age of 42 years were included. The mean soft tissue defect was 91.7 cm2 and the mean bone defect was 5.8 cm. 67.85% of the cases were classified as a type IV B-local osteomyelitis. The median global treatment time was 456 days. The External Fixator Time (EFT) was 419, 284, 235 for bone transport, shortening-lengthening and acute shortening groups respectively. The median Bone Healing Index (BHI) was 1.82 months/cm in bone transport group and 2.15 months/cm in shortening-lengthening group. The limb salvage rate was 92.85%. Infection resolution rate was 96,42%. We achieved bone union in 92,85% of the cases. Regarding ASAMI bone score, 92.8% of the cases were “good-or-excellent”. Two patients underwent a delayed amputation. Eight cases of non-progressive Docking Site (DS) healing were observed. Nineteen non-expected reinterventions were performed. Functional data: the mean VAS score was 1.0. The mean LEFS score was 55.88 (55.88/80). Regarding ASAMI functional score, 78,6% of the cases were “good-or-excellent”. More than 80% of the patients could return to work. 100% of the patients were “very satisfied” or “moderately satisfied” (75% and 25% respectively). Conclusions. Our results demonstrate that our three-stage surgical approach in infected tibial bone defects with soft tissue damage can result in high infection resolution, good functional outcome, good patient satisfaction and an acceptable limb salvage rate despite the large time of treatment and unexpected reinterventions


Orthopaedic Proceedings
Vol. 99-B, Issue SUPP_5 | Pages 78 - 78
1 Mar 2017
Pasko K Hall R Neville A Tipper J
Full Access

Surgical interventions for the treatment of chronic neck pain, which affects 330 million people globally, include fusion and cervical total disc replacement (CTDR). Most of the currently clinically available CTDRs designs include a metal-on-polymer (MoP) bearing. Numerous studies suggest that MoP CTDRs are associated with issues similar to those affecting other MoP joint replacement devices, including excessive wear and wear particle-related inflammation and osteolysis. A standard ISO testing protocol was employed to investigate a device with a metal-on-metal (MoM) bearing. Moreover, with findings in the literature suggesting that the testing protocol specified by ISO-18192-1 may result in overestimated wear rates, additional tests with reduced kinematics were conducted. Six MoM CTDRs made from high carbon cobalt-chromium (CoCr) were tested in a six-axis spine simulator, under the ISO-18192-1 protocol for a duration of 4 million cycles (MC), followed by 2MC of modified testing conditions, which applied the same axial force as specified in ISO-18192-1 (50-150N), but reduced ranges of motion (ROM) i.e. ±3° flexion/extension (reduced from ±7.5°) and ±2° lateral bending (reduced from ±6°) and axial rotation (reduced from ±4°). Foetal bovine serum (25% v/v), used as a lubricant, was changed every 3.3×10. 5. cycles and stored at −20°C for particle analysis. Components were measured after each 1×10. 6. cycles; surface roughness, damage modes and gravimetric wear were assessed. The wear and roughness data was presented as mean ±95% confidence interval and was analysed by one-way analysis of variance (ANOVA) (p=0.05). The mean wear rate of the MoM CTDRs tested under the ISO protocol was 0.246 ± 0.054mm. 3. /MC, with the total volume of wear of 0.977 ± 0.102mm. 3. lost over the test duration (Fig. 1). The modified testing protocol resulted in a significantly lower mean volumetric wear rate of 0.039 ± 0.015mm. 3. /MC (p=0.002), with a total wear volume of 0.078 ± 0.036mm. 3. lost over the 2MC test duration. Under both test conditions, the volumetric wear was linear; with no significant bedding-in period observed (Fig. 1). The mean pre-test surface roughness decreased from 0.019 ± 0.03µm to 0.012 ± 0.002µm (p=0.001) after 4MC of testing, however surface roughness increased to 0.015 ± 0.002µm (p=0.009) after the additional 2MC of modified test conditions. Following 4MC of testing, polishing marks, observed prior to testing, had been removed. Consistently across all components, surface discolouration and multidirectional, criss-crossing, curvilinear and circular wear tracks, caused by abrasive wear, were observed. Reduced ROMs testing caused similar types of damage, however the circular wear tracks were smaller in size, compared to those produced during testing under the ISO protocol. The wear rates exhibited by MoM CTDRs tested under ISO-18192-1 testing protocol (0.246mm. 3. /MC) were lower, when compared to CTDR designs incorporating MoP bearings, as well as MoM lumbar CTDRs. Wear rates generated under a modified ISO testing protocol were reduced tenfold, similarly to findings that have previously been reported in the literature, and support the hypothesis that the testing protocol specified by ISO-18192-1 may overestimate wear rates. Characterisation of particles generated by MoM CTDRs and biological consequences of those remain to be determined. For figures/tables, please contact authors directly.


Orthopaedic Proceedings
Vol. 99-B, Issue SUPP_4 | Pages 18 - 18
1 Feb 2017
Hood B Greatens M Urquhart A Maratt J
Full Access

Introduction. There is no consensus on the ideal pain management strategy following total hip arthroplasty (THA). This study sought to identify immediate changes in the hospital course of patients undergoing primary THA following implementation of a rapid recovery anesthesia and multimodal management of pain (RAMP) protocol. For this study, rapid recovery anesthesia describes the use of preoperative non-narcotic medication in conjunction with neuraxial anesthesia techniques confined to the operating room only. The multimodal pain regimen consists of pre- and post-operative high dose nonsteroidal anti-inflammatories (NSAIDs), gabapentin, and antiemetics with or without intraoperative periarticular anesthetic injection. We hypothesized that the implementation of a RAMP protocol would lead to decreased reported pain scores, decreased narcotic use, and a shorter hospital stay in patients undergoing primary THA. Methods. This retrospective cohort study performed at a multi-surgeon high-volume institution reviewed the records of 81 consecutive patients who underwent primary THA utilizing traditional anesthesia and an opioid-dependentpain management techniques between June to September 2014 compared to 78 patients who underwent primary THA after implementation of the RAMP protocol between November 2014 to February 2015. The length of stay (LOS), pain scores, narcotic use, and other clinical data were recorded for each study group. Equality of variance was confirmed prior to statistical analysis using t-test for equality of means. Results. There was no significant difference in the demographics, body mass index, ASA classification, or Charleson Comorbidity Index between the two cohorts. The average LOS was significantly shorter after implementation of the RAMP protocol with a mean of 2.10 ± 1.1 days versus 2.89 ± 1.0 days. Total amount of narcotics used was significantly less with the RAMP protocol (Fig 1). Patient reported pain scores were improved throughout the hospital stay with patients receiving the RAMP protocol reporting significantly less pain beginning the morning of POD 1 (Fig 2). No significant decrease in kidney function was seen following the use of high dose NSAIDs (p=0.18). Conclusions. The implementation of a RAMP protocol in patients undergoing primary total hip arthroplasty has resulted in an immediate shorter length of stay with less narcotics use and improved reported pain scores. To our knowledge, this is the first study demonstrating a statistically significantly shorter length of stay associated with a multimodal pain protocol. This study supports the benefits of broad implementation a RAMP protocol for total hip arthroplasty


Orthopaedic Proceedings
Vol. 99-B, Issue SUPP_22 | Pages 50 - 50
1 Dec 2017
Shahi A Boe R Oliashirazi S Salava J Oliashirazi A
Full Access

Aim. Persistent wound drainage has been recognized as one of the major risk factors of periprosthetic joint infection (PJI). Currently, there is no consensus on the management protocol for patients who develop wound drainage after total joint arthroplasty (TJA). The objective of our study was to describe a multimodal protocol for managing draining wounds after TJA and assess the outcomes. Methods. We conducted a retrospective study of 4,873 primary TJAs performed between 2008 and 2015. Using an institutional database, patients with persistent wound drainage (>48 hours) were identified. A review of the medical records was then performed to confirm persistent drainage. Draining wounds were first managed by instituting local wound care measures. In patients that drainage persisted over 7 days, a superficial irrigation and debridement (I&D) was performed if the fascia was intact, and if the fascia was not intact modular parts were exchanged. TJAs that underwent subsequent I&D, revision surgery, or developed PJI within one year were identified. Results. Draining wounds were identified in 6.2% (302/4,873) of all TJAs. Overall, 65% (196/302) of patients with draining wounds did not require any surgical procedures. Of the patients with persistent drainage, 9.8% underwent I&D, 25.0% underwent revision arthroplasty. Moreover, 15.9% of these patients developed PJI within one year. Compared to those without wound drainage, TJAs complicated by wound drainage demonstrated an odds ratio of 16.9 (95% CI: 9.1–31.6) for developing PJI, and 18.0 (95% CI: 11.3–28.7) for undergoing subsequent surgery. Conclusions. Wound drainage after TJA is a major risk factor for subsequent PJI and its proper management has paramount importance. Our results demonstrated that drainage ceased spontaneously in 65% of the patients with local wound care measures alone. Wounds with persistent drainage were at substantially higher risk for PJI than those that healed uneventfully


Orthopaedic Proceedings
Vol. 102-B, Issue SUPP_6 | Pages 136 - 136
1 Jul 2020
Tushinski D Winemaker MJ De Beer J Petruccelli D Mertz D Main C Piccirillo E
Full Access

Prosthetic joint infections (PJI) are amongst the most feared postoperative complication of total joint replacement (TJR). PJIs are associated with significant morbidity ranging from functional impairment to amputation. Staphylococcus aureus (S. aureus) is one of the most common causative organisms involved in PJI. More than one quarter of the general population are S. aureus carriers, and carrier status has been shown to increase the risk of developing surgical site infections including PJIs. Decolonization of S. aureus carriers prior to surgery has demonstrated promising results in general surgery, however, solid evidence supporting decolonization in orthopaedic patients is lacking. We aimed to seek further evidence supporting pre-operative screening and S. aureus decolonization in patients undergoing primary or revision hip and knee TJR. A quasi-experimental quality improvement study was conducted to compare the 5-year baseline rates of deep PJIs to a one-year screening and decolonization intervention period. All consecutive patients who underwent primary or revision TJR at one tertiary care hospital in Hamilton, ON, Canada were included in both study periods. Nasal and throat screening for S. aureus carriage of all eligible TJR patients in the preoperative clinic was implemented as standard of care. Patients who tested positive were contacted and provided with details on the S. aureus decolonization protocol. Decolonization included a standardized treatment protocol of 2% intranasal mupirocin twice daily for five days prior to surgery date (excluding day of surgery), and chlorhexidine gluconate wipes (2%) to be used once daily for 4 days prior to surgery date and on the morning of surgery. Regardless of the colonization status at the visit in the preoperative clinic, all patients were re-swabbed on the day of surgery. Primary outcome of interest was the rate of deep PJI as per CDC/NHSN at one-year postoperative follow-up. Secondary outcomes included rate of deep PJIs due to S. aureus, adherence to the decolonization protocol, proportion of S. aureus carriers successfully decolonized, and the proportion of patients deemed as non-carriers following preoperative swab subsequently identified as carriers on the day of surgery. A total of 8,505 patients were included in the 5-year control group, and 1,883 during the intervention period, of which 424 (22.5%) were identified as S. aureus carriers. The deep PJI rate was similar in the two groups, 0.4% (7/1,883) in the intervention group and 0.5% (42/8,505) in the control group (OR 0.75, 95%CI 0.34–1.67, p=0.58). More importantly, we found a significant reduction in PJI due to S. aureus to only one case in the intervention period (0.05%) as compared to 29 cases (0.3%) in the historic control (OR 0.15, 95%CI 0.004–0.94, p=0.0376). We found a significant reduction in PJIs due to S. aureus by decolonizing S. aureus carriers prior to surgery. However, no significant difference in overall infection rates was observed. In conclusion, routine implementation of active screening for S. aureus and decolonization of carriers before TJR is feasible and helps to reduce PJI due to S. aureus


Orthopaedic Proceedings
Vol. 94-B, Issue SUPP_XXXIV | Pages 30 - 30
1 Jul 2012
Blocker O Singh S Lau S Ahuja S
Full Access

The aim of the study was to highlight the absence of an important pitfall in the Advanced Trauma Life Support protocol in application of rigid collar to patients with potentially unstable cervical spine injury. We present a case series of two patients with ankylosed cervical spines who developed neurological complications following application of rigid collar for cervical spine injuries as per the ATLS protocol. This has been followed up with a survey of A&E and T&O doctors who regularly apply cervical collars for suspected unstable cervical spine injuries. The survey was conducted telephonically using a standard questionnaire. 75 doctors completed the questionnaire. A&E doctors = 42, T&O = 33. Junior grade = 38, middle grade = 37. Trauma management frontline experience >1yr = 50, <1yr = 25. Of the 75 respondents 68/75 (90.6%) would follow the ATLS protocol in applying rigid collar in potentially unstable cervical spine injuries. 58/75 (77.3%) would clinically assess the patient prior to applying collar. Only 43/75 (57.3%) thought the patients relevant past medical history would influence collar application. Respondents were asked whether they were aware of any pitfalls to rigid collar application in suspected neck injuries. 34/75 (45.3%) stated that they were NOT aware of pitfalls. The lack of awareness was even higher 17/25 (68%) amongst doctors with less that 12 months frontline experience. When directly asked whether ankylosing spondylitis should be regarded as a pitfall then only 43/75 (57.3%) answered in the affirmative. We would like to emphasise the disastrous consequences of applying a rigid collar in patients with ankylosed cervical spine. The survey demonstrates the lack of awareness (∼ 50%) amongst A&E and T&O doctors regarding pitfalls to collar application. We recommend the ATLS manual highlight a pitfall for application of rigid collars in patients with ankylosed spines and suspected cervical spine injuries


Orthopaedic Proceedings
Vol. 95-B, Issue SUPP_18 | Pages 22 - 22
1 Apr 2013
Hosny H Srinivasan S Keenan J Fekry H
Full Access

Medical and Health care products Regulatory Agency (MHRA) released an alert in 2010 regarding metal on metal (MoM) bearings in hip arthroplasty owing to soft tissue reactions to Metal debris. Following this, we adopted a targeted screening protocol to review patients with this bearing couple. 218 Patients (252 hips), mean age 53.2 (25–71) years were assessed clinically using Oxford hip score (OHS) and X-ray examination. The mean follow up was 44.5 (12–71) months. Patients were considered at higher risk (118 patients/133 hips) if they had deterioration of OHS (50 hips), Small sized heads <50mm (114 hips), acetabular inclination >500 (37 hips), neck thinning (17 hips). These patients (107/118), (120/133 hips) were further investigated through measuring metal ion levels and magnetic resonance imaging (MRI). The mean blood levels of cobalt and chromium in this group were 6.7, 8.62 ug/L respectively. Metal ions increased significantly with high acetabular inclination angles (p=0.01, 0.004 respectively), but was not affected by the size of the head (p=0.13). MRI showed periprosthetic lesions around 28 hips (26 fluid collections, 2 pseudotumours). The screening protocol detected all patients who subsequently required elective revision. We believe that this protocol was beneficial in detecting problematic MoM hip replacements


Orthopaedic Proceedings
Vol. 104-B, Issue SUPP_11 | Pages 44 - 44
1 Nov 2022
Khadabadi N Murrell J Selzer G Moores T Hossain F
Full Access

Abstract

Introduction

We aimed to compare the outcomes of elderly patients with periarticular distal femur or supracondylar periprosthetic fractures treated with either open reduction internal fixation or distal femoral replacement.

Methods

A retrospective review of patients over 65 years with AO Type B and C fractures of the distal femur or Su type I and II periprosthetic fractures treated with either a DFR or ORIF was undertaken. Outcomes including Length of Stay, PROMs (Oxford Knee Score and EQ 5D), infection, union, mortality, complication and reoperation rates were assessed. Data on confounding variables were also collected for multivariate analysis. Patients below 65 years and extra articular fractures were excluded.


Orthopaedic Proceedings
Vol. 95-B, Issue SUPP_29 | Pages 64 - 64
1 Aug 2013
Sabnis B Maheshwari R Walmsley P Brenkel I
Full Access

Blood loss following total hip replacement is a major contributor to increase morbidity and length of stay. Various techniques have been described to reduce its occurrence. We now follow a set protocol, combining rivaroxaban for thrombo-prophylaxis and tranexamic acid to reduce immediate postoperative bleeding. Patients and methods:. Using data collected prospectively we looked at 2 groups of consecutive patients undergoing THR. The protocol was the only factor changed during the period studied. Initially we used subcutaneous dalteparin injections and continued use of aspirin in peri-operative period following total hip replacements (Group I–317 patients). A new protocol was introduced involving rivaroxaban for thrombo-prophylaxis with its first dose at least 8 hours from skin closure and stopping aspirin at least 7 days before operation. In addition tranexamic acid was given in a dose of 500 mg (or 1 gm in obese patients) intravenously just prior to incision (Group II–348 patients). We compared these two groups regards Hb drop at 24 hours and blood transfusion requirement. Results:. The average Hb drop at 24 hours postop in group I was 3.08 gm/dl compared to 2.31 in group II. (p<0.001). 62 (19.6%) patients in group I required blood transfusion compared to 11 (3.2%) in group II. (p = 0.001) Perioperative blood loss and length of stay reduction was also significantly different. There was no increase in number of DVT/PE, but the sample size was too small to assess this statistically. Conclusion:. This protocol drastically reduces requirement of postoperative blood transfusion requirement helping in reducing the length of stay following hip replacements