Lumbar fusion remains the gold standard for the treatment of discogenic back pain. Total disc replacement has fallen out of favor in many institutions. Other motion preservation alternatives, such as nucleus replacement, have had limited success and none are commercially available at this time. Two prospective, nonrandomized multicenter studies of lumbar disc nucleus replacement using the PerQdisc 2.0 nucleus replacement device in patients with lumbar discogenic back pain. Early clinical results are presented. A total of 16 patients from 4 international sites (Germany, Paraguay, Canada and Belgium) were enrolled in the trial between May 2019 and February 2021. Data collection points include baseline and postoperatively at 1, 2, 6, and 12 months. Clinical outcome measures were obtained from the Visual Analog Scale (VAS) for back and leg pain, Oswestry Disability Index (ODI), SF-12V2, Analgesic Score (AS), and radiographic assessments. Prospectively gathered data on patient reported outcomes, neurological outcome, surgical results, radiological analysis, and any adverse events. 16 patients had successful implantation of the device. There have been no expulsions of the device. Early postoperative results are available in 13/16 patients at 6 months and 11/16 patients at 12 months. There have been 4 (25%) revision surgeries 3–12 months post implantation between the two trials. 12 of 13 (92%) patients had Minimal Clinically Important Difference (MCID) in ODI at 6 months and 10 of 11 (91%) at 12 months. Mean decrease in ODI from baseline to 12 months was 44.8. At 12 months 8 (73%) patients are not taking pain medication, 1 (9%) patient is taking a narcotic for pain management. 73% of patients are working without restrictions at 12 months post implant. Early clinical and technical results are encouraging. Long term follow up is essential and is forthcoming. Additional patient recruitment and data points are ongoing.
There is a paucity of published Canadian literature comparing lumbar total disc arthroplasty (LDA) to fusion on patient outcomes in degenerative spondylosis. The purpose of this study is to quantify and compare the long-term patient reported outcomes following LDA and matched-fusion procedures. We conducted a matched-cohort study comparing consecutive patients enrolled by CSORN who underwent standalone primary LDA or hybrid techniques for degenerative disk disease between 2015–2019. Fusion patients were included by a primary diagnosis of degenerative disk disease, chief complaint of back pain, who received a primary fusion irrespective of technique. Fusion patients were matched by number of involved levels of surgery to LDA counterparts. Outcome scores and patient satisfaction were assessed preoperatively and 2-years postoperatively. 97 patients (39-female, 58-male) underwent LDA or hybrid construct up to 4 levels. 94 patients (52-female, 42-male) underwent a lumbar fusion were selected based on inclusion criteria. 36 LDA and 57 Fusion patients underwent a 1-level surgery. 39 LDA and 25 Fusion patients underwent 2-level surgery. 18 LDA and 7 Fusion patients underwent 3-level surgery. 4 LDA and 5 Fusion patients underwent a 4-level procedure. Slight differences in average cohort age were found (LDA-43.4yrs, Fusion-49.8yrs, p<0.01). Cohort preoperative-BMI (LDA-27.0kg/m2, Fusion-27.9kg/m2, p=0.29) and total comorbidities (LDA-2.6, Fusion-2.1, p=0.05) demonstrated no clinically significant differences. At 2 year follow-up, no differences were found in ODI improvement (LDA-20.32pts, Fusion-17.02pts, p=0.36), numerical back-pain improvement (LDA-3.5pts, Fusion-3.06pts, p=0.40), numerical leg-pain improvement (LDA-1.67pts, Fusion-1.87pts, p=0.76), and Health Scale improvement (LDA-17.12, Fusion-10.73, p=0.20) between cohorts. Similar positive findings were found in subgroups stratified by number of surgical levels. Satisfaction rate at 2 years was 86.7% and 82.4% for LDA and Fusion patients respectively. There didn't appear to be significant differences in outcomes or satisfaction through 2 years comparing patients who underwent LDA (whether used in isolation or as part of a hybrid construct) for debilitating degenerative disk disease and isolated spinal fusion for back dominant pain.
The objective of this paper is to demonstrate the difference in post-operative complication rates between Computer-assisted surgery (CAS) and conventional techniques in spine surgery. Several studies have shown that the accuracy of pedicle screw placement significantly improves with use of CAS. Yet, few studies have compared the incidence of post-operative complications between CAS and conventional techniques. The American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP) database was used to identify patients that underwent posterior lumbar fusion from 2011 to 2013. Multivariate analysis was conducted to demonstrate the difference in post-operative complication rates between CAS and conventional techniques in spine surgery. Out of 15,222 patients, 14,382 (95.1%) were operated with conventional techniques and 740 (4.90%) were operated with CAS. Multivariate analysis showed that patients in the CAS group had less odds to experience adverse events post-operatively (OR 0.57, P <0.001). This paper examined the complications in lumbar spinal surgery with or without the use of CAS. These results suggest that CAS may provide a safer technique for implant placement in lumbar fusion surgeries.
Cervical spine fusion have gained interest in the literature since these procedures are now ever more frequently being performed in an outpatient setting with few complications and acceptable results. The purpose of this study was to assess the rate of blood transfusion after cervical fusion surgery, and its effect, if any on complication rates. The American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP) database was used to identify patients that underwent cervical fusion surgery from 2010 to 2013. Univariate and multivariate regression analysis was used to determine post-operative complications associated with transfusion and cervical fusion. We identified 11,588 patients who had cervical spine fusion between 2010 and 2013. The overall rate of transfusion was found to be 1.47%. All transfused patients were found to have increased risk of: venous thromboembolism (TBE) (OR 3.19, CI: 1.16–8.77), myocardial infarction (MI) (OR 9.12, CI: 2.53–32.8), increased length of stay (LOS) (OR 28.03, CI: 14.28–55.01) and mortality (OR 4.14, CI: 1.44–11.93). Single level fusion had increased risk of: TBE (OR 3.37, CI: 1.01–11.33), MI (OR 10.5, CI: 1.88–59.89), and LOS (OR 14.79, CI: 8.2–26.67). Multilevel fusion had increased risk of: TBE (OR 5.64, CI: 1.15–27.6), surgical site infection (OR 16.29, CI: 3.34–79.49), MI (OR 10.84, CI: 2.01–58.55), LOS (OR 26.56, CI: 11.8–59.78) and mortality (OR 10.24, CI: 2.45–42.71). ACDF surgery had an increased risk of: TBE (OR 4.87, CI: 1.04–22.82), surgical site infection (OR 9.73, CI: 2.14–44.1), MI (OR 9.88, CI: 1.87–52.2), LOS (OR 28.34, CI: 13.79–58.21) and mortality (OR 6.3, CI: 1.76–22.48). Posterior fusion surgery had increased risk of: MI (OR 10.45, CI: 1.42–77.12) and LOS (OR 4.42, CI: 2.68–7.29). Our results demonstrate that although cervical fusions can be done as outpatient procedures special precautions and investigations should be done for patients who receive transfusion after cervical fusion surgery. These patients are demonstrated to have higher rate of MI, DVT, wound infection and mortality when compared to those who do not receive transfusion.
Hemorrhage and transfusion requirements in spine surgery are common. This is especially true for thoracic and lumbar fusion surgeries. The purpose of this papersi to determine predictive factors for transfusion and their effect on short-term post-operative outcomes for thoracic and lumbar fusions. The American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP) database was used to identify patients that underwent lumbar or thoracic fusion surgery from 2010 to 2013. Univariate and multivariate regression analysis was used to determine predictive factors and post-operative complications associated with transfusion. A total of 14,249 patients were included in this study; 13,586 had lumbar fusion and 663 had thoracic fusion surgery. The prevalence of transfusion was 35% for thoracic fusion and 17.5% for lumbar fusion. The multivariate analysis showed that age between 50–60 (OR 1.38, CI: 1.23–1.54), age between 61–70 (OR 1.65, CI: 1.40–1.95), dyspnea (OR 1.11, CI: 1.02–1.23), hypertension (OR 1.14, CI: 1.02–1.27), ASA class (OR 1.73, 1.18–1.45), pre-operative blood transfusion (OR 1.91, CI: 1.04–3.49), and extended surgical time (OR 4.51, CI: 4.09–4.98) were predictors of blood transfusion requirements for lumbar fusion. While only pre-operative BUN (OR 1.04, CI: 1.01–1.06) and extended surgical time (OR 4.70, CI: 3.12–6.96) were predictors of transfusion for thoracic fusion. In contrast, higher pre-operative hematocrit was protective against transfusion. Patients transfused who underwent lumbar fusion had an increased risk to develop superficial wound infection, deep wound infection, venous thromboembolism, myocardial infarction and had longer length of hospital stay. Patients transfused who underwent thoracic fusion were more likely to have venous thromboembolism and extended length of hospital stay. However, mortality was not associated with blood transfusion. This study used a large database to characterise the incidence, predictors and post-operative complications associated with blood transfusion in thoracic and lumbar fusion surgeries. Pre- and post-operative planning for patients deemed to be at high-risk of requiring blood transfusion should be considered to reduce post-operative complication in this population.
Linear spinal cord distraction, in animal models, leads to elevated intra-compartmental spinal cord pressure. We developed an in vitro model of distraction, with increasing tensile force, to demonstrate the relationship between the degree of spinal curvature and the proportional elevation of intra-compartmental pressure. Six Porcine spinal sections, two cervical, two thoracic, and two lumbar were harvested from 30kg pigs. These cord sections were individually stretched in a saline solution with increasing tensile force applied. Cord interstitial pressure (CIP) was monitored with an arterial line pressure monitor. The sections were each tested six times fresh, and then thawed and tested an additional six times. An additional ten freshly thawed cords were tested in linear distraction and over forty-five degree and ninety degree curved surfaces with CIP monitoring. Increased tension, by adding increasing weights of distraction, lead to a proportionally elevated CIP in the linear model (R=0.986). We achieved a 99% confidence interval via paired T testing to demonstrate that there was no significant difference between fresh specimens and recently thawed cords. As the degree of spinal curvature increased from a linear model, to a forty-five and ninety degree (cobb) curve, there were significant increases in CIP at the same distraction force. The more significant the curve, the greater the CIP for each increment in distraction force; ninety degree curves produced a 2.3x higher pressure than linear distraction. High cord interstitial pressure (CIP) can be achieved through spinal cord distraction (>
140mm Hg). This CIP is no only directly proportional to tension, but also proportionally magnified by the degree of spinal curvature. It is not affected by freezing/thawing. This may suggest that spinal cord compartment syndrome is a potential mechanism for spinal cord distraction injury, and these distraction pressures are potentially magnified in the setting of scoliosis.