Despite the widespread use of opioids for pain control in post-operative joint arthroplasty patients, data regarding actual opioid consumption in opioid-naive patients during the recovery period are limited. We sought to determine postoperative opioid consumption for opioid-naïve patients undergoing total knee and hip arthroplasty (THA and TKA) procedures. The study cohort consisted of 55 patients (29 females, 26 males) who underwent either primary unilateral TKA (n=28) or THA (n=27). Prior to discharge, patients were provided with a medication log on which to track daily consumption of pain medicine. Patients were asked to provide details regarding the type of pain medication, quantity and frequency of use, and pain score at the time of use. Patients were contacted weekly by a member of the study team to monitor compliance. Specific opioid prescription information was acquired for each subject using their electronic medical record. Subjects returned the completed logs once they ceased opioid use post-operatively. Daily quantity was converted to daily Morphine Equivalent Dose (MED). Average daily, weekly, and total post-operative use was calculated for all opioid data variables. Descriptive statistics (mean, frequency, deviation) were used to analyze opioid data. All dependent variables were compared between TKA and THA patients using separate independent samples t-tests or Chi-square tests.Introduction
Methods
Opioids are an integral part of pain management following total joint replacement procedures; however, to date, no evidence-based guidelines which regulate opioid prescribing practices exist. In order to determine an appropriate number of opioids required to control pain for post-arthroplasty patients, it is important to understand why patients are using them. We sought to identify the causes of pain which necessitated opioid consumption for patients following total knee (TKA) and total hip (THA) arthroplasty. The study cohort consisted of 55 patients (29 females, 26 males) who underwent either primary unilateral TKA (n=28) or THA (n=27). Prior to discharge, patients were provided with a pain diary in which to record details regarding the type of pain medication used, the time of use, pain score at the time of use, and the specific reason for use. Subjects returned the completed logs once they ceased opioid use post-operatively. Based on responses, we categorized reasons for use into either Activity, which was further classified into ADL and Exercise, or Rest, which was further classified into Sleeping, Sitting, and Laying Down. Average and frequency of opioid consumption was calculated for each category, along with the pain score at the time of use for each category. All dependent variables were compared between TKA and THA patients using separate independent samples t-tests or Chi-square tests.Introduction
Methods
Multiple retrospective studies have compared UC with traditional bearings and shown comparable results and outcomes when looking at clinical and radiologic variables, complications rates, and implant survivorship; however, debate still exists regarding the optimum bearing surface. The present study seeks to determine whether there are any preoperative patient demographic or medical factors or anatomic variables including femoral condylar offset and tibial slope that may predict use of a UC bearing when compared to a standard CR group. The study cohort consisted of 117 patients (41 males, 76 females) who underwent primary TKA with the senior author. The implants utilized were either the CR or UC polyethylene components of the Zimmer Persona Total Knee System. Insert selection was based on intraoperative assessment of PCL integrity and soft tissue balancing. Patient demographics (age, gender, BMI) and co-morbidities (hypertension, diabetes, depression, cardiac disease, and lung disease) were recorded. Intraoperative variables of interest included extension and flexion range of motion, estimated blood loss (EBL), tourniquet time, and polyethylene and femoral component sizes. We calculated change in tibial slope and femoral condylar offset from pre- to post-surgery and computed the percentage of patients for whom an increase in tibial slope or femoral condylar offset was determined. Postoperative variables, including length of stay, complication rates and reoperation rates, were recorded. All dependent variables were compared between patients who received the UC component and patients who received the CR component. Continuous variables were assessed using independent samples t-tests, while categorical variables were compared using the chi-square test of independence.Background
Methods
Interferon (IFN) based treatments for chronic hepatitis C (HCV) have been the standard of care until 2014 when direct antiviral agents (DAA) were introduced. Patients with HCV have had extremely high complication rates after total hip arthroplasty (THA). It is unknown whether HCV is a modifiable risk factor for these complications prior to THA. The purpose of this study was 1) to compare perioperative complication rates between untreated and treated HCV in THA and 2) to compare these rates between patients treated with two different therapies (IFN vs. DAA). A multicenter retrospective database query was used to identify patients diagnosed with chronic hepatitis C virus who underwent total hip arthroplasty from 2006–2016. All patients (n=105) identified were included and were divided into two groups: untreated HCV (n=63) and treated (n=42); the treated group were further subdivided into those receiving IFN based therapies (n=16) or DAA therapies (n=26). Comparisons between the treated and untreated groups were made with respect to demographic data, comorbidities, preoperative viral load, MELD score, and all surgical (≤1 yr) and medical (≤90d) complications; a sub-group analysis of the treated patients was also performed. Separate independent t-tests were conducted for dependent variables that were normally distributed, and Mann-Whitney U tests were conducted for variables which were not normally distributed. Categorical variables were compared through the chi-square test of independence. The level of statistical significance was set at p<0.05.Introduction
Methods
The sealing function of the acetabular labrum is central to the stability of the hip and the health of the joint. Disruption of the labrum has been shown to reduce intra-articular pressure and increase the rate of cartilage consolidation during static loading. Functional activities require movement of the hip through wide ranges of joint motion which disrupt joint congruency, and thus may alter the seal. This study was performed to test the hypothesis that the sealing function of the labrum varies with the position of the hip during functional activities. Six fresh cadaveric hip joint specimens were obtained from donors of average age 45.5 ± 16.1 years (range 25–63 years). Each specimen was dissected free of soft tissue, leaving the capsule and labrum intact, potted in mounting fixtures, and placed in a loading apparatus. Catheters were inserted into the central and peripheral compartments of each hip to allow infusion of fluid and monitoring of compartment pressures via miniature transducers (OMEGA Engineering, Inc). After application of a joint load of 0.50 BW, fluid was introduced into the central compartment at a constant rate until transport was indicated by a rise in pressure within the peripheral compartment. These measurements were performed with each hip placed in 10 functional positions ranging from −5 to 105 degrees of flexion, −5 to 13 degrees of abduction, and −25 to 35 degrees of external rotation simulating the sequential stages of gait, stooping, and pivoting. Motion analysis was performed via reflective marker arrays attached to the femur and pelvis to allow computer visualization of the position of the pelvis and femur using CT reconstructions. In each hip position, we measured the peak pressure (kPA) developed within the central compartment prior to fluid transfer to the peripheral compartment.Introduction:
Methods:
Experimental disruption of the labrum has been shown to compromise its sealing function and alter cartilage lubrication. However, it is not known whether pathological changes to the labrum secondary to femoro-acetabular impingement (FAI) have a similar impact on labral function. This study was performed to determine the effect of natural labral damage secondary to abnormal femoral morphology on the labral seal. Ten intact hip specimens were obtained from male donors (47.8 ± 1.5 yrs) for use in this study. CT reconstructions demonstrated that 6 specimens were of normal morphology, while 4 displayed morphology typical of cam-FAI. Specimens were dissected free of the overlying soft tissue, leaving the capsule and labrum intact. Each specimen was potted and placed in a loading apparatus (0.5 BW). Pressures developed within the central and peripheral compartments were monitored with miniature pressure transducers. The sealing capacity of the labrum was measured by introducing fluid into the central compartment at a constant rate until transport was detected from the central to the peripheral compartment. These measurements were performed in 10 functional positions simulating sequential stages of gait, stooping, and pivoting. During testing, the 3D motion of the femoral head in the acetabulum was measured with motion analysis combined with computer visualization. Peak pressures were compared between specimens with and without labral damage for each of the three activities (p < 0.05).Objectives:
Methods: