Aims. The use of technology to assess balance and alignment during total knee surgery can provide an overload of numerical data to the surgeon. Meanwhile, this quantification holds the potential to clarify and guide the surgeon through the surgical decision process when selecting the appropriate bone recut or soft tissue adjustment when balancing a total knee. Therefore, this paper evaluates the potential of deploying supervised
Degenerative lumbar spondylolisthesis (DLS) is a common condition with many available treatment options. The Degenerative Spondylolisthesis Instability Classification (DSIC) scheme, based on a systematic review of best available evidence, was proposed by Simmonds et al. in 2015. This classification scheme proposes that the stability of the patient's pathology be determined by a surgeon based on quantitative and qualitative clinical and radiographic parameters. The purpose of the study is to utilise
Introduction & Aims. Patient recovery after total knee arthroplasty remains highly variable. Despite the growing interest in and implementation of patient reported outcome measures (e.g. Knee Society Score, Oxford Knee Score), the recovery process of the individual patient is poorly monitored. Unfortunately, patient reported outcomes represent a complex interaction of multiple physiological and psychological aspects, they are also limited by the discrete time intervals at which they are administered. The use of wearable sensors presents a potential alternative by continuously monitoring a patient's physical activity. These sensors however present their own challenges. This paper deals with the interpretation of the high frequency time signals acquired when using accelerometer-based wearable sensors. Method. During a preliminary validation, five healthy subjects were equipped with two wireless inertial measurement units (IMUs). Using adhesive tape, these IMU sensors were attached to the thigh and shank respectively. All subjects performed a series of supervised activities of daily living (ADL) in their everyday environment (1: walking, 2: stair ascent, 3: stair descent, 4: sitting, 5: laying, 6: standing). The supervisor timestamped the performed activities, such that the raw IMU signals could be uniquely linked to the performed activities. Subsequently, the acquired signals were reduced in Python. Each five second time window was characterized by the minimum, maximum and mean acceleration per sensor node. In addition, the frequency response was analyzed per sensor node as well as the correlation between both sensor nodes. Various
Single level discectomy (SLD) is one of the most commonly performed spinal surgery procedures. Two key drivers of their cost-of-care are duration of surgery (DOS) and postoperative length of stay (LOS). Therefore, the ability to preoperatively predict SLD DOS and LOS has substantial implications for both hospital and healthcare system finances, scheduling and resource allocation. As such, the goal of this study was to predict DOS and LOS for SLD using
Disorders of human joints manifest during dynamic movement, yet no objective tools are widely available for clinicians to assess or diagnose abnormal joint motion during functional activity.
Background. The advent of value-based conscientiousness and rapid-recovery discharge pathways presents surgeons, hospitals, and payers with the challenge of providing the same total hip arthroplasty episode of care in the safest and most economic fashion for the same fee, despite patient differences. Various predictive analytic techniques have been applied to medical risk models, such as sepsis risk scores, but none have been applied or validated to the elective primary total hip arthroplasty (THA) setting for key payment-based metrics. The objective of this study was to develop and validate a predictive
Objective. Wearable sensors have enabled objective functional data collection from patients before total knee replacement (TKR) and at clinical follow-ups post-surgery whereas traditional evaluation has solely relied on self-reported subjective measures. The timed-up-and-go (TUG) test has been used to evaluate function but is commonly measured using only total completion time, which does not assess joint function or test completion strategy. The current work employs
The success of a cementless Total Hip Arthroplasty (THA) depends not only on initial micromotion, but also on long-term failure mechanisms, e.g., implant-bone interface stresses and stress shielding. Any preclinical investigation aimed at designing femoral implant needs to account for temporal evolution of interfacial condition, while dealing with these failure mechanisms. The goal of the present multi-criteria optimization study was to search for optimum implant geometry by implementing a novel
Pedicle screw fixation is a technically demanding procedure with potential difficulties and reoperation rates are currently on the order of 11%. The most common intraoperative practice for position assessment of pedicle screws is biplanar fluoroscopic imaging that is limited to two- dimensions and is associated to low accuracies. We have previously introduced a full-dimensional position assessment framework based on registering intraoperative X-rays to preoperative volumetric images with sufficient accuracies. However, the framework requires a semi-manual process of pedicle screw segmentation and the intraoperative X-rays have to be taken from defined positions in space in order to avoid pedicle screws' head occlusion. This motivated us to develop advancements to the system to achieve higher levels of automation in the hope of higher clinical feasibility. In this study, we developed an automatic segmentation and X-ray adequacy assessment protocol. An artificial neural network was trained on a dataset that included a number of digitally reconstructed radiographs representing pedicle screw projections from different points of view. This model was able to segment the projection of any pedicle screw given an X-ray as its input with accuracy of 93% of the pixels. Once the pedicle screw was segmented, a number of descriptive geometric features were extracted from the isolated blob. These segmented images were manually labels as ‘adequate’ or ‘not adequate’ depending on the visibility of the screw axis. The extracted features along with their corresponding labels were used to train a decision tree model that could classify each X-ray based on its adequacy with accuracies on the order of 95%. In conclusion, we presented here a robust, fast and automated pedicle screw segmentation process, combined with an accurate and automatic algorithm for classifying views of pedicle screws as adequate or not. These tools represent a useful step towards full automation of our pedicle screw positioning assessment system.
External validation of
Approximately 20% of patients feel unsatisfied 12 months after primary total knee arthroplasty (TKA). Current predictive tools for TKA focus on the clinician as the intended user rather than the patient. The aim of this study is to develop a tool that can be used by patients without clinician assistance, to predict health-related quality of life (HRQoL) outcomes 12 months after total knee arthroplasty (TKA). All patients with primary TKAs for osteoarthritis between 2012 and 2019 at a tertiary institutional registry were analysed. The predictive outcome was improvement in Veterans-RAND 12 utility score at 12 months after surgery. Potential predictors included patient demographics, co-morbidities, and patient reported outcome scores at baseline. Logistic regression and three
Introduction.
Aims. This study explored the shared genetic traits and molecular interactions between postmenopausal osteoporosis (POMP) and sarcopenia, both of which substantially degrade elderly health and quality of life. We hypothesized that these motor system diseases overlap in pathophysiology and regulatory mechanisms. Methods. We analyzed microarray data from the Gene Expression Omnibus (GEO) database using weighted gene co-expression network analysis (WGCNA),
Advances in cancer therapy have prolonged patient survival even in the presence of disseminated disease and an increasing number of cancer patients are living with metastatic bone disease (MBD). The proximal femur is the most common long bone involved in MBD and pathologic fractures of the femur are associated with significant morbidity, mortality and loss of quality of life (QoL). Successful prophylactic surgery for an impending fracture of the proximal femur has been shown in multiple cohort studies to result in longer survival, preserved mobility, lower transfusion rates and shorter post-operative hospital stays. However, there is currently no optimal method to predict a pathologic fracture. The most well-known tool is Mirel's criteria, established in 1989 and is limited from guiding clinical practice due to poor specificity and sensitivity. The ideal clinical decision support tool will be of the highest sensitivity and specificity, non-invasive, generalizable to all patients, and not a burden on hospital resources or the patient's time. Our research uses novel
Excessive resident duty hours (RDH) are a recognized issue with implications for physician well-being and patient safety. A major component of the RDH concern is on-call duty. While considerable work has been done to reduce resident call workload, there is a paucity of research in optimizing resident call scheduling. Call coverage is scheduled manually rather than demand-based, which generally leads to over-scheduling to prevent a service gap.
Most cost containment efforts in public health systems have focused on regulating the use of hospital resources, especially operative time. As such, attempting to maximize the efficiency of limited operative time is important. Typically, hospital operating room (OR) scheduling of time is performed in two tiers: 1) master surgical scheduling (annual allocation of time between surgical services and surgeons) and 2) daily scheduling (a surgeon's selection of cases per operative day). Master surgical scheduling is based on a hospital's annual case mix and depends on the annual throughput rate per case type. This throughput rate depends on the efficiency of surgeons’ daily scheduling. However, daily scheduling is predominantly performed manually, which requires that the human planner simultaneously reasons about unknowns such as case-specific length-of-surgery and variability while attempting to maximize throughput. This often leads to OR overtime and likely sub-optimal throughput rate. In contrast, scheduling using mathematical and optimization methods can produce maximum systems efficiency, and is extensively used in the business world. As such, the purpose of our study was to compare the efficiency of 1) manual and 2) optimized OR scheduling at an academic-affiliated community hospital representative of most North American centres. Historic OR data was collected over a four year period for seven surgeons. The actual scheduling, surgical duration, overtime and number of OR days were extracted. This data was first configured to represent the historic manual scheduling process. Following this, the data was then used as the input to an integer linear programming model with the goal of determining the minimum number of OR days to complete the same number of cases while not exceeding the historic overtime values. Parameters included the use of a different quantile for each case type's surgical duration in order to ensure a schedule within five percent of the historic overtime value per OR day. All surgeons saw a median 10% (range: 9.2% to 18.3%) reduction in the number of OR days needed to complete their annual case-load compared to their historical scheduling practices. Meanwhile, the OR overtime varied by a maximum of 5%. The daily OR configurations differed from historic configurations in 87% of cases. In addition, the number of configurations per surgeon was reduced from an average of six to four. Our study demonstrates a significant increase in OR throughput rate (10%) with no change in operative time required. This has considerable implications in terms of cost reduction, surgical wait lists and surgeon satisfaction. A limitation of this study was that the potential gains are based on the efficiency of the pre-existing manual scheduling at our hospital. However, given the range of scenarios tested, number of surgeons included and the similarity of our hospital size and configuration to the majority of North American hospitals with an orthopedic service, these results are generalizable. Further optimization may be achieved by taking into account factors that could predict case duration such as surgeon experience, patients characteristics, and institutional attributes via
Total knee and hip arthroplasty (TKA and THA) are two of the highest volume and resource intensive surgical procedures. Key drivers of the cost of surgical care are duration of surgery (DOS) and postoperative inpatient length of stay (LOS). The ability to predict TKA and THA DOS and LOS has substantial implications for hospital finances, scheduling and resource allocation. The goal of this study was to predict DOS and LOS for elective unilateral TKAs and THAs using
Aim. While metagenomic (microbial DNA) sequencing technologies can detect the presence of microbes in a clinical sample, it is unknown whether this signal represents dead or live organisms. Metatranscriptomics (sequencing of RNA) offers the potential to detect transcriptionally “active” organisms within a microbial community, and map expressed genes to functional pathways of interest (e.g. antibiotic resistance). We used this approach to evaluate the utility of metatrancriptomics to diagnose PJI and predict antibiotic resistance. Method. In this prospective study, samples were collected from 20 patients undergoing revision TJA (10 aseptic and 10 infected) and 10 primary TJA. Synovial fluid and peripheral blood samples were obtained at the time of surgery, as well as negative field controls (skin swabs, air swabs, sterile water). All samples were shipped to the laboratory for metatranscriptomic analysis. Following microbial RNA extraction and host analyte subtraction, metatranscriptomic sequencing was performed. Bioinformatic analyses were implemented prior to mapping against curated microbial sequence databases– to generate taxonomic expression profiles. Principle Coordinates Analysis (PCoA) and Partial Least Squares-Discriminant Analysis were utilized to ordinate metatranscriptomic profiles, using the 2018 definition of PJI as the gold-standard. Results. After RNA metatranscriptomic analysis, blinded PCoA modeling revealed accurate and distinct clustering of samples into 3 separate cohorts (infected, aseptic, and primary joints) – based on their active transcriptomic profile, both in synovial fluid and blood (synovial anosim p=0.001; blood anosim p=0.034). Differential metatranscriptomic signatures for infected versus noninfected cohorts enabled us to train
Introduction. Clinical decision support tools are software that match the input characteristics of an individual patient to an established knowledge base to create patient-specific assessments that support and better inform individualized healthcare decisions. Clinical decision support tools can facilitate better evidence-based care and offer the potential for improved treatment quality and selection, shared decision making, while also standardizing patient expectations. Methods. Predict+ is a novel, clinical decision support tool that leverages clinical data from the Exactech Equinoxe shoulder clinical outcomes database, which is composed of >11,000 shoulder arthroplasty patients using one specific implant type from more than 30 different clinical sites using standardized forms. Predict+ utilizes multiple coordinated and locked supervised
To explore a novel