In this study x-ray calibration using the femoral head diameter as derived by an anatomic formula is compared to the standard technique of using a calibration ball and acetate type fixed magnification.
Polyethylene (PE) wear particle induced osteolysis remains a major cause of failure in total hip arthroplasty (THA), so that routine clinical measurement of wear stays important. Crosslinked PE promises very low wear rates so that measurement accuracy becomes increasingly important to distinguish alternative materials. The rising use of large femoral heads causes lower linear head penetration also requiring improved accuracy. Digital x-rays and wear measurement software have become standard, but during archiving and exchange of x-rays, image format, resolution or compression are often changed without knowing the effects on wear measurement. This study investigates the effect of digital x-ray resolution and compression on the accuracy of two software programs to measure wear. The 8-year post-op digital x-rays of 24 THA patients (Stryker ABG-II, 28mm metal femoral head against Duration or conventional PE) were taken from the hospital PACS (Philips Diagnost H, AGFA ADC Solo, Siemens Medview) as DICOM at 5.1 MPix resolution. Images were converted to compression-free TIFF format using Irfanview V4.1. Wear (linear head penetration) was measured using Roman V1.7 and Martell Hip Analysis Suite 7.14. The x-rays were smoothened (Irfanview V4.1, Median Filter: 3) as recommended in literature for compatibility with Martell’s edge detection algorithm. Wear was measured twice by two independent observers at original format and resolution and then once by a single observer at three subsequently halved resolutions (2.6, 1.3, 0.65MPix) and three jpeg compressions (90%, 50%, 20%). Intra- and inter-observer reliability (R) was compared to the reliability of measuring manipulated images (Pearson’s r). The mean absolute wear differences (AD) were calculated versus the original x-ray. The mean total wear was 0.98+/−0.59mm (0.3–2.4mm) equaling an annual of wear rate of 0.11mm/yr. Using Roman, Intra-R (0.97) and Inter-R (0.96) were high and AD low (0.10 and 0.20mm). Reduced image resolution caused the R to drop only slightly to 0.95 (2.6MPix), 0.92 (1.3MPix) and 0.94 (0.65MPix) while AD remained low (<
0.20mm). Also compression hardly affected R (90%:0.96, 50%: 0.94, 20%:0.93) nor AD (<
0.20mm). Using Martell Intra-R (0.99) and Inter-R (0.87) were also high but dropped with reducing resolution (0.82, 0.72, 0.34, AD: 0.4–1.1mm) but hardly with increased compression (0.95, 0.92, 0.94, AD<
0.20mm). Low resolution and high compression do not have to be critical for wear measurement accuracy and reliability when edge detection is performed by a trained human eye. This way interpolating the ball and cup perimeters and locating their centers can be performed at accuracy below pixel size (ca. 0.40mm at 0.65MPix). Automatic edge detection is less robust to reducing resolution but performs at high compression. If image size needs to be reduced compression is preferable to reducing resolution.