Advertisement for orthosearch.org.uk
Results 1 - 1 of 1
Results per page:
Orthopaedic Proceedings
Vol. 94-B, Issue SUPP_XLIV | Pages 51 - 51
1 Oct 2012
Claasen G Martin P Picard F
Full Access

Over the past fifteen years, computer-assisted surgery systems have been more commonly used, especially in joint arthroplasty. They allow a greater accuracy and precision in surgical procedures and thus should improve outcomes and long term results.

New instruments such as guided handheld tools have been recently developed to ultimately eliminate the need for drilling/cutting or milling guides.

To make sure that the handheld tool cuts and/or drills in the desired plane, it has to be servo-controlled. For this purpose, the tool joints are actuated by computer-controlled motors. A tracking system gives the tool position and orientation and a computer calculates the corrections for the motors to keep the tool in the desired plane.

For this servo-control, a very fast tracking system would be necessary. It should be fast enough to follow human motion. Current optical tracking systems used for computer-assisted surgery have a bandwidth of about 10–60 Hz [3]. For servo-control, a bandwidth of about 200–300 Hz would be required to be faster than human reaction; the latency of the system should also be small, about 2–3 ms. Optical tracking systems with a higher bandwidth exist but are too expensive for applications in surgery; besides the latency – due to the complex computer vision treatment involved – is too big.

We have developed a hybrid tracking system consisting of two cameras pointed at the operating field and a sensor unit which can be attached to a handheld tool.

The sensor unit is made up of an inertial measuring unit (IMU) and numerous optical markers. The data from the IMU (three gyroscopes and three accelerometers placed such that their measurement axes are perpendicular to each other) and the marker images from the cameras looking at the optical markers are fed to a data fusion algorithm. This algorithm calculates the position and the orientation of any handheld tool. It can do so at the higher of the two sensor sample rates which is the IMU sample rate in our case.

Our experimental setup consists of an ADIS 16355 IMU which runs at a sample rate of 250 Hz and a pair of stereo cameras which are sampled at 16.7 Hz. The data collected from these sensors are processed offline by the data fusion algorithm. To compare the results of our hybrid system to those of a purely optical tracking system, we use only the marker image data to recalculate the sensor unit's position by triangulation.

The experiment we conducted was a fast motion in a horizontal direction starting from a rest position. The sensor unit position was calculated by the hybrid system and by the optical tracking system using the experimental data. The fast motion started right after the optical sample at t1 and the hybrid system detects it at once. The optical tracking system, on the other hand, only sees the motion at the next optical sample time t2.

These results show that our hybrid system is able to follow a fast motion of the sensor unit whereas a purely optical tracking system is not.

The proposed hybrid tracking system calculates position and orientation of any handheld tool at a high frequency of 250 Hz and thus makes it possible to servo-control the tool to keep it in the desired plane.

Several similar systems fusing optical and inertial data have been described in the literature. They all use processed optical data, i.e. 3D marker positions. Our algorithm uses raw image data to considerably reduce computation time. This hybrid tracking system can be used with any handheld tool developed to substitute existing drilling, cutting or milling instruments used in orthopaedic surgery and particularly in arthroplasty.

The sensor unit can be easily implemented into an existing optical tracking system. For the surgeon, the only change is an additional small inertial sensor besides the optical markers already attached to the tool.

The authors would like to thank the AXA Research Fund for funding G.C. Claasen's work with a doctoral grant and Guillaume Picard for his contributions to the experimental setup.