Surgical case identification for an image-guided interventional system

Tamás Haidegger, Peter Kazanzides, Balázs Benyó, L. Kovács, Z. Benyó

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

Image-guided surgery offers great advantages to surgeons through the possibility to track tools in 3D space and to navigate based on the virtual model of the patient. In the case of robot-assisted procedures, both the inherent accuracy of the system components and the quality of the registration procedures are critical to provide high precision treatment delivery. One of the major barriers towards more technology-integrated procedures is the fact that alterations in the operating room environment can fundamentally change the performance of the system, decrease the accuracy, and therefore pose significant danger to the patient. Surgical events from the control point of view may include motion of the robot, motion of the camera, or motion of the patient. The paper describes a new concept to treat these events, to track and automatically compensate for abrupt changes that may affect the accuracy of a robot-integrated interventional system. Our solution is to use all available information at a given time, including the intra-operative tracker's internal base frame, to distinguish between different surgical events. The concept has been developed and tested on the neurosurgical robot system at the Johns Hopkins University. Initial experiments performed on data recordings from simulated scenarios showed that the algorithm was able to correctly identify the cases.

Original languageEnglish
Title of host publicationIEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings
Pages1831-1836
Number of pages6
DOIs
Publication statusPublished - 2010
Event23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Taipei, Taiwan, Province of China
Duration: Oct 18 2010Oct 22 2010

Other

Other23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010
CountryTaiwan, Province of China
CityTaipei
Period10/18/1010/22/10

Fingerprint

Robots
Operating rooms
Data recording
Surgery
Cameras
Experiments

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Control and Systems Engineering

Cite this

Haidegger, T., Kazanzides, P., Benyó, B., Kovács, L., & Benyó, Z. (2010). Surgical case identification for an image-guided interventional system. In IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings (pp. 1831-1836). [5650396] https://doi.org/10.1109/IROS.2010.5650396

Surgical case identification for an image-guided interventional system. / Haidegger, Tamás; Kazanzides, Peter; Benyó, Balázs; Kovács, L.; Benyó, Z.

IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings. 2010. p. 1831-1836 5650396.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Haidegger, T, Kazanzides, P, Benyó, B, Kovács, L & Benyó, Z 2010, Surgical case identification for an image-guided interventional system. in IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings., 5650396, pp. 1831-1836, 23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010, Taipei, Taiwan, Province of China, 10/18/10. https://doi.org/10.1109/IROS.2010.5650396
Haidegger T, Kazanzides P, Benyó B, Kovács L, Benyó Z. Surgical case identification for an image-guided interventional system. In IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings. 2010. p. 1831-1836. 5650396 https://doi.org/10.1109/IROS.2010.5650396
Haidegger, Tamás ; Kazanzides, Peter ; Benyó, Balázs ; Kovács, L. ; Benyó, Z. / Surgical case identification for an image-guided interventional system. IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings. 2010. pp. 1831-1836
@inproceedings{c2b419ff5e64494fae762846e8327070,
title = "Surgical case identification for an image-guided interventional system",
abstract = "Image-guided surgery offers great advantages to surgeons through the possibility to track tools in 3D space and to navigate based on the virtual model of the patient. In the case of robot-assisted procedures, both the inherent accuracy of the system components and the quality of the registration procedures are critical to provide high precision treatment delivery. One of the major barriers towards more technology-integrated procedures is the fact that alterations in the operating room environment can fundamentally change the performance of the system, decrease the accuracy, and therefore pose significant danger to the patient. Surgical events from the control point of view may include motion of the robot, motion of the camera, or motion of the patient. The paper describes a new concept to treat these events, to track and automatically compensate for abrupt changes that may affect the accuracy of a robot-integrated interventional system. Our solution is to use all available information at a given time, including the intra-operative tracker's internal base frame, to distinguish between different surgical events. The concept has been developed and tested on the neurosurgical robot system at the Johns Hopkins University. Initial experiments performed on data recordings from simulated scenarios showed that the algorithm was able to correctly identify the cases.",
author = "Tam{\'a}s Haidegger and Peter Kazanzides and Bal{\'a}zs Beny{\'o} and L. Kov{\'a}cs and Z. Beny{\'o}",
year = "2010",
doi = "10.1109/IROS.2010.5650396",
language = "English",
isbn = "9781424466757",
pages = "1831--1836",
booktitle = "IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings",

}

TY - GEN

T1 - Surgical case identification for an image-guided interventional system

AU - Haidegger, Tamás

AU - Kazanzides, Peter

AU - Benyó, Balázs

AU - Kovács, L.

AU - Benyó, Z.

PY - 2010

Y1 - 2010

N2 - Image-guided surgery offers great advantages to surgeons through the possibility to track tools in 3D space and to navigate based on the virtual model of the patient. In the case of robot-assisted procedures, both the inherent accuracy of the system components and the quality of the registration procedures are critical to provide high precision treatment delivery. One of the major barriers towards more technology-integrated procedures is the fact that alterations in the operating room environment can fundamentally change the performance of the system, decrease the accuracy, and therefore pose significant danger to the patient. Surgical events from the control point of view may include motion of the robot, motion of the camera, or motion of the patient. The paper describes a new concept to treat these events, to track and automatically compensate for abrupt changes that may affect the accuracy of a robot-integrated interventional system. Our solution is to use all available information at a given time, including the intra-operative tracker's internal base frame, to distinguish between different surgical events. The concept has been developed and tested on the neurosurgical robot system at the Johns Hopkins University. Initial experiments performed on data recordings from simulated scenarios showed that the algorithm was able to correctly identify the cases.

AB - Image-guided surgery offers great advantages to surgeons through the possibility to track tools in 3D space and to navigate based on the virtual model of the patient. In the case of robot-assisted procedures, both the inherent accuracy of the system components and the quality of the registration procedures are critical to provide high precision treatment delivery. One of the major barriers towards more technology-integrated procedures is the fact that alterations in the operating room environment can fundamentally change the performance of the system, decrease the accuracy, and therefore pose significant danger to the patient. Surgical events from the control point of view may include motion of the robot, motion of the camera, or motion of the patient. The paper describes a new concept to treat these events, to track and automatically compensate for abrupt changes that may affect the accuracy of a robot-integrated interventional system. Our solution is to use all available information at a given time, including the intra-operative tracker's internal base frame, to distinguish between different surgical events. The concept has been developed and tested on the neurosurgical robot system at the Johns Hopkins University. Initial experiments performed on data recordings from simulated scenarios showed that the algorithm was able to correctly identify the cases.

UR - http://www.scopus.com/inward/record.url?scp=78651480842&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=78651480842&partnerID=8YFLogxK

U2 - 10.1109/IROS.2010.5650396

DO - 10.1109/IROS.2010.5650396

M3 - Conference contribution

AN - SCOPUS:78651480842

SN - 9781424466757

SP - 1831

EP - 1836

BT - IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings

ER -