Funded by: LGA Nordbayern

This project is a follow-up of the project SoloAssist and targets to automate the camera guidance with the developed robot arm. To work goal-orientated, MITI affiliates with the AKTORmed GmbH, the Caritas Krankenhaus St. Josef in Regensburg, the GEFASOFT Automatisierung und Software GmbH, the Faculty of Mechanical Engineering of the University of Applied Sciences Deggendorf, the Laboratory for Digital Technology and Automation of the University of Applied Sciences Regensburg and the Sensorik-Bayern GmbH in order to compose a sensory cluster.

As a first step, the exhausting manual camera guiding by the assisting surgeon during minimally invasive surgeries was replaced by a robot arm (SoloAssist) that can be steered effordlessly with a joystick. Although physical weaknesses while holding the camera are avoided, the assistant still has to follow the surgery observantly over the whole time and direct the SoloAssist or the surgeon has to steer the camera himself, which could cause delays.

To facilitate the monotonous work of the assistant the TeKaMIC-project aims to automate the camera guidance. Surgeries are analyzed in order to determine the by the surgeon favoured instrument position. With that knowledge an autonomous tracking and tracing of the instrument is enabled. Concerning the number of possibilities and contingencies that influence the surgery process, a fully automated system is not desirable. Planned is a semi-automatic camera tracing, that foremost secures patient safety and is able to relieve the surgeon when possible.

In order to accomplish that, the coordinates of the instrument are determined simultaneously with different tracking methods. Thus the performance of the tracking is not solely depending on image processing techniques but for example electromagnetic tracking systems allow reliable results even though the image is blurry or the lens dirty. The special challenge here is the fusion of redundant data in order to allow an ideal readjustment at any time.