Over the past 20 years robotic systems have demonstrated benefits in surgical therapy with greater accuracy and consistency of tool trafectories in contrast with manual interventions. It has been a great achievement to deploy these machiens in the operating theatre in close proximity to patients and staff and now commercial systems are available. Tool guidance is most often related to pre-operative scan data with possible control by the surgeon. Important steps to be made in future development of these devices are to enable tool guidance based on sensory perception at the tool-point and to produce smaller devices requiring little set-up time and other infrastructure in the operating room.
Our research work focuses on controlling tool-point interaction in small, even hand-supported devices. The potential to automatically discriminate different working conditions and state of the tool-point and tissues will increase perception by the surgeon at this small scale, and enable precise and consistent results. This will offer great benefit in the many surgical procedures that now work on small tissue targets and often through difficult access.
Currently we are investigating two generic solutions as smart tool-points:
Here control is achieved through the discrimination of the interaction between tissues and tool-point. These descriptions are information having great benefit over the use of pure data. Using this approach it is also possible to infer the state of tissue and tool-point in real time.