Biosensor Integration and Image Guidance for Robotic Surgery Brady King Luke Reisner
Contributors Dr. Abhilash Pandya Dr. Gregory Auner Dr. Michael Klein People of SSIM/CARES Smart Sensors and Integrated Microsystems Computer-Assisted Robot-Enhanced Systems
About Us Ph.D. students at Wayne State University in the ECE department M.S. degrees in Computer Engineering from the University of Michigan–Dearborn Part of SSIM/CARES at WSU Sensor development Sensor applications Robotic applications
Engineering in Medicine Problems in medicine (and solutions) Surgeons have limited accuracy Surgical robots Patients are difficult to navigate Image-guided surgery Identification of tissue is difficult Biological sensors (biosensors) Why engineers? Needed to develop these solutions to medicine’s problems
What We’re Doing Developing systems that help surgeons treat patients In particular, we are trying to improve surgical cancer treatment To accomplish this, we are combining: Image-guided surgery Medical robotics A Raman biosensor The following slides will provide background on each of these topics
Image-Guided Surgery (IGS) Surgery that utilizes imaging and advanced visualization techniques Patient imaging, cameras, tracked surgical tools Typically minimally invasive Virtual reality Computer-generated 3D views of the patient Augmented reality Mixed camera and computer views Superman’s X-ray vision!
Image-Guided Surgery Video
IGS Applications Surgical path planning Enables surgeon plot the best course to the tissue of interest Required for minimally invasive surgery Cameras and tools threaded through tiny holes in the body Allows surgeon see what’s inside the body Surgeon training Procedures can be replayed and practiced for instructional purposes
Robotic Surgery Any surgery that utilizes robots The robots can retrieve tools, position tools, position cameras, etc. Typically, the surgeon controls a robot through a pair of joysticks The robot’s arms act as an extension of the surgeon’s arms
Benefits of Robotic Surgery Minimally invasive Motion scaling and tremor filtration Enhanced accuracy More comfortable for the surgeon Lower risk of infection Faster recovery and reduced hospital stays Enables remote surgical procedures We believe that all types of surgery are moving towards using this technology
Robotic Surgery Pictures
Robotic Surgery Procedures Common types of robotic surgeries: Brain Gastrointestinal Prostate Heart Orthopedic The most popular robot (da Vinci) was used in 48,000+ cases last year
Surgical Robots Popular surgical robots: Aesop Zeus Da Vinci Intuitive Surgical manufactures all of these Our lab currently has Aesop and Zeus systems
Raman Spectroscopy Spectroscopic technique that uses laser light to determine molecular composition Shifts in energy of reflected light produces a spectrum Samples have distinct spectral “fingerprints” Laser Excitation Rayleigh Scattering Raman Energy states
Raman Spectra
Raman Applications Bomb detection Space applications Tissue diagnosis Detect trace amounts of explosives Space applications Analysis of Martian surface material Tissue diagnosis Used to detect cancer markers within cells
Medical Benefits of Raman Near real-time results Conventional biopsies require at least 20 minutes Requires no sample preparation or contrast-enhancing agents Not harmful to tissue Fiber optic probe suitable for minimally invasive surgery Accurate tissue diagnosis
Our Research Goals Integrate image guidance with robotic surgery Provide better interfaces for the surgeon to see & navigate the body Utilize virtual/augmented reality Integrate biosensor information with (robotic) surgery Use a Raman probe to detect cancer Develop tissue classification algorithms
Robot Feedback/Control We are reverse-engineering the Aesop and Zeus to track and control them: Joint feedback read directly from the robot Developed custom motion controller for complete computerized control Computer control allows: New control interfaces Task-specific motion scaling Automation of sensor placement or other repetitive surgical tasks
Robot Feedback Video
Robotic Image Guidance Preliminary image-guided system: Tracked mechanical arm Incorporates medical imaging Presents a 3D virtual or augmented display Adding Aesop and Zeus feedback to the system Robot arms (holding surgical tools) will be tracked and displayed on medical imaging
Eye Tracking We have a system that tracks where the user is looking This is being integrated with our robotic control and visualization systems Current systems require the surgeon to manually reposition the camera By tracking the surgeon’s eyes, we can command the Aesop to automatically position the camera
Raman Biosensors We currently have 3 Raman spectrometers, including a portable probe that can be used inside the body Our partnership with the Detroit Medical Center has allowed us to create a database of hundreds of tissue samples This data lets us produce accurate tissue classification techniques
Robotic Raman Integration Our portable Raman probe is currently tracked in our IGS system Next the probe will be attached to the arm of one of our medical robots This will enable controlled, accurate positioning of the probe Tissue scans and classification will be completely automated
Raman Spectra Processing Before we can classify a Raman spectra, we must perform numerous signal processing steps: Background subtraction Noise filtering Normalization Dimensionality reduction
Raman Classification Processed Raman spectra must be classified as normal, cancer, etc. Accomplished via learning/statistical algorithms Currently using artificial neural networks Algorithms trained using our database Our software uses the training data to classify tissue spectra in real-time
Raman Image Guidance Once the tissue is classified, we need to present it to the surgeon We have integrated our image-guided display system with the tracked Raman probe Colored markers in the virtual view denote the tissue type We are investigating better ways to present the data to the surgeon
Raman Visualization Our visualization system showing 3D models, the tracked Raman probe, collected Raman spectra, and colored classification markers
Future Work We have developed an augmented reality path planning system that we intend to implement for Aesop and Zeus Improving the Raman classification algorithms Human (surgeon) factors testing Animal and human trials Other secret ideas
Robotic Surgery of the Future The robotic arm will automatically scan tissue and present the results to the surgeon in a 3D display The system will help the surgeon plan the best path to remove the lesion The robot’s interface will enable the surgeon to easily perform the surgery Repetitive tasks will be automated
Acknowledgements SSIM Children’s Hospital of Michigan Wayne State University Dr. Raja Rabah Dr. Janet Poulik IEEE
Any questions, comments, or suggestions? Feedback Thanks for listening! Any questions, comments, or suggestions?
CARES: NASA Project Astronauts remotely control a dexterous manipulator (robotic arm) to insert orbital replacement units on the space station We are enhancing the control, visualization, and path planning capabilities of the system
CARES: Military Project Soldiers remotely control a robot to detect explosives under cars We are improving the control interface and adding explosives detection sensors and equipment to the robot
CARES: Microsurgery Project Surgeons use a robot to perform minimally invasive surgery We are researching and enhancing the surgeon’s interfaces with the robot