Imagine robots performing surgery on patients on the battlefield, or on astronauts in space, with little or no human guidance. As impossible as it might seem, the truth remains that the time is drawing near to when this practice becomes common place, thanks in part to the efforts of a small group of engineers and researchers at Duke University (Durham, North Carolina).
The group said that the results of feasibility studies conducted in their lab offers perhaps the first concrete evidence toward making this vision a reality.
"We're looking at a 10-year-horizon for this to occur," Stephen Smith, director of the Duke University Ultrasound Transducer Group and senior member of the research team, told Medical Device Daily. "Not in the immediate future, but probably in the next 10 years. This procedure could be used on the battlefield or for the space program."
Specifically this procedure uses 3-D ultrasound probes, a robot and an artificial intelligence program. All of these elements helped the team to create a road map to enabling remote robotic surgery.
The team used a robot system to direct catheters inside synthetic blood vessels. In the latest experiment, the robot was able to direct a needle on the end of the robotic arm to touch the tip of another needle within a blood vessel graft. The robot was guided by a tiny 3-D ultrasound transducer, the "wand" that collects the 3-D images, attached to a catheter commonly used in angioplasty procedures.
The Transducer Group also took a 3-D image of the tumor, sent the images over to the computer, and the computer using what Smith calls a "crude" artificial intelligence program, found the center of the tumor and directed the robot to stab it with a needle.
The results of a series of experiments on the robot system directing catheters inside synthetic blood vessels was published online in the journal IEEE Transactions on Ultrasonics, Ferroelectrics and Frequency Control. A second study, published in April in the journal Ultrasonic Imaging, chronicled the simulated needle biopsy.
"We've been involved with 3-D ultrasound for quite sometime now," Smith told MDD.
In fact, researchers at Duke first developed the 3-D application in 1987 for imaging the heart from outside the body. As the technology enabled ever smaller ultrasound arrays, the researchers engineered tiny probes that could be implanted inside catheters threaded through blood vessels to image the vasculature and heart from the inside out.
Here's how the 3-D ultrasound works.
The ultrasound uses 500 tiny cables and sensors packed into a tube 12 millimeters in diameter, the size required to fit into surgical instruments, called trocars, that surgeons for easy exchange of laparoscopic tools. It should be noted that 2-D ultrasound proves use just 64 cables, according to a description of the how the device works in a Duke news release.
The cables carry electrical signals from the scanner to the sensors at the tip of the tube, which then send pulses of acoustic eaves into the surrounding tissue, according to Smith. Those sensors then pick up the returning echoes and relay the sounds back to the scanner which in turn produces an image of the moving tissue or organ. The scanner uses parallel processing to listen to echoes of each pulse in 16 directions at once.
3-D ultrasound's advancement over endoscopic surgical methods is that endoscopic methods are two-dimensional and offer only a limited view, which can impede their depth perception and make such procedures difficult to master.
Although robots have been used mostly to focus on helping aid operators, acting as a sort of extension to the person, Smith's group proves robots can go it alone. Ultimately the group wants to be able to use the technology to access hard to reach terrain.
"All this is strictly for our academic research," he said adding that there are no plans at this point for the team to seek any type of regulatory approval for the procedure.
As to what's next for the application, well, the future is wide open, he said.
"In a number of tasks, the computer was able to direct the robot's actions," Smith said in a statement. "We believe that this is the first proof-of-concept for this approach. Given that we achieved these early results with a rudimentary robot and a basic artificial intelligence program, the technology will advance to the point where robots without the guidance of the doctor can someday operate on people."