NASA NEEMO Topside Report: Mission Days 12 and 13: Sunday, April 16th, 2006
Mission: April, 2006 Saturation
Today we accomplished the last of the CMAS (Center for Minimal Access Surgery) research objectives that were waiting to be run. The first involved looking at emergency treatment of joint injuries using ultrasound and telementored arthroscopy. The second was an investigation on haptics. A joint injury, such as a torn meniscus or dislocation, is an example of a potential injury that would require emergency treatment by other members of the mission crew. Joint injuries are frequently diagnosed by ultrasound investigation, and depending on the injury, may then be treated by arthroscopy. This minimally invasive technique involves creating a number of small incisions through which the surgeon inserts a camera and the surgical instruments necessary to repair the injury. In this experiment, the aquanauts used a portable ultrasound device to perform a diagnostic ultrasound examination on a crewmember’s knee. They used a specially designed training manual and received guidance from an expert orthopedic surgeon in Hamilton, Ontario via telementoring. With step-by-step telementoring from Dr. Anthony Adili, they will then attempt to repair a simulated joint injury (torn meniscus) using a medical training model of a knee.
Since telementoring relies on transmission of video images over a telecommunications network, time delay (latency) becomes an issue when images are sent over very long distances. In order to study the effect of latencies similar to those that would be experienced during telementoring from earth to the moon, the astronauts will also attempt the arthroscopic joint repair with telementoring using a telecommunications network that mimics Lunar latency (2 second time delay).
Image above: Dave Williams performs a simulated telementored arthroscopic knee surgery. Credit: NASA
Image above: Nicole Stott demonstrates a diagnostic ultrasound on Tim Broderick’s knee. Credit: NASA
This technology may one day enable expert surgeons to guide non-physicians through the procedures necessary to provide emergency surgical care to astronauts injured during space exploration missions, and to patients in remote locations without any access to a physician.
You’ve seen examples so far this mission of a surgeon, located in a remote location (Hamilton, Ontario) performing surgical techniques using a robotic device. You may be asking, “how does the surgeon feel what the robot is doing?” After all, the feedback from the tools to the hands is a big cue to a surgeon doing his work. Doesn’t he lose all sense of feel when working with a robot? To give the operator the ability to feel, these robotic devices employee a technology called “haptics.” Haptics is the science of applying touch (tactile) sensation and control to interaction with robotic devices. By using special input/output devices the user can receive feedback from robotic devices in the form of felt sensations in the hand. So for instance, if the robotic manipulator hit something, the control in the operator’s hand would push back, so that he can feel the contact from the manipulator, thousands of kilometers away. However, there is a downside to this type of technology: a large enough time delay affects haptics to the point where the user cannot control the device. There are time delays built in any time large distances are involved. The larger the distance, the larger the delay. During NEEMO 9, we evaluated a new technology called TiDeC. TiDeC is a time delay compensator that allows a haptic enabled device to be controlled from a distance of nearly 1300 miles. Dr Anvari was again in Hamilton, and using TiDeC assisted haptics, will be able to guide the crew through a series of tasks and each side feels every move each other makes. Thanks for staying with us!
– NEEMO 9 Topside Team