Nursebot 3000, Please Report to the O.R., Stat

At work last night, R.J. (the only coworker I can hold a moderately intelligent conversation with) and I were discussing robots/cyborgs, the current limitations of technology and cutting-edge robotics research, and the definition of humanity. And by “R.J. and I,” I mostly mean R.J. would say something, then I would explain why he was wrong or elaborate on the topic for him.

Which is funny, given that, when I got home today, I was browsing one of my science news sites and discovered an interesting article about robotics that relates directly to the world of health care.

Juan Pablo Wachs, an assistant professor of industrial engineering at Purdue University, is working on a robotic scrub nurse.

That’s right… a nursebot.

While not the first robotic nurse to hit the hospital scene [Do hospitals even have scenes? In my experience, the answer is no. Latex gloves, sure. Irritating, beeping machines and oxygen hookups? Oh yes. A “scene?” Unless that scene involves disinfectant and intense periods of boredom, I’m gonna have to go with a no.], Wachs’ robonurse has something the others don’t- hand gesture recognition technology.

You know… like Tom Cruise uses in Minority Report:

What was once relegated solely to the land of science fiction films could now become a reality. But… why use this kind of tech to replace nurses?

The reasoning behind Wachs’ research centers around improving operating room efficiency. He’s hoping that the robotic scrub nurse could help speed up surgeries and, more importantly, lower the infection risk in the O.R.

In terms of potential infection risk, simply eliminating another germy human from the room isn’t the only benefit of the nurse-o-tron. Wachs’ nurse would not only be able to recognize hand gestures and be able to hand surgeons the proper tool, but it would also be able to display medical records and images. Currently, surgeons have to step away from the operating table and handle a keyboard and mouse to access these files. This both delays the surgery and increases the risk of spreading infectious bacteria. The robot nurse would be able to provide hands-free access to this information right at the operating table. Which, I’ll admit, would be terribly useful.

Of course, this technology isn’t quite ready yet. Anyone who has handled a Wii or Kinect or whatever-the-fuck-the-Playstation-version-of-this-shit-is knows about the limitations of commercial motion capture systems. And, while Wachs’ version is much more precise than the video game versions, it’s still imperfect. One of the biggest challenges he’s facing is in the development of “…the proper shapes of hand poses and the proper hand trajectory movements to reflect and express certain medical functions,” Wachs said. “You want to use intuitive and natural gestures for the surgeon, to express medical image navigation activities, but you also need to consider cultural and physical differences between surgeons. They may have different preferences regarding what gestures they may want to use.”

Not only that, but Wachs’ algorithms need to include a way for the computer to understand the context of the gestures, namely the ability to discriminate between purposeful and accidental gestures.

“Say the surgeon starts talking to another person in the operating room and makes conversational gestures,” Wachs said. “You don’t want the robot handing the surgeon a hemostat.”

He’s also working on giving the robot prediction abilities, so it would be able to anticipate what images or tools the surgeons would need next.

I’m torn on this issue. I’m a health care worker with a more menial job than a scrub nurse. If anyone is going to be replaced by a robot, I feel it would be me. To see them focusing on jobs more important than mine makes me feel like I should be starting the job hunt again soon, as there’s no hope my job will last much longer (not like I want to be in this damn job much longer, but that’s my choice, dammit, not some usurping robot’s).

As a tech geek, however, I think this is awesome.

On a practical note, however, this is tech that’s a long way from seeing completion. Without the anticipatory abilities of an actual human being, the robot nurse would only serve to slow down the surgical process. A good scrub nurse knows their shit and can accurately predict what a surgeon will need next probably 85% of the time (critical emergency situations aside). This actually speeds the surgery up, because the surgeon doesn’t have to waste time asking for the next implement.

And based on my recent readings on the subject of robotics and A.I. research (were I not so tired, this post would have expanded to include that information in a more in-depth fashion), while scientists are currently attempting to create a robot that has the capability to learn using a look-and-follow method all children use (a primary way children learn how to identify objects is to track the movements of their parents’ heads to determine objects they are referring to), they have yet to create a solid working model. Considering the anticipatory features of Wachs’ scrub bot are entangled in the same principles as this particular feature of A.I. research, I just don’t see this idea becoming a feasible, working nursebot for at least another year or two.

So… I guess my job’s probably safe for a while.

Yay?

Leave a comment