We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here. By continuing to use our site, you accept our use of cookies. Cookie Policy.

HospiMedica

Download Mobile App
Recent News AI Critical Care Surgical Techniques Patient Care Health IT Point of Care Business Focus

Gesture Recognition Could Control Future Robot Scrub Nurses

By HospiMedica International staff writers
Posted on 17 Feb 2011
An innovative support system uses a camera and specialized algorithms to recognize hand gestures as commands to instruct a computer or robot.

Researchers at Purdue University (West Lafayette, IN, USA;), the Naval Postgraduate School (Monterey, CA, USA), and Ben-Gurion University of the Negev (Beer Sheva, Israel), developed a prototype robotic scrub nurse that can respond to gestures, based on three dimensional (3D) hand-gesture recognition using the Microsoft (Redmond, WA, USA) Kinect camera. Currently the 3D space-sensing camera is found in consumer electronics games that can track a person's hands, without the use of a controller wand.

However, the enhanced accuracy and gesture-recognition speed needed for surgery depend on advanced software algorithms being developed by the researchers, algorithms that isolate the hands and apply "anthropometry" by predicting the position of the hands, based on knowledge of where the surgeon's head is located. Prerequisites include using a small vocabulary of simple, easily recognizable gestures; not requiring the user to wear special virtual reality gloves or certain types of clothing; be responsive and able to keep up with the speed of a surgeon's hand gestures; and letting the user know whether it understands the hand gestures by providing audible feedback.

The researchers are developing gestures that are easy for surgeons to learn, remember, and carry out with little physical exertion, by basing them on intuitive gestures, such as two fingers held apart to mimic a pair of scissors. The system needs to also disregard unintended gestures by the surgeon--perhaps made in conversation with colleagues in the operating room--as well as be able to configure itself quickly to work properly in different operating rooms, under various lighting conditions, and other criteria. The study describing the system was published in the February 2011 issue of Communications of the Association for Computing Machinery (ACM).

"While it will be very difficult using a robot to achieve the same level of performance as an experienced nurse who has been working with the same surgeon for years, often scrub nurses have had very limited experience with a particular surgeon, maximizing the chances for misunderstandings, delays and sometimes mistakes in the operating room," said codeveloper Juan Pablo Wachs, PhD, an assistant professor of industrial engineering at Purdue. "In that case, a robotic scrub nurse could be better."

"Another contribution is that by tracking a surgical instrument inside the patient's body, we can predict the most likely area that the surgeon may want to inspect using the electronic image medical record, and therefore saving browsing time between the images," added Dr. Wachs. "This is done using a different sensor mounted over the surgical lights."

Related Links:
Purdue University
Naval Postgraduate School
Ben-Gurion University of the Negev
Microsoft



New
Gold Member
X-Ray QA Meter
T3 AD Pro
Flocked Fiber Swabs
Puritan® patented HydraFlock®
New
Tabletop Steam Autoclave
T24
New
Oxygen Concentrator
Nuvo 10

Latest Health IT News

Machine Learning Model Improves Mortality Risk Prediction for Cardiac Surgery Patients

Strategic Collaboration to Develop and Integrate Generative AI into Healthcare

AI-Enabled Operating Rooms Solution Helps Hospitals Maximize Utilization and Unlock Capacity