We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here. By continuing to use our site, you accept our use of cookies. Cookie Policy.

AI Study Sees through Walls and Occlusions

By HospiMedica International staff writers
Posted on 28 Jun 2018
A new study describes how artificial intelligence (AI) can be used to analyze radio signals bouncing off people's bodies so as to study posture and movement, even through walls.

The Massachusetts Institute of Technology (MIT, Cambridge, MA, USA) RF-Pose project is based on a deep neural network approach that parses wireless signals in the WiFi frequencies in order to estimate human poses and postures. One of the stumbling blocks in the process is that teaching AI networks to identify visual patterns relies on human annotation; but since radio signals cannot be annotated, the researchers used a state-of-the-art vision model to provide cross-modal supervision.

Image: A new study shows how artificial intelligence can identify human motion and posture, even through walls (Photo courtesy of CSAIL).
Image: A new study shows how artificial intelligence can identify human motion and posture, even through walls (Photo courtesy of CSAIL).

This involved collecting thousands of examples of both wireless device data and matched photographic images of people doing activities like walking, talking, sitting, opening doors, and waiting for elevators. They then used the images to extract stick figures, which they showed to the AI neural network along with the corresponding radio signal. The combined data enabled the AI system to learn the association between the radio signal and the stick figures of the people in a given scene. Once trained, the network used only the wireless signal for pose estimation.

The results showed that when tested on visible scenes, the radio-based system is almost as accurate as the vision-based system used to train it. But unlike vision-based pose estimation, the radio-based system can also estimate two-dimensional (2D) poses through walls, despite never being trained on such scenarios. The researchers suggest the system could monitor patients with Parkinson's disease, multiple sclerosis (MS), and other issues, as well as provide an added security for seniors at home by monitoring falls, injuries, and changes in activity patterns. The study was presented at the annual conference on Computer Vision and Pattern Recognition (CVPR), held during June 2018 in Salt Lake City (UT, USA).

“Just like how cellphones and Wi-Fi routers have become essential parts of today's households, I believe that wireless technologies like these will help power the homes of the future,” said senior author Professor Dina Katabi, PhD, of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). “We've seen that monitoring patients' walking speed and ability to do basic activities on their own gives healthcare providers a window into their lives that they didn't have before, which could be meaningful for a whole range of diseases.”

Related Links:
Massachusetts Institute of Technology


Latest AI News



SonoScape