AI Study Sees through Walls and Occlusions
|
By HospiMedica International staff writers Posted on 28 Jun 2018 |

Image: A new study shows how artificial intelligence can identify human motion and posture, even through walls (Photo courtesy of CSAIL).
A new study describes how artificial intelligence (AI) can be used to analyze radio signals bouncing off people's bodies so as to study posture and movement, even through walls.
The Massachusetts Institute of Technology (MIT, Cambridge, MA, USA) RF-Pose project is based on a deep neural network approach that parses wireless signals in the WiFi frequencies in order to estimate human poses and postures. One of the stumbling blocks in the process is that teaching AI networks to identify visual patterns relies on human annotation; but since radio signals cannot be annotated, the researchers used a state-of-the-art vision model to provide cross-modal supervision.
This involved collecting thousands of examples of both wireless device data and matched photographic images of people doing activities like walking, talking, sitting, opening doors, and waiting for elevators. They then used the images to extract stick figures, which they showed to the AI neural network along with the corresponding radio signal. The combined data enabled the AI system to learn the association between the radio signal and the stick figures of the people in a given scene. Once trained, the network used only the wireless signal for pose estimation.
The results showed that when tested on visible scenes, the radio-based system is almost as accurate as the vision-based system used to train it. But unlike vision-based pose estimation, the radio-based system can also estimate two-dimensional (2D) poses through walls, despite never being trained on such scenarios. The researchers suggest the system could monitor patients with Parkinson's disease, multiple sclerosis (MS), and other issues, as well as provide an added security for seniors at home by monitoring falls, injuries, and changes in activity patterns. The study was presented at the annual conference on Computer Vision and Pattern Recognition (CVPR), held during June 2018 in Salt Lake City (UT, USA).
“Just like how cellphones and Wi-Fi routers have become essential parts of today's households, I believe that wireless technologies like these will help power the homes of the future,” said senior author Professor Dina Katabi, PhD, of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). “We've seen that monitoring patients' walking speed and ability to do basic activities on their own gives healthcare providers a window into their lives that they didn't have before, which could be meaningful for a whole range of diseases.”
Related Links:
Massachusetts Institute of Technology
The Massachusetts Institute of Technology (MIT, Cambridge, MA, USA) RF-Pose project is based on a deep neural network approach that parses wireless signals in the WiFi frequencies in order to estimate human poses and postures. One of the stumbling blocks in the process is that teaching AI networks to identify visual patterns relies on human annotation; but since radio signals cannot be annotated, the researchers used a state-of-the-art vision model to provide cross-modal supervision.
This involved collecting thousands of examples of both wireless device data and matched photographic images of people doing activities like walking, talking, sitting, opening doors, and waiting for elevators. They then used the images to extract stick figures, which they showed to the AI neural network along with the corresponding radio signal. The combined data enabled the AI system to learn the association between the radio signal and the stick figures of the people in a given scene. Once trained, the network used only the wireless signal for pose estimation.
The results showed that when tested on visible scenes, the radio-based system is almost as accurate as the vision-based system used to train it. But unlike vision-based pose estimation, the radio-based system can also estimate two-dimensional (2D) poses through walls, despite never being trained on such scenarios. The researchers suggest the system could monitor patients with Parkinson's disease, multiple sclerosis (MS), and other issues, as well as provide an added security for seniors at home by monitoring falls, injuries, and changes in activity patterns. The study was presented at the annual conference on Computer Vision and Pattern Recognition (CVPR), held during June 2018 in Salt Lake City (UT, USA).
“Just like how cellphones and Wi-Fi routers have become essential parts of today's households, I believe that wireless technologies like these will help power the homes of the future,” said senior author Professor Dina Katabi, PhD, of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). “We've seen that monitoring patients' walking speed and ability to do basic activities on their own gives healthcare providers a window into their lives that they didn't have before, which could be meaningful for a whole range of diseases.”
Related Links:
Massachusetts Institute of Technology
Latest AI News
- AI Analysis of Pericardial Fat Refines Long-Term Heart Disease Risk
- Machine Learning Approach Enhances Liver Cancer Risk Stratification
- New AI Approach Monitors Brain Health Using Passive Wearable Data
- AI Tool Maps Early Risk Patterns in Bloodstream Infections
- AI Model Identifies Rare Endocrine Disorder from Hand Images
- AI Tool Promises to Reduce Length of Hospital Stays and Free Up Beds
- Machine Learning Model Cuts Canceled Liver Transplants By 60%
Channels
Critical Care
view channel
Noninvasive Monitoring Device Enables Earlier Intervention in Heart Failure
Hospitalizations for heart failure with preserved ejection fraction (HFpEF) remain common because lung congestion often worsens before symptoms prompt treatment changes. Missed early decompensation... Read more
Automated IV Labeling Solution Improves Infusion Safety and Efficiency
Medication administration in high-acuity settings is often complicated by multiple concurrent infusions, making accurate line identification essential. In a 10-hospital intensive care unit study, 60% of... Read moreSurgical Techniques
view channel
Ultrasound Technology Aims to Replace Invasive BPH Procedures
Benign prostatic hyperplasia (BPH) is a frequent cause of lower urinary tract symptoms in aging men and often requires invasive procedures or prolonged recovery. With prevalence expected to rise as populations... Read more
Continuous Monitoring with Wearables Enhances Postoperative Patient Safety
Postoperative hypoxemia on general surgical wards is common and often missed by intermittent vital sign checks. Undetected low oxygen levels can delay recovery and raise the risk of complications that... Read morePatient Care
view channel
Wearable Sleep Data Predict Adherence to Pulmonary Rehabilitation
Chronic obstructive pulmonary disease (COPD) is a long-term lung disorder that makes breathing difficult and often disturbs sleep, reducing energy for daily activities. Limited engagement in pulmonary... Read more
Revolutionary Automatic IV-Line Flushing Device to Enhance Infusion Care
More than 80% of in-hospital patients receive intravenous (IV) therapy. Every dose of IV medicine delivered in a small volume (<250 mL) infusion bag should be followed by subsequent flushing to ensure... Read moreHealth IT
view channel
EMR-Based Tool Predicts Graft Failure After Kidney Transplant
Kidney transplantation offers patients with end-stage kidney disease longer survival and better quality of life than dialysis, yet graft failure remains a major challenge. Although a successful transplant... Read more
Printable Molecule-Selective Nanoparticles Enable Mass Production of Wearable Biosensors
The future of medicine is likely to focus on the personalization of healthcare—understanding exactly what an individual requires and delivering the appropriate combination of nutrients, metabolites, and... Read moreBusiness
view channel







