Intelligent Camera System Continuously Monitors Premature Babies in NICU

By HospiMedica International staff writers
Posted on 09 Feb 2026

The neonatal period is a critical stage of life, with premature babies facing a higher risk of short- and long-term complications that require close observation. In neonatal intensive care units (NICUs), nurses must care for multiple infants while managing feeds, procedures, and documentation, making continuous visual monitoring impossible. As a result, subtle changes in infant posture or movement can be missed. Researchers have now demonstrated a non-contact, artificial intelligence (AI)-based visual monitoring approach that provides continuous insight into infant behavior without disrupting care.

Researchers at the University of Cambridge (Cambridge, UK) have developed a novel 3D camera setup that uniquely combines Red, Green, and Blue (RGB) imaging, depth sensing, and infrared imaging to monitor premature babies in neonatal intensive care units. This multi-modal system was designed to function reliably across real-world NICU conditions, including darkness, incubator coverings, and frequent clinical interventions.


Image: What the 3D camera \

Using machine learning and pose estimation, the system automatically tracks key body points such as shoulders and hips to determine infant position and behavior. The researchers recorded one-hour and 24-hour sessions from 24 babies in the NICU, capturing real clinical scenarios including parental contact, nursing care, lighting changes, and equipment interference. Combining multiple image types enabled effective monitoring even when babies were covered or the room was dark.

The AI models demonstrated better-than-average human performance when babies were uncovered or partially covered and when lying on their backs or stomachs. Performance was lower for fully covered babies or those lying on their side, highlighting challenges unique to the NICU environment. Importantly, the findings, published in npj Digital Medicine, showed that combining RGB, depth, and infrared data consistently outperformed single-image approaches.

Continuous, automated pose estimation could support early detection of developmental abnormalities and improve long-term monitoring of premature infants. The researchers are now extending the system to estimate vital signs such as heart rate and respiratory rate and to develop neonatal-specific scoring systems based on clinician-labeled data. Future work will also focus on detecting limb movement and identifying meaningful motion patterns to support clinical decision-making and reduce staff burden.

“This clinical study was a chance for us to try something new, with our 24-hour recordings offering a better representation of the real-life NICU environment,” said co-author Dr Alex Grafton, who led the study. “Accurate pose estimation can contribute to the early detection of developmental abnormalities and ongoing monitoring of premature babies’ health. These advancements offer valuable insights that can inform clinical decisions and interventions, ultimately improving the standard of care for newborn babies.”

Related Links:
University of Cambridge


Latest Critical Care News