There is an increasing need to remotely monitor people in daily life using radio-frequency probe signals. However, conventional systems can hardly be deployed in real-world settings since they typically require objects to either deliberately cooperate or carry a wireless active device or identification tag. To accomplish complicated successive tasks using a single device in real time, we propose the simultaneous use of a smart metasurface imager and recognizer, empowered by a network of artificial neural networks (ANNs) for adaptively controlling data flow. Here, three ANNs are employed in an integrated hierarchy, transforming measured microwave data into images of the whole human body, classifying specifically designated spots (hand and chest) within the whole image, and recognizing human hand signs instantly at a Wi-Fi frequency of 2.4 GHz. Instantaneous in situ full-scene imaging and adaptive recognition of hand signs and vital signs of multiple non-cooperative people were experimentally demonstrated. We also show that the proposed intelligent metasurface system works well even when it is passively excited by stray Wi-Fi signals that ubiquitously exist in our daily lives. The reported strategy could open up a new avenue for future smart cities, smart homes, human-device interaction interfaces, health monitoring, and safety screening free of visual privacy issues.
Machine learning: Metasurfaces gain the power of recognition Combining radio-frequency imaging with artificial intelligence could make it easier for computers to interact with individuals using non-verbal cues, such as sign language. Lianlin Li from Peking University in Beijing, China and Tie Jun Cui from Southeast University in Nanjing, China, and co-workers fabricated a meter-scale flat panel containing ‘meta-atoms’, tiny electronic devices that manipulate the phases of light waves, arranged in a grid-like pattern. By emitting microwave signals or manipulating stray Wi-Fi signals and detecting echoes bounced back, the metasurface can collect high-resolution imaging data on multiple non-cooperative subjects, even those behind solid walls. The teams fed the microwave data to a series of artificial intelligence algorithms that first identify human shapes, modify signal distributions to better focus on specific body parts, and recognize people's hand signs and vital signs . Experiments showed this setup could continuously monitor hand signals and breathing, even using stray Wi-Fi signals that ubiquitously exist in the daily lives.