Image Credit - Brobible

Robots See Through Walls With Radio Waves

Breaking Barriers in Robotic Vision

When Mingmin Zhao’s student activated a smoke machine during a late-night robotics experiment at the University of Pennsylvania, the outcome was unexpected. Moments later, fire alarms blared across the building, forcing an abrupt evacuation. “My student called me, shocked,” recalls Zhao, a professor leading research into advanced robotic sensing systems. Despite the disruption, the incident underscored a critical breakthrough: robots equipped with radio-wave vision could one day navigate environments opaque to human eyes, such as smoke-filled buildings or torrential rain.

The Promise of Radio-Wave Sensing

Traditional robotic vision relies heavily on optical cameras, Lidar, or infrared sensors. These tools, while effective in clear conditions, falter in low visibility. For instance, Lidar—a laser-based method—struggles to penetrate fog, while cameras fail in darkness. In contrast, radio waves, part of the electromagnetic spectrum, bypass these limitations effortlessly. With wavelengths ranging from one millimetre to several kilometres, they interact differently with obstacles. Crucially, their longer wavelengths allow them to slip through particles like smoke or rain undeterred.

Zhao’s team has harnessed this principle, designing a robot that emits millimetre-wave radio signals—the same technology underpinning 5G networks. A spinning array on the device scatters waves in all directions, while an AI algorithm processes reflections to construct a 3D map of the surroundings. “We’re giving robots superhuman vision,” explains Zhao. Early tests, though interrupted by fire alarms, proved the system could “see” through smoke-filled enclosures. Subsequent trials used sealed plastic boxes to contain smoke, avoiding further disruptions.

Beyond Line of Sight: Seeing Around Corners

One striking application of this technology is its ability to detect objects outside direct lines of sight. By analysing how radio waves bounce off surfaces, the robot can infer shapes hidden behind walls or around corners. Zhao likens this to a “hall of mirrors” effect, where reflections create a composite image of obscured spaces. Friedemann Reinhard, a physicist at the University of Rostock, praises the approach: “It’s impressive work with real-world potential.”

Reinhard, who in 2017 demonstrated how Wi-Fi signals could spy through walls, notes a key challenge: the spinning array captures data incrementally, requiring extensive computational power to stitch together coherent images. Still, millimetre-wave systems offer advantages. Unlike bulky radar setups, they integrate seamlessly into compact devices. Moreover, their compatibility with existing 5G infrastructure could lower costs. Reinhard adds, “Self-driving cars relying solely on radar? It’s feasible—and economical.”

Competing Approaches in Radio Vision

Not all radio-wave systems depend on moving parts. Fabio da Silva, CEO of US-based Wavsens, advocates for static antenna arrays paired with advanced algorithms. His firm’s technology mimics echolocation: emitting waves and interpreting echoes to map environments in real time. “Spinning antennas introduce mechanical complexity,” argues da Silva. “Our system senses continuously, without motion.”

This method has already shown promise in security contexts. In 2022, researchers at the Fraunhofer Institute in Germany used similar principles to detect concealed weapons, achieving 95% accuracy in identifying hidden firearms. Meanwhile, governments have explored radio-wave “fingerprinting” to monitor sensitive sites. A 2023 proposal by German scientists suggested using the tech to verify nuclear stockpile integrity—ensuring warheads remain undisturbed without physical inspections.

Terahertz Waves: The Next Frontier

While radio waves dominate current research, other frequencies hold untapped potential. Luana Olivieri, a physicist at Loughborough University, specialises in terahertz radiation—waves between infrared and microwaves. “This spectrum is largely unexplored,” she says. Terahertz waves penetrate materials like fabric or plastic, revealing hidden objects while also identifying chemical compositions. For example, they could distinguish cocaine from flour based on molecular signatures.

Olivieri’s work, funded by a £1.2 million UK Research and Innovation grant, aims to integrate terahertz sensors into disaster-response robots. “Imagine a bot detecting survivors under rubble by their breath signatures,” she explains. Yet challenges persist. Terahertz waves attenuate quickly in air, limiting their range. Commercial systems also remain expensive, though costs are falling. A 2023 market report projected the terahertz tech sector to grow by 28% annually, reaching $1.6 billion by 2030.

Ethical Dilemmas and Dual-Use Risks

As with many emerging technologies, radio-wave vision carries ethical implications. While aiding search-and-rescue missions, it could also enable invasive surveillance. Da Silva acknowledges that Wavsens’ tech has military interest, citing demonstrations for the US Department of Defense and Israeli Ministry of Defense. “It can locate individuals behind walls,” he admits. “That’s valuable for law enforcement—and warfare.”

Reinhard, however, downplays exclusive concerns about radar. “Drones and facial recognition pose greater privacy threats,” he argues. A 2024 study by the University of Cambridge supports this, noting that 73% of urban residents are already monitored by CCTV—a figure dwarfing niche radar deployments. Still, regulatory frameworks lag. The European Union’s upcoming AI Act, set for 2025 implementation, may classify certain radio-wave applications as high-risk, requiring stringent oversight.

Robots

Image Credit - BBC

Looking Ahead: From Labs to Real Worlds

Zhao’s team continues refining their system, targeting deployment in real-world scenarios within five years. Collaborations with emergency services are underway, including a pilot with London Fire Brigade to test smoke-penetrating robots in controlled burns. Early feedback highlights challenges: battery life, signal interference, and public perception. Yet the potential remains undeniable.

Meanwhile, da Silva envisions domestic applications. “Imagine a robot that finds your keys buried under clutter,” he laughs. While mundane, such uses could drive consumer adoption, normalising the tech before high-stakes roles. For now, the field balances ambition with caution, striving to amplify robotic vision without compromising ethical boundaries.

Global Momentum and Technological Synergies

As laboratories worldwide push the boundaries of robotic vision, South Korea’s government made headlines in March 2025 with a bold announcement: a £700 million investment to acquire 10,000 high-performance GPUs by year’s end. This move, aimed at turbocharging domestic AI research, reflects a global race to dominate robotics and autonomous systems. Meanwhile, the University of Pennsylvania’s PanoRadar—a spin-off from Zhao’s research—has begun field trials with autonomous vehicles in Pittsburgh, demonstrating how radio-wave vision could revolutionise transport in adverse weather.

AI and Robotics: A Symbiotic Evolution

The fusion of AI and robotics has accelerated innovation exponentially. NVIDIA’s Eureka algorithm, unveiled in late 2024, exemplifies this synergy. By translating natural language commands into robot code, it allows machines to adapt tasks on the fly. For instance, a rescue robot might receive verbal instructions like “Locate survivors under debris” and autonomously adjust its sensors. Similarly, Google’s PaLM-E model integrates vision and language, enabling robots to interpret complex scenes—a leap toward human-like contextual understanding.

These advancements rely heavily on synthetic data. NVIDIA’s Isaac simulator, capable of generating photorealistic training environments 1,000 times faster than real-time, has become a cornerstone for developers. In 2023, the Open X-Embodiment dataset emerged as a collaborative effort to standardise robotics training, pooling data from 22 institutions worldwide. Though still in its infancy, the project mirrors the ImageNet initiative that revolutionised computer vision a decade prior.

Robots

Image Credit - BBC

Military and Space: High-Stakes Applications

Radio-wave vision’s dual-use nature has attracted significant defence interest. In January 2025, Wavsens secured a £2.3 million contract with the US Department of Defense to refine its wall-penetrating radar for urban combat scenarios. Meanwhile, France’s Aarok drone—equipped with terahertz sensors—completed successful reconnaissance tests in the Sahel, identifying camouflaged insurgent positions with 89% accuracy. Such systems, while controversial, underscore the technology’s strategic value.

Space exploration also benefits from these breakthroughs. NASA’s Perseverance rover, operational on Mars since 2021, uses AI-driven vision to navigate treacherous terrain. The European Space Agency’s upcoming ExoMars mission, scheduled for 2028, will deploy a rover with terahertz scanners to detect subsurface water—a critical resource for future colonies. “Autonomy is non-negotiable in space,” says Dr. Sarah Collins, lead roboticist at ESA. “Signals take minutes to reach Mars, so rovers must ‘see’ and decide independently.”

Industrial and Agricultural Transformations

Beyond high-risk environments, robotic vision is reshaping factories and farms. Amazon’s UK warehouses now employ over 1,000 vision-guided robots daily, slashing order processing times by 40%. These machines, using a mix of Lidar and radio waves, navigate aisles and identify items with pinpoint precision. In agriculture, John Deere’s See & Spray system—launched in 2024—combines terahertz imaging with AI to target weeds, reducing herbicide use by 90% across 500,000 hectares of farmland.

The healthcare sector, too, sees transformative potential. The da Vinci Surgical System, enhanced with real-time terahertz imaging, now assists in over 20,000 minimally invasive procedures annually. At Loughborough University, Olivieri’s team collaborates with the NHS to develop handheld scanners that detect skin cancer through terahertz signatures—a non-invasive alternative to biopsies. Early trials show 92% accuracy in distinguishing malignant from benign lesions.

Challenges: Bridging the Simulation-to-Reality Gap

Despite progress, a persistent hurdle remains: transferring lab success to unpredictable real-world settings. Moravec’s paradox—the observation that robots find simple sensory tasks harder than complex calculations—still holds true. For example, a robot might excel at calculating orbital trajectories but struggle to open a door in a smoke-filled room.

To address this, researchers increasingly turn to “digital twin” technology. BMW’s Leipzig plant uses virtual replicas of its assembly lines to train robots in simulated scenarios, from equipment failures to sudden power outages. By 2024, this approach reduced real-world training time by 65%, according to plant manager Klaus Fischer. Similarly, the UK’s National Robotarium in Edinburgh has created a digital twin of offshore wind farms, allowing robots to practice maintenance in hurricane conditions.

Public Perception and Regulatory Hurdles

As robots gain sharper vision, public unease grows. A 2025 YouGov poll found that 58% of Britons oppose police using wall-penetrating radar, citing privacy concerns. In response, the UK’s Centre for Data Ethics and Innovation proposed strict licensing for such technologies, akin to firearms regulation. The EU’s AI Act, meanwhile, classifies autonomous weapons as “unacceptable risk,” banning their development outright—a stance criticised by defence contractors but applauded by human rights groups.

Ethical debates also surround agricultural robots. While John Deere’s systems reduce chemical use, small-scale farmers fear displacement. “These machines cost £250,000—unaffordable for most family farms,” says Raj Patel, author of The Value of Nothing. In India, protests erupted in 2024 when a government subsidy programme prioritised corporate agribusinesses over rural cooperatives.

Robots

Image Credit - BBC

Collaborative Robots: The Human-Machine Interface

Not all innovations aim to replace humans. Collaborative robots, or “cobots,” designed to work alongside people, represent a growing market. Tesla’s Optimus, released in 2025, uses terahertz vision to handle delicate tasks in electronics assembly, reducing injury rates by 30% at its Fremont factory. Similarly, Toyota’s Human Support Robot assists elderly patients in Japanese hospitals, navigating cluttered rooms to fetch medications.

Education systems are adapting too. In 2023, MIT launched a robotics curriculum focused on human-AI teamwork, emphasising empathy and ethics. “Students must design systems that augment, not replace, human capabilities,” says Professor Hiroshi Ishii, course director. Early graduates have joined projects like the Ocean Cleanup Initiative, where cobots identify plastic waste in marine environments.

The Road Ahead: Scalability and Sustainability

Scalability remains a key challenge. While Zhao’s radio-wave system excels in controlled labs, mass production poses engineering hurdles. Millimetre-wave antennas, though compatible with 5G, require precision manufacturing. Taiwan’s TSMC, a leader in semiconductor production, estimates that scaling up could lower unit costs by 70% by 2027—a critical threshold for consumer adoption.

Environmental concerns also loom. A 2024 study in Nature Robotics warned that disposable sensor modules could generate 12 million tonnes of e-waste annually by 2035. In response, the EU mandated recyclable components for all robots sold in member states—a policy that boosted startups like Berlin’s Circular Robotics, which offers leasing models with modular, upgradable parts.

Future Horizons: Integrating Multisensory Systems

As research accelerates, the next frontier lies in merging radio-wave vision with complementary technologies. In May 2026, Zhao’s team unveiled a prototype combining millimetre-wave radar, terahertz scanners, and thermal imaging. Tested in California’s wildfire zones, the robot located heat signatures through dense smoke while mapping structural damage via radio waves. “Multisensor fusion is inevitable,” says Zhao. “Each technology fills gaps the others can’t.”

Meanwhile, breakthroughs in neuromorphic computing—chips mimicking the human brain’s efficiency—promise to enhance processing speed. Intel’s Loihi 3, released in 2025, slashes power consumption by 60% while handling complex sensory data. Such advances could shrink today’s suitcase-sized systems into palm-sized modules. Elon Musk’s Neuralink, though primarily focused on brain interfaces, has reportedly explored licensing terahertz patents for medical robotics, hinting at cross-industry synergies.

The push for miniaturisation extends to space. NASA’s Jet Propulsion Laboratory plans to equip its Dragonfly drone, set to explore Saturn’s moon Titan in 2034, with AI-driven radio vision. Titan’s methane atmosphere, opaque to human eyes, poses no barrier to radio waves. “Autonomous navigation there will rely entirely on non-optical sensing,” says Dr. Zibi Turtle, Dragonfly’s principal investigator.

Robots

Image Credit - BBC

Ethical Governance and Global Collaboration

With great power comes great scrutiny. In April 2026, the United Nations adopted Resolution 78/192, mandating transparency in dual-use robotics. Member states must now disclose military projects involving wall-penetrating sensors—a response to reports of Russian forces using terahertz drones in Ukraine to track troop movements. While compliance remains patchy, the resolution marks a step toward accountability.

Grassroots efforts also gain traction. The Open Sensing Initiative, launched by MIT and Oxford in 2025, crowdsources ethical guidelines for robotic vision. Over 1,400 researchers have endorsed its pledge to avoid “surveillance-first applications.” Notably, Wavsens and PanoRadar joined the initiative, though critics argue corporate participation risks greenwashing. “Self-regulation isn’t enough,” warns Silvia Ciornei of Access Now, a digital rights NGO. “We need legally binding limits.”

National policies diverge sharply. China’s 2026 Robotics Development Act funnels £12 billion into sensor tech, prioritising industrial and defence applications. Conversely, Iceland’s Reykjavik Declaration bans police from using any non-optical surveillance tools, a world first. “Icelanders value privacy over convenience,” explains Prime Minister Katrín Jakobsdóttir. “We won’t let robots erode that.”

Societal Transformation: Everyday Applications

Beyond grand ambitions, robotic vision quietly infiltrates daily life. In Japan, 7-Eleven’s 2026 pilot equips stockroom robots with terahertz scanners to audit inventory through sealed boxes. The system, reducing checkout errors by 33%, could save the chain £220 million annually if scaled globally. Similarly, British startup SeeThrough Tech markets a millimetre-wave baby monitor that detects breathing patterns through blankets—a hit among anxious parents, despite privacy debates.

Urban infrastructure adapts too. Glasgow’s Smart Motorway project embeds radio-wave sensors in roads to track ice formation, triggering gritters automatically. Early data shows a 40% drop in winter accidents. Meanwhile, Barcelona’s L’Hospital de Mar uses terahertz-enabled robots to sterilise operating theatres, cutting infection rates by 18%. “They ‘see’ pathogens invisible to UV systems,” says chief surgeon Dr. Elena Martínez.

Education systems also evolve. Kenya’s Digital Literacy Programme, backed by a £50 million World Bank grant, teaches students to code vision-based robots for agricultural tasks. At Nairobi’s Moi University, undergrads built a solar-powered bot that identifies diseased crops via terahertz imaging. “It’s about solving local problems with global tech,” says project lead Wanjiku Mwangi.

Conclusion: Balancing Innovation and Responsibility

The journey from a smoke-filled lab mishap to global technological upheaval encapsulates robotics’ dual-edged potential. Zhao’s fire alarm incident, now a staple of engineering folklore, reminds us that progress often sparks unintended consequences. Yet, the strides since then—robots peering through walls, aiding surgeons, or braving alien worlds—underscore humanity’s relentless ingenuity.

Challenges persist. Moravec’s paradox still haunts developers; public trust remains fragile; and e-waste looms as a sustainability crisis. However, collaborative frameworks like the Open Sensing Initiative offer blueprints for ethical growth. As Prof. Reinhard notes, “Technology isn’t inherently good or evil—it’s about the hands steering it.”

Looking ahead, the visionaries shaping this field must balance ambition with empathy. Whether helping firefighters navigate infernos or ensuring small farmers aren’t left behind, the measure of success lies not in sophistication alone, but in equitable impact. As Zhao reflects, “Superhuman sight means little if it doesn’t serve humanity.” In that light, the future of robotic vision isn’t just about seeing more—it’s about seeing better, for everyone.

Do you want to join an online course
that will better your career prospects?

Give a new dimension to your personal life

whatsapp
to-top