Jump to content
Kev

Robot Vacuums Suck Up Sensitive Audio in ‘LidarPhone’ Hack

Recommended Posts

Posted

vacuum-robot.jpg

 

Researchers have unveiled an attack that allows attackers to eavesdrop on homeowners inside their homes, through the LiDAR sensors on their robot vacuums.

 

Researchers have uncovered a new attack that lets bad actors snoop in on homeowners’ private conversations – through their robot vacuums.

 

The vacuums, which utilize smart sensors in order to autonomously operate, have gained traction over the past few years. The attack, called “LidarPhone” by researchers, in particular targets vacuums with LiDAR sensors, as the name suggests. LiDAR, which stands for Light Detection and Ranging, is a remote sensing method that uses light in the form of a pulsed laser to measure distances to or from nearby objects. The technology helps vacuums navigate around obstacles on the floor while they clean.

 

The good news is that the attack is complex: Attackers would need to have already compromised the device itself (in their attack, researchers utilized a previously discovered attack on the vacuum cleaners). Additionally, attackers would need to be on the victim’s local network to launch the attack.

 

Quote

“We develop a system to repurpose the LiDAR sensor to sense acoustic signals in the environment, remotely harvest the data from the cloud and process the raw signal to extract information. We call this eavesdropping system LidarPhone,” said the team of researchers from the University of Maryland, College Park and the National University of Singapore, in Wednesday research.

 

Threatpost has reached out to the researchers for further information on the specific equipment utilized to launch the attack, as well as the complexity of the attack; and will update this article accordingly.

 

The core idea behind the attack is to remotely access the vacuum cleaner’s LiDAR readings, and analyze the sound signals collected. This would allow an attacker to listen in on private conversations, said researchers – which could reveal their credit-card data or deliver potentially incriminating information that could be used for blackmail.

 

Researchers were able to LidarPhone on a Xiaomi Roborock vacuum cleaning robot as a proof of concept (PoC). First, they reverse-engineered the ARM Cortex-M based firmware of the robot. They then leveraged an issue in the Dustcloud software stack, which is a proxy or endpoint server for devices, in order to gain root access to the system. That’s an attack based on prior research released at DEFCON 26 in 2018.

 

robot-vacuum-300x201.png

The robot vacuum attack. Credit: National University of Singapore

 

 

Quote

“The robot is typically connected to the Xiaomi cloud ecosystem for its standard operations and data exchange,” said researchers. “We override this interface with the Valetudo software stack on the rooted device and control the robot over a local network.”

 

Then, researchers collected both spoken digits – along with music played by a computer speaker and a TV sound bar – totaling more than 30,000 utterances over 19 hours of recorded audio. They said that LidarPhone achieves approximately 91 percent and 90 percent average accuracies of digit and music classifications, respectively.

 

For instance, researchers were able to detect different sounds around the household – from a cloth rug, to the trash, to various intro music sequences for popular news channels on TV like FOX, CNN and PBS – even predicting the gender of those who were talking.

 

At the same time, various setbacks still exist with the attack. For one, several conditions in the household could render an attack less effective. For instance, the distance away from the vacuum cleaner, and volume, of different noises has an impact on the overall effectiveness. Background noise levels and lighting conditions also have an impact on the attack.

 

Researchers said that the attack can be mitigated by reducing the signal-to-noise ratio (SNR) of the LiDAR signal: “This may be possible if the robot vacuum-cleaner LiDARs are manufactured with a hardware interlock, such that its lasers cannot be transmitted below a certain rotation rate, with no option to override this feature in software,” they said.

Regardless, the attack serves as an important reminder that the proliferation of smart sensing devices in our homes opens up many opportunities for acoustic side-channel attacks on private conversations.

 

Quote

“While we investigate LiDAR on robot vacuum cleaners as an exemplary case, our findings may be extended to many other active light sensors, including smartphone time-of-flight [ToF] sensors,” said researchers. ToF cameras use infrared rays that bounce off objects and return to the hardware. The time that this light takes to leave and then return to the device (the time of flight) allows the camera to sense depth, thus creating a 3D ‘map’ of a space.

 

Via threatpost.com

  • Upvote 1

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.



×
×
  • Create New...