Smart speakers managed to be hacked without connection - using a laser

All major smart assistants, Google Assistant, Amazon Alexa and Siri, were vulnerable to the new attack. An anonymous group of hackers have discovered that they can use a laser to target the microphones of smart speakers, forcing them to respond as if they were voice commands. And, accordingly, perform a number of functions - from sounding the weather forecast to access to mail, etc.

Hackers honestly admit that they do not fully understand the physics of the process. Apparently, the laser pulse creates pressure on the membrane of microelectromechanical sensors, an important part of the "smart microphone". And he perceives it as air vibrations, that is, an acoustic signal. All that remains is to find a suitable pulse sequence to simulate the command "Hey Siri, what time is it?"

The test setup cost the hackers a handful of dollars and managed to hack several speaker models at a distance of 110 m. True, this is only possible with line of sight, plus there are a lot of difficulties with focusing the beam. Most importantly, hacking is possible only if the device has no protection and simply reacts to the owner's voice.

Experts say that it is difficult to call this a full-fledged vulnerability, rather, it is a feature of the technology that manufacturers will have to take into account. If the column has a banal voice recognition system or it receives a command from several microphones at once, it will no longer work to hack it in this way. But that's when it comes to smart speakers, but what about the rest of the IoT components that could suddenly be vulnerable to attacks that their developers never thought about?