This is all, of course, very cool, but do not forget about one important point: in practice, no one will shine a laser into the microphone with a laser. As well as no one will read the image from the monitor screen, analyzing the radiation of its power supply. The study showed once again that voice assistants have a fundamental vulnerability in terms of authorization of the owner, and also that the attack can be carried out at a decent distance. The farther, the more we will depend on the accuracy of computers around the world. This problem is not limited to human speech, but for safety, the ability of the machine to accurately identify the speed limit sign or markings on asphalt plays a much greater role.
Almost all media outlets wrote about hacking smart speakers at a distance using a laser last week (news, project site, scientific research, post on Habré). The theme is really attractive: such a hack in the style of James Bond films. Researchers from universities in the USA and Japan have shown how you can transmit voice commands to a smart speaker at a distance of up to 110 meters (maybe more, but this has not been tested) with a directed laser beam. The study tested the smart speakers Google Home and Amazon Echo, but any devices that can recognize voice commands and equipped with highly sensitive MEMS microphones are actually vulnerable.