Warning: alexa, google home or siri can be hack only with a laser tia


Network security | Since virtual assistants were launched a few years ago, security experts have been concerned that systems like Siri and Alexa could easily become a privacy threat tool at hand. hackers.

How has the Specter flaw been dropped during the last 7 months?

Google introduces the new Nest Mini smart speaker

Scientists have found a way to hack Alexa, Google Home or Siri with just one laser

But few people pay attention to the threat posed by a subtle light emitted. Researchers in Japan and at the University of Michigan announced on Monday to find a way to hijack devices like Google Home, Amazon Alexa or Apple Siri from a distance. hundred meters by projecting a laser or flashlight onto the mic of the above products.


 
In one example, the research team said that they had succeeded in opening a garage garage with laser beams on the connected voice assistant device. They also climbed a bell tower over 42 meters high at the University of Michigan and took control of a Google Home device on the fourth floor of an office building 70 meters away. The team also said that it is also possible to control virtual assistants from a distance of more than 106 meters thanks to the ability of the telephoto lens to assemble light rays.

Opening a garage door is just a simple trick. With the above method, the researchers can also control any smart electronic device connected to the virtual assistant.

They can easily turn on and off electric light switches, buy online or open doors with smart lock technology. They can even unlock and launch cars linked to the virtual assistant remotely.

Kevin Fu, a professor of electrical engineering and computer science at the University of Michigan, said: "This opens up a whole new kind of flaw. It is currently very difficult to get accurate information about numbers. the number of devices affected by this flaw because of its very basic mechanism ".

The findings were published in a scientific study on Monday by a team of Takeshi Sugawara, a researcher in computer science and electrical engineering at the University of Electronic Communications in Japan, Mr. Fu, Daniel Genkin, Sara Rampazzi and Benjamin Cyr of the University of Michigan.

Daniel Genkin is also one of the researchers responsible for finding two major security vulnerabilities known as Meltdown and Specter that appear on the microprocessors of most computers in the world. This event resulted in a 5% decline in the stock of chipmaker Intel.

The team spent seven months studying the light-related flaw and discovered that the microphones of the aforementioned devices responded to the same light with sound. Because in each microphone is a small plate, called a driver, that can move when colliding with sound.

The motion can be replicated by gathering laser beams or flashlights into the driver, the bright signal from which will be converted into electrical signals. And so the rest of the system will respond to that signal exactly the way it responds to sound.

The team said that they informed Tesla, Ford, Amazon, Apple and Google about the flaw. The above companies all claim that they are researching the results obtained by the study.

According to the first people who discovered this problem, most of the recording mic types would have to be redesigned to fix this problem. Simple methods of shielding will not work. Mr. Fu also said that the dust shield on some virtual assistants could not prevent their hacking process.

To tell the history of the dreaded vulnerabilities on internet-connected devices will take us quite a long time. But experts often warn that these weaknesses are unexpected but they are usually just what happens in the worst scenario with very little probability. In addition, there is no sign that the security hole related to the light will be exploited by hackers.

This is not the first time people have discovered unexpected holes in virtual assistants. Previously, scientists in China and the United States also demonstrated the ability to order these devices in a form of signals that the human ear cannot hear.

However, in the era of internet-enabled devices, this finding will be a reminder for consumers to always be on the alert for security.

"This is just the surface of the iceberg. There is still a big gap between what computers do and have to do. And with everything connected to the internet, they can do things. without us knowing this is just an example ".

An Amazon spokesperson said the company had never received any information about the use of light to hack except this study, and that customers currently use the company's virtual assistant. This company can use some simple safety measures. For example, they can set up voice PINs for shopping requests or any other sensitive requirements on other smart home devices. In addition, users can also use the microphone shortcut to disconnect the power to the voice receiver.

Another way to protect yourself is to place the virtual assistant in a hidden place from outside view and "don't connect it to any device you don't want others to have." access rights, "added Genkin.