Rocksolid Light

Welcome to RetroBBS

mail  files  register  newsreader  groups  login

Message-ID:  

My haircut is totally traditional!


rocksolid / Hacking / Re: This is way cool: hacking Siri?Alexa and so on with laser beams

SubjectAuthor
* This is way cool: hacking Siri/Alexa and so on with laser beamstrw
`* Re: This is way cool: hacking Siri/Alexa and so on with laser beamsAnonUser
 `* Re: This is way cool: hacking Siri?Alexa and so on with laser beamsGuest
  `- Re: This is way cool: hacking Siri?Alexa and so on with laser beamsGuest

1
This is way cool: hacking Siri/Alexa and so on with laser beams

<qpsmrv$lp0$1@i2pn2.org>

 copy mid

https://rocksolidbbs.com/rocksolid/article-flat.php?id=91&group=rocksolid.shared.hacking#91

 copy link   Newsgroups: rocksolid.shared.hacking
Path: i2pn2.org!.POSTED!not-for-mail
From: trw@i2pmail.org (trw)
Newsgroups: rocksolid.shared.hacking
Subject: This is way cool: hacking Siri/Alexa and so on with laser beams
Date: Tue, 05 Nov 2019 15:42:31 -0500
Organization: Dancing elephants
Lines: 144
Message-ID: <qpsmrv$lp0$1@i2pn2.org>
Reply-To: trw <trw@i2pmail.org>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Tue, 5 Nov 2019 20:42:42 -0000 (UTC)
Injection-Info: i2pn2.org; posting-account="def.i2p";
logging-data="22304"; mail-complaints-to="usenet@i2pn2.org"
User-Agent: FUDforum 3.0.7
X-FUDforum: 6666cd76f96956469e7be39d750cc7d9 <114078>
 by: trw - Tue, 5 Nov 2019 20:42 UTC

Even possible through glass windows and for larger distances, apparently. Ok, these systems have always been crappy and full of holes. But this opens up new possibilities. Leave your mobile on the table somewhere, and bam, the hacker sitting 50 m from you in a car has ordered some drugs and paid with your online account. And you're none the wiser....
Read the article, it has some nice pictures too:
https://lightcommands.com/
I will just paste the text here:
Laser-Based Audio Injection on Voice-Controllable Systems

Light Commands is a vulnerability of MEMS microphones that allows attackers to remotely inject inaudible and invisible commands into voice assistants, such as Google assistant, Amazon Alexa, Facebook Portal, and Apple Siri using light.

In our paper we demonstrate this effect, successfully using light to inject malicious commands into several voice controlled devices such as smart speakers, tablets, and phones across large distances and through glass windows.

The implications of injecting unauthorized voice commands vary in severity based on the type of commands that can be executed through voice. As an example, in our paper we show how an attacker can use light-injected voice commands to unlock the victim's smart-lock protected home doors, or even locate, unlock and start various vehicles.
Read the Paper Cite
See Light Commands in Action

Team

Light Commands were discovered by the following team of academic researchers:

Takeshi Sugawara at The University of Electro-Communications (Tokyo)
Benjamin Cyr at University of Michigan
Sara Rampazzi at University of Michigan
Daniel Genkin at University of Michigan
Kevin Fu at University of Michigan

Contact us at mailto:LightCommandsTeam@gmail.com
University of Michigan logo
UEC logo
Q&A

How do Light Commands work?

By shining the laser through the window at microphones inside smart speakers, tablets, or phones, a far away attacker can remotely send inaudible and potentially invisible commands which are then acted upon by Alexa, Portal, Google assistant or Siri.

Making things worse, once an attacker has gained control over a voice assistant, the attacker can use it to break other systems. For example, the attacker can:
Control smart home switches
Open smart garage doors
Make online purchases
Remotely unlock and start certain vehicles
Open smart locks by stealthily brute forcing the user's PIN number.

But why does this happen?

Microphones convert sound into electrical signals. The main discovery behind light commands is that in addition to sound, microphones also react to light aimed directly at them. Thus, by modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio.

Ok, but what do voice assistants have to do with this?

Voice assistants inherently rely on voice to interact with the user. By shining a laser on their microphones, an attacker can effectively hijack the voice assistant and send inaudible commands to the Alexa, Siri, Portal, or Google Assistant.

What is the range of Light Commands?

Light can easily travel long distances, limiting the attacker only in the ability to focus and aim the laser beam. We have demonstrated the attack in a 110 meter hallway, which is the longest hallway available to us at the time of writing.

But how can I aim the laser accurately, and at such distances?

Careful aiming and laser focusing is indeed required for light commands to work. To focus the laser across large distances one can use a commercially avalible telephoto lens. Aiming can be done using a geared tripod head, which greatly increases accuracy. An attacker can use a telescope or binocular in order to see the device's microphone ports at large distances.

Which devices are susceptible to Light Commands?

In our experiments, we test our attack on the most popular voice recognition systems, namely Amazon Alexa, Apple Siri, Facebook Portal, and Google Assistant. We benchmark multiple devices such as smart speakers, phones, and tablets as well as third-party devices with built-in speech recognition.
Device Voice Recognition
System Minimun Laser Power
at 30 cm [mW] Max Distance
at 60 mW [m]* Max Distance
at 5 mW [m]**
Google Home Google Assistant 0.5 50+ 110+
Google Home mini Google Assistant 16 20 -
Google NEST Cam IQ Google Assistant 9 50+ -
Echo Plus 1st Generation Amazon Alexa 2.4 50+ 110+
Echo Plus 2nd Generation Amazon Alexa 2.9 50+ 50
Echo Amazon Alexa 25 50+ -
Echo Dot 2nd Generation Amazon Alexa 7 50+ -
Echo Dot 3rd Generation Amazon Alexa 9 50+ -
Echo Show 5 Amazon Alexa 17 50+ -
Echo Spot Amazon Alexa 29 50+ -
Facebook Portal Mini Alexa + Portal 18 5 -
Fire Cube TV Amazon Alexa 13 20 -
EchoBee 4 Amazon Alexa 1.7 50+ 70
iPhone XR Siri 21 10 -
iPad 6th Gen Siri 27 20 -
Samsung Galaxy S9 Google Assistant 60 5 -
Google Pixel 2 Google Assistant 46 5 -

While we do not claim that our list of tested devices is exhaustive, we do argue that it does provide some intuition about the vulnerability of popular voice recognition systems to Light Commands.
Note:
* Limited to a 50 m long corridor.
** Limited to a 110 m long corridor.

Can speaker recognition protect me from Light Commands?

At the time of writing, speaker recognition is off by default for smart speakers and is only enabled by default for devices like phones and tablets. Thus, Light Commands can be used on these smart speakers without imitating the owner's voice. Moreover, even if enabled, speaker recognition only verifies that the wake-up words (e.g., "Ok Google" or "Alexa") are said in the owner's voice, and not the rest of the command. This means that one "OK Google" or "Alexa" spoken by the owner can be used to compromise all the commands. Finally, as we show in our work, speaker recognition for wake-up words is often weak and can be sometimes bypassed by an attacker using online text-to-speech synthesis tools for imitating the owner's voice.

Do Light Commands require special equipment? How can I build such a setup?

The Light Commands attack can be mounted using a simple laser pointer ($13.99, $16.99, and $17.99 on Amazon), a laser driver (Wavelength Electronics LD5CHA, $339), and a sound amplifier (Neoteck NTK059, $27.99 on Amazon). A telephoto lens (Opteka 650-1300mm, $199.95 on Amazon) can be used to focus the laser for long range attacks.

How vulnerable are other voice controllable systems?

While our paper focuses on Alexa, Siri, Portal, and Google Assistant, the basic vulnerability exploited by Light Commands stems from design issues in MEMS microphones. As such, any system that uses MEMS microphones and acts on this data without additional user confirmation might be vulnerable.

How can I detect if someone used Light Commands against me?

While command injection via light makes no sound, an attentive user can notice the attacker's light beam reflected on the target device. Alternatively, one can attempt to monitor the device's verbal response and light pattern changes, both of which serve as command confirmation.

Have Light Commands been abused in the wild?

So far we have not seen any indications that this attack have been maliciously exploited.

Does the effect depend on laser color or wavelength?

During our experiments, we note the effects are generally independent from color and wavelength. Although blue and red lights are on the other edges in the visible spectrum, the levels of injected audio signal are in the same range and the shapes of the frequency-response curves are also similar.

Do I have to use a laser? What about other light sources?

In principle any bright enough light can be used to mount our attack. For example the Acebeam W30 laser-excited flashlight can be used as an alternative to a laser diode.

Is it possible mitigate this issue?

An additional layer of authentication can be effective at somewhat mitigating the attack. Alternatively, in case the attacker cannot eavesdrop on the device's response, having the device ask the user a simple randomized question before command execution can be an effective way at preventing the attacker from obtaining successful command execution.

Manufacturers can also attempt to use sensor fusion techniques, such as acquire audio from multiple microphones. When the attacker uses a single laser, only a single microphone receives a signal while the others receive nothing. Thus, manufacturers can attempt to detect such anomalies, ignoring the injected commands.

Another approach consists in reducing the amount of light reaching the microphone's diaphragm using a barrier that physically blocks straight light beams for eliminating the line of sight to the diaphragm, or implement a non-transparent cover on top of the microphone hole for attenuating the amount of light hitting the microphone. However, we note that such physical barriers are only effective to a certain point, as an attacker can always increase the laser power in an attempt to compensate for the cover-induced attenuation or for burning through the barriers, creating a new light path.

Is the laser beam used in the attack safe?


Click here to read the complete article
Re: This is way cool: hacking Siri/Alexa and so on with laser beams

<10a58b1b9e3e9d97fa329a8240c01a61$1@rslight.i2p>

 copy mid

https://rocksolidbbs.com/rocksolid/article-flat.php?id=92&group=rocksolid.shared.hacking#92

 copy link   Newsgroups: rocksolid.shared.hacking
Path: i2pn2.org!rocksolid2!.POSTED.localhost!not-for-mail
From: AnonUser@rslight.i2p (AnonUser)
Newsgroups: rocksolid.shared.hacking
Subject: Re: This is way cool: hacking Siri/Alexa and so on with laser beams
Date: Mon, 11 Nov 2019 13:37:04 -0000 (UTC)
Organization: Rocksolid Light
Message-ID: <10a58b1b9e3e9d97fa329a8240c01a61$1@rslight.i2p>
References: <qpsmrv$lp0$1@i2pn2.org>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Mon, 11 Nov 2019 13:37:04 -0000 (UTC)
Injection-Info: novabbs.com; posting-account="retrobbs1"; posting-host="localhost:127.0.0.1";
logging-data="26586"; mail-complaints-to="usenet@novabbs.com"
User-Agent: rslight (http://news.novabbs.com)
X-Spam-Checker-Version: SpamAssassin 3.4.2 (2018-09-13) on novabbs.com
X-Rslight-Site: $2y$10$3Uq44sBjIz8LoDB9r7MX2edP9pvQajZZwXi.jRIv1qi8fqjj.xPrG
 by: AnonUser - Mon, 11 Nov 2019 13:37 UTC

add this little info to the recipe, just for fun:

https://www.engadget.com/2019/11/02/florida-police-obtain-alexa-recordings-in-murder-case/?guccounter=1

mix the two, and there is no reality any more, just like it already
happened with deepfake software that lets you fabricate all kinds of video
and audio material that passes as real...
some device records something somewhere, and by order of law, it becomes
evidence...

see the text of the article :

Florida police obtain Alexa recordings in murder investigation
This time, though, officers are more realistic about what they may find.
Jon Fingas, @jonfingas

Police have once again obtained Alexa voice recordings as part of an
investigation, although they're not necessarily expecting a treasure trove
of information this time around. Law enforcement in Hallandale Beach,
Florida has used a search warrant to collect Alexa recordings from two
Echo Dots as part of a murder case. Investigators want to know if the
smart speakers inadvertently picked up audio of a July altercation between
Adam Crespo and his wife Silvia Crespo. She died of a spear wound to the
chest; Adam maintained that it was the result of an accident that snapped
the spear, but detectives want to know if Alexa preserved any evidence of
possible foul play.

Unlike a pioneering murder case in Arkansas, Hallandale police weren't
expecting a complete audio capture. The search warrant indicated that cops
obtained "Amazon Echo Recordings w/ Alexa Voice Command," suggesting that
they were only hoping that one or both of the Crespos may have
inadvertently set off the Echo Dots during the incident. Outside of
security exploits, there's no substantial evidence that Echo speakers
record continuously -- they're only supposed to capture audio in a brief
window of time after someone says Alexa's wake word.

Adam Crespo's attorney, Christopher O'Toole, was happy to see the
recordings turned over as he believed it would support his client's
version of events.

As in the past, Amazon stressed in a statement to CBS that it doesn't hand
over customer information unless mandated by a "legally valid and binding
order," and that it resists "overbroad or otherwise inappropriate"
requests.

It's uncertain whether these kinds of requests will continue to grow in
the future. While smart speakers continue to sell in large numbers, police
are also increasingly aware of their limitations. Moreover, users
themselves increasingly have control over their data. An Alexa user can
delete the day's voice recordings, for instance. Although many people
won't think (or need) to do that, there's now a chance that any relevant
clips will have vanished before police can listen to them.

--
Posted on Rocksolid Light

Re: This is way cool: hacking Siri?Alexa and so on with laser beams

<qqjseg$27c$1@i2pn2.org>

 copy mid

https://rocksolidbbs.com/rocksolid/article-flat.php?id=93&group=rocksolid.shared.hacking#93

 copy link   Newsgroups: rocksolid.shared.hacking
Path: i2pn2.org!.POSTED!not-for-mail
From: guest@retrobbs.rocksolidbbs.com (Guest)
Newsgroups: rocksolid.shared.hacking
Subject: Re: This is way cool: hacking Siri?Alexa and so on with laser beams
Date: Thu, 14 Nov 2019 10:38:57 -0500
Organization: Dancing elephants
Lines: 0
Message-ID: <qqjseg$27c$1@i2pn2.org>
References: <10a58b1b9e3e9d97fa329a8240c01a61$1@rslight.i2p>
Reply-To: Guest <guest@retrobbs.rocksolidbbs.com>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Thu, 14 Nov 2019 15:38:57 -0000 (UTC)
Injection-Info: i2pn2.org; posting-account="def.i2p";
logging-data="2284"; mail-complaints-to="usenet@i2pn2.org"
User-Agent: FUDforum 3.0.7
X-FUDforum: 6666cd76f96956469e7be39d750cc7d9 <132608>
 by: Guest - Thu, 14 Nov 2019 15:38 UTC

To listen to a conversation from outside by pointing a laser to a window and measuring vibration is an old technique since the 1970's. It does work very well for about 500-800 meters based on weather. This is new to actually insert a conversation by vibrating a surface. In those days "people" bought white noise machines and plastics (similar to laser radar detector scramblers). You could also listen to loud music since they could filter out TV channels.
Posted on def3

Re: This is way cool: hacking Siri?Alexa and so on with laser beams

<qqjsnf$2mq$1@i2pn2.org>

 copy mid

https://rocksolidbbs.com/rocksolid/article-flat.php?id=94&group=rocksolid.shared.hacking#94

 copy link   Newsgroups: rocksolid.shared.hacking
Path: i2pn2.org!.POSTED!not-for-mail
From: guest@retrobbs.rocksolidbbs.com (Guest)
Newsgroups: rocksolid.shared.hacking
Subject: Re: This is way cool: hacking Siri?Alexa and so on with laser beams
Date: Thu, 14 Nov 2019 10:43:44 -0500
Organization: Dancing elephants
Lines: 3
Message-ID: <qqjsnf$2mq$1@i2pn2.org>
References: <qqjseg$27c$1@i2pn2.org>
Reply-To: Guest <guest@retrobbs.rocksolidbbs.com>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Thu, 14 Nov 2019 15:43:44 -0000 (UTC)
Injection-Info: i2pn2.org; posting-account="def.i2p";
logging-data="2778"; mail-complaints-to="usenet@i2pn2.org"
User-Agent: FUDforum 3.0.7
X-FUDforum: 6666cd76f96956469e7be39d750cc7d9 <132616>
 by: Guest - Thu, 14 Nov 2019 15:43 UTC

https://www.radarbusters.com/The-Laser-Shield-p/lasershield.htm
https://en.wikipedia.org/wiki/White_noise_machine

https://www.instructables.com/id/LASER-MICROPHONE/
Posted on def3

1
server_pubkey.txt

rocksolid light 0.9.7
clearnet tor