Hear no evil: Ultrasound attacks on voice assistants | WeLiveSecurity

How
 your
voice
assistant
could
do
the
bidding
of
a
hacker

without
you
ever
hearing
a
thing

Regular
WeLiveSecurity
readers
won’t
be
stunned
to
read
that
cyberattacks
and
their
methods
keep
evolv

Hear no evil: Ultrasound attacks on voice assistants | WeLiveSecurity

How
 your
voice
assistant
could
do
the
bidding
of
a
hacker

without
you
ever
hearing
a
thing

Regular
WeLiveSecurity
readers
won’t
be
stunned
to
read
that
cyberattacks
and
their
methods
keep
evolving
as
bad
actors
continue
to
enhance
their
repertoire.
It’s
also
become
a
common
refrain
that
as
security
vulnerabilities
are
found
and
patched
(alas,
sometimes

after
being
exploited
),
malicious
actors
find
new
chinks
in
the
software
armor.

Sometimes,
however,
it
is
not
“just”
a(nother)
security
loophole
that
makes
the
headlines,
but
a
new
form
of
attack.
This
was
also
the
case
recently
with
a
rather
unconventional
attack
method
dubbed
NUIT.
The
good
news?
NUIT
was
unearthed
by
academics
and
there
are
no
reports
of
anybody
exploiting
it
for
pranks
or
outright
cybercrime.
That
said,
it
doesn’t
hurt
to
be
aware
of
another
way
your
privacy
and
security
could
be
at
risk

as
well
as
about
the
fact
that
NUIT
can
actually
come
in
two
forms.

How
NUIT
saw
the
light
of
day

NUIT,
or

Near-Ultrasound
Inaudible
Trojan
,
is
a
class
of
attack
that
could
be
deployed
to
launch
silent
and
remote
takeovers
of
devices
that
use
or
are
powered
by
voice
assistants
such
as
Siri,
Google
Assistant,
Cortana,
and
Amazon
Alexa.
As
a
result,
any
device
accepting
voice
commands

think
your
phone,
tablet
or
smart
speaker

could
be
open
season.
Ultimately,
the
attack
could
have
some
dire
consequences,
ranging
from
a
breach
of
privacy
and
loss
of
trust
to
even
the
compromise
of
a
company’s
infrastructure,
which
could,
in
turn,
result
in
hefty
monetary
losses.


Described
by
a
team
of
researchers

at
the
University
of
Texas
in
San
Antonio
(UTSA)
and
the
University
of
Colorado
Colorado
Springs
(UCCS),
NUIT
is
possible
because
microphones
in
digital
assistants
can
respond
to
near-ultrasound
waves
played
from
a
speaker.
While
inaudible
to
you,
this
sound
command
would
prompt
the
always-on
voice
assistant
to
perform
an
action

let’s
say,
turn
off
an
alarm,
or
open
the
front
door
secured
by
a
smart
lock.

To
be
sure,
NUIT
isn’t
the
first
acoustic
attack
to
have
made
waves
over
the
years.
Previously,
attacks
with
similarly
intriguing
names
have
been
described

think

SurfingAttack
,

DolphinAttack
,

LipRead

and

SlickLogin
,
including
some
other
inaudible
attacks
that
that,
too,
targeted
smart-home
assistants.

Night,
night

As
mentioned,
NUIT
comes
in
two
forms:
They
are:


  • NUIT
    1


    This
    is
    when
    the
    device
    is
    both
    a
    source
    and
    the
    target
    of
    an
    attack.
    In
    such
    cases,
    all
    it
    takes
    is
    a
    user
    playing
    an
    audio
    file
    on
    their
    phone
    that
    causes
    the
    device
    to
    perform
    an
    action,
    like
    sending
    a
    text
    message
    with
    its
    location.


  • NUIT
    2


    This
    attack
    is
    launched
    by
    a
    device
    with
    a
    speaker
    to
    another
    device
    with
    a
    microphone,
    like
    from
    your
    PC
    to
    a
    smart
    speaker.

As
an
example,
let’s
say
you
are
watching
a
webinar
on
Teams
or
Zoom.
A
user
could
unmute
themselves
and
play
a
sound,
which
would
then
be
picked
up
by
your
phone,
prompting
it
to
visit
a
dangerous
website
and
compromising
the
device
with
malware.

Alternatively,
you
could
be
playing
YouTube
videos
on
your
phone
with
your
loudspeakers,
and
the
phone
would
then
perform
an
unwarranted
action.
From
the
user’s
perspective,
this
attack
does
not
require
any
specific
interaction,
which
makes
it
all
the
worse.

Should
NUIT
keep
you
up
at
night?

What
does
it
take
to
perform
such
an
attack?
Not
much,
as
for
NUIT
to
work,
the
speaker
from
which
it
is
launched
needs
to
be
set
to
above
a
certain
level
of
volume,
with
the
command
lasting
less
than
a
second
(0.77s).

Moreover,
obviously
you
need
to
have
your
voice
assistant
enabled.
According
to
the
researchers,
out
of
the
17
devices
tested,
only

Apple
Siri-enabled
devices
were
harder
to
crack
.
This
was
because
a
hacker
would
need
to
steal
your
unique
voice
fingerprint
first
to
get
the
phone
to
accept
commands.

Which
is
why
everyone
should
set
up
their
assistants
to
only
work
with
their
own
voice.
Alternatively,
consider
switching
your
voice
assistant
off
when
it’s
not
needed;
indeed,
keep
your
cyber-wits
about
you
when
using
any
IoT
devices,
as
all
sorts
of

smart
gizmos
can
be
easy
prey
for
cybercriminals
.

The
doctor’s
orders

The
researchers,
who
will
also
present
their
NUIT
research
at
the

32nd
USENIX
Security
Symposium
,
also
recommend
that
users
scan
their
devices
for
random
microphone
activations.
Both
Android
and
iOS
devices
display
microphone
activation,
usually
with
a
green
dot
on
Android,
and
with
a
brown
dot
on
iOS
in
the
upper
part
of
the
screen.
In
this
case,
also
consider
reviewing
your
app
permissions
for
microphone
access,
as
not
every
app
needs
to
hear
your
surroundings.

Likewise,
listen
to
audio
using
earphones
or
headsets,
as
that
way,
you
are
less
likely
to
share
sound
with
your
surroundings,
protecting
against
an
attack
of
this
nature.

This
is
also
a
good
time
to
make
sure
you
have
the
cybersecurity
basics
covered
– 
keep
all
your
devices
and
software
updated,
enable
two-factor
authentication
on
all
of
your
online
accounts,
and
use
reputable
security
software
across
all
your
devices.

About Author

Subscribe To InfoSec Today News

You have successfully subscribed to the newsletter

There was an error while trying to send your request. Please try again.

World Wide Crypto will use the information you provide on this form to be in touch with you and to provide updates and marketing.