Artificial Imposters—Cybercriminals Turn to AI Voice Cloning for a New Breed of Scam

Three
seconds
of
audio
is
all
it
takes.  

Cybercriminals
have
taken
up
newly
forged
artificial
intelligence

(AI)

voice
cloning
tools
and
created
a
new
breed
of
scam.

Artificial Imposters—Cybercriminals Turn to AI Voice Cloning for a New Breed of Scam


Three
seconds
of
audio
is
all
it
takes. 
 


Cybercriminals
have
taken
up
newly
forged
artificial
intelligence


(
AI)

voice
cloning
tools
and
created
a
new
breed
of
scam.
With
a
small
sample
of
audio,
they
can
clone
the
voice
of
nearly
anyone
and
send
bogus
messages
by
voicemail
or
voice
messaging
texts.
 


The
aim,
most
often,
is
to
trick
people
out
of
hundreds,
if
not
thousands,
of
dollars.
 



The
rise
of
AI
voice
cloning
attacks 
 


Our
recent
global
study
found
that
out
of
7,000
people
surveyed,
one
in
four
said
that
they
had
experienced
an
AI
voice
cloning
scam
or
knew
someone
who
had.
Further,
our
research
team
at
McAfee
Labs
discovered
just
how
easily
cybercriminals
can
pull
off
these
scams.
 


With
a
small
sample
of
a
person’s
voice
and
a
script
cooked
up
by
a
cybercriminal,
these
voice
clone
messages
sound
convincing,
70%
of
people
in
our
worldwide
survey
said
they
weren’t
confident
they
could
tell
the
difference
between
a
cloned
voice
and
the
real
thing.
 


Cybercriminals
create
the
kind
of
messages
you
might
expect.
Ones
full
of
urgency
and
distress.
They


will
use
the
cloning
tool
to
impersonate
a
victim’s
friend
or
family
member
with
a
voice
message
that
says
they’ve
been
in
a
car
accident,
or
maybe
that
they’ve
been
robbed
or
injured.
Either
way,
the
bogus
message
often
says
they
need
money
right
away.
 


In
all,
the
approach
has
proven
quite
effective
so
far.
One
in
ten
of
people
surveyed
in
our
study
said
they
received
a
message
from
an
AI
voice
clone,
and
77%
of
those
victims
said
they
lost
money
as
a
result. 
 



The
cost
of
AI
voice
cloning
attacks 
 


Of
the
people
who
reported
losing
money,
36%
said
they
lost
between
$500
and
$3,000,
while
7%
got
taken
for
sums
anywhere
between
$5,000
and
$15,000.
 


Of
course,
a
clone
needs
an
original.
Cybercriminals
have
no
difficulty
sourcing
original
voice
files
to
create
their
clones.
Our
study
found
that
53%
of
adults
said
they
share
their
voice
data
online
or
in
recorded
notes
at
least
once
a
week,
and
49%
do
so
up
to
ten
times
a
week.
All
this
activity
generates
voice
recordings
that
could
be
subject
to
hacking,
theft,
or
sharing
(whether
accidental
or
maliciously
intentional).


 


Consider
that
people
post
videos
of
themselves
on
YouTube,
share
reels
on
social
media,
and
perhaps
even
participate
in
podcasts.
Even
by
accessing
relatively
public
sources,
cybercriminals
can
stockpile
their
arsenals
with
powerful
source
material.
 


Nearly
half
(45%)
of
our
survey
respondents
said
they
would
reply
to
a
voicemail
or
voice
message
purporting
to
be
from
a
friend
or
loved
one
in
need
of
money,
particularly
if
they
thought
the
request
had
come
from
their
partner
or
spouse
(40%),
mother
(24%),
or
child
(20%). 
 


Further,
they
reported
they’d
likely
respond
to
one
of
these
messages
if
the
message
sender
said:
 


  • They’ve
    been
    in
    a
    car
    accident
    (48%).
     

  • They’ve
    been
    robbed
    (47%).
     

  • They’ve
    lost
    their
    phone
    or
    wallet
    (43%).
     

  • They
    needed
    help
    while
    traveling
    abroad
    (41%).
     


These
messages
are
the
latest
examples
of
targeted
“spear
phishing”
attacks,
which
target
specific
people
with
specific
information
that
seems
just
credible
enough
to
act
on
it.


Cybercriminals


will
often
source
this
information
from
public
social
media
profiles
and
other
places
online
where
people
post
about
themselves,
their
families,
their
travels,
and
so
on—and
then
attempt
to
cash
in.


 


Payment
methods
vary,
yet
cybercriminals
often
ask
for
forms
that
are
difficult
to
trace
or
recover,
such
as
gift
cards,
wire
transfers,
reloadable
debit
cards,
and
even
cryptocurrency.
As
always,
requests
for
these
kinds
of
payments
raise
a
major
red
flag.
It
could
very
well
be
a
scam.
 



AI
voice
cloning
tools—freely
available
to
cybercriminals
 


In
conjunction
with
this
survey,
researchers
at
McAfee
Labs
spent
two
weeks
investigating
the
accessibility,
ease
of
use,
and
efficacy
of
AI
voice
cloning
tools.
Readily,
they
found
more
than
a
dozen
freely
available
on
the
internet.
 


These
tools
required
only
a
basic
level
of
experience
and
expertise
to
use.
In
one
instance,
just
three
seconds
of
audio
was
enough
to
produce
a
clone
with
an
85%
voice
match
to
the
original
(based
on
the
benchmarking
and
assessment
of
McAfee
security
researchers).
Further
effort
can
increase
the
accuracy
yet
more.
By
training
the
data
models,
McAfee
researchers
achieved
a
95%
voice
match
based
on
just
a
small
number
of
audio
files.
 
 


McAfee’s
researchers
also
discovered
that
that
they
could
easily
replicate
accents
from
around
the
world,
whether
they
were
from
the
US,
UK,
India,
or
Australia.
However,
more
distinctive
voices
were
more
challenging
to
copy,
such
as
people
who
speak
with
an
unusual
pace,
rhythm,
or
style.
(Think
of
actor
Christopher
Walken.)
Such
voices
require
more
effort
to
clone
accurately
and
people
with
them
are
less
likely
to
get
cloned,
at
least
with
where
the
AI
technology
stands
currently
and
putting
comedic
impersonations
aside.


 


The
research
team
stated
that
this
is
yet
one
more
way
that
AI
has
lowered
the
barrier
to
entry
for
cybercriminals.
Whether
that’s
using
it
to



create
malware
,
write



deceptive
messages
in
romance
scams
,
or
now
with
spear
phishing
attacks
with
voice
cloning
technology,
it
has
never
been
easier
to
commit
sophisticated
looking,
and
sounding,
cybercrime.
 


Likewise,
the
study
also
found
that
the
rise
of
deepfakes
and
other
disinformation
created
with
AI
tools
has
made
people
more
skeptical
of
what
they
see
online.
Now,
32%
of
adults
said
their
trust
in
social
media
is
less
than
it’s
ever
been
before.
 



Protect
yourself
from
AI
voice
clone
attacks
 



  1. Set
    a
    verbal
    codeword
    with
    kids,
    family
    members,
    or
    trusted
    close
    friends.


    Make
    sure
    it’s
    one
    only
    you
    and
    those
    closest
    to
    you
    know.
    (Banks
    and
    alarm
    companies
    often
    set
    up
    accounts
    with
    a
    codeword
    in
    the
    same
    way
    to
    ensure
    that
    you’re
    really
    you
    when
    you
    speak
    with
    them.)
    Make
    sure
    everyone
    knows
    and
    uses
    it
    in
    messages
    when
    they
    ask
    for
    help.
     


  2. Always
    question
    the
    source.


    In
    addition
    to
    voice
    cloning
    tools,
    cybercriminals
    have
    other
    tools
    that
    can
    spoof
    phone
    numbers
    so
    that
    they
    look
    legitimate.
    Even
    if
    it’s
    a
    voicemail
    or
    text
    from
    a
    number
    you
    recognize,
    stop,
    pause,
    and
    think.
    Does
    that
    really
    sound
    like
    the
    person
    you
    think
    it
    is?
    Hang
    up
    and
    call
    the
    person
    directly
    or
    try
    to
    verify
    the
    information
    before
    responding. 
     


  3. Think
    before
    you
    click
    and
    share.


    Who
    is
    in
    your
    social
    media
    network?
    How
    well
    do
    you
    really
    know
    and
    trust
    them?
    The
    wider
    your
    connections,
    the
    more
    risk
    you
    may
    be
    opening
    yourself
    up
    to
    when
    sharing
    content
    about
    yourself.
    Be
    thoughtful
    about
    the
    friends
    and
    connections
    you
    have
    online
    and
    set
    your
    profiles
    to
    “friends
    and
    families”
    only
    so
    your
    content
    isn’t
    available
    to
    the
    greater
    public.
     


  4. Protect
    your
    identity.



    Identity
    monitoring
    services


    can
    notify
    you
    if
    your
    personal
    information
    makes
    its
    way
    to
    the
    dark
    web
    and
    provide
    guidance
    for
    protective
    measures.
    This
    can
    help
    shut
    down
    other
    ways
    that
    a
    scammer
    can
    attempt
    to
    pose
    as
    you.
     


  5. Clear
    your
    name
    from
    data
    broker
    sites.


    How’d
    that
    scammer
    get
    your
    phone
    number
    anyway?
    It’s
    possible
    they
    pulled
    that
    information
    off
    a
    data
    broker
    site.
    Data
    brokers
    buy,
    collect,
    and
    sell
    detailed
    personal
    information,
    which
    they
    compile
    from
    several
    public
    and
    private
    sources,
    such
    as
    local,
    state,
    and
    federal
    records,
    in
    addition
    to
    third
    parties.
    Our



    Personal
    Data
    Cleanup


    service
    scans
    some
    of
    the
    riskiest
    data
    broker
    sites
    and
    shows
    you
    which
    ones
    are
    selling
    your
    personal
    info.
     



Get
the
full
story
 


A
lot
can
come
from
a
three-second
audio
clip.
 


With
the
advent
of
AI-driven
voice
cloning
tools,
cybercriminals
have
created
a
new
form
of
scam.
With
arguably
stunning
accuracy,
these
tools
can
let
cybercriminals
nearly
anyone.
All
they
need
is
a
short
audio
clip
to
kick
off
the
cloning
process.
 


Yet
like
all
scams,
you
have
ways
you
can
protect
yourself.
A
sharp
sense
of
what
seems
right
and
wrong,
along
with
a
few
straightforward
security
steps
can
help
you
and
your
loved
ones
from
falling
for
these
AI
voice
clone
scams.
 


For
a
closer
look
at
the
survey
data,
along
with
a
nation-by-nation
breakdown,

download
a
copy
of
our
report
here
.
 



Survey
methodology
 


The
survey
was
conducted
between
January
27th
and
February
1st,
2023
by
Market
Research
Company
MSI-ACI,
with
people
aged
18
years
and
older
invited
to
complete
an
online
questionnaire.
In
total
7,000
people
completed
the
survey
from
nine
countries,
including
the
United
States,
United
Kingdom,
France,
Germany,
Australia,
India,
Japan,
Brazil,
and
Mexico.
 

Introducing
McAfee+

Identity
theft
protection
and
privacy
for
your
digital
life

About Author

Subscribe To InfoSec Today News

You have successfully subscribed to the newsletter

There was an error while trying to send your request. Please try again.

World Wide Crypto will use the information you provide on this form to be in touch with you and to provide updates and marketing.