02x05 - My Identity

Episode transcripts for the TV show "Dark Net". Aired: January 2016 to May 2017.*
Watch/Buy Amazon


"Dark Net" explores murky corners of the Internet using examples of unsettling digital phenomena to ponder larger questions, like whether and how the digital age might be changing us as a species.
Post Reply

02x05 - My Identity

Post by bunniefuu »

[Talley] I've been told that
I look like an average person.

But some of the characteristics

that somewhat stick out are,

I would say, I have a big nose.

It's not very symmetrical.

My eyes are spaced apart.

I do have distinct moles
on my right side of my face.

I think a lot of people
probably look like that.

[narrator] The face...

it's the way we recognize each other.

But to machines, we are code.

We are being transformed
into a face print,


a unique set of points that converts

our very humanity into data,

traceable, trackable,

forever,

online or on the streets.

Will you ever be a face
in the crowd again?


_

[Shipp] On September th,
we opened the bank at : .

We had a line, probably five
or six people in line.


The next client that came to the window

handed me a piece of paper.

It was enclosed in a plastic envelope.

And I'm going, "Oh, my gosh," you know.

And it's like my heart just sunk.

[narrator] On September ,
, Bonita Shipp


came face-to-face with a bank robber.

[Shipp] As I was getting
the money ready,


I studied his face.

I tried to recognize

any markings at all.

He had sunglasses on.

And he had a cap on.

So I couldn't see anything
from his eyes up.

But I looked at his nose.
He had thin lips.

He had no tattoos, no scars, nothing.

So I took all of the money
out of my drawer,

put it all in an envelope.

And I handed it to him.
And then he left.

He got away with just
a little under $ , .

[Atick] One of the most
prominent things about humans


is that they use their vision
to do many things.


We recognize friends from foe.

We recognize familiar faces.

We recognize objects.

We use our vision in a way that's very,

very essential to our survival.

[narrator] Could computers
one day recognize faces?


That's the question

that haunted face recognition
pioneer Joseph Atick.


[Atick] I grew up in Jerusalem,

where it was necessary

to always travel with an I.D.,

to present it two or three times a day

to go from one town to the other.

I believed that we needed something

that was more effective
in securing the world.


So I applied mathematical techniques

to the study of the human brain.

There are four elements

in a facial-recognition technology.

One is the algorithm,

which is inspired by the human brain.

Second was the need
to have a camera in order

to allow the vision, like the eye.

You needed to also have the database.

You needed to have the memory

of people who are known.

Humans, for example,

we remember about , people.

That's our database.

And in order to run the algorithm,

we need processing power.

We do it effortlessly
because we don't think of it.

But, in fact, when I look
at your face, many,

many calculations happen in my brain.

It is so innate that
we don't even think about it.

Three or four years
of mathematical took us

to develop the first
facial-recognition algorithm.

[narrator] This is FaceIt,

the first commercial product

using face-recognition
technology developed by Atick


and his team in .

It opened a new window
to our digital world.


The genie's now out of the bottle.

And any attempt
to put it back, technologically,

is doomed

because I started getting a lot of calls

from intelligence agencies
around the world

who thought this would be
quite useful for their mission.

Facial recognition is simply the ability

for law enforcement

to electronically search

against millions of photo images.

[narrator] Next-generation
identification,


the FBI's biometric network,

overseen by assistant
director Stephen Morris.


[Morris] The system is looking
for key features of a face.

It'll measure the distance
between the eyes...

...the distance between the ears,

the ears in relation to the mouth.

And then it's looking for
other images in that repository

that have those same measurements.

Our database consists of around

million mug-sh*t photos.

There are also repositories
of photographs

that have been lawfully collected,

such as visa photos, travel documents

and driver's license photo files.

So we're talking about more
than just million photos

that can be searched using
facial-recognition technology.

The potential is unlimited.

[narrator] About half of American adults

are in a law-enforcement database.

Most don't even know it.

To store the world's largest
biometrics database


is a , -square-foot data center,

about the size of
a lower Manhattan block.


[Dispatcher] Just be advised,

there was an RP accident
at precinct.


I've been working in the -
for about / ,

going on years.

We have the pictures of people

who were involved in a sh**ting.

So when you come into contact

with that person, they know who we are.

But now we have a step up
because we know who you are.

[narrator] About , officers

patrol the streets of New York City.

But even the nation's
largest police force


can use an extra pair of eyes...

...or , of them.

These cameras aren't just watching.

With the help of live analytics,

they can detect what
the human eye cannot,


a sh*t fired,

a suspicious package

or a suspect running from a crime.

And they transmit this data

directly to the
real-time crime center.


[man] The main purpose of
this unit is to help identify


anybody who's unknown
in a criminal investigation.


[narrator] The software
enhances surveillance footage.


We're able to convert
a D image to a D image.

And it's going to convert that image

to a more proper pose
that we're going to need,

similar to a driver's license
or a mug sh*t.

Facial recognition has
tremendous potential

when you're looking for the proverbial

needle in a haystack.

It provides you a lead,

particularly in an instance
where you don't know

who it is you're looking for.

[Talley] My name is Steve Talley.

I lived in Colorado.
I had a beautiful family.

I was a loving father.

I have two kids.

I have a daughter
who is this February.

I have a son who had just
turned last week.

We lived in a very nice,
family-oriented community.

And I had a great career
in financial services.

And I was, at that time,
excited about my life

and my prospects and the future.

I was living the American dream...

...or so I thought.

My life changed dramatically.

I got divorced from my ex-wife.

And then I got laid off
due to corporate restructuring.

I had financial obligations.

I still had child-support payments

of about $ , a month.

But I considered myself
to still be an average,

law-abiding citizen.

[narrator] But the authorities thought

he was living a double life
as a serial bank robber.


And even local news joined in the hunt.

[reporter] Do you recognize this man?

Denver police are
looking for him tonight.


They say he robbed the U.S. Bank

on South Colorado Boulevard

near East Mississippi
and Glendale last week.


Have a look at his picture

taken by surveillance
cameras in the bank.

Police say he may be armed with a g*n.

[Talley] One day, there was
a pounding at my door.


All of a sudden, I see
a gentleman with the FBI jacket.

He handcuffed me behind my back.

He said, "Do you know
why we're arresting you?"

He said, "We're arresting you
for two armed bank robberies

and assaulting a police officer."

I was driven to the detention center.

I was in prison
in a maximum-security pod

because they had very
strong facial recognition

that proved that I was the guy.

But I always said I was innocent.

I had an air-tight alibi.
I shared it with everyone.

I had my own witnesses

for the alibi come in
and prove I was there.

But my only crime is I,

apparently, look like someone else.

I really have nothing to hide.

[narrator] Nothing to hide,
nothing to fear, right?


Face rec gets the bad guys
off the streets.


So what's the harm?

I grew up in a world

where identity was part
of our daily experience

at a time when the
world was in conflict.

In societies where there was
an oppressive regime,

there was a chilling factor.

People did not express
themselves freely

because there was a fear
that they would be persecuted.

Now we have a different
kind of chilling factor.

And it is driven not by governments,

necessarily, but by
the surveillance camera.

And that chilling factor,

it means we're going
to change our behavior.

And we no longer live in a free society.

Will that be a society
that we will accept?

[man] Zoom in.

Try to get as close up as you can.

Take a quick snapshot.

[Wade] By walking in to a casino,

you have effectively
given up your privacy


because, in a casino,

you really want to know your customer.

So facial recognition

dramatically improved the ability

to actively track card counters,

high net-worth individuals,

cheaters and all sorts
of other individuals


that the casinos
are interested in tracking.


[narrator] The man
behind this technology


is Wyly Wade,

who provides it to
casinos across the country.


[Wade] Facial recognition
is contactless.


It is noninvasive.

And it is the link
from your digital environment


to your physical environment.

Part of this is a very
personal issue to me.


I've got a daughter with special needs.

She's deaf. She's autistic.

My daughter loves to climb.

She flips. She twirls.

I'm the human jungle gym.

Oh, now you're going to run away, huh?

She laughs. She plays.

She runs. She hides.

But she doesn't have this filter

on what is good and what is bad.

So we need an extra level
of security to re-create

some of that filter for her.

We have cameras that
automatically rotate,

pan,

tilt, zoom,

all based off of either sound

or motion.

If that camera detects
that there's a face there,

then what it does is it drops it down

to our security system

and then compares that to the people

that I don't want into our house

so that that way you
can be prewarned or preconfirm

whether or not that person
is a r*pist or sex offender

because we have hundreds of thousands

of registered sex offenders
in the United States.

So if the UPS driver happens to be

a registered sex offender,

yeah, I'd like to be
notified about that.

Facial recognition gives
us peace of mind.

You don't know where
the real danger lies.

You don't know who the hooligans are.

You probably don't even
know your neighbors.

Facial recognition is not
positive identification.

If you're looking for an individual,

and you submit a search, and you
get back in your gallery,

there could be wrong people in there.

And that is where it's up
to the investigator

to take the information

that comes with that picture.

After the robbery, the police

and the FBI came,
and they interviewed me,

and they wanted me
to identify the person

and do a photo lineup.

There were six people on that page.


You know, I said, "Well, this
kind of looks like the guy."

And they asked me like,
"What percentage do you think?"

And I said, "Well, probably maybe %,

but I'm not % sure."

[Talley] Instead of using my
mug sh*t in this photo lineup,


which they typically do because
it's the most recent picture,


they actually used the picture

where I got my DUI in .

They're using this picture

from years ago where I look younger.

[narrator] That same DUI mug
sh*t is what the FBI used


to compare Talley to the robber.

I didn't think he looked like me.

But I could see some similarities.

And I think he probably
looked like a gazillion people.

[narrator] But in court,
the eyewitness faces


the suspect not in pixels

but in the flesh.

[Shipp] I insisted

I would have to see Mr. Talley in person

because I want to make sure
that my decision is right.

The robber had no markings, no moles.

And when you see this Mr. Talley,

he has a mole on his cheek.

Also, Mr. Talley had a long nose.

The other guy's nose was shorter.

And Mr. Talley is a big man.

And the robber was not.

And I told the judge,

I said, "If you're asking me

if this was the guy
that was the robber,"

I said, "I am absolutely,
% positive it is not."

Talley: The FBI and local police

said it was me based
on the similar features.


But they totally ignored features

that would exclude me
as being a suspect.

Finally, they wanted
to do a height analysis.

The bank robber was foot.

And they determined I was ' / ".

So I was inches too tall.

That proved I couldn't
possibly be the guy.

[narrator] His face got him arrested.

But his height set him free.

I did get dismissed.

But they seemed to be
trying to build a case

where there really wasn't a case there.

They didn't have any,
not even a shred, of evidence,

except for I looked
like the bank robber.

This is really scary stuff,
this technology.

It could be a great shortcut
to help with investigations

but at the expense of possibly,

you know, targeting the wrong people.

[narrator] It could happen to you.

As the database of faces grows larger,

so does your chance
of having a doppelganger.


[Atick] There are more cameras
now than humans in the world.


With people having
cameras in their pocket,


cameras in the street,

there are billions
of cameras in the world.

That is a natural progression.

And we saw some of it happening.

But one thing we did not see,

nor anyone saw or counted on,

was the emergence of social media.

If, in the ' s,
I told you that, one day,

there was going to be a database

where people voluntarily
add their faces

and the faces of their friends
and the faces of their children,

and that database
could be used to identify

every single one
of those billion people,

I would be laughed at back then.

But now Facebook
is essentially a database

that is the dream of a big brother.

[narrator] . billion users,

million pictures uploaded every day,

we are the ones training
Facebook's facial recognition.


How?

By tagging ourselves and our friends.

Their algorithm is so strong,

Facebook can I.D. you even
if your face is hidden.


There is money to be made

from the recognition of your face.

Imagine if you walked around,

and on top of your head
it said your name,

how much money you have, what you like,

what are your preferences in life,

and machines that run algorithms

will start to make decisions for us.

And as a consequence,

we may lose the battle of privacy.

[Wade] I think privacy is
a misnomer at this point.


We have to get over the idea

that you own your data
and your identity.


We gave up our right
to most of our data

when every time we signed up
to Google or to Facebook

because it was convenient.

It improved your life in a lot of ways.

But it also broke down a lot of barriers

to where that data that you claim

is your identity is no longer your data.

You don't own it.

We've almost gone to a post-privacy era.

[narrator] What does
post privacy look like?


Just ask the Russians.

[woman] Findface, an app that
allows you to find any person


in the largest Russian social network.

[narrator] It lets a user
photograph a stranger,


upload that picture

and compare it to social media profiles

to unmask the person's identity.

[woman] The service allows you

to not only find the desired user

but also to send them messages
and other information.


[narrator] In other words,

it's a dream come true for stalkers.

[Atick] To see a face recognition

being abused in a certain way,

it means that we no longer
live in a free society.

But face recognition can be good

as long as there's oversight

to make sure no innocent individuals

are accused.

[Talley] Now I'll wake up
in the middle of the night,


and I won't know where I'm at.

All of a sudden, it dawns on me.

This nightmare is my life.

Here I am. I'm actually in
a shelter right now.

Even though they had
no case, it dragged on.

It's two years after,
and I am still struggling.

Because of this incident,
I lost my housing.

And even just being
associated with bank robberies

has, basically, effectively,

black-balled me or destroyed
my career in financial services.

No one's going to touch me.

The biggest thing
is my custody situation.

I haven't been able
to see my kids in two years.

I've missed the Christmases,

the birthdays, the Thanksgivings,

all the memories of them growing up.

I'm always concerned about them,

what they're feeling, what their
thoughts are about me.

Do they feel that
they've been abandoned,

that their father doesn't love them?

Being homeless is extremely hard.

You know, I feel isolated.

I feel like I'm invisible, too,

because a lot of people
don't like to look at homeless.

I'm in a pretty big hole right now

that I have to dig myself out of.

I'm fighting for my life back.

That's really what this is
about is fighting to get my life

back to where it was,

to fight for myself, for my family.

[Wade] There's a lot of good
that's used of this technology.


I use this to protect my family.

Are there bad uses
of facial recognition?

Absolutely, because we don't
live in a perfect world.

Facial recognition is going
to create problems.

There's going to be
some collateral damage.

I guess I was collateral damage.

My family was collateral damage.

It happened to me.

Why couldn't it happen to other people?

[narrator] Chances are your
face can be used against you.


We all feed the network.

The more data we upload,

the more powerful it becomes,

entangling our physical selves
into a digital web.


We, as technologists,
have to be responsible

for the creations that we've made.

And we have to explain

to the world the inherent danger.

And that is the missing link

between our online persona

and our offline persona.

The offline persona is our only persona.

It's unique. It's us, makes us human.

And the ability to protect that

will depend on the ability
to stop face recognition

from recognizing us without consent.

And I don't believe we can afford

to lose control

over the most precious thing...

...which is our identity.
Post Reply