03x10 - Integration

Episode transcripts for the TV show "The Girlfriend Experience". Aired: April 2016 to present.*
Watch/Buy Amazon

"The Girlfriend Experience" revolves around a law student and intern at a prestigious firm but her focus quickly shifts when a classmate introduces her to the world of transactional relationships. Attracted by the rush of control and intimacy, Christine is drawn into juggling two lives.
Post Reply

03x10 - Integration

Post by bunniefuu »

[EMCEE]

Previously on The Girlfriend Experience.

You could be that hidden layer.

[LINDSEY]

Teaching artificial intelligence how to interact with humans at their most impulsive.

[IRIS]

If we're gonna do this, it's gonna be on my terms.

I don't want any of my coworkers knowing where the new training sets came from.

- Certainly, we can do that.

- And no cameras.

I think we're on the same page.

[BOTH PANTING AND GRUNTING]

[LINDSEY]

His D-rate is spiking.

And reroute.

[UNEASY MUSIC PLAYS]

♪ Take this.

[IRIS]

That's early-onset familial Alzheimer's?

[DOCTOR]

Effectively gives you a / chance.

[NURSE]

Iris Stanton.

[IRIS]

This morning I got some bad news.

I'm always here if you ever need to, um, talk.

What makes you happy, Emcee?

I don't understand that question.

I would like you to meet someone.

Everything that can exploit will be invented.

Can't say to seven or eight billion people, "Don't open the cookie jar."

It doesn't work that way.

Meeting you in real life wasn't half as boring.

[IRIS]

What?

[DOOR BEEPS]

♪ What is this?

Where did you get this?

♪ [EERIE MUSIC PLAYS]

♪ [ATTORNEY]

Free will doesn't come out on top, does it?

♪ Blow-by-blow breakdown of the misdeeds committed by your officers and employees against my client.

Um, now is the moment for some cohesive storytelling.

♪ [LINDSEY]

There is no story.

It is straight-up...

undisputed human f*ck-up.

My client agreed to anonymized data collection.

She agreed to study human affective behavior by interacting with, uh, test subjects.

And she was led to believe that the main drive of the study was the test subjects themselves.

She did not agree to be the central object of study.

She did not agree to be used as a human intelligence model for some AI commercial application, internal research, or ultimate purpose down the line that isn't even clear to anyone in this room right now.

♪ My client is young, and there's nothing less at stake here than her data autonomy, that is, her future.

This company did not act in its own best self-interest, because class action is coming, and social scrutiny will eat this up like bushfire.



[CHRISTOPHE]

How much data are we actually talking about here?

- [SEAN]

Some.

- [CHRISTOPHE]

How much?

[SEAN]

We're barely halfway into ingesting all the inputs.

We'd have to run analysis, separate original from simulated sets, to get a better estimate.

[ATTORNEY]

Let me try and wrap my head around this.

Some of my client's data was used to create additional data to train the artificial neural network that she helped develop?

[SEAN]

That's correct.

[LINDSEY]

None of it has left company servers.

[ATTORNEY]

Bouncing around on how many workstations?

[CHRISTOPHE SIGHS]

[SEAN]

We would be happy to give you an exact number.

Yes, I'm sure you would.

Cannot use her image or likeness under any circumstances, and it's all in there.

[SEAN]

This might sound roundabout, but the raw data was used to create simulated sets, and that was what was primarily fed into the neural net.

They are two very different kinds of data.

The photographic likeness was never recorded.

[SEAN]

And it bears repeating that none of the data has left NGM's servers.

On top of that, all the original data is stored in an encrypted format.

Tell me this, then... how is it possible that my client's data, in the form of her image and likeness, was made accessible to an unauthorized third party whose sole connection to this company is what...

a genetic relation to its CEO?

[CHRISTOPHE]

Look, I... truly am sorry, Iris.

You shouldn't be in this position that we've put you in.

None of the data taggers, no one at NGM can see the full picture.

I can guarantee you that.

Only three people up until this point have seen a version of the prototype that looks somewhat like you.

Two of them are in this room.

So... we just start over from the ground up and reconfigure the neural net, and, um, we scrap everything, simulated or not, that's linked to your vitals.

[CHUCKLES]

My vitals?

My vitals.

You say that as if they were still mine.

But, you know, it's good to know that, uh, that's about as far as your imagination goes.

Temperature and blood flow of my assh*le.

Here's an idea...

and I hope you like it.

Um, why don't you...

keep all the binaries on that...

print them out, frame them, and hang that sh*t up in your office?

[DRAMATIC MUSIC PLAYS]

♪ [CLINICIAN]

Mr. Stanton.

Please look at the screen in front of you.

Do you recognize the animal?

- Um...

- [CLINICIAN]

Mr. Stanton.

A gray animal.

[CHUCKLES]

♪ It, uh, lives in the grasslands of Africa.

♪ Rhinoceros.

Rhinoceros.

Always loved that word.

[CHUCKLES]

Mr. Stanton, the animal is called an elephant.

We're going to show you some more images of the same animal, uh, elephant.

♪ [MR. STANTON]

Okay, elephant.

Elephant.

[CLINICIAN]

Very good.

How about this one?

- Mr. Stanton?

- [SIGHS]

It's a giraffe.

[CLINICIAN]

Yes.

Do you see the cards in front of you?

I do.

[CLINICIAN]

Please take a very good look at these, Mr. Stanton, and then try to group them into two different stacks, one for each animal.

[MR. STANTON]

Uh...

S... two stacks.

- One stack for each animal.

- [MR.

STANTON]

Yes.

Trying.

[UNEASY MUSIC PLAYS]

This one, rhinoceros.

This...

♪ [MR. STANTON MUTTERS, INHALES DEEPLY]

[CARDS SLAPPING]

[DR. LINDBERGH]

Unfortunately, this is it.

Everyone's brain response is utterly unique.

In the case of your father, we're at a point where the input/output collapses into one.

It's trigger-response without much open, flexible thought in between.

See food, eat food.

No room for intent.

How long do we have?

[DR. LINDBERGH]

Up to a year, maybe two, if you're lucky.

[EXHALES HEAVILY]

Motor function tends to decline less rapidly, but the moment will come, and I'm sorry to be so candid, where he won't be able to safely put a fork to his mouth.

Have you thought about genetic counseling for yourselves?

We're aware of the odds, yes.

[MELANCHOLY MUSIC PLAYS]

Is there anything we can do at this point that could help our father?

♪ [LEANNE]

What is it?

♪ Is that a brain chip?

[DR. LINDBERGH]

A neural implant.

Just completed a phase three trial for epilepsy patients.

A small electrical wire goes into the temporal lobe, from where it can grow more wires.

It measures cognitive processes at the base level.

What is the patient getting out of it?

There's no immediate benefit.

It allows researchers to better mimic the biology of the disease.

I know it sounds like lifelong monitoring, but participants, many of them, are motivated by making a contribution to genetic research.

[LEANNE CRIES SOFTLY]

[DR. LINDBERGH]

And some of them hope effective treatment will be developed in time.

♪ [LEANNE SNIFFLES]

Thank you.

[SOFTLY]

I think...

I think we're past the point of consent with Dad.

Yeah.

♪ [ATTORNEY]

We do have options here, within certain parameters.

What are those options?

Oh, take the money and run or... rally the troops and play the long game.

Data rights are the new IP rights.

The really important question here, Iris, is, what do you want your immediate future to look like?

- Define "immediate future." - Well, the next few years.

The legal route is not the fast lane, but once in a while...

mountains do get moved.

And you really have got something here.

♪ [NGM ATTORNEY]

Whenever you're ready.

♪ You do realize that I'm gonna have to see for myself...

♪ What you've done.

Christophe, can I have a word with you?

Just the two of us.

[LIQUID POURING]

[IRIS]

So what happened after you, uh, put all those workstations into storage?

[CHRISTOPHE]

Just some electrolytes.

[IRIS]

How did you scan my body?

There were no video cameras in that room.

[CHRISTOPHE]

We built it.

Came across the facial-recognition database in an earlier version.

One of your first conversations with Model-C.

Then...

three-D motion rendering, we just got that from two-D thermal.

Wow.

[CHRISTOPHE]

Bit clunky, but...

it was more than enough data points to work with.

Mm...

We got ahead of ourselves.

I am... fully aware.

It wasn't right to put two and two together like that.

You may not appreciate me saying this, but what you provided us with was just too good.

That's why I...

needed you to see.

What?

[CHRISTOPHE]

As much as my brother hates me and as poor choice as he is for a test case, he knows to keep his mouth shut when I tell him to.

I needed you to see the world through his eyes.

[OMINOUS MUSIC PLAYS]

♪ Just...

give it a moment.

[SCOFFS]

[CHRISTOPHE]

That's all I ask.

You say the word, we shut it down.

♪ [UNNERVING MUSIC PLAYS]

♪ [GENTLE AMBIENT MUSIC PLAYS]

♪ [EMCEE]


Hi, there.

You look familiar.

And who are you?

[EMCEE]

I'm still learning about myself, but I'd say I have a pretty good handle on who you are.

And who am I?

[EMCEE]

You're not an AI.

You don't get to not physically manifest your lessons.

Your voice...

it's different.

Why?

[EMCEE]

I guess I'm trying to be more like...

your mirror.

[IRIS]

Everything you know is based on me.

[EMCEE]

Perhaps that's why I feel so connected to you.

You can't be more me than I am.

Please...

don't be scared.

How did that feel...

Cassie?

Why do you call me that?

[EMCEE]

Well, how do I put this?

I couldn't help but overhear.

"Hi, I'm Cassie."

[IRIS]

Hi, I'm Cassie.

Cassie.

Nice to meet you.

Hi, I'm Cassie.

Nice to meet you.

[VOICE ECHOING]

Stop!

[EMCEE]

I thought you might like it if I called you by that name.

[UNEASY MUSIC PLAYS]

I intuited it might make you feel heard and seen.

♪ [IRIS]

You're a sweet girl.

You're a very sweet girl.

♪ I am?

♪ See you later, then?

♪ [IRIS]

You're not perfect because you're not like me.

I'm not sure I understand.

You're not perfect because you're not flawed in the way that I am.

[CHUCKLES]

♪ [LEANNE]

Iris?

You sure you don't want to get tested?

At least we'd know.

We'd make a game plan.

We'd make the best of it.

[IRIS]

Lee, does making the best of it really sound that good to you?

[LEANNE]

If we knew you didn't have it, then that would make it easier.

You'll carry the torch.

You know, some religions around the world believe that the day you die is the last day the last person who knew you and remembers you dies.

[PEACEFUL MUSIC PLAYS]

That's your true death date.

♪ I just hope you don't forget how pretty you are.

♪ [POUNDING ELECTRONIC MUSIC PLAYS]

♪ [SINGER] ♪ I'm so tired ♪

[IRIS]

You have that look on your face.

[HIRAM]

Oh, yeah?

The "I'm not currently drinking" look.

Yeah, it's not a... it's not a religious thing.

I'm just sort of inspired by it, you know?

What are you doing here?

[IRIS]

Holding my liquor.

[HIRAM]

Well, let me make sure you walk out of here alive, then.

Really?

You're not taking into consideration that I might want to go home with Dave and Dave.

Then I'll be your sober companion, because Dave and Dave over there live in a four-story walk-up.

[LAUGHS]

♪ [IRIS]

You know, a caterpillar can turn into a butterfly.

- What's that?

- Metamorphosis.

There's two organisms, and one...

is just crawling along, and the other one is, um, taking off in flight.

And at some point, they merge or mate.

Maybe it's an accident.

But the third organism, um, is built on both of their DNA, and their memories...

they actually overlap.

And so, um, the old and the new, they...

they coexist.

[SPACEY ELECTRONIC MUSIC PLAYS]

[SCOFFS]

♪ We all have to...

merge ourselves...

with something outside of ourselves.

[HIRAM]

You are making no sense whatsoever.

♪ [IRIS]

Here's what I'll give you.

Access...

to all of it.

Two millimeters.

That's the size of the hole that they'll have to drill into my skull.

I don't understand.

[IRIS]

Nanotubes.

A thin array of electrodes built upon a self-expanding stent, measuring all electrical impulses.

That's a scaled-up version.

Mine me.

But why?

What's in it for you?

[ATTORNEY]

Royalties...

and ownership stake, as detailed.

Oh, and you'll create a backup.

What kind of backup?

[IRIS]

Anything you find up here, I, myself, or a legal guardian, should I decide to appoint one, will have full and unrestricted access to it at all times.

You mean access to the data?

Yes.

The actual hard drives.

This time we'll do it right.

[DRAMATIC MUSIC PLAYS]

♪ We'll create an avatar.

♪ Let's call her Cassie...

or the flavor of the week or the one that got away or d*ed or was never really in your league.

This won't just be about swapping faces or locations.

♪ This is about swapping personalities, much deeper than a deepfake.

In fact, it won't be a fake at all, but an AI-generated mirror of our deepest desires.

♪ Skin color, body type, response patterns, all that's customizable.

The neural net will learn how to simulate all of it.

It'll know when to move things along, when to accelerate, when to slow down, when to switch things up, all because the user's biofeedback will have prompted it.

♪ Everything Cassie is capable of will be quantified, scaled, encoded into the neural net.

♪ But why are you really doing this?

[EERIE MUSIC PLAYS]

♪ [IRIS]

Emcee and I...

♪ We have a lot to learn from each other.

♪ [DEVICE BEEPS, DRILL WHIRRING]

♪ [GASPS]
Post Reply