Skip to main content

Pre-Crime Policing: How Cops Are Using Algorithms to Predict Crimes

The LAPD is one of a growing number of police departments using algorithms to try to predict crimes before they happen. Proponents of these tools say they provide cops with added tools to keep their cities safe -- but critics argue it's just another form of profiling.

Released on 05/22/2018

Transcript

(police radio)

[Issie] Police in Los Angeles

are trying to predict the future.

We know where crime happened yesterday.

But where is it going to happen tomorrow and the next day?

[Issie] And they're not alone.

More and more departments are using data driven algorithms

to forecast crime.

It's all about improving accuracy,

and improving our effectiveness and our efficiency,

and our service to the community.

When you see more police officers.

You see the lights, you hear the sirens.

The high visibility of officers

does deter crime in certain areas.

Proponents say it's helping to reduce crime.

But other wonder, at what cost?

I think it targets and it just justifies

in their eyes

racial profiling.

I just really worry about the world

that my son is gonna enter into,

and become a young man in.

I don't want people to look at him and prejudge him

that could be life or death

if someone calls the police on you.

The algorithm is always going to augment

the system that it's in.

And if the system itself is biased,

is unjust,

then the algorithm's gonna replicate that.

Predictive policing is changing policing

in your city now.

And when you start asking the hard questions about.

How does it impact civil rights?

How does it impact privacy?

How does it impact suspects?

How does it impact officers doing their job?

Alright here's how where gonna work for tonight.

Sergeant Flores will be out as 160.

[Issie] Say you're living in Los Angeles.

And you just got out of prison.

The LAPD could be keeping tabs on you

through a program known as Operation LASER.

It uses crime, arrest and field data

to determine where violent crimes are likely to take place.

And who will perpetrate them.

LASER predicts who will commit a crime

before crime actually happens.

Groups like the Stop LAPD Spying Coalition

are sounding the alarm in local communities.

They say predictive policing strategies like LASER

Disproportionately target low income people

and communities of color.

Why do you believe that this data is subjective?

The data is subjective and the data is biased

because policing practices have always impacted

black and brown

and poor people.

Now we have predictive policing

coming in with a veneer of science

trying to tell us

that somehow making policing more objective.

But what we're finding is

that it's only augmenting the way that policing operates.

The LASER program tries to predict

who might commit violent crime using a point system.

For example,

If you're on probation that's five points on your record.

If you're in a gang that's another five.

A stop by police can get you another point.

The people with the highest scores

in each of LA's 21 divisions

are placed on a list called the Chronic Offender Bulletin.

You tally up all those numbers.

Some folks end up with

two points.

Some folks end up with 40 points.

Those that end up with the highest number of points,

we'll take a look and see.

Where are they now?

The most troubling thing

about the Chronic Offender Bulletin,

it' re-criminalizing people who have already been on parole.

Who have served their time.

It seems kind of hard to avoid police contact

when that's your home and this is your history.

And the contact is coming to you.

Well not necessarily.

If you live in the area,

and you did time,

and you've got a job,

and you're doing everything you're supposed to be doing,

then you're not gonna be getting

that constant police contact.

[Issie] Police say they're just being proactive.

What we don't want is for that person

to recommit another crime.

Go back to jail and keep on the cycles.

We try to get ahead of it.

[Issie] The problem is the public has no way

of knowing who's being targeted,

which is why the Stop LAPD Spying Coalition

is suing the city for more information.

Police are supposed to inform individuals

if they're on the list.

But we reached out

to the ACLU, Legal Aid and several defense attorneys

and none of them knew of anyone

who had been contacted by LAPD.

We feel that the people have the right

to know what is going on in this program.

They may not know the name of LASER.

And they may not know the specifics of how it operates.

But I think they do know exactly what's happening

that they're being monitored.

Cause people in the community they speak about it.

They say, the officer came up to me and said,

I see you, I know your brother.

I got your brother, I'm gonna get you.

It is one part social control.

Hey we're watching you.

And it's one part information clashing.

We are seeing who you are associating with.

The car you're driving.

Who you're hanging out with in your neighborhood.

And they're recording all that,

and putting it back into a larger, ever growing database

that's used for investigative purposes.

[Issie] And LASER is just one

of several new smart policing strategies.

Some departments are also considering drone surveillance,

and body cameras equipped

with facial recognition technology.

This video from the maker of an AI powered body cam

demonstrates how the technology could be used

to find a missing child.

Not shown,

how they could use it to track suspects.

Meanwhile, cities across the country

have embraced a predictive policing approach

that focuses not on high risk people

but on high risk places.

Putting entire neighborhoods under watch.

Critics worry it's targeting areas

that are already over policed.

The problem is that if you're already policing

in a certain neighborhood,

you're gonna be predisposed to seeing data

that's now gonna tell you

to keep putting more and more officers in those areas.

[Issie] Bari Williams is an attorney

who advises startups in the Bay Area.

I absolutely love technology.

I work in the sector.

But I think that we need to be mindful

of how technology is created.

And how it's used.

And how it can be used in various purposes.

[Issie] One popular tool focused

on where crime will happen is PredPol.

Today more than 60 departments nationwide

use it to forecast crimes

like burglaries and car break ins.

Everybody has their predictive policing already.

[Issie] In the LAPD's foothill division,

one of the first to use PredPol,

it starts with the daily roll call.

Officers get a map labeled with 10 to 20 hotspots

they're encouraged to visit during their shifts.

The more time they spend in those areas,

the more likely they are to deter crime.

Or so the thinking goes.

[Officer] Alright, let's go to work.

This is today's predictive policing boxes.

We're gonna cover the three boxes that are in my area.

In the north end of Pocoima.

[Issie] PredPol forecasts crimes

based on patterns from the last several years.

It analyzes three elements of crime data,

the crime type, the location and the time.

It then runs them through an algorithm

and spits out 500 by 500 foot hotspots, or boxes

that officers should keep an eye on.

Right now we're experiencing a lot of property crime.

That's what our predictive policing boxes

are gonna focus on.

[Issie] Officer Steve Nunez has worked

in the FootHill Division for 16 years.

What are you looking for as we cruise around?

Looking for anything suspicious.

Even sometimes looking for cars

that look like they've been sitting here for awhile.

Anyone that is maybe trying to open car doors.

Just anything that doesn't fit.

At first we were kind of,

I don't know if we're gonna follow this predictive policing.

Having a computer tell us where to go.

[Issie] But he's become a believer in PredPol.

Crime has gone down dramatically.

Part of it I believe is predictive policing.

Part of it's just the community engagement.

Has to do with the officers

going out there and being proactive.

Don't leave it out here for a month

and all that stuff like that.

Like I said, I won't impound it,

cause I know it's his.

So I'll let you guys deal with it.

But at least get it off the street.

Alright, man. Cool, man.

You guys be good, alright?

You too.

[Issie] The fundamental question here is,

can we trust the data and algorithms

that police are using to predict crimes?

Algorithms and big data

simply take inputs,

crunch 'em up to create outputs.

If your inputs are biased,

your outputs are gonna be biased.

[Issie] You don't need to look far

to find troubling examples of algorithmic bias.

In 2016, Microsoft introduced

an AI chat bot called Tay,

which quickly started repeating

racist and anti-Semitic comments

it picked up on Twitter.

Then there were those problematic Google auto-completes,

which are based on popular human searches.

In January, a Google search for islamists are

turned up results like, islamists are evil.

And hitler is turned up, hitler is my hero.

And there was also the time,

the Google Images algorithm

mistook black people for gorillas.

What happens if predjudiced data

makes it's way into police work?

It's a seven.

[Issie] For Bari Williams that risk is personal.

The biggest fear I have

with predictive policing and my son

is that he will be profiled.

I'm sure he probably will experience that

at some point in his life already.

But I don't need something making that easier.

Did you have a good day?

Yeah.

[Issie] Williams lives in Oakland,

a city that so far has decided not to use PredPol.

Still she says,

the image of black children being fatally misslabled

as a threat by police looms large in her mind.

In the case of Tamir Rice,

you have a child who was playing with a BB gun.

And the police just rolled up.

And within three seconds shot him.

That is a real concern for me.

My son and I like to go certain places, and

if the neighborhoods are not awesome,

and predictive policing is telling you

to send more police into that area,

and we're there.

I'm not gonna allow my child to go play in the street.

I'm not gonna allow my child to run in front of me.

I don't want anything to give the police

a false sense that he could've been

engaged in something he wasn't.

The activists and the police

are all coming from the point that makes sense for them.

I think from a community perspective,

that feels over policed.

That feels like there is racial injustice in policing.

It's wise to be concerned about this new technology

that's going to give police more power.

If you're a police officer,

you understand you have an incredibly difficult job.

And adding new technologies might help.

You kinda wanna get the benefit of the doubt.

We've had crime rises over the last few years.

We've had crime reduction.

It is a tool.

It's not a panacea for crime.

That's what a lot of people believe.

They believe that you utilize PredPol,

crime's gonna go away.

False.

That is a farce.

That is not gonna happen.

It's an indicator.

It takes you there.

It could happen.

It could not.

It's a probability.

The question about whether predictive policing works

is still open.

[Issie] One study found that PredPol's algorithms

correctly predicted crime just 4.7% of the time.

Still that's higher than the human analysts

who got it right just 2.1% of the time.

It's worth noting that

the study was done by the company's co-founder.

And then there's the question of bias.

You start coming into the Pacoima area,

which is south of the 210 freeway.

Now you're dealing with predominantly Hispanic families.

A lot of immigrants in that area.

And I notice there are more boxes there.

The day we visited most

of the PredPol boxes in the Foothill Division

were in Hispanic neighborhoods.

But police say that's by chance.

I do believe the data is objective.

We're simply using the crime numbers.

The type of crime.

The location.

And

the time

of when we thought that crime,

or when that crime was reported to us.

It has nothing to do with anybody's ethnic makeup.

Or the neighborhood.

Or anything else.

[Issie] According to research by PredPol's co-founder,

their algorithms don't lead to more arrests of minorities

than human policing.

But that isn't exactly comforting

to critics of traditional policing.

The issue is that crime tends to happen in all areas.

But the problem is that

marginalized groups are where police

are usually congregating.

[Issie] And there is evidence of that disparity.

Today black and Hispanic Americans

account for about a quarter of the total population.

But make up more than half of the prison population.

And there may not be a more infamous symbol

of unjust policing

than the Rodney King beating,

which happened right here

in LAPD's Foothill Division 27 years ago.

That beating and the riots sparked

by the acquittal of the officers involved,

ignited a national conversation

around police brutality that continues today.

Members of the LAPD say that predictive policing

is helping them move forward as an organization.

Instead of being an occupying force,

like we were referred to once upon a time.

Where there was a problem in the area,

and we had to come in with tons of cops.

We're tryna be more strategic.

[Issie] But activists of the Youth Justice Coalition

say that new technologies will only make things worse.

It's basically allowing them to racially profile

black and brown youth.

I live in a low income neighborhood.

Mostly black and brown people that live there.

There's always police on my block.

What ProdPol I think is doing

is targeting communities that they already been targeting.

[Issie] Stop LAPD Spying Coalition

is still pursuing their lawsuit against the LAPD.

And for now the most that concerned citizens can hope for

is more transparency.

If there was a dialog

between these police departments

and the communities that they're serving,

explaining to them how they're using these data points.

How they're using this technology.

It may go a long way.

Predictive policing is here to stay.

And so what we should do is

figure out how to identify the risks,

see them coming and address them beforehand

before it's too late.