Typically, police arrive at the scene of a crime after it occurs. But rather than send cops to yesterday’s crime, a new trend in law enforcement is using computers to predict where tomorrow’s crimes will be — and then try to head them off.
The software uses past statistics to project where crime is moving. Police in Los Angeles say it’s worked well in predicting property crimes there. Now Seattle is about to expand it for use in predicting gun violence.
It all started as a research project. Jeff Brantingham, an anthropologist at UCLA, wanted to see if computers could model future crime the same way they model earthquake aftershocks. Turns out they can.
“It predicts sort of twice as much crime as any other existing system, even going head-to-head with a crime analyst,” Brantingham says.
Checking The Red Boxes
Older systems, like the famous CompStat in New York, show where crime has been. This system looks forward.
“The model will actually predict other locations, that effectively say, even though there was a crime somewhere else in your environment, the risk is still greatest in this location today for the next 10 hours or the next 12 hours,” Brantingham explains.
Brantingham and his colleagues are now selling the predictive system to police departments with the name PredPol. At this point, you may be thinking about the sci-fi movie Minority Report. But this is different. No psychics sleeping in bathtubs, for one. More to the point, this doesn’t predict who will commit a future crime, just where it is likely to happen.
In Seattle, police Sgt. Christi Robbin zooms in on a map of the city. Earlier this year, Seattle started using PredPol to predict property crimes. It’s now the first place to try predicting gun violence with the software.
“These red boxes [on the map] are predictions of where the next crimes are likely to occur,” Robbin explains.
At the start of every shift, patrol cops are assigned to those red boxes. “So we’re asking that they spent the time in that 500-by-500 square foot box, doing whatever proactive work they can to prevent that crime,” Robbin says.
On a recent shift, officer Philip Monzon pulls up inside his box; today, it’s a city block near the Seattle waterfront.
“[The police] want visibility, they want contacts with businesses as are appropriate and anyone who’s wandering through the area,” Monzon explains.
This area has parking lots, and PredPol’s forecast includes car thefts. As Monzon passes a green Honda, he pauses. The guy inside seems to be ducking under the dashboard.
“[I] wanna make sure to see if he’s got the key or if he’s gonna pull out anytime soon,” Monzon says.
The car starts — the guy probably does have the key. But why didn’t Monzon challenge him, just in case?
“I don’t really have enough — I’m not just going to single out one guy in a Honda,” he explains.
Computer Models And ‘Reasonable Suspicion’
And this is where this gets tricky. The courts say police need “reasonable suspicion” in order to stop somebody. That suspicion can come from a lot of things — even someone’s “furtive movements,” as police like to say.
But can it come from the fact that someone is occupying an imaginary red box drawn by a computer?
“Ah — No. No. I don’t know. I wouldn’t make a stop solely on that,” Monzon says.
That’s probably the right answer, says Andrew Guthrie Ferguson, a law professor at the University of the District of Columbia who has taken a special interest in the constitutional implications of PredPol. He says the departments using it have told police not to use it as a basis for stops. But he also wonders how long that can last.
“The idea that you wouldn’t use something that is actually part of the officer’s suspicion and not put that in — [that] may come to a head when that officer is testifying,” Ferguson says. Either that officer will have to omit the fact that he or she was prompted by PredPol, he says, or that officer will admit it on the stand. “Then the issue will be raised for the court to address.”
And it may be that PredPol is a constitutional basis for stopping someone. Some might consider it more objective than an individual police officer’s judgment — less prone to racism or other kinds of profiling, for example.
Ferguson says that argument may have merit, but that police and society still need to be careful.
“I think most people are gonna defer to the black box,” he says. “Which means we need to focus on what’s going into that black box, how accurate it is and what transparency and accountability measures we have [for] it.”
In other words, even though computers aren’t biased, the statistics feeding it might be. And if police are going to follow an algorithm, we should at least be willing to check the math.