The Los Angeles Police Department, like many urban police forces today, is both heavily armed and thoroughly computerised. The Real-Time Analysis and Critical Response Division in downtown LA is its central processor. Rows of crime analysts and technologists sit before a wall covered in video screens stretching more than 10 metres wide. Multiple news broadcasts are playing simultaneously, and a real-time earthquake map is tracking the region’s seismic activity. Half-a-dozen security cameras are focused on the Hollywood sign, the city’s icon. In the centre of this video menagerie is an oversized satellite map showing some of the most recent arrests made across the city – a couple of burglaries, a few assaults, a shooting.
On a slightly smaller screen the division’s top official, Captain John Romero, mans the keyboard and zooms in on a comparably micro-scale section of LA. It represents just 500 feet by 500 feet. Over the past six months, this sub-block section of the city has seen three vehicle burglaries and two property burglaries – an atypical concentration. And, according to a new algorithm crunching crime numbers in LA and dozens of other cities worldwide, it’s a sign that yet more crime is likely to occur right here in this tiny pocket of the city.
The algorithm at play is performing what’s commonly referred to as predictive policing. Using years – and sometimes decades – worth of crime reports, the algorithm analyses the data to identify areas with high probabilities for certain types of crime, placing little red boxes on maps of the city that are streamed into patrol cars. “Burglars tend to be territorial, so once they find a neighbourhood where they get good stuff, they come back again and again,” Romero says. “And that assists the algorithm in placing the boxes.”
Romero likens the process to an amateur fisherman using a fish finder device to help identify where fish are in a lake. An experienced fisherman would probably know where to look simply by the fish species, time of day, and so on. “Similarly, a really good officer would be able to go out and find these boxes. This kind of makes the average guys’ ability to find the crime a little bit better.”
Predictive policing is just one tool in this new, tech-enhanced and data-fortified era of fighting and preventing crime. As the ability to collect, store and analyse data becomes cheaper and easier, law enforcement agencies all over the world are adopting techniques that harness the potential of technology to provide more and better information. But while these new tools have been welcomed by law enforcement agencies, they’re raising concerns about privacy, surveillance and how much power should be given over to computer algorithms.
P Jeffrey Brantingham is a professor of anthropology at UCLA who helped develop the predictive policing system that is now licensed to dozens of police departments under the brand name PredPol. “This is not Minority Report,” he’s quick to say, referring to the science-fiction story often associated with PredPol’s technique and proprietary algorithm. “Minority Report is about predicting who will commit a crime before they commit it. This is about predicting where and when crime is most likely to occur, not who will commit it.”
PredPol is now being used in a third of the LA Police Department’s 21 geographic policing divisions, and officers on patrol are equipped with maps sprinkled with a dozen or more red boxes indicating high probabilities of criminal activity. For now, the LAPD is focusing on burglary, vehicle break-ins and car theft – three types of crime that last year made up more than half of the roughly 104,000 crimes recorded in LA.
Dozens of other cities across the US and beyond are using the PredPol software to predict a handful of other crimes, including gang activity, drug crimes and shootings. Police in Atlanta use PredPol to predict robberies. Seattle police are using it to target gun violence. In England, Kent police have used PredPol to predict drug crimes and robberies. Brantingham notes that Kent police are taking a more proactive approach by not only concentrating officers in prediction areas, but also civilian public safety volunteers and drug intervention workers.
The prediction algorithm is constantly reacting to crime reports in these cities, and a red box predicting crime can move at any moment. But although officers in the divisions using PredPol are required to spend a certain amount of time in those red boxes every patrol, they’re not just blindly following the orders of the crime map. “The officer still has a lot of discretion. It’s not just the algorithm,” Romero says. “The officer still has to know the area well enough to know when to adjust and go back into manual.”
Clicking on a few of the boxes for more detail, Romero brings up Google Street View images of the predicted crime areas. Two are centred on the car parks of big box stores, not particularly surprising places for car break-ins and thefts, says Romero. But Brantingham contends that the algorithm is doing much more than just telling cops what they already know.
“Crime hotspots are incredibly dynamic,” he says. “Yes, there are bad sides of town and good sides of town, but within those broad distinctions crime hotspots pop up and spread and disappear and pop up again in really complicated ways that are just very, very difficult, if not impossible, for the individual to intuit.”
Not that they don’t try. Beginning in the mid-1990s, police in the New York City Police Department began to run statistical analyses of the city’s crime reports, arrests and other police activity known as Compstat. Law-enforcement agencies around the world have since implemented their own data-driven approaches to tracking and adapting to crime trends. Though police have long mapped out crime hotspots, Brantingham says the increased attention to data and analytics has been a major step up. He calls predictive policing the next iteration of that advancement.
“It’s using much larger collections of data, and processing it in a much more sophisticated mathematical way that allows you to produce significant boosts over just hotspot mapping alone,” Brantingham says. A 21-month single-blind randomised control trial in three LAPD divisions found PredPol to accurately predict twice as much crime as existing best practices, according to Brantingham. However critics have argued that a larger study of multiple cities would be needed to more accurately test its effectiveness.
“Predictive policing is at the cutting edge of policing today. The problem is that, historically, the cutting edge of policing is dull,” says John Eck, a professor of criminology at the University of Cincinnati. He is skeptical about what predictive policing can do to actually prevent crime in the long term. “If crime at locations is highly predictable over long periods, there is often something fundamentally wrong with how the place is managed by its owner that makes it a crime hotspot. And it is the owner who has the responsibility to correct things. We do not have to stop a lot of innocent people and intrude in their lives.
“[Predictive policing] fosters a whack-a-mole policing mentality,” Eck says. “Not that whacking moles doesn’t work – but it is unnecessarily intrusive in people’s lives, which erodes their limited confidence in the police, and undermines something fundamental to our ideals of democracy.”
Widespread data collection efforts have faced scrutiny in recent years, especially since information about the surveillance tactics of the US National Security Agency was leaked by Edward Snowden last June. And though much of that data collection focuses on digital communications, a growing amount of information is being gathered in the physical world.
Police departments across the US are considering using drones to assist in policing and surveillance. The LAPD has access to more than 1,000 closed-circuit security cameras. Number-plate readers have been installed on police cars in a number of cities, including Los Angeles. The LA County Sheriff’s Department recently tested out an airplane-mounted system that recorded real-time activity in the entire city of Compton. And by this summer, the FBI plans to have a fully operational face recognition system that will eventually contain upwards of 52 million records.
According to documents released by the FBI as a result of a lawsuit filed by civil liberties organisation the Electronic Frontier Foundation (EFF), the database will include not only criminal mugshots but also millions of photos taken for non-criminal reasons such as employment background checks. That lawsuit was filed by Jennifer Lynch, senior staff attorney at the EFF, who says that the FBI’s facial recognition database is hardly the only one. Law enforcement agencies in San Diego County and in Maricopa County, Arizona, have contracts with private firms to build their own local facial recognition databases, while the New York Police Department has partnered with Microsoft to create a “Domain Awareness System” capable of accessing more than 3,000 surveillance cameras as well as a trove of crime data, 911 calls, number-plate readers and even radiation detectors.
Law enforcement agencies now have access to hundreds of millions of records, many of non-criminal members of the general public. While these tools have been useful for police, they are not failsafe. A California woman recently won a civil rights lawsuit against the San Francisco Police Department after a number-plate reader misidentified hers as a stolen car and she was held at gunpoint by officers, forced to her knees and detained for 20 minutes.
In Los Angeles, Lynch and the EFF have partnered with the American Civil Liberties Union of Southern California on a lawsuit seeking information from the LAPD and the LA County Sheriff about how exactly they collect number plate data. “The law enforcement agencies have not even been willing to talk to us about that,” Lynch says. “And I think that’s really problematic. If we can’t get information on how they’re using this data and what kind of surveillance they’re doing, I don’t see why we should accept the fact that they’re doing it.”
Lynch worries that there’s too much submissive acceptance of these technologies by the public, without consideration of exactly how this data is collected and used. She says that predictive policing, with its claims of reducing crime, will be given something of a free pass.
“What starts to happen is people think the results that come out of that must be accurate because there’s technology involved,” Lynch says. “But what we forget is that the information that went in may have been the subject of bias, may have been based on inaccurate assumptions about people, may have been collected in certain communities more than other communities. The problem is technology legitimises somehow the problematic policing that was the origination of the data to begin with.”
Brantingham says that because PredPol only concerns itself with the spatial and temporal aspects of crime, it wouldn’t be skewed by social factors. And Romero says the LAPD has been conscientious about how it collects and uses the increasing amount of data in its storage drives.
“We’re pretty careful about what people do here,” Romero says. “I care about civil liberties and freedom. And I know that our constitution was not written to protect us from gang members and thieves and thugs; it was written to protect us from the government and overreach of the government.”
But concerns persist. Gary T Marx, professor emeritus of sociology at the Massachusetts Institute of Technology, says technology such as predictive policing creates “categorical suspicion” of people in predicted crime areas, which can lead to unnecessary questioning or excessive stopping-and-searching. And as data-driven policing expands, Marx worries that analysis and decision-making by machine will lead to what he calls “the tyranny of the algorithm”.
“The Soviet Union had remarkably little street crime when they were at their worst of their totalitarian, authoritarian controls,” Marx says. “But, my god, at what price?”