Share on LinkedInTweet about this on TwitterShare on Google+Share on FacebookShare on Reddit

By Jordan Robertson – Aug 15, 2013 12:01 AM 

Think of it as big data meets “Minority Report.” (Scary stuff!)

While working as the chief privacy officer at Intelius, an online provider of background checks, Jim Adler created software that demonstrates how just a few details about a person could be used to estimate the chances of someone committing a felony. Accurately, he says.

 Jim Adler created a computer program that shows how just a few details about an individual could be used to estimate the chances of that person committing a felony. Photographer: Rex Ziak

If that sounds like the stuff of science fiction, similar to the Tom Cruise movie where people are arrested before the crimes happen, early forms of this type of predictive policing can be done today because of the enormous amounts of digital data on individuals being collected and analyzed.

To test his computer program, Adler began with tens of thousands of criminal records owned by Intelius and focused only on a few details about each person, including gender, eye and skin color, the number of traffic tickets and minor offenses, and whether the individual has tattoos. Based on that data, and excluding any information about a felony conviction, he said his algorithm determined with reasonable accuracy whether a person had committed a serious crime.

While Adler points out there was “sample bias” in the data and that his program is “not ready for prime time,” he said a bigger sample with more historical information about individuals could be used to create a felon predictor — software that gives the statistical likelihood of someone committing a serious crime in the future. Scores could even be assigned to individuals.

Adler, who has testified before Congress and the Federal Trade Commission on big data and privacy issues, created the program to show both the potential benefits of using big data to stop trouble before it happens, as well as the possible dangers of going too far with using predictive technologies.

“It’s important that geeks and suits and wonks get together and talk about these things,” said Adler, who is now a vice president at Metanautix, a data analytics startup. “Because geeks like me can do stuff like this, we can make stuff work – it’s not our job to figure out if it’s right or not. We often don’t know.”

This type of predictive policing is already causing alarm for some civil libertarians.

“When we start using data to make decisions that imprison people and execute people and impact their freedom, that is a reason to be enormously careful,” said Jules Polonetsky, executive director of the Future of Privacy Forum, a Washington-based group. “Every individual has the free will to not be a criminal, despite what the statistics said yesterday.”

Abuse of big data by government is a growing concern as law enforcement and intelligence agencies amass more personal information from e-mail providers, social networks and financial institutions. Controversial investigative tactics such as New York City’s stop-and-frisk program, which a judge ruled this week unlawfully targets minorities, along with the National Security Agency’s vast electronic spying are fueling fears that the expanding dossiers of personal data are being misused to profile innocent people.

Still, some law enforcement agencies say they’re finding success with predictive software. Los Angeles has recorded declines in property crimes, while Memphis, Tennessee, has seen a drop in robberies, burglaries and rapes with the help of these programs, which analyze broad patterns to help police identify hot spots for trouble.

Adler’s program goes deeper, he says, by pinpointing specific people. He does this by drawing upon a source of data most researchers don’t have access to — Intelius’s trove of records on 630 million criminal cases and 40 million defendants in the U.S.

To create his software, Adler examined court records dating back to the early 1980s of everyone who had brushes with the law in Kentucky and whose information was in Intelius’s database. The dataset, which he chose because it was small and easier to work with, included people who didn’t have felonies but did have traffic tickets and misdemeanors.

Certain features correlated highly with having a felony record: being male, having hazel eyes, minor offenses beyond traffic tickets, and tattoos. Another was being light-skinned. Eighty-nine percent of Kentucky’s residents and 74 percent of its inmates are white. Adler wanted to see how many felons he could identify if the fact of the felony record was hidden.

The accuracy of the software depends on the number of false positives one is willing to tolerate, a range that Adler calls the “anarchy to tyranny” spectrum. At its most aggressive, his program can correctly identify all 51,246 felons while misidentifying 2,220 non-felons, numbers an iron-fisted ruler could live with. At a more lenient setting, it can correctly identify 37,842 felons while misidentifying 152 non-felons — a smaller number of false positives, but still far from perfect.

“If we can do this smarter and faster and the right way, then certainly, especially when it revolves around gun violence and gang violence, it could be a useful tool,” said Bruce Ferrell, a retired homicide and gang investigator with the Omaha Police Department and now president of the National Alliance of Gang Investigators’ Associations.

The real test of Adler’s algorithm will be when it’s used with other states’ data and modified to include their defendants’ information. He doesn’t plan to do that since he no longer has access to Intelius’s data.

The use of physical characteristics such as hair, eye and skin color to predict future crimes would raise “giant red privacy flags” since they are a proxy for race and could reinforce discriminatory practices in hiring, lending or law enforcement, said Chi Chi Wu, staff attorney at the National Consumer Law Center….Read Full Article at Bloomberg.com

Share on LinkedInTweet about this on TwitterShare on Google+Share on FacebookShare on Reddit

Blog Publisher / Head of Data Science Search

Founder & Head of Data Science Search at Starbridge Partners, LLC.