To Predict and Serve: Predictive Intelligence Analysis, Part II

Posted on 6 July 2011 by

1


In Part I of this two part article, Sgt Christopher Fulcher discussed the need for predictive intelligence analysis. In Part II, he gives some specific examples, tips and guidelines. If you would like to contribute non-commercial content in your area of expertise to Police Led Intelligence, please let us know.


In 2006, the Memphis Police Department launched its predictive analytic program, Criminal Reduction Using Statistical History, or Blue CRUSH. They credit the program with a 31% decrease in crime and an 863% return on their initial investment.

Their system began with historical data on gun crimes and has since expanded to include a digital map of all incidents within the City. These incidents are analyzed in 28-day and seven-day crime hot spots which are reviewed and used to adjust officer deployment around the clock.

In 2010 Chicago Police initiated the Citizen Law Enforcement Analysis and Reporting (CLEAR) mapping system. CLEAR started as a web-based crime mapping application for the public, but has evolved into the department’s own predictive analytic engine. Using many of the same technologies that started the CLEAR system, the data mining includes tracking every incident that is linked to a known gang member.

Results from the analysis allow the department to deploy resources more intelligently into affected areas.4 In just a short time; Chicago has reported a significant reduction in gang related activity and crimes.

The National Institute of Justice (NIJ) also sees the benefits of this technology, and in 2009 awarded a planning grant to seven large metropolitan police departments across the country for pilot predictive analytic programs. Each has shown very promising results which can be implemented in other agencies nationwide.

We Don’t Know What We Don’t Know
No system, no matter how successful or advanced, can be considered a true solution to crime or terrorism. Predictive analytics is not meant to replace tried-and-true police techniques, instead it builds on the essential elements of all policing strategies for the greater good.

It is not perfect and is not without its own shortfalls. Analysis of infrequent events is one such danger. Since data mining relies on statistical data, the smaller the sample size the more unreliable the result.

Especially when models are being used, infrequent events can create issues when they are associated with grossly unequal sample distributions.

This is especially important to keep in mind when attempting to predict future events connected to terrorism at a local level. Since the frequency of incidents from municipality to municipality is usually low, and there may not be sufficient data available, this can lead to false-positive hot-spots.

Outliers, or unusual subjects, events, or items, can significantly affect analytic results especially relative to infrequent events.

If we analyze ten traffic stops near a local critical infrastructure site and find that three of them involved drivers of foreign descent with suspicious items in their vehicle, would it be prudent to say that 60% of all encounters with subjects near this critical infrastructure site were suspected of doing harm to it?

Do we suddenly increase security in and around the site as a precaution?

The answer to both is no. Additional analysis would need to be performed, with the understanding that these three events may have been isolated.

Determining baseline data is critical when interpreting crime and intelligence information. Without a baseline there can be no comparison to future events or feedback from past events. Baseline data can include location, population, demographics and many others.

Domain knowledge is also connected to baseline data since this baseline is usually different from one jurisdiction to the next. A failure to establish a baseline when analyzing local data can lead to assumptions and un-educated guesses, neither of which will result in valuable intelligence.

The most important aspect of analytics in general that must be understood is that it is only a tool. Analytics, particularly data mining, provides a glimpse into how people, places, and events come together in threats and incidents. No single analyst could possibly see how these atypical and imperfect evidence bits fit together, but these processes should be left at their strength.

While machines can place these intelligence puzzle pieces to show a bigger picture at an unmatched rate, they reach a threshold where analysts must take over. The analyst will always excel at applying human judgment and reasoning, which is critical to the intelligence process. “Otherwise judgment and decision never arrive, connections are never made, and red flags are never raised”.6

It’s all semantics
While the technology that drives data mining processes continues to improve, there is a “next generation” of analytics that is gaining momentum. Semantic web technology has been used by an increasing number of organizations to manage, integrate, and gain intelligence from multiple data sources.

What sets the Semantic technology apart is its unique ability to exceed the limits of other technologies and approach the automatic understanding of a text. Creating a web of understood word meanings and connections brings data mining to a whole new level. What began as keyword based search systems, evolved to a “second generation” social media interaction, and now has become semantic web. Imagine countless law enforcement agencies conducting word searches of locations, people, etc. and seeing those connections all in one place.

Semantic web technology not only brings more power to the analytic process, but also attempts to address the gaps in intelligence procedures and sharing processes that is seen throughout the country.6

Conclusion
Tactical crime analysis, risk and threat assessment, behavioral analysis of violent crime, and proactive deployment strategies are just some of the benefits predictive analytics provides. At a time when agencies all over the country are expected to accomplish the same law enforcement goals with less money, and sometimes less manpower, the benefits of predictive analytics becomes even more pronounced.

The technology also lends itself to inherent data sharing between not only agencies on a local level, but also at the state, federal, and military levels.

For too long al-Qaeda has modified their tactics to take advantage of a disjointed national intelligence system. A system that has had difficulty analyzing intelligence from anomalous sources. A system that took too much time to provide crucial answers.

Predictive analytics can change that. It can be the force multiplier that is so desperately needed to help win this war.

It can help us predict, plan, and act.

It can tell us where terrorists will strike next.


Christopher Fulcher is a Sergeant with the Vineland Police Department in Vineland, NJ. He has worked for the police department for 15 years, the last six as a supervisor and the last four assigned to the Services Division functioning as a Chief Technology Officer. Fulcher has been involved with his department’s technology and network management since 2001. Through several educational seminars and classes he has also recently begun tactical, strategic, and geo-spatial intelligence analysis for his department. In April 2011 he graduated from the US Department of Homeland Security Intermediate Fusion Center Analyst Training (IFCAT) program.