Predictive analytics and event processing: The future of BPM?

They may change how business decisions are made, and together they can enhance applications and vertical niches. But success hinges on linking them to strategic goals—and finding people with the right skills.

Seeing Into the future goes back a long way. Once it was done with omens and portents. Now it’s done with streaming databases, neural network algorithms and MapReduce data shuffles. What hasn’t changed is that obtaining reliable and “actionable” predictions remains challenging—especially for businesses.  

Wall Street has long been in the technology forefront. Outside of capital markets, the pace may be slower. But event processing and predictive analytics technologies -- some people refer to them as strategies -- are being tried out in other industries. They are gradually finding use as different types of businesses seek new competitive advantages. Both approaches, along with related “big data” technologies such as Hadoop, may be poised to change how business decisions are made.

Event processing and predictive analytics are “pretty complementary strategies,” according to Neil Ward-Dutton, research director of MWD Advisors, based in the United Kingdom. Beyond capital markets, these strategies can be employed in tandem in online sales and marketing, call centers, telecommunications and other industries, he said. The explosion in Web data is a major driver of greater use of event processing and predictive analytics.

These now-sizzling technologies can be both strange and familiar. Event processing is a unique branch of computing that tracks “streams” of occurrences and looks for underlying patterns. The complex version of event processing—aptly named complex event processing, or CEP—combines data from various sources to form more complicated patterns for processing. The prime example has long been the computer that finds valuable stocks that respond to unexpected market conditions.

Meanwhile, predictive analytics is a methodology that applies specialized algorithms to data sets to create a probability-based predictive model of anticipated activity, not unlike the actuarial tables used to calculate risk and compute auto insurance rates for different types of people.

Events: Our digital footprints, ourselves

Events have always been there. They just weren’t digital. “People didn’t leave the digital footprints they now do,” WardDutton said. “Now pretty much everything leaves a footprint.”

The addition of predictive analytics opens new possibilities for the processing of digital events, he said. But it’s important to connect both event processing and predictive analytics to business objectives.  

Tying event technology to operations and analysis results to application integration efforts and useful business activity is important to practitioners of business process management (BPM). Event processing technology is driven largely by a desire for better operations, as indicated by the most recent ebizQ survey of business and IT professionals. A majority of respondents (69%) cited “operational efficiency” as the main goal of their event processing technology efforts. But analytics has a foothold, too: Twenty-eight percent of respondents cited “support for business intelligence/analytics programs” as a primary driver of event processing work. Meanwhile, in the same survey, “integration with other application/systems” (71%) leads the list of challenges to decision management, with “tying analysis to action” placing second (44%) among such challenges.

What are mixed event processing and predictive analytics? The applications and vertical niches can be diverse:

  • Manufacturers predicting factory machine failures and preorder replacement parts.
  • Software employed to speed up medical device clinical trials for faster time to market.
  • Programs predicting the likelihood that a hospitalized infant may contract blood poisoning.
  • A financial services company watching website logs and correlating activity to dangerous security events.
  • An automaker setting up highly automated fraud forecasting filters that better select leasing candidates for scrutiny.
  • A travel site that steers users of Apple products to different promotional offers based on the established likelihood that they are willing to spend more on vacations.

In his technology predictions for this year, Ward-Dutton emphasized the importance of the trend for increased aggregation of core BPM capabilities with complementary platform elements “including event processing, business rules and real-time/predictive analytics.”

Understanding events and triggers and deriving insight from combining those event streams with models of, for example, customer segmentations, can be a powerful tool, he said.

CEP: Slow going outside of capital markets

As organizations of all sizes deal with more Web-created data, CEP will come under consideration. Yet to date, CEP’s major base remains banking and stock trading.

CEP took off in the capital market sector for activities such as algorithmic trading, but it has had some success stories beyond the original constituency, said Philip Howard, research director for Bloor in the United Kingdom. He sees a handful connected by themes.

First, he said, a number of companies have invested in CEP to support BPM and SOA.

Second, security information and event management has been a specialized operations niche for CEP. And finally, there is CEP matched with real-time analytics.

“My basic view is that event processing is very valuable for predictive analytics, but the market hasn’t taken off yet,” Howard said.

Ward-Dutton adds supply-chain optimization and fleet-logistics optimization to the list of CEP success stories. Where CEP coupled with predictive analytics has shown value is in telecommunications, he said. There, the churn rate—that is, the level of customer attrition—is a crucial issue.

“For a mobile cell company, say, if you can take intelligence and combine it with an understanding of event streams, then you can detect, for example, that a customer has a string of four dropped calls in a week,” he said. “You may know from that customer’s attributes that they are likely to drop their service as a result.”

But with the right software in place, “you can take an action, make an offer,” Ward-Dutton said. “This is where SOA, BPM and events get linked together, too.”

Hadoop, PMML take the stage

Both complex event processing and predictive analytics as they are today require specialized skills. That can greatly add to the expense of implementation and put additional pressure on software architects to get it right.

At the same time, emerging standards show some promise for bringing a bit more commonality to these styles of development. One of these is the Predictive Model Markup Language (PMML), which supports a portable notation that can be used in building predictive models and pattern filters.

A more widely discussed standard is the Apache Hadoop MapReduce software framework. This Java-based distributed file system technique is naturally tailored to run on distributed computer systems. Some might even call Hadoop the poster child for big data.

“Hadoop is really good at embarrassingly parallel problems with massive amounts of data,” said Douglas Moore, principal consultant and architect for Think Big Analytics, a professional services firm based in Mountain View, Calif.

A central premise for Hadoop is reduced cost of software and development. This will appeal to line-of-business leaders, enabling wholly new applications. The framework has already appeared as an adjunct to some CEP engines. And predictive analytics has shown up on Hadoop clusters as well.  

Naturally, there are disadvantages too.

First, it’s still new and raw. That requires developers to build their own tools for many requirements. Perhaps most important: Its newness draws programmers back into data integration in ways not seen for many years. Some will find the programming model restrictive.

It can be difficult for developers well versed in SQL to jump to Hadoop. It can be an even more difficult leap for relational database administrators.

Of course, it’s still early.

“The majority are still testing out Hadoop. Some people are doing it for real,” Bloor’s Howard said. “What worries me is all the emphasis is on Hadoop and people not looking at the alternatives. It’s almost like people are all looking in the Hadoop direction and putting blinders on.”

About the Author

Jack Vaughan, co-editor of Business Agility Insights, is editor in chief of Email him at [email protected]

Dig Deeper on Topics Archive