Showing posts with label event processing. Show all posts
Showing posts with label event processing. Show all posts

Thursday, January 17, 2013

Business Signal Optimization

@DouglasMerrill of @ZestFinance (via @dhinchcliffe) tells us A Practical Approach to Reading Signals in Data (HBR Blogs November 2012)

If we think of data in tabular form, there are two obvious ways of increasing the size of the table - increasing the number of rows (greater volume of cases) or increasing the number of columns (greater volume of signals). This can either involve a greater variety of variables, as Merrill advocates, or a higher frequency of the same variable. I have talked in the past about the impact of increased granularity on Big Data.

As I understand it, Merrill's company sells Big Data solutions to the insurance underwriting industry, and its algorithms use thousands of different indicators to calculate risk.

The first question I always have in regard to such sophisticated decision-support technologies is what the feedback and monitoring loop looks like. If the decision is fully automated, then it would be good to have some mechanism to monitor the accuracy of the algorithm's predictions. Difficulty here is that there is usually no experimental control, so there is no direct way of learning whether the algorithm is being over-cautious. I call this one-sided learning,

Where the decision involves some human intervention, this gives us some further things to think about in evaluating the effectiveness of the decision-support. What are the statistical patterns of human intervention, and how do these relate to the way the decision-support software presents its recommendations?

Suppose that statistical analysis shows that the humans are basing their decisions on a much smaller subset of indicators, and that much of the data being presented to the human decision-makers is being systematically ignored. This could mean either that the software is too complicated (over-engineered) or that the humans are too simple-minded (under-trained). I have asked many CIOs whether they carry out this kind of statistical analysis, but most of them seem to think their responsibility for information management ends when they have provided the users with the requested information or service, therefore how this information or service is used is not their problem.

Meanwhile, the users may well have alternative sources of information, such as social media. One of the challenges Dion Hinchcliffe raises is how these richer sources of information can be integrated with the tabular data on which the traditional decision-support tools are based. I think this is what Dion means by "closing the clue gap".




Dion Hinchcliffe, The enterprise opportunity of Big Data: Closing the "clue gap" (ZDNet August 2011)

Dion Hinchcliffe, How social data is changing the way we do business (ZDNet Nov 2012)

Douglas Merrill, A Practical Approach to Reading Signals in Data (HBR Blogs November 2012)





Places are still available on my forthcoming workshops Business Awareness (Jan 28), Business Architecture (Jan 29-31), Organizational Intelligence (Feb 1).


Monday, March 07, 2011

TIBCO platform for organizational intelligence

By adding tibbr to its established software portfolio, TIBCO has now extended its range of organizational intelligence technologies. Last week I spoke with Stefan Farestam of TIBCO to discuss the present and future prospects for TIBCO customers linking these technologies together in interesting ways.

We talked about three main technology areas: Complex Event Processing (CEP), Business Process Management (BPM) and Enterprise 2.0. For TIBCO at least, these technologies are at different stages of adoption and maturity. TIBCO's CEP and BPM tools have been around for a while, and there is a fairly decent body of experience using these tools to solve business problems. Although the first wave of deployment typically uses each tool in a relatively isolated fashion, Stefan believes these technologies are slowly coming together, as customers start to combine CEP and BPM together to solve more complex business problems.

Much of the experience with CEP has been in tracking real-time operations. For example, telecommunications companies such as Vodafone can use complex event processing to monitor and control service disruptions. This is a critical business concern for these companies, as service disruptions have a strong influence on customer satisfaction and churn. CEP is also used for autodetecting various kinds of process anomalies, from manufacturing defects to fraud.

One of the interesting things about Business Process Management is that it operates at several different tempi, with different feedback loops.
  • A modelling and discovery tempo, in which the essential and variable elements of the process are worked out. Oftentimes, full discovery of a complex process involves a degree of trial and error.
  • An optimization and fine-tuning tempo, using business intelligence and analytics and simulation tools to refine decisions and actions, and improve business outcomes.
  • An execution tempo, which applies (and possibly customizes) the process to specific cases.

The events detected by CEP can then be passed into the BPM arena, where they are used to trigger various workflows and manual processes. This is one of the ways in which CEP and BPM can be integrated.

Social software and Enterprise 2.0 can also operate at different tempi - from a rapid and goal-directed navigation of the social network within the organization to a free-ranging and unplanned exploration of business opportunities and threats. TIBCO's new product tibbr is organized around topics, allowing and encouraging people to develop and share clusters of ideas and knowledge and experience.


Curiously, the first people inside TIBCO to start using tibbr were the finance people, who used it among other things to help coordinate the flurry of activity at quarter end. (Perhaps it helped that the finance people already shared a common language and a predefined set of topics and concerns.) However, the internal use of tibbr within TIBCO has now spread to most other parts of the organization.

The organization of Enterprise 2.0 around topics appears to provide one possible way of linking with CEP and BPM. A particularly difficult or puzzling event (for example, a recurrent manufacturing problem) can become a topic for open discussion (involving many different kinds of knowledge), leading to a coordinated response. The discussion is then distilled into a resource for solving similar problems in future.

TIBCO talks a great deal about "contextually relevant information", and this provides a common theme across all of these technologies. It helps to think about the different tempi here. In the short term, what counts as "contextually relevant" is preset, enabling critical business processes and automatic controls to be operated efficiently and effectively. In the longer term, we expect a range of feedback loops capable of extending and refining what counts as "contextually relevant".

  • On the one hand, weak signals can be detected and incorporated into routine business processes. Wide-ranging discussion via Enterprise 2.0 can help identify such weak signals.
  • On the other hand, statistical analysis of decisions can determine how much of the available information is actually being used. Where a particular item of information appears to have no influence on business decisions, its contextual relevance might need to be reassessed.
 The adoption of Enterprise 2.0 within the enterprise raises different challenges to the adoption of CEP and BPM. One reason for this is the tricky question of critical mass. Whereas it is possible to conduct a meaningful pilot for CEP or BPM in a small part of the business, it is much harder to get a sense of how Enterprise 2.0 tools will work across the enterprise from a small pilot, and much harder to see concrete return on investment. However, many of TIBCO's customers already have an objective to implement some form of Enterprise 2.0, and the demand is simply to satisfy this objective in the most effective way.
 


My eBook on Organizational Intelligence is now available from LeanPub. leanpub.com/orgintelligence

Related posts: Two-Second Advantage (May 2010), Embedding Intelligence into the Business Process (November 2010)

Wednesday, August 06, 2008

Secret Life of New Tools

In his post, The Secret Life of CEP, Tim Bass is unimpressed by the claim that CEP vendors have dozens of unnamed customers doing unspecified things.

Twelve paying customers may not be much to boast about. For most new software tools, a typical breakdown would be as follows. Three of the companies are doing small pilot studies, three of them have finished the pilot and don't know what to do next, two of them have started a high risk project with little chance of delivering before 2027, three of them have been distracted by some other shiny new tool, and the one remaining company might possibly be using your tool on serious and achievable projects. Is CEP doing better than this? I think we should be told.

Sunday, June 15, 2008

Programming Languages

Does event processing need a new programming language? And what are the prospects of programmers adopting a new language?

The answer to both questions seems to hinge on the word "new". If we add a few constructs to an existing programming language, does that count as new?

Louis Lovas (Progress Apama) and Mark Tsimelzon (Coral8) agree that a programming language for event processing needs to be familiar, easily recognizable, explained in five minutes. In other words, new but not very new.

Louis Lovas (Progress Apama): Successful languages - show me the code please
Mark Tsimelzon (Coral8): What Makes a Programming Language Successful?

As individuals, programmers may be attracted by new and innovative languages, but there is a collective conservatism. There are always many interesting niche languages, sometimes with a devoted fanbase but with practically no chance of achieving critical mass; very few languages ever achieve widespread adoption and popularity. Louis attributes the current dominance of Java to a process of natural selection, but of course this is a selection based on the fact that popularity breeds popularity and does not indicate any intrinsic superiority of Java over the thousands of unheard-of alternatives.

And adoption is only half the story. Programmers may indeed be unwilling to invest more than five minutes in checking out a new programming language, and this unwillingness possibly filters out a lot of potentially valuable innovations. But what if new types of software application require new types of thinking? If programmers are only using language constructs that they are comfortable with, this suggests that they are also only using familiar styles of thinking, and may fail to pay attention to some important new aspects and issues. Consequently some errors may creep into their programs.

Obviously brilliant programmers never make any errors, but programming languages must be designed for not-so-brilliant programmers as well. So we must ask what are the pattern of errors associated with using a particular language in a particular context, and what can we do to reduce the error rate?

Some of the advocates of niche programming languages get rather smug at this point. They argue that although their favourite programming language may be a bit more difficult to learn if you are only familiar with C or Java, it is far less error-prone. But that argument is pretty academic if noone is using or interoperating with these languages.

There is sometimes therefore a trade-off between adoption and correct use. A language that is easy to adopt may also be easy to use badly. Whereas one that is more fail-safe in use may initially be more difficult to adopt. While language vendors undoubtedly care a little about correct use, they obviously care a great deal more about adoption and adoptability. Whereas the users of these languages should care equally about both.

Update

In a post entitled On the Right Event Processing Language, Opher Etzion adds: "abstractions that enable to think naturally about a certain type of functions is more important than familiar syntax". Actually I'm not convinced that programmers think naturally about anything - that's what makes them programmers in the first place - but apart from that I think he's got a point.

Thursday, April 03, 2008

Poorly Paid Job?

Marco Seiriƶ (RuleCore) has found an advertisement for a Poorly paid Tibco CEP job. (At least Marco thinks it's poorly paid.)

I'd have thought there was an inverse relationship between the quality and ease-of-use of software and the amount you have to pay someone to get it to work properly.
  • Hard-to-use software ~ large numbers of expensive consultants.
  • Easy-to-use software ~ a few cheap contractors.
Okay Marco, here's a friendly question for you. How much is the going rate for a consultant to work with your software?