Thursday, November 27, 2014
Misunderstanding CRM and Big Data
Peter cites an Ovum survey showing that Customer Satisfaction is now the number one concern of management, and argues for what Ovum calls Intelligent CRM. (CA announced something under this label back in October 2000. Other products are available.)
Mark says that CRM and Big Data are widely misunderstood, which is certainly true. My own opinion is the first misunderstanding is to think CRM is about managing THE relationship with THE customer, and I completely agree with Clayton Christensen (via Sloan) that this isn't enough. What we really need to focus on is the job the customers are trying to get done when they use your product or service.
Who is good at CRM? Peter cites an example of a professor of marketing who got a personalized service at a certain chain of hotels and has been talking about it ever since. (That's a pretty good coup for the hotel, if we take the story at face value.) Mark cites the video game market, where both the console manufacturers and the large game publishers are able to collect and analyse huge quantities of consumer behaviour.
Is CRM with Big Data merely a new way of taking advantage of customers? Although most people seem oblivious to the privacy and trust risks, the Wall Street Journal this week suggested that the consumer is becoming more savvy and less susceptible to exploitative loyalty schemes and promotions. This might help to explain why Tesco, once a master of the science of retail, now seems to be faltering.
If there is a sustainable business model based on CRM and Big Data, it must surely involve using these technologies to engage intelligently, authentically and ethically with customers, rather than imagining that these technologies can provide a quick fix for stupid organizations to take advantage of compliant customers.
Related Blogs
Customer Orientation (May 2009)
The Science of Retail (April 2012)
Other Articles
Martha Mangelsdorf, Understanding your customer isn't enough (Sloan Review May 2009)
Shelly Banjo and Sara Germano, The End of the Impulse Shopper (Wall Street Journal 25 November 2014)
Intelligent CRM
AI-CRM "An intelligent CRM system with atuo-learning-tunning engine (sic), Aichain offers the most widely used open source business intelligence software in the world." Last updated March 2013
CA rolling out customer relationship management software (ComputerWorld October 2000)
IBA Group "maintains its focus on IT outsourcing that has become a strategy for many organizations seeking to improve their business processes"
Friday, January 25, 2013
Analytics for Adults
"Experienced analysts are somewhat like children. And that is a good thing. ... They too have curiosity and imagination. ... Analysts should be allowed their play time to explore."
In contrast, he refers to the managers and employee teams who rely on analytics as "adults". He tells a story of how he resisted adult intervention when he was a child.
"When adults poked their head in to see what I was doing, although their observations and suggestions were well intended, they confused me. I preferred to make up my own methods."
His conclusion
"I think children should have restrictions on when parents or adults can engage on what children are analyzing."
However
"Eventually managers and employee teams, the “adults” in this scenario, should get involved with seeing and understanding what that the analyst is investigating."
But many "adults" might think that the whole point of analytics was to support the managers and employee teams, not just to have fun at the organization's expense. @haydens30 asks us to think of analytics as a valuable business resource, and calls upon organizations to communicate and use analytics more effectively.
The trap here is to regard analytics as something produced by one kind of person (children or wizards) and consumed by another kind of person (adults or muggles). I think what is more useful is to think of analytics as part of a closed intelligence loop, where analytics tools and techniques are used collaboratively to support an integrated sense-making and decision-making and learning process, which may benefit from the insights and skills and collective intelligence of a diverse community of workers. Not just the exclusive preserve of a bunch of irresponsible kids.
Of course, play is an important element of intelligence and innovation. Marshall Sponder regrets a widespread corporate unwillingness to invest in creativity.
"No one wants to pay for experimentation, for the most part, they only want to pay for results, and they want to bill it by the hour, or half hour. ... Since most corporations aren't getting much out of their analytics, why would they want to pay for it (when they can get almost the same thing for free) and why would they want to pay anyone to play and experiment, in order to learn and come up with something creative, and perhaps, unforseen or expected? They don't."
But the solution to this is not to build a protective fence around the analytics guys, but to give everyone in the organization (yes, and customers as well) the opportunity to ask off-the-wall questions and ponder the answers.
Gary Cokins Analytics admittance. Adults unaccompanied by minors (SAS, July 2012)
Marshall Sponder, Playful Social Media and Web Analytics making its way in the agency world (Social Media Today, Oct 2009)
Hayden Sutherland, Are you wasting your digital analytics? (Jan 2013)
Related posts: From Networked BI to Collaborative BI (April 2016), DataOps - Organizing the Data Value Chain (October 2019)
Wednesday, January 23, 2013
Opening the Black Box: Analytics and Admissions
When kids apply to university in the USA, it is becoming increasingly common to include a link with supplementary information about the applicant - for example a project tumblr, a YouTube video, a Flickr album of artwork. The links are typically coded to track visitors, giving the applicant some idea about the level of interest the universities are showing. Chris Peterson finds this an uncomfortable experience: "As admissions officers, we are accustomed to reading applications; now, applications are reading us. ... Applicants are now armed with unprecedented insight into the processes that decide their fate."
There are several problems with this. Applicants and their parents may be misled by the tracking signals collected by these digital supplements, which may yield an entirely false picture of the university process. And yet applicants may attempt to use these signals as evidence that an application has not been properly considered. Even if the university attempts to block the analytics, this may still send the wrong message. (The absence of a signal is still a signal.)
In the past, analytics were a tool used by large organizations to monitor and control their customers. We are now seeing analytic platforms that seem to allow customers to monitor and control large organizations. Large organizations now need to understand how much information they are exposing to these platforms, and what conclusions their customers may draw. We can expect similar examples to appear in many other sectors.
Chris Peterson, Opening the Black Box: Analytics and Admissions (Chronicle of Higher Education, January 2013)
Updated 25 June 2015
Does Cameron's Dashboard App Improve the OrgIntelligence of Government?
Here are some quick comments from Twitter
@lesteph PM's dashboard is at best pointless, at worst dangerous, unless his briefing system has fundamentally collapsed
@dominiccampbell he may as well have it, but pretending it's anything other than a partial view and mostly for PR is daft
@willperrin rather an antediluvian counsel of despair there then. back to 'ringbinders full of..' briefing
@6loss The "dashboard vs intelligence" debate? IMHO dashboards are useless without fast feedback on action.
In a subsequent discussion on Linked-In, @6loss and I discussed some of the intriguing questions raised by this news story.
Firstly, we were missing the imperative for real-time action and feedback. Obviously the Prime Minister needs to know whether job vacancies are going up or down, but the idea of real-time update is just ridiculous. Suppose that seventeen new job vacancies have been posted in Smartchester in the past twenty minutes, Are we supposed to believe that these seventeen vacancies urgently need to be communicated to the PM so that he can take appropriate action?
What does make sense is a dashboard that supports an OODA loop. A well-designed dashboard should not only provide aggregated data, but also provide some way of making sense of the data. (It is possible that the data visualization may help here.) And then taking rapid action.
But in a well-designed organization, the responsibility for rapid action is delegated to the people in the front line, who are given the real-time intelligence and the resources/tools and the authority to solve problems effectively and efficiently. This is what the military call "Power to the Edge". A completely different order of intelligence is required at the centre, usually operating at a much slower tempo.
And since managers are often tempted to meddle with randomly varying processes (Deming called this "tampering"), a well-designed control system deliberately hides much of the volatility from senior management. (In cybernetics, this is called "attenuation".)
Secondly, I'm wondering what kind of statistics we are talking about here. When people talk about "statistics", they often mean the kind of statistics kids learn in primary school (totals and averages) rather than the kind of statistics kids learn in high school (correlation and significance). I wonder how many ministers could cope with high school statistics (let alone degree level) without a civil servant or adviser there to explain it to them? The danger of the "dashboard" is that it may eliminate the vital step of interpretation and sense-making, which is surely essential to evidence-based management.
Thirdly, I'm wondering about the planned rollout of this App. Are we to suppose that all ministers and senior civil servants are going to be watching the same set of indicators, or does collective responsibility entail that each minister is watching a different set of indicators? In a typical control room, there are many people each watching a different dashboard or controlling a different sector: it would seem a bit redundant if they were all watching the same one. Meanwhile, the supervisor sits in his cubicle playing Angry Birds, or sending texts to his neighbours.
A few weeks after this discussion, writing in the New York Times, Will Wiles compared this dashboard with the Viable Systems Model implementation in Chile under Salvador Allende. He pointed out that the dashboard is not truly cybernetic because it lacks a mechanism to translate all that data into action. Quite so.
Update: See also Shannon Mattern, Mission Control: A History of the Urban Dashboard (Places Journal, March 2015)
Tuesday, January 22, 2013
OrgIntelligence - Are Better Tools the Answer?
Information Gathering
Managers spend up to two hours a day searching for information, and more than 50 percent of the information they obtain has no value to them. In addition, only half of all managers believe their companies do a good job in governing information distribution or have established adequate processes to determine what data each part of an organization needs.The average interaction worker spends an estimated 28 percent of the workweek managing e-mail and nearly 20 percent looking for internal information or tracking down colleagues who can help with specific tasks.
Knowledge Management
Traditional knowledge management has failed to address the problem of knowledge worker productivity. Tools that have been developed in KM focused on information management and do not support many of the key knowledge work processes. Knowledge workers have therefore adpated the email client to suit their needs. It has become the most successful knowledge work tool because it combines personal control with personalisability and integrates communication.
Big Data
By 2018, the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions.
Big data can create big value. But like all the big-data predecessors – i.e., databases, data warehousing, data mining, data analytics and business intelligence – you need to know what you’re looking for, why you’re looking for it, what’s it worth to you, and how will you take advantage of it BEFORE you start. Otherwise, big data will just be a big waste of money.
Social Media
Social media is addictive. And if you’re not too careful, it can seriously eat into your productivity.
Places are still available on my Organizational Intelligence Workshop (Feb 1st).
Thursday, January 17, 2013
Business Signal Optimization
If we think of data in tabular form, there are two obvious ways of increasing the size of the table - increasing the number of rows (greater volume of cases) or increasing the number of columns (greater volume of signals). This can either involve a greater variety of variables, as Merrill advocates, or a higher frequency of the same variable. I have talked in the past about the impact of increased granularity on Big Data.
As I understand it, Merrill's company sells Big Data solutions to the insurance underwriting industry, and its algorithms use thousands of different indicators to calculate risk.
The first question I always have in regard to such sophisticated decision-support technologies is what the feedback and monitoring loop looks like. If the decision is fully automated, then it would be good to have some mechanism to monitor the accuracy of the algorithm's predictions. Difficulty here is that there is usually no experimental control, so there is no direct way of learning whether the algorithm is being over-cautious. I call this one-sided learning,
Where the decision involves some human intervention, this gives us some further things to think about in evaluating the effectiveness of the decision-support. What are the statistical patterns of human intervention, and how do these relate to the way the decision-support software presents its recommendations?
Suppose that statistical analysis shows that the humans are basing their decisions on a much smaller subset of indicators, and that much of the data being presented to the human decision-makers is being systematically ignored. This could mean either that the software is too complicated (over-engineered) or that the humans are too simple-minded (under-trained). I have asked many CIOs whether they carry out this kind of statistical analysis, but most of them seem to think their responsibility for information management ends when they have provided the users with the requested information or service, therefore how this information or service is used is not their problem.
Meanwhile, the users may well have alternative sources of information, such as social media. One of the challenges Dion Hinchcliffe raises is how these richer sources of information can be integrated with the tabular data on which the traditional decision-support tools are based. I think this is what Dion means by "closing the clue gap".
Dion Hinchcliffe, The enterprise opportunity of Big Data: Closing the "clue gap" (ZDNet August 2011)
Dion Hinchcliffe, How social data is changing the way we do business (ZDNet Nov 2012)
Douglas Merrill, A Practical Approach to Reading Signals in Data (HBR Blogs November 2012)
Places are still available on my forthcoming workshops Business Awareness (Jan 28), Business Architecture (Jan 29-31), Organizational Intelligence (Feb 1).
Monday, October 22, 2012
Embedding Intelligence into Technical Systems
Smart Grid
One area where intelligence is being embedded into technical systems is in electric power infrastructure - the Grid.The demand for intelligence in this context includes several factors
- efficiency (delivering more with less)
- reliability and robustness (fault tolerance)
- affordable growth
- systemic risk (climate change, national security)
Sources
US Department of Energy, Smart Grid (pdf),Gridwise Alliance
Raymond Kelley, Architecting Intelligence into AMI systems (Penn Energy)
Gene Wolf, Embedded Intelligence (April 2007)
Monday, April 16, 2012
Embedding Intelligence in the Business Process - Product Links
- Embedding business intelligence (BI) into the business process.
- Embedding Enterprise 2.0 into the business process.
- Embedding knowledge (content) into the business process.
- Embedding learning into the business process.
- Embedding collaboration into the business process -
- Embedding intelligence into business capabilities
There are various hardware and software vendors who offer one or more of these. Here is an unscientific selection.
Intel - Embedded@Intel
Oracle - Fusion CRM Applications - Embedded Intelligence

Textron Defense Systems - "sophisticated embedded intelligence applications that enable warfighters to do more, better and faster, with their current assets". Features of this embedded intelligence apparently include augmented futures, social networking, anthropology, resource optimization, HCI and force multiplication. (Textron website)
Workday - "Embedded Intelligence within business processes allows for real-time contextual insight. Workday enables pre-built or custom worklets to be embedded within selected business processes. These worklets provide relevant, real-time business insight in context at the point of decision." (Press release August 2011)
Machine Analytics, a company in the Boston area, offers a solution methodology called Amitie for embedding intelligence into the business process. Amitie stands for Analyze, Model, IMplement, Test, Interface and Evaluate. The embedding takes place in the final two steps: interfacing the implemented and tested model with the client's business process, and evaluating/monitoring performance of the embedded model within the client's business process environment.
This material is partly based on some posts from November-December 2010.
- Embedding Intelligence in the Business Process 1
- Embedding Intelligence in the Business Process 2
- Joined-up Collaboration
- Embedding Intelligence into a Business Capability
Monday, March 07, 2011
TIBCO platform for organizational intelligence
We talked about three main technology areas: Complex Event Processing (CEP), Business Process Management (BPM) and Enterprise 2.0. For TIBCO at least, these technologies are at different stages of adoption and maturity. TIBCO's CEP and BPM tools have been around for a while, and there is a fairly decent body of experience using these tools to solve business problems. Although the first wave of deployment typically uses each tool in a relatively isolated fashion, Stefan believes these technologies are slowly coming together, as customers start to combine CEP and BPM together to solve more complex business problems.
Much of the experience with CEP has been in tracking real-time operations. For example, telecommunications companies such as Vodafone can use complex event processing to monitor and control service disruptions. This is a critical business concern for these companies, as service disruptions have a strong influence on customer satisfaction and churn. CEP is also used for autodetecting various kinds of process anomalies, from manufacturing defects to fraud.
One of the interesting things about Business Process Management is that it operates at several different tempi, with different feedback loops.
- A modelling and discovery tempo, in which the essential and variable elements of the process are worked out. Oftentimes, full discovery of a complex process involves a degree of trial and error.
- An optimization and fine-tuning tempo, using business intelligence and analytics and simulation tools to refine decisions and actions, and improve business outcomes.
- An execution tempo, which applies (and possibly customizes) the process to specific cases.
The events detected by CEP can then be passed into the BPM arena, where they are used to trigger various workflows and manual processes. This is one of the ways in which CEP and BPM can be integrated.
Social software and Enterprise 2.0 can also operate at different tempi - from a rapid and goal-directed navigation of the social network within the organization to a free-ranging and unplanned exploration of business opportunities and threats. TIBCO's new product tibbr is organized around topics, allowing and encouraging people to develop and share clusters of ideas and knowledge and experience.
Curiously, the first people inside TIBCO to start using tibbr were the finance people, who used it among other things to help coordinate the flurry of activity at quarter end. (Perhaps it helped that the finance people already shared a common language and a predefined set of topics and concerns.) However, the internal use of tibbr within TIBCO has now spread to most other parts of the organization.
The organization of Enterprise 2.0 around topics appears to provide one possible way of linking with CEP and BPM. A particularly difficult or puzzling event (for example, a recurrent manufacturing problem) can become a topic for open discussion (involving many different kinds of knowledge), leading to a coordinated response. The discussion is then distilled into a resource for solving similar problems in future.
TIBCO talks a great deal about "contextually relevant information", and this provides a common theme across all of these technologies. It helps to think about the different tempi here. In the short term, what counts as "contextually relevant" is preset, enabling critical business processes and automatic controls to be operated efficiently and effectively. In the longer term, we expect a range of feedback loops capable of extending and refining what counts as "contextually relevant".
- On the one hand, weak signals can be detected and incorporated into routine business processes. Wide-ranging discussion via Enterprise 2.0 can help identify such weak signals.
- On the other hand, statistical analysis of decisions can determine how much of the available information is actually being used. Where a particular item of information appears to have no influence on business decisions, its contextual relevance might need to be reassessed.
My eBook on Organizational Intelligence is now available from LeanPub. leanpub.com/orgintelligence
Related posts: Two-Second Advantage (May 2010), Embedding Intelligence into the Business Process (November 2010)
Friday, February 18, 2011
Jeopardy and Risk
In his short blogpost, Cser doesn't offer an answer to this question. He merely makes one assertion and one prediction.
Firstly he asserts an easy and superficial connection between the game of Jeopardy and the profession of security, based on "the complexity, amount of unstructured background information, and the real-time need to make decisions." Based on this connection, he makes a bold prediction on behalf of Forrester.
"Forrester predicts that the same levels of Watson's sophistication will appear in pattern recognition in fraud management and data protection. If Watson can answer a Jeopardy riddle in real time, it will certainly be able to find patterns of data loss, clustering security incidents, and events, and find root causes of them. Mitigation and/or removal of those root causes will be easy, compared to identifying them."
As this is presented as a corporate prediction rather than merely a personal opinion, I'm assuming that this has gone through some kind of internal peer review, and is based on an analytical reasoning process supported by detailed discussions with the IBM team responsible for Watson. I'm assuming Forrester has a robust model of decision-making that justifies Cser's confidence that the Jeopardy victory can be easily translated into the fraud management and data protection domain within the current generation of technology. (Note that the prediction refers to what Watson will be able to do, not what some future computer might be able to do.)
For my part, I have not yet had the opportunity to talk with the IBM team and congratulate them on their victory, but there are some important questions to explore. I think one of the most interesting elements of the Watson victory is not the complexity - which other commentators such as Paul Miller of Engadget have downplayed - but the apparent ability to outwit the other competitors. This ability may well be relevant to a more agile and intelligent approach to security, but that's a long way from the simplistic connection identified by Cser. Meanwhile, I look forward to seeing the evidence that Watson is capable of analysing root causes, which would be a lot harder than winning at Jeopardy.
Paul Miller, Watson wins it all, humans still can do some other cool things (Engadget 16 Feb 2011)
IBM's Watson supercomputer crowned Jeopardy king (BBC News 17 Feb 2011)
Thursday, November 25, 2010
Future of IT
Maybe we shouldn't expect a single coherent vision to cover all this stuff, but I believe organizational intelligence provides a useful framework for joining the following fragments, and connecting them to lasting business value.
- Collaboration platforms become people-centric (Forrester)
- Process-Centric Data and Intelligence (Forrester)
- BPM will be Web-2.0-enabled (Forrester)
- Business impact of social computing (Gartner)
- Complex systems engineering (ZapThink)
Elements
- Enterprise Architecture. “Static frameworks give way to continuous business transformation best practices. ” (ZapThink)
- Business Process Management. “Organizations begin truly managing their business processes.” (ZapThink)
- Social computing. The technologies and principles behind Facebook, Twitter and LinkedIn “will be implemented across and between all organizations.” (Gartner)
Business Pay-Off
- “It will unleash yet to be realized productivity growth, it will contribute to economic growth.” (Gartner)
- “Organizations ... achieve their goals in the context of an ever-changing business environment.” (ZapThink)
Architectural Framework
Of course, these and similar benefits have been claimed for any number of previous technological innovations. It is not clear from these brief quotes from some of the leading IT analyst firms exactly how they believe specific combinations of specific elements would produce these outcomes, and how a CIO (or software salesperson) could reason about the likely return on investment. I'm guessing they charge buckets for that kind of insight.Monday, November 22, 2010
Embedding Intelligence into the Business Process 2
- Embedding business intelligence (BI) into the business process.
- Embedding Enterprise 2.0 into the business process.
In this post, I'm going to talk about two further aspects of this.
- Embedding knowledge (content) into the business process.
- Embedding learning into the business process.
Embedded Knowledge Content
There are various levels at which knowledge can be embedded in a business process. Some forms of procedural knowledge can be encapsulated as static rules, which can be either written into the process (as software or bureaucratic procedure) or stored in a form that can be easily and automatically referenced by software components or knowledge workers. There is a considerable software literature on so-called business rules - see for example my review of Business Rule Concepts.More complex forms of knowledge can be represented as models. For example, the business processes associated with operating a complex industrial process or communications network require some representation of the physical structure and processes involved, while business processes in the finance world may use economic models that help to predict market trends and risks. These models may be buried within complicated algorithms, or represented visually in dashboards and control room displays. See my post on OrgIntelligence in the Control Room.
Thirdly there is contextual knowledge - an appreciation of the specific circumstances and general trends relevant to a business decision or action. This kind of knowledge is dynamic and typically requires human mediation and interpretation, although it may be possible to codify and even automate some limited kinds of contextual knowledge. When discussing the contribution of Enterprise 2.0 to the American security services, Dennis Howlett comments that "content without context in process is meaningless". (See my post on Connecting the Dots).
In her post on The Future of HRM Software, Naomi Bloom talks about embedded intelligence that integrates the rule-based and the contextual knowledge into a software agent she calls "Naomi". She claims that embedded intelligence can achieve several things.
- It "replaces what we lost" when we reduced or eliminated the interaction between experts and the rest of the organization. (In her piece, the experts are HR professionals.)
- It improves upon human embedded intelligence by removing human error.
- Automated embedded intelligence improves compliance to rules/policies/regulations and reduces the organization’s exposure to risk.
- Commercial Web sites (Landsend.com, Amazon.com) and social Web sites (Wikipedia) set expectations of the embedded intelligence to be found in any self service environment.
One of the ways that enterprise architects can think strategically about business capabilities and business processes is in terms of knowledge intensity - in other words, the quantity and quality of knowledge required in a given capability or process to differentiate the enterprise from its competitors. The "core" activities of an enterprise are those requiring high levels of knowledge intensity and specificity, other activities can be regarded as "peripheral" and may be commoditized or outsourced. See my post on Ecosystem SOA, which draws on the work of Amin and Cohendet.
Embedded Learning
In my post on Learning by Doing, I pointed out that such characteristics as adaptability, agility, flexibility, responsiveness (supposed to be the benefits of various technologies including SOA) imply processes of adaptation and learning. So we need to ask: How do business systems (both organizational and technical) improve? Where is the learning located? What is the nature of the feedback loop?- The learning loop goes through the software developers. (The software development acts as a gate/brake on the learning process.)
- A learning process is contained in the software or service. (Learning can take place in real-time, but only for things that have been explicitly anticipated in software development.)
- The learning process is distributed across the community usage of the software or service.
Places are still available for my Organizational Intelligence Workshop on December 8th.
Friday, November 19, 2010
Embedding Intelligence into the Business Process
In this post, I'm going to talk about two specific aspects of this.
- Embedding business intelligence (BI) into the business process.
- Embedding Enterprise 2.0 into the business process.
Embedded BI
The idea of embedded business intelligence has been around for many years. See my blog on Service-Oriented Business Intelligence from September 2005. See also my slideshare presentation.
When software vendors talk about embedded BI, they often mean embedding BI functionality in other pieces of software - for example ERP applications - to allow these applications to produce more interesting reports. There are several niche BI producers in this space, including Jaspersoft, Pentaho and Yellowfin. Brian Gentile of Jaspersoft talks about this in his recent article The BI Revolution: Business Intelligence's Future. TDWI November 10, 2010. For an article explaining the difference between Embedded BI and Integrated BI, see Execution MIH.
For BI to be embedded in the business process, we need to have an understanding of the business process that includes some cognitive task, such as a complex decision, where some business intelligence capability can be used specifically to support this cognitive task. In some cases, the aim might be to make the process faster and more efficient, but more usually the aim is to make the process more powerful and effective.
Embedded BI in this sense can also be related to intelligent event processing, where analytic capability embedded in one process can trigger automatic as well as human responses in other processes. Brian Gentle talks about this in an earlier article The BI Revolution: A New Generation of Analytic Applications. TDWI October 20 2010.
Beyond embedding BI in the business process, we might look forward to a state in which analytics is embedded in the entire enterprise, what Tom Davenport and his colleagues call the Analytic Organization. (See my review of Competing on Analytics.) This is the proper meaning of the term Pervasive BI, which Dave Mittereder defined in 2005 as "empowering everyone in the organization, at all levels, with analytics, alerts and feedback mechanisms" (Pervasive Business Intelligence: Enhancing Key Performance Indicators Information Management Magazine, April 2005).
Embedded Enterprise 2.0
In her piece Time For Enterprise 2.0 To Get Enterprisey, Sandy Kemsley takes a sceptical look at the extent to which Enterprise 2.0 is supporting the core business."You hear great stories about social software being used to strengthen weak ties through internal social networking, or fostering social production by using a wiki for project documents, but many less stories about using social software to actually run the essential business processes."She quotes Andrew McAfee:
[The CIOs] weren’t too worried that their people would use the tools to waste time or goof off. In fact, quite the opposite; they were concerned that the busy knowledge workers within their companies might not have enough time to participate.And comments:
The fact that the knowledge workers had a choice of whether to participate tells me that the use of social business software is still somewhat discretionary in these companies, that is, it’s not running the core business operations; if it were, there wouldn’t be a question of participation.
It seems to me that there are two possible interpretations of McAfee's remark. Sandy's interpretation is that busy knowledge workers simply don't find time to do any Enterprise 2.0 stuff at all, and she concludes that if the business process can still work without it, then the Enterprise 2.0 stuff is discretionary. An alternative interpretation might be that the business knowledge workers don't have enough time to do enough Enterprise 2.0 stuff to get as much intelligence (requisite variety) into the business process as the business really needs. (I happen to prefer the second interpretation, but I don't know whether this is what McAfee really meant.) In other words, it could be that there a trickle of benefit rather than a decent flow.
I'm presuming that the way Enterprise 2.0 is used within the business process is to support specific cognitive tasks, such as interpreting and making sense of events, and making complex decisions. These tasks may be done by an individual knowledge worker, possibly drawing on knowledge made available by co-workers, or may be done collectively by a network of knowledge workers. The quality of sense-making and decision-making doesn't necessarily increase just because you have more people spending more time on it, but with highly complex business situations the opposite is almost certainly true - the quality will be impaired if you have too few people devoting insufficient time and attention.
But I worry a bit when technology vendors merely invoke the magic words "business process" without demonstrating any real understanding. For example, Sandy's blog links to Klint Finley's piece on Tying Enterprise 2.0 to Business Processes, or Creating New Processes for the Social Enterprise, which doesn't say anything I can recognize as being about business process; as far as I can see, it is largely about activity stream filtering as a technical solution for integrating pieces of software. Finley's piece links in turn to a Monday Musing by R "Ray" Wang which states that
Organizations seeking a marketing edge must digest, interpret, and asses (sic) large volumes of meta data from sources such as Facebook Open Graph.
I think there may possibly be a business process implicit in there somewhere, but exactly how this business process would be supported by Enterprise 2.0 is left to the imagination. I hope "asses" isn't a Freudian slip.
To be clear, I can see how the technologies Klint and "Ray" are talking about might possibly be embedded into a sociotechnical system to support a real business process. But they aren't actually making the connection, nor are they providing any evidence that anyone else is doing so. Even Michael Idinopulos, who at least sounds as if he knows what he is talking about in The End of the Culture 2.0 Crusade? fails to provide any concrete examples. He may have seen some evidence, but he's not telling us. So (not for the first time) it's left to Tom Davenport to say something useful. In a short blogpost for HBR, he provides a couple of examples of what can be done when the social and structuring aspects of technology are combined (Want Value from Social? Add Structure).
Note: some of the larger software vendors have a stake in several of these areas, and are trying to integrate different product lines. For example, IBM adds predictive analytics and social networking in Cognos 10 (SearchBusinessAnalytics.com, 25 Oct 2010). Meanwhile, the niche software providers may be developing interesting partnerships and collaborations - Brian Gentle emails me with a note about the embedding of Jaspersoft within eBuilder, a Swedish provider of an end-to-end B2B suite of Cloud Supply-Chain Management Processes, to produce what they call a Strategic Management Tool.
Places are still available for my Organizational Intelligence Workshop on December 8th.
Wednesday, November 03, 2010
Collaboration Chasm
Rogers himself modelled adoption as a continuous S-curve, but Moore's notion of a chasm is popular with supply-side marketing people, because it suggests a single heroic leap from an experimental product to a commercial success.
In the context of collaboration technologies within the enterprise, the "chasm" metaphor can be interpreted in multiple ways, all of which are discussed or implied in the CISCO document.
- The categorical difference between individual early adopters and everyone else. A simplified model of adoption would regard "early adopter" as a personality type, predicting a person's attitude to any kind of innovation, similar to the Adaptor/Innovator scale developed by Michael Kirton. (Rogers himself recognized that a person could easily be an early adopter of one technology and a late adopter of another.) Additionally, some writers may wish to characterize particular organizations as early adopters.
- The categorical difference between Generation X and Generation Y. The assumption that because younger people are likely to be more comfortable with certain classes of technology, they will therefore be more positively inclined to the adoption and use of these technologies in the workplace.
- The difference between social and workplace use of these technologies. Jagdish Vasishtha thinks this has something to do with personal choice, saying "there is a growing chasm between how people now communicate in their personal space and how they are forced to communicate in a corporate environment" [Crossing the Chasm, May 2009 (pdf)]. The CISCO document points out several reasons why technologies such as social networking or instant messaging don't necessarily transfer easily from one’s personal life to the workplace, and quotes Ray Ozzie on the "chilling effect" of organizational dynamics [Churchill Club, June 2009]. See also Richard Dennison, BT 2.0 Case Study, November 2007.
- The step from undirected ("bottom-up") investigation to directed ("top-down") business process change. "Carefully shaping a subset of collaboration initiatives in a top-down fashion to align them with business priorities provides the required structure to scale up an organization’s efforts into the performance stage."
- The step from isolated experimental use to integrated use. CISCO describes a 3-stage development strategy (1: Investigative, 2: Performance, 3: Transformational), and positions the "chasm" between stages 1 and 2. For an SAP example of the "chasm" between stand-alone collaboration and embedded collaboration, see Irwin Laazar, Collaboration in Context (May 2010).
- "Collaboration creates shifts in the organizational mindset." This might possibly qualify as a second chasm between CISCO stages 2 and 3.
However, there are some misalignments between these different notions. For example, the fact that many young people are familiar with social networking in their private lives doesn't necessarily imply that they will be better able than their parents to use social networking effectively in their professional lives. Effective use in a given social context depends on purpose and style, and social and organizational experience may be more relevant here than technical skill and enthusiasm.
In my work on technology adoption within the enterprise, I make the distinction between broad adoption and deep adoption. Broad adoption basically means that a given product or platform is used by a lot of different people in a lot of places, but this usage may be casual, occasional and uncommitted. Deep adoption means that the product or platform is used intensively, is embedded in processes and working practices, as well as being integrated with other relevant technologies, but may only involve a few people or departments.
The distinction between broad adoption and deep adoption implies two "chasms" at right angles - one between narrow and broad, and one between shallow and deep. The tactics required to encourage broad adoption are significantly different from the tactics needed to implement deep adoption. CISCO's basic 3-step strategy appears to involve crossing both of these chasms at the same time, but the document also refers to some alternative strategies, including a "cultivation" strategy followed by Statoil. Some adoption strategies may permit different aspects of technology adoption to be decoupled; indeed, a number of the examples cited from CISCO's own internal processes involve localized collaboration within specialized processes, although this may be because enterprise-wide cross-process collaboration is harder to explain.
The distinction between broad adoption and deep adoption may also encourage us to look at early adopters more critically. Those who constantly quest for technological novelties may not appreciate or experience the full power of a revolutionary innovation, and may not the best people to lead serious and sustained commitment to its enterprise-wide and system-deep adoption. By the time the organization is moving into CISCO's stage three, the so-called early adopters may have switched their attention and allegiance to something else.
Thursday, September 16, 2010
From water cooler to Web 2.0
Rob expressed some scepticism about formal systems for organizational intelligence, and speculated that the water-cooler might be the most important tool for knowledge management. But obviously a literal water cooler can only support a small number of people at a single location. So what is the internet or intranet equivalent, and what are the organizational and cultural requirements for making a metaphorical water cooler work as effectively as a real one?
As Richard asked
Does this virtual model make the "water cooler effect" a myth since the organisation itself may be small but its partners may be dispersed? Is the "water cooler" actually a personal network that spans organisations? What effect does Web 2.0 have on this (like LinkedIn!)?
Let's start by understanding the value of the "water cooler" to the enterprise. The first point is that people don't just rely on formal information systems and dashboards to know what is going on, they also use a range of informal communication mechanisms including casual and serendipitous chit-chat, as well as Management-By-Walking-Around (MBWA). Some of these mechanisms can be replicated or extended by Web 2.0; in any case, the water cooler merely stands in for anywhere (real or virtual) where these exchanges can take place.
Many IT architects concentrate on building and integrating formal systems (although this task is perhaps increasingly delegated to ERP or SaaS vendors and similar) but organizational intelligence raises the question about the relationship between formal systems and informal systems.
But although Web 2.0 may enable all kinds of communication and sharing that weren't possible before, both inside and outside the organization boundaries, I don't see technology as the efficient cause of change, but merely providing support for change in the organization itself and its processes and capabilities.
Richard made an important observation about strong inward-looking implications of the water cooler. Interestingly, the water cooler metaphor echoes a much older idea of the village pump being the locus of social interaction. (Several Bible stories take place near a well.)
Richard also notes that senior executives tend to rely more on traditional personal networks than on Web 2.0. Of course there are some obvious limitations with Web 2.0, at least as currently available. I posted a fictional example of the Old Boy Network on my blog (Social Networking as Reuse) and suggested it might take a while before Linked-In and Facebook can replicate the kind of affordance offered by more traditional methods.
Ian averred that in 30 years of consulting he never came across an organization where people gathered around a water cooler, and asked if it really happened?
The main problem we seem to have with the traditional methods of networking is that they are not scaleable or interoperable. Each executive has his/her own personal network of friends and information sources, but that typically results in intelligence silos and thus may not be enough in the face of really large and complex problems.The village pump is more likely, assuming an age where time passes more slowly, but sadly grabbing a coffee and taking it back to your desk is more likely. Of course the village pump was also a major transmitter of disease as untreated sewage would have been piled in middens just yards away and polluted the water source - just as the water cooler/coffee machine can be a source for the rapid spread of baseless rumours.
The main problem we seem to have with Internet-based methods is that they are ungrounded. Poor quality information (rumour) has always existed, but now it can be disseminated globally with a single well-timed Tweet. A great deal of Internet discussion lacks rigour, relevance or respect, and is sometimes quite incomprehensible.
The Internet may therefore simultaneously amplify both intelligence and stupidity, is a constant battleground between them. This is now a large part of the environment in which organizational intelligence must operate.
Thursday, July 15, 2010
Social Networks and Received Opinion
The internet has not become the great leveller that it was once thought it could be. The web is now contrary to the original utopian vision and users focus on information from a handful of wealthy countries. It's making us 'imaginary cosmopolitans'.
Social networks make the problem worse with the majority of people sharing information with folk who share their world-view. Our world-view might actually be narrowing.
Tools like Twitter trap people in so-called "filter bubbles". The internet is too big to understand as a whole, so we get a picture of it that's similar to what our friends see. If you turn to your friends, eventually you get the wisdom of the flock.
The term "filter bubbles" is credited to political activist @elipariser. See Ethan's earlier post Eli Pariser on Filter Bubbles (March 2010).
This phenomenon is important from many perspectives. One question that particularly interests me is the way that these networks can create the illusion of improved intelligence, while actually doing no such thing.
wish, illusion | actual | |
information gathering | availability: fast, rich, high quality, unmediated, diverse | homogeneous, filtered |
sense-making & decision-making | open, creative | closed, doctrinaire |
knowledge | complete, consistent, strong, independent | partial, partisan, weak, received opinion |
learning | progressive | pseudo-learning |
communication | authentic | vapid |
Obviously it would be crazy to write off social networking and the internet as an inevitable producer of these effects - that would be the kind of crude technological determinism that gets the tabloid newspapers bewailing the Perils of Facebook.
Instead, the challenge is both to use the available human and technical networks more wisely, and to develop sociotechnical mechanisms that help to realise the original vision of these technologies and contribute to a greater and better distributed intelligence and understanding. Zuckerman talks ambitiously about mechanisms for amplifying underrepresented voices, and for discovering content through serendipity. He also talks about important new roles - for example curators to collect the content, xenophiles to bridge different cultures, working together to put content into context.
But even if we cannot transform the world overnight, we (ourselves and our organizations) can at least start to use these technologies in a more contingent manner, and with greater awareness of their strengths and weaknesses.
See also Polarizing Filters (March 2021)
Tuesday, May 25, 2010
From Buzz to Actionable Intelligence
Why would anyone want to do this? The first obvious interest is in tracking mindshare. How many people are talking about your product versus its competitors.
But it's not enough just to count the mentions of your product. When Microsoft launched the Zune, this was almost universally compared with the Apple iPod, so within a day or two there were thousands of webpages mentioning both. But unsurprising information is of little value; what's potentially significant here is not the absolute numbers but the relative shifts.
There are some important questions here about the volatility of buzz data. If mindshare fluctuates, is this a significant movement, or just random noise? The challenge is to build up enough statistical history to be able to set realistic action thresholds, and to identify potentially important weak signals for further investigation.
It might seem useful to know exactly what people were saying about the two products - which one they preferred and why. Until recently it has been almost impossible for software (and not always easy for humans, especially in unfamiliar cultural settings) to distinguish an enthusiastic "brilliant" from a sarcastic "brilliant", but Israeli researchers are now claiming a 77% precision in detecting sarcasm.
Joe McKendrick, New algorithm spots sarcasm in customer testimonials (Smart Planet, May 2010)
MacGregor Campbell, Just what we need: sarcasm software (New Scientist, May 2010)
However, tagging mentions according to sentiment still looks a pretty inexact science. Some vendors operating in this space don't include automated sentiment analysis at all (e.g. ConMetrics ); others provide simple trends only, leaving humans to do the detailed analysis (e.g. Lexalytics).
But never mind the technical detail. The point of this kind of business intelligence is that it is actionable. Companies can get an early indication of the success of a marketing campaign, long before mindshare feeds through into sales.
Because we aren't just interested in product mentions - we can also track discussion of particular design features of the product. How many people are talking about battery life or screen size or capacity or cost? This kind of detailed information helps identify the features that the marketing campaigns should emphasize, and may also feed into product development. Obviously if battery life is the most talked-about feature of this class of product, then that's a valuable item of intelligence for product designers as well as for sales and marketing. (I wonder how easy it would be to integrate this kind of business intelligence with a requirements engineering tool/method such as Quality Function Deployment, or a statistical technique such as MaxDiff? See Eric Almquist and Jason Lee, What Do Customers Really Want?, Harvard Business Review, April 2009)
If you have enough high-quality data, with all the automatic replication and spam stripped out, then you can also track the influence paths across the Internet over time. Not only identifying the pages that talk about the Zune versus iPod, but which pages came out first, and which of the earlier pages are strongly referenced by later pages. Not just individual thought leaders but also communities or geographies - for example, a given buzz might start on university campuses before spreading to other demographic sectors. That tells you where you should conduct market trials if you want rapid dissemination, and also where you should go for a relatively isolated trial of some high-risk venture. It also tells you which websites to watch for potential trouble.
What interests me most about this kind of innovation is not the technical details but the potential for transforming the business process - to develop greater organizational intelligence. Two years ago, Onalytica founder Flemming Madsen laid out a vision in his blog Predicting Sales from Online Buzz (Jan 2008) and Predicting Sales from Online Buzz - 2 (April 2008).
- predicting sales, market share and other outcomes
- detect changes in competitors’ behaviour
- setting targets known as “influence budgets”
- using “influence budgets” to predict whether an organization is on track to meet its actual revenue or market share targets, and take remedial action if required
But here's the thing I found most exciting. If an organization can develop sufficient confidence in the reliability of the predictions resulting from this kind of business intelligence, then the visible growth of influence and mindshare may enable it to sustain longer-term programmes and campaigns, instead of cancelling projects that don't deliver an immediate commercial return. Some people might imagine that an organization driven by buzz would be excessively short-termist - but the champions of this approach insist that good use of buzz by a truly intelligent organization could have quite the opposite effect.
I have talked to one large organization using this technology, and I'm hoping to publish this as a case study in the near future. In the meantime, I should be delighted to talk to any other organizations, to see what is actually happening in practice.
See also Just Shut Up and Listen, by Kishore S. Swaminathan of Accenture.
Thursday, May 20, 2010
From Organizational Stupidity to IT Disaster
Tony's article extracted several key points from the Public Accounts Committee report (pdf) on the C-Nomis project. As he points out, C-Nomis is by no means an isolated example of failure, and much the same could be said of other big IT-based change programmes such as the NPfIT. So I thought I'd try and map his key points against the Symptoms of Organizational Stupidity I outlined a few days ago.
On a preliminary analysis of Tony's summary, at least six of these symptoms are strongly indicated, and can be clearly linked to a very poor outcome. I should be very interested to carry out a more detailed analysis.
Denial
- Bending the truth. "The programme team running C-NOMIS reported that the programme was delivering on time and to budget, when it was not."
- Over-optimistic 'good news' culture.
- NOMS significantly underestimated the technical complexity of the project.
Guesswork
- No-one was actively monitoring the budget .
- NOMS cannot provide the detail.
Meddle
- There was no sustained effort by NOMS to simplify and standardise its business processes reflecting management's misplaced confidence in C-NOMIS, their unrealistic expectations of what could be achieved by an IT solution and their underestimation of the time and costs to deliver it.
- "Prison and probation information requirements were quite different and each of the 42 probation areas had different ways of working. End-to-end offender management was little more than a concept, and what it meant in practice and the IT needed to support it had not been worked through."
Muddle
- Remarkable lack of insight and rigour, coupled with naivety and over-optimism.
- No-one has been held to account. ... The vacuum of leadership within NOMS contributed to confusion and created challenges for suppliers and the project team.
Repetition
- Poor decision taking and weak project management on many occasions. The same lessons have still not been learnt.
- It is deeply depressing that after numerous highly critical PAC reports on IT projects in recent years, the same mistakes have occurred once again.
Short-sighted
- Serious failure to understand the magnitude and cost of the changes which would be needed.
Monday, May 17, 2010
Are BPM professionals experts in collaboration?
1. To understand the factors supporting collaboration between knowledge workers, a questionnaire was sent to BPM professionals. The paper doesn't make clear whether they were being asked about their own personal collaboration, or about their opinions about collaboration in general. In any case, we might imagine that BPM professionals have a particular perspective on collaboration, which might distort the survey.
2. Nearly a quarter of the BPM professionals couldn't make sense of the collaboration model on which the survey was based, and were unable to answer all the questions. Instead of treating this as a sign that there might be a problem with the model, the researchers chose to exclude these from the analysis. They then argue that the remaining responses validate their model.
3. The survey doesn't measure collaboration, it measures opinions about collaboration, from a carefully selected group of knowledge workers. Perhaps not surprisingly, the opinions are pretty consistent with the kind of management literature that these knowledge workers might be expected to have read. Except that the answers about "purpose" were all over the place (which I can well imagine, given the uncertain intentions of the question), so this factor failed a statistical test (Cronbach's Alpha) and could be quietly dropped from the model.
4. The direct relation between collaboration and the performance of an enterprise is not tested, because the questionnaire did not consists of any financial questions. (It would seem that financial information is "sensitive"; collaboration itself presumably isn't.) Let's hope the students manage to extend their research to include questions about the financial situation of an enterprise, allowing them to demonstrate and explain how maturity of collaboration of knowledge workers (as perceived and understood by BPM professionals) might actually help to improve the performance of an enterprise.
5. I generally regard opinion surveys as low-grade research because they usually merely recycle received opinion. While I understand that this may be the easiest and cheapest form of research, especially for students and software industry analysts, I expect to see some acknowledgement of the potential distortion, rather than merely taking the collected opinions at face value.
Saturday, May 15, 2010
Competing on Analytics
Davenport and Harris (I shall call them DH) start by defining business intelligence as a set of techniques and processes that use data to understand and analyse business performance, from data access and reporting to analytics proper, together addressing a range (spectrum) of questions about an organization's business activities. They position analytics at the higher-value and more proactive end of this range (spectrum), and offer a graph that appears to correlate the degree of intelligence with competitive advantage (DH pp7-8).
Data Access and Reporting | Analytics |
Standard reports - What happened? | Statistical analysis - Why is this happening? |
Ad hoc reports - How many, how often, where? | Forecasting/extrapolation - What if these trends continue? |
Query / drill-down - Where exactly is the problem? | Predictive modelling - What will happen next |
Alerts - What actions are needed? | Optimization - What is the best that can happen? |
However, intelligence is not just about asking clever questions, but includes a number of other capabilities described in the book including fact-based decision-making (DH pp 44-7) (sometimes known as evidence-based policy) and the capture of learning from organizational experiments (DH pp178-9).
The book is clearly tied to a software industry perspective on intelligence, understanding business intelligence as a collection of systems and services, and a chapter devoted to software technology is entitled "The Architecture of Business Intelligence" (Chapter 8). For my part, I might have wished that this chapter had been called "The Software Architecture of Business Intelligence", and that the wealth of material on introducing enhancing analytical capabilities (Chapter 6) and managing analytical people (Chapter 7) had been presented as "The Organizational Architecture of Business Intelligence". Or even "The Architecture of Organizational Intelligence". But that's the book I'm writing, so I guess I shouldn't complain.
The authors clearly understand that it is not the possession but the use of this technology that is the critical differentiator. They identify four common characteristics of the most analytically sophisticated and successful firms (DH p23).
- Analytics supported a strategic, distinctive capability.
- The approach to and management of analytics was enterprise-wide.
- Senior management was committed to the use of analytics.
- The company made a significant strategic bet on analytics-based competition.
(As an aside here, I am always slightly sceptical about senior management commitment being a precondition for success, because I've generally seen it as a postcondition for success. When an initiative is successful, senior managers will appear from nowhere to take the credit.)
The book ends with an eloquent argument for analytical capability (p 186), framed in terms of optimizing efficiency and effectiveness:
- Efficient and effective marketing campaigns and promotions
- Excellent customer service, resulting in customer loyalty
- Ultraefficient supply chains, optimized inventory levels
- Precise evaluation and compensation of the workforce
- Early recognition and diagnosis of problems
The book opens with the story of Netflix. The company was founded in 1997, competing against established video rental companies like Blockbuster. "Pure folly, right?" the authors ask rhetorically, and then go on to attribute Netflix's success to the incorporation of analytics into its business operations (DH pp3 ff).
Okay, there is clearly a lot of intelligence in the way Netflix operates, but what equally interests me is the intelligence involved in creating Netflix in the first place. Obviously the authors don't regard the founding of Netflix as folly - in retrospect it was a very smart move indeed - but this kind of decision is not primarily based on the kind of analytics described in the book but involves a completely different kind of intelligence. Hence organizational intelligence is not just analytics.
Thomas H Davenport and Jeanne G Harris, Competing on Analytics, The New Science of Winning. Harvard Business School Press, 2007. Extract from the 2017 edition here, including what appears to be a more recent look at Netflix. https://www.huffpost.com/entry/how-netflix-uses-analytics-to-thrive_b_5a297879e4b053b5525db82b
See also Rhyme or Reason: The Logic of Netflix (June 2017), Does Big Data Drive Netflix Content? (January 2021)
My book is available here Building Organizational Intelligence (LeanPub, 2012)
Update February 2023. This post was flagged as breaking community guidelines. This may have been a consequence of an extended quote from the Davenport and Harris book, which I have now replaced with a bullet point summary.