Showing posts with label adoption. Show all posts
Showing posts with label adoption. Show all posts

Friday, August 09, 2019

RPA - Real Value or Painful Experimentation?

In May 2017, Fran Karamouzis of Gartner stated that "96% of clients are getting real value from RPA" (Robotic Process Automation). But by October/November 2018, RPA was declared to be at the top of the Gartner "hype cycle", also known as the Peak of Inflated Expectations.

So from a peak of inflated expectations we should not be surprised to see RPA now entering a trough of disillusionment, with surveys showing significant levels of user dissatisfaction. Phil Fersht of HfS explains this in terms that will largely be familiar from previous technological innovations.
  • The over-hyping of how "easy" this is
  • Lack of real experiences being shared publicly
  • Huge translation issues between business and IT
  • Obsession with "numbers of bots deployed" versus quality of outcomes
  • Failure of the "Big iron" ERP vendors and the digital juggernauts to embrace RPA 
"You can't focus on a tools-first approach to anything." adds @jpmorgenthal

There are some generic models and patterns of technology adoption and diffusion that are largely independent of the specific technology in question. When Everett Rogers and his colleagues did the original research on the adoption of new technology by farmers in the 1950s, it made sense to identify a spectrum of attitudes, with "innovators" and "early adopters" at one end, and with "late adopters" or "laggards" at the other end. Clearly some people can be attracted by a plausible story of future potential, while others need to see convincing evidence that an innovation has already succeeded elsewhere.
Diffusion of Innovations (Source: Wikipedia)

Obviously adoption by organizations is a slightly more complicated matter than adoption by individual farmers, but we can find a similar spread of attitudes within a single large organization. There may be some limited funding to carry out early trials of selected technologies (what Fersht describes as "sometimes painful experimentation"), but in the absence of positive results it gets progressively harder to justify continued funding. Opposition from elsewhere in the organization comes not only from people who are generally sceptical about technology adoption, but also from people who wish to direct the available resources towards some even newer and sexier technology. The "pioneers" have moved on to something else, and the "settlers" aren't yet ready to settle. There is a discontinuity in the adoption curve, which Geoffrey Moore calls "crossing the chasm".

Note: The terms "pioneers" and "settlers" refers to the trimodal approach. See my post Beyond Bimodal (May 2016).

But as Fersht indicates, there are some specific challenges for RPA in particular. Although it's supposed to be about process automation, some of the use cases I've seen are simply doing localized application patching, using robots to perform adhoc swivel-chair integration. Not even paving the cow-paths, but paving the workarounds. Tool vendors such as KOFAX recommend specific robotic types for different patching requirements. The problem with this patchwork approach to automation is that while each patch may make sense in isolation, the overall architecture progressively becomes more complicated.

There is a common view of process optimization that suggests you concentrate on fixing the bottlenecks, as if the rest of the process can look after itself, and this view has been adopted by many people in the RPA world. For example Ayshwarya Venkataraman, who describes herself on Linked-In as a technology evangelist, asserts that "process optimization can be easily achieved by automating some tasks in a process".

But fixing a bottleneck in one place often exposes a bottleneck somewhere else. Moreover, complicated workflow solutions may be subject to Braess's paradox, which says that under certain circumstances adding capacity to a network can actually slow it down. So you really need to understand the whole end-to-end process (or system-of-systems).

And there's an ethical point here as well. Human-computer processes need to be designed not only for efficiency and reliability but also for job satisfaction. The robots should be configured to serve the people, not just taking over the easily-automated tasks and leaving the human with a fragmented and incoherent job serving the robots.

And the more bots you've got (the more bot licences you've bought), the challenge shifts from getting each bot to work properly to combining large numbers of bots in a meaningful and coordinated way.  Adding a single robotic patch to an existing process may deliver short-term benefits, but how are users supposed to mobilize and combine hundreds of bots in a coherent and flexible manner, to deliver real lasting enterprise-scale value? Ravi Ramamurthy believes that a rich ecosystem of interoperable robots will enable a proliferation of automation - but we aren't quite there yet.



Phil Fersht, Gartner: 96% of customers are getting real value from RPA? Really? (HfS 23 May 2017), With 44% dissatisfaction, it's time to get real about the struggles of RPA 1.0 (HfS, 31 July 2019)

Geoffrey Moore, Crossing the Chasm (1991)

Susan Moore, Gartner Says Worldwide Robotic Process Automation Software Market Grew 63% in 2018 (Gartner, 24 June 2019)

Ravi Ramamurthy, Is Robotic Automation just a patchwork? (6 December 2015)

Everett Rogers, Diffusion of Innovations (First published 1962, 5th edition 2003)

Daniel Schmidt, 4 Indispensable Types of Robots (and How to Use Them) (KOFAX Blog, 10 April 2018)

Alex Seran, More than Hype: Real Value of Robotic Process Automation (RPA) (Huron, October 2018)

Sony Shetty, Gartner Says Worldwide Spending on Robotic Process Automation Software to Reach $680 Million in 2018 (Gartner, 13 November 2018)

Ayshwarya Venkataraman, How Robotic Process Automation Renounces Swivel Chair Automation with a Digital Workforce (Aspire Systems, 5 June 2018)


Wikipedia: Braess's Paradox, Diffusion of Innovations, Technology Adoption Lifecycle


Related posts: Process Automation and Intelligence (August 2019), Automation Ethics (August 2019)

Wednesday, November 03, 2010

Collaboration Chasm

I've just been looking at a Collaboration Framework published by the CISCO community earlier this year [Insights from the Collaboration Consortium Year One], which uses the well-known Crossing-the-Chasm model of technology adoption, popularized by Geoffrey Moore and loosely based on the work of Everett Rogers. The CISCO term “collaboration chasm” refers to the notion that there is a discontinuity between the use of a limited number of technologies by early adopters and the large-scale adoption by mainstream users.

Rogers himself modelled adoption as a continuous S-curve, but Moore's notion of a chasm is popular with supply-side marketing people, because it suggests a single heroic leap from an experimental product to a commercial success.

In the context of collaboration technologies within the enterprise, the "chasm" metaphor can be interpreted in multiple ways, all of which are discussed or implied in the CISCO document.

  • The categorical difference between individual early adopters and everyone else. A simplified model of adoption would regard "early adopter" as a personality type, predicting a person's attitude to any kind of innovation, similar to the Adaptor/Innovator scale developed by Michael Kirton. (Rogers himself recognized that a person could easily be an early adopter of one technology and a late adopter of another.) Additionally, some writers may wish to characterize particular organizations as early adopters.
  • The categorical difference between Generation X and Generation Y. The assumption that because younger people are likely to be more comfortable with certain classes of technology, they will therefore be more positively inclined to the adoption and use of these technologies in the workplace.
  • The difference between social and workplace use of these technologies. Jagdish Vasishtha thinks this has something to do with personal choice, saying "there is a growing chasm between how people now communicate in their personal space and how they are forced to communicate in a corporate environment" [Crossing the Chasm, May 2009 (pdf)]. The CISCO document points out several reasons why technologies such as social networking or instant messaging don't necessarily transfer easily from one’s personal life to the workplace, and quotes Ray Ozzie on the "chilling effect" of organizational dynamics [Churchill Club, June 2009]. See also Richard Dennison, BT 2.0 Case Study, November 2007.
  • The step from undirected ("bottom-up") investigation to directed ("top-down") business process change. "Carefully shaping a subset of collaboration initiatives in a top-down fashion to align them with business priorities provides the required structure to scale up an organization’s efforts into the performance stage."
    • The step from isolated experimental use to integrated use. CISCO describes a 3-stage development strategy (1: Investigative, 2: Performance, 3: Transformational), and positions the "chasm" between stages 1 and 2. For an SAP example of the "chasm" between stand-alone collaboration and embedded collaboration, see Irwin Laazar, Collaboration in Context (May 2010).
    • "Collaboration creates shifts in the organizational mindset." This might possibly qualify as a second chasm between CISCO stages 2 and 3.

    However, there are some misalignments between these different notions. For example, the fact that many young people are familiar with social networking in their private lives doesn't necessarily imply that they will be better able than their parents to use social networking effectively in their professional lives. Effective use in a given social context depends on purpose and style, and social and organizational experience may be more relevant here than technical skill and enthusiasm.




    In my work on technology adoption within the enterprise, I make the distinction between broad adoption and deep adoption. Broad adoption basically means that a given product or platform is used by a lot of different people in a lot of places, but this usage may be casual, occasional and uncommitted. Deep adoption means that the product or platform is used intensively, is embedded in processes and working practices, as well as being integrated with other relevant technologies, but may only involve a few people or departments.

    The distinction between broad adoption and deep adoption implies two "chasms" at right angles - one between narrow and broad, and one between shallow and deep. The tactics required to encourage broad adoption are significantly different from the tactics needed to implement deep adoption. CISCO's basic 3-step strategy appears to involve crossing both of these chasms at the same time, but the document also refers to some alternative strategies, including a "cultivation" strategy followed by Statoil. Some adoption strategies may permit different aspects of technology adoption to be decoupled; indeed, a number of the examples cited from CISCO's own internal processes involve localized collaboration within specialized processes, although this may be because enterprise-wide cross-process collaboration is harder to explain.

    The distinction between broad adoption and deep adoption may also encourage us to look at early adopters more critically. Those who constantly quest for technological novelties may not appreciate or experience the full power of a revolutionary innovation, and may not the best people to lead serious and sustained commitment to its enterprise-wide and system-deep adoption. By the time the organization is moving into CISCO's stage three, the so-called early adopters may have switched their attention and allegiance to something else.

    Thursday, December 10, 2009

    What is Technology Maturity?

    @madgreek65 asks whether cloud computing is "mature", and whether it matters (What the masses are missing about the cloud).

    I suggest that there are several characteristic features of a technology or product, indicating whether it is mature or immature.


    Immature
    Mature
    Product Stability
    Subject to frequent and significant improvements. In a state of "permanent beta".
    Stable. New releases are fairly predictable upgrades.
    Conceptual Stability
    Terminological disputes. Disagreements as to what the technology is "all about".
    Terminology "taken for granted".

    Technology-in-use
    A small number of early adopters trying ambitious stuff. Little consensus about how the technology should be deployed and used.
    A large user community doing similar stuff. Use of the technology has become standardized "best practice".
    Growth
    Large untapped market. Rapid growth possible, under favourable conditions.
    Relatively little scope for further growth.
    Metrics
    Absent or unreliable
    Systematized
    Adoption Risk
    High
    Low
    Adoption Benefits
    Potentially high
    Moderate

    This notion of technological maturity has the following consequences.

    1. It is unrelated to quality or value. A mature technology or product can be unimaginative, boring, almost obsolescent, whereas an immature technology can be visionary, exciting in conception and engineered to the highest standards.

    2. Maturity is as much to do with the community of users (technology-in-use) as about the designed products (technology-as-built).

    3. The adoption roadmap for an immature technology may be rather complicated. One of the main reasons for this is that the adoption programme needs to bridge the gap between technology-as-built and technology-in-use. There is also a common preference for a cautious stepwise approach - pilot projects, proof of concept and so on. But the stakes can be much higher.

    As Mike points out, for any technology that is in the hype phase, there is a lot of resistance to change, and this is certainly true for cloud computing. Mike suggests that a lower-risk adoption approach will win over the sceptical.
    "The reason why I encourage those who are pessimistic about the cloud to try one of these low risk scenarios is once they see how easy it is, how productive they can be, and how inexpensive the project will be, then maybe they will see the value and investigate further."
    For many people, this is the preferred approach for an immature technology. However, there are some specific risks associated with a slow adoption curve, which I shall discuss in a future post.


    See also previous post: CEP and technological maturity

    Wednesday, May 27, 2009

    Layered Architecture of Technology Adoption

    @colin_jack asked whether companies ever really change, ignoring situations where there is a big change of staff (one group leaves, another group joins). People seem to want to slip back into their old way of working within weeks or months. Thinking particularly of the fast big bang changes companies go for. Agile, SOA, etc.

    Companies do often change their nature as they get larger and older, but this is a slow process. Managed organization change involves several loosely-coupled streams of activity, which operate on different timetables. Installing new software, sending everyone on a training course, renegotiating project charters and external service contracts, building experience and confidence in new practices - these things all happen at different speeds.

    A key principle of evolutionary change is that the slow-moving layers generally dominate the faster-moving layers. If your organization wishes to adopt "agile" or "service-orientation" or anything like that, this requires attention to the slow-moving layers as well.

    When I was working with CASE tools in the late 1980s, I and a few colleagues constructed an adoption roadmap to help with planning technology adoption. This roadmap was
    designed in layers or streams, not just to aid with separation of concerns, but also to manage the different characteristic pace of change in each stream. This is architectural thinking applied to organizational change. And nearly twenty years layer, exactly the same principles were used by the CBDI Forum in constructing a roadmap for SOA adoption.

    Factoring in Barriers to Entry

    @toddbiske and @djbressler make some interesting points about the adoption of software tools and platforms (Factoring in Barriers to Entry). Todd's specific example is applying BPM and BPMN tools to support EA processes, but his remarks would apply in many other contexts.

    Todd's basic argument is that adoption is more important than sophistication. Better to get people started with simple tools and platforms - for example Visio and Sharepoint - than do anything that requires the IT department to get its hands dirty. (In an earlier post, Todd identified the IT department as one of the Barriers to SOA Adoption.)

    But I don't think of adoption as a simple binary event (from "unadopted" to "adopted") but as a curve (from shallow occasional use to sophisticated and seamless integration into working practice). And although that's not how Todd is using the word "adoption", I think his argument is consistent with a richer notion of adoption. For example, he acknowledges a concern that "low barrier to entry eventually become a boat anchor".

    If a vendor boasts thousands of users, and then I discover this merely means installing the trial version of the software and playing with it once, then I'm not very impressed. If a vendor has a dozen customers at the top of the curve, that's much more impressive than a thousand at the bottom of the curve.

    From this point of view, lowering the barriers to entry is only half the story. What I'm interested in is the shape of the whole adoption curve, which enables people to find an appropriate level of adoption and not get stuck on the nursery slopes. That's where I think software like Visio and Sharepoint falls down - they may be easy to get started, but they can get hairy if you want to do anything more interesting.

    Wednesday, August 06, 2008

    Secret Life of New Tools

    In his post, The Secret Life of CEP, Tim Bass is unimpressed by the claim that CEP vendors have dozens of unnamed customers doing unspecified things.

    Twelve paying customers may not be much to boast about. For most new software tools, a typical breakdown would be as follows. Three of the companies are doing small pilot studies, three of them have finished the pilot and don't know what to do next, two of them have started a high risk project with little chance of delivering before 2027, three of them have been distracted by some other shiny new tool, and the one remaining company might possibly be using your tool on serious and achievable projects. Is CEP doing better than this? I think we should be told.

    Sunday, June 15, 2008

    Programming Languages

    Does event processing need a new programming language? And what are the prospects of programmers adopting a new language?

    The answer to both questions seems to hinge on the word "new". If we add a few constructs to an existing programming language, does that count as new?

    Louis Lovas (Progress Apama) and Mark Tsimelzon (Coral8) agree that a programming language for event processing needs to be familiar, easily recognizable, explained in five minutes. In other words, new but not very new.

    Louis Lovas (Progress Apama): Successful languages - show me the code please
    Mark Tsimelzon (Coral8): What Makes a Programming Language Successful?

    As individuals, programmers may be attracted by new and innovative languages, but there is a collective conservatism. There are always many interesting niche languages, sometimes with a devoted fanbase but with practically no chance of achieving critical mass; very few languages ever achieve widespread adoption and popularity. Louis attributes the current dominance of Java to a process of natural selection, but of course this is a selection based on the fact that popularity breeds popularity and does not indicate any intrinsic superiority of Java over the thousands of unheard-of alternatives.

    And adoption is only half the story. Programmers may indeed be unwilling to invest more than five minutes in checking out a new programming language, and this unwillingness possibly filters out a lot of potentially valuable innovations. But what if new types of software application require new types of thinking? If programmers are only using language constructs that they are comfortable with, this suggests that they are also only using familiar styles of thinking, and may fail to pay attention to some important new aspects and issues. Consequently some errors may creep into their programs.

    Obviously brilliant programmers never make any errors, but programming languages must be designed for not-so-brilliant programmers as well. So we must ask what are the pattern of errors associated with using a particular language in a particular context, and what can we do to reduce the error rate?

    Some of the advocates of niche programming languages get rather smug at this point. They argue that although their favourite programming language may be a bit more difficult to learn if you are only familiar with C or Java, it is far less error-prone. But that argument is pretty academic if noone is using or interoperating with these languages.

    There is sometimes therefore a trade-off between adoption and correct use. A language that is easy to adopt may also be easy to use badly. Whereas one that is more fail-safe in use may initially be more difficult to adopt. While language vendors undoubtedly care a little about correct use, they obviously care a great deal more about adoption and adoptability. Whereas the users of these languages should care equally about both.

    Update

    In a post entitled On the Right Event Processing Language, Opher Etzion adds: "abstractions that enable to think naturally about a certain type of functions is more important than familiar syntax". Actually I'm not convinced that programmers think naturally about anything - that's what makes them programmers in the first place - but apart from that I think he's got a point.

    Wednesday, April 26, 2006

    SalesForce adoption

    Howard Smith (CTO of CSC) has done an analysis of SalesForce.com, provocatively called SalesForce Dot Bomb, in which he uses what he calls Dilution Ratio (the average number of users per customer) as an adoption metric. He suggests that the dilution ratio is a reasonable measure of whether Salesforce is penetrating larger firms. 

    I think this means individual users per user organization (although I'm not sure whether this is what his figures actually represent). I think it would make more sense to call this a concentration ratio - the more individuals using SalesForce within a given organization, the higher the concentration. To my ears, the word dilution suggests the exact opposite - spreading a given number of users more thinly across a larger population of non-users. 

    Howard's figures show a fairly slow increase of concentration, and he interprets this as problematic for the commercial success of SalesForce in providing services to large enterprise. I am not convinced by this interpretation. User volumes are increasing and concentration is increasing. Isn't this a good thing? 

    Technology adoption can be mapped against two axes - individual adoption and collective adoption. Traditional enterprise software vendors have always tried to increase concentration, because this reduces the cost of sales. You don't make money selling one copy of your product to Megacorp Enterprises Inc for a pilot project that requires loads of unpaid customer support. You start to make serious money when Megacorp integrates your product/service into its architecture/standards, and buys a thousand copies for distribution around the organization. That's when the salesman takes a well-earned cruise around the world. Clearly a thousand copies to Megacorp, even at a significant discount, is going to be more profitable (at least in the short term) than a thousand sales to a thousand separate pilot projects.

    But traditional software vendors have also paid a lot of attention to "new name" business. You cannot sustain growth indefinitely by increasing the concentration (penetration) within existing customer organizations. Thus in very crude terms, significant growth comes ultimately from new name customers, while profit comes from better concentration in existing customers. In some sectors of the software industry, the vendors with the best concentration are the ones with mature products and trapped customers. (Sometimes it seems as if it's not the products that are treated as cash cows but the customers.) 

    Howard is now raising a very interesting question - to what extent is this business model relevant to the new breed of software-as-a-service vendors such as Salesforce? Given that the economics of scale are different in the SaaS world, does this mean that concentration/dilution no longer has such a significant impact on a software company's profitability and commercial viability? Or perhaps concentration/dilution are still relevant, but these concepts now need to be understood differently, against a different kind of reference model? 

    Remember that we are talking here about a CRM solution. Let's say that Megacorp Enterprises Inc adopts the Salesforce.com solution for all its customer data. Let's say that Megacorp has 1,500,000 customers, with 250,000 new customers per year. That seems like a much more interesting adoption metric than the number of Megacorp employees who are registered users of the Salesforce service - particularly if some of the "users" of the Salesforce services are lumps of software rather than human beings.  

     

    via Barry Briggs

    Friday, September 16, 2005

    Software Hype Curve

    The Gartner Group produces a large range of technology trends and predictions, based on a so-called Hype Cycle model. (The term Hype Cycle implies that things come round again. But the model is not cyclic, so it is more accurate to refer to it as a Hype Curve model.) I have just been looking at a Gartner document that includes curves for 1995 and 2005.

    I have posted some comments on my Demanding Change blog (formerly known as Innovation Matters) concerning the degree of rigour and empirical support underpinning Gartner's analysis. What I want to comment on here are some specifics about some of the software technologies we've been tracking ourselves.

    Issue
    Example
    Comment
    Interrelated technologies SOA is just entering the trough of disillusionment. but will be plateau-ing in 2-5 years.Web Services-enabled Business Models is a bit further behind. Meanwhile Internal Web Services is reaching the plateau of productivity. When technologies are interrelated, there is likely to be some temporal coupling between their dissemination and adoption.
    Implied technologies
    MDD hasn't peaked yet, apparently. Some analysts are predicting that MDD will peak when Microsoft actually ships its DSL + Software Factory products.
    Gartner's selection of technologies omits some key enablers.
    Vendor-specific technologies
    At present, DSL + Software Factory is a Microsoft-specific initiative.
    Gartner tries to talk about all technologies as if they were vendor-independent, but this doesn't always work.
    Absent technologies
    CBD (CBSE) doesn't get a mention. Perhaps some people now see it as having been a blind alley, while others see it as common-sense design.
    CBD (CBSE) clearly means different things to different people.
    Deja vu technologies
    Some might consider we have been through the MDD hype curve once. Except it was called CASE the first time. Plus ca change ...
    So maybe it should be a Hype Cycle after all!

    Based on discussion with John Dodd, Oliver Sims and Lawrence Wilkes.