I read Competing on Analytics because my boss began swearing by it, and my conversations with her were starting to get seriously confusing. So I bought a copy, and was plowing diligently through it at a local Rochester coffee shop, when a friendly woman — your inevitable next-table laptop warrior — noticed the book, came up to me, and struck up what turned out to be a very interesting conversation (which ended with her heading off to the nearest book store, to buy herself a copy). Since I’ve only ever struck up conversations over a book with random strangers twice before in my life, that struck me as an important piece of evidence in favor of the book. So here is my review-slash-summary.
The short version: well worth a read even if you think you know what analytics is about. And that’s coming from a resolutely non-data-driven guy who normally wouldn’t touch such a bean-counter-ish book with a ten-foot pole. Grudgingly, I have to admit I learned a lot, and saw more than I liked of my own flaws revealed, in the pattern of my resistance to the book’s ideas.
You should read this book if you don’t have a ready and clear answer to the question: “what are the differences among the concepts of business intelligence, data mining, analytics and six sigma?” That’s actually also a pretty good interview question for the hordes of job-seekers who are undoubtedly going to repackage themselves as analytics professionals following this book (my idea of a good answer is provided later in this review).
Competing on Analytics, by Thomas H. Davenport and Jeanne G. Harris
There are two good reasons to read this book. First, you are going to hear a lot about it wherever you work, and it is likely going to figure in your company’s next effort at introspection and change, so you might as well get ahead of the crowd. Second, there is actually a lot of good stuff in this book, whether or not you are part of the “data-driven” choir (I am not; though I work closely, kicking and screaming, with many people who are).
Here is the authors’ own definition of the term analytics (which should really be ’empiritics’):
“By analytics we mean the extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions and actions…analytics are part of what has come to be called business intelligence: a set of technologies and processes that use data to understand and analyze business performance.” (Page 7)
That definition is mostly justified through the book, except for the “explanatory” bit (which I’ll critique in a bit). The key premise of this book is that you can compete on analytics defined in this way; that it can be the basis of a sustainable competitive advantage and market differentiation.
For those who get ideas better through examples, here are a few examples whose nature as ‘analytics’ competitors is obvious: Amazon, Google, Netflix, and Progressive Insurance. A few not-so-obvious ones: Capital One, Anheuser-Busch (aka “Budweiser”), Harrah’s gaming and John Deere. Each of these companies differentiates itself from its competitors through mastery of the data flows in its operations.
What is (and isn’t) New
Analytics is about corporations learning to drink from the fire hose of cheap data that the modern IT systems wrapped around their operations can generate. So what is new? People have been championing a variety of data-driven approaches to management since Frederick Taylor, so what difference does, say, a modern point-of-sales (POS) or RFID system make? Here’s what’s new, per the authors:
- Unlike data mining, analytics is about operational interpretation and visualization, not collecting or reporting. This distinction can range from significant to irrelevant depending on the case you are talking about.
- Analytics is to be viewed as a subset of business intelligence (BI), within which it lives next to its older, stupider sibling, reporting. I am not sure this contextualization helps anybody, since BI is a massively overloaded term.
- Unlike ideas like lean six sigma and its predecessor, total quality management, analytics (when mature) is i) truly global in scope rather than globally-local, ii) about an enabling information infrastructure for all processes and functions (managed at an enterprise level) and most importantly, iii) about responding opportunistically in real time to some sort of systemic variability via feedback, rather than about reducing process variability via episodic process measurement and re-engineering.
- Unlike most management doctrines, analytics needs an enabling technology, since it relies on continuous data flows to support high-frequency operational decisional making, rather than low-frequency interventionist decision-making. Without some sort of technology that provides a breakthrough cost structure for data generation (such as remote diagnostics, RFID, retail POS data aggregation or Web services), the data flows required for competing on analytics would simply be prohibitively expensive. That explains why Web-based businesses lead in best practices.
- Unlike related technology paradigms like “Service Oriented Architecture” and “Software as a Service,” analytics is about a business competence that has its locus in people, not (just) software.
- Perhaps most important: unlike previous data-driven management paradigms, analytics can be a competitive differentiator. This is such a critical claim that it deserves some probing, so I give it its own section.
So if you are championing analytics-based competition, and someone asks you how it is different from what failed before, your canned answer: “Continuous data flows, cheap, operational measurement systems, enterprise-level presence, differentiating capability.” Keep examples handy for those untalented in the area of business abstractions (a.k.a the “all buzzwords=bullshit” crowd).
The most non-trivial claim in the book is that analytics can serve as a differentiator in the market. That means that it can provide a sustainable strategic advantage that won’t just be copied in 6 months and be reduced to yet another cost of doing business, whereby everybody in the stadium stands up, nobody gets a better view, and everybody has aching legs. Take, for example, something like six sigma, which is widely viewed as (and applied as) a cost discipline paradigm. Once everybody in your industry adopts it, it provides no differentiation (the quality differentiation that can appear as a result of TQM or LSS is not sustainable). Is this claim of analytics as a differentiator credible?
I spent a lot of time scratching my head, trying to shoot down this claim, and finally concluded that it was actually credible for one really good reason: there are domains and business models where rich data analysis can create a one-of-a-kind growing mass of digested information that cannot be duplicated. An example is Amazon’s recommendation system. You could easily reverse engineer the algorithms and start to compete, but you’ll never catch up unless you stole the actual data. In fact, analytical competitive advantage is grown, not built overnight. This isn’t true of all domains, despite the author’s claims, and if you aren’t sitting on such a growing pile of information with cumulative value, analytics may not provide differentiation for you.
In fact the only way I could think of an analytical competitor being overtaken is by an equally analytical new competitor with a better idea on how to crunch the data. Besides the obvious example of Google vs. Altavista and other early search dinosaurs, you might mull a more recent one: Stumble Upon versus the other social bookmarking sites like Digg and del.icio.us. In these examples, the competitor managed to squeeze more intelligence out of the same data as the competition. Examples from non-Web domains are harder to come by (post any you know of please!).
The Missing Pieces: Modeling, Metaphor and Deduction
Competing on Analytics is not a fatally flawed book. It does have one moderately serious flaw that you need to be aware of. The dominant dichotomy in the book is between “intuitive” management and analytical management. While winning over old “gut feel” die hards in your organization might be of enormous practical importance,to buttress the case for analytics by beating up intuition is to tilt at windmills (a particularly pointless windmill charge is a critique of Malcolm Gladwell’s Blink early int he book).
Intuition, understood in the usual sense of “gut feel,” is important whether or not you are swimming in high-quality digested data. The real dichotomy that ought to be addressed, and isn’t, is between empiricist and deductive modes of thought. Analytics (despite its name) is an empiricist doctrine. It’s antithesis isn’t navigating from the gut, it is first-principles deductive (rather than inductive) reasoning.
That is not a fatal flaw though, since induction has its place, and the book does a sound job of what it sets out to do, which is to show that analytics can work and describe a reasonable road map. You don’t have to do an either-or between deduction and induction (you should be using both). That said, however, it is definitely an incomplete book if you are looking for an idea that you can hang an entire business model on. So long as you read the book with a view to filling three gaping holes from other sources, you should be able to keep yourself out of trouble.
Unless you are the sort of person who, after a ritual incantation of “correlation is not causation,” goes on to equate R-square with “percent of variability explained”, you will find each of the analytics case studies lacking in a clear underlying conceptual model of the business. Amazon recommendations are cool, but does that explain the book industry? Capital One can crunch data to find profitable customers in pools previously dismissed as unprofitable and high risk, but does that explain the nature of credit risk? Bounce rates help you manage Web sites, but do they tell you how to architect Websites? No, No and No.
For all its talk of “models,” the blunt reality is that analytics, as understood in this book, is about smart people asking the right questions, and trying to guide their thinking with the right sorts of glorified and opaque black-boxes fitted around data. Analytics by themselves will not provide you a clear and sound way to tell signals apart from noises. Conceptually powerful models (like, say, classical mechanics) do that. Conceptually weak models (like regressions) don’t.
All you have underlying most analytics approaches cited in the books then, is some first-order digestion of raw data, some serendipitous metrics that are at once easy to compute and useful, and a few pretty pictures. The number-portrait of your company is nowhere near being a complete (qualitative or mathematical) model you can manage with. The key weakness of such approaches is captured by Robert Pirsig’s rather clever assertion that “data without generalization is just gossip.” All the familiar criticisms about being fooled by randomness apply. Empiricist models are prey to all sorts of human failings (confirmation bias, self-selection bias, survivorship bias). They encourage prioritization on the basis of data-availability rather than importance, and encourage risk aversion. They are susceptible to rare events (both rare worst-case scenarios and rare huge opportunities), and are prone to bad-faith manipulation.
But most importantly, analytics alone won’t help you think more clearly about your business. They will only help you if you first start out with a clear idea about the nature of your business. The right foundational questions won’t pop out of the data.
Can this gap be filled by borrowing other sorts of business management ideas? Since real-time feedback-driven decision making is central to analytics, the only framework that even comes close is system dynamics, and that will not do (despite an initial infatuation with system dynamics, based on the field being a descendant of my own field of control theory, I am not currently a fan). So that is an interesting research challenge for an ambitious academic: what conceptual modeling framework and organizational theory can provide a sound foundation for analytics?
Until somebody comes up with a good answer, you’ll need to make up a good qualitative mental model of your business before you apply analytics.
A good metaphor is key to insight, understanding and tasteful application for just about any major idea. Without a guiding metaphor, you are operating blind, applying cookie-cutter solutions, and setting yourself up for failure creeping up on you via serious blind spots. You have little real sense of priority or proportion, and nothing approaching “understanding.” The book screams for a couple of evocative metaphors.
One perception (possibly incorrect) that I couldn’t suppress, was that the authors (and apparently most of the practitioners of the field) are so massively left-brained they should be falling down sideways. That might be why they don’t supply any metaphors (even though pattern-recognition in data is, paradoxically, primarily a right-brained function)
So let me try to help: a business competing on analytics is to one that isn’t as a plant is to an animal. Animals have a central nervous system that pipes massive amounts of data to and from a very sophisticated (and usually central) processor, to drive action on the same time scales as key opportunities and threats. A tree, on the other hand, has no way of reacting to a chain saw.
This is not a particularly original idea, but it is a massively useful one. IBM has been talking about the biological “autonomic computing” metaphor for years, and I am surprised the authors of CoA didn’t make some use of the idea.
As I said, the key debate isn’t between intuition and data. It is between deduction and induction. The failings of induction are treated well enough by entire books (like the delightful Fooled by Randomness by Nassim Nicholas Taleb, which I referenced before). But beyond that, there are places deduction can go where induction cannot. A deductive process can figure out the general logic of a given “checkmate in two moves” situation from one example, while an inductive process (like, say, reinforcement learning) never will, even given all possible examples of the situation. This isn’t to say that deduction and induction are incompatible. They are merely necessary complements, and no treatment of one is complete without a compatible treatment of the other. CoA lacks this, so you will need to supply it yourself. Remember, applying deductive analysis to the foundations of a data-driven inductive process is not the same as applying inferences to the data itself (which assumes that the induction model is valid a priori).
The book is a matrix of running examples (primarily Harrah’s gaming, Progressive insurance and Capital One credit cards, with plenty of more isolated ones) woven through a set of themed chapters divided into a “fundamental concepts” part and a “roadmap” part. Here is a summary:
Part I: The Nature of Analytical Competition
Chapter 1: The Nature of Analytical Competition
This chapter introduces the key examples and sets up the premise of analytics-as-competitive advantage, provides a list of best practice leaders and distinguishes analytics from business intelligence in general via a neat little graphic of “Degree of Intelligence versus Competitive Advantage.” There is also a bit of a historical overview walking you through notions of enterprise resource planning, decision support systems and other buzz phrases to lead you away from the temptations of conflation. There is also some well-intentioned hedging about when analytics is appropriate. Finally, there is also a suspiciously extended (and not particularly useful) discussion of analytics in sports which suggests the authors might be sports fans.
Chapter 2: What Makes an Analytical Competitor?
Chapter 2 identifies the key attributes of analytical competitors (page 23), three of which are substantive and one of which is trite. Pop quiz, spot the trite one:
- Analytics supported a strategic, distinctive capability
- The approach to, and management of, analytics was enterprise-wide
- Senior management was committed to the use of analytics
- The company made a significant strategic bet on analytics.
The chapter is based on a survey of 371 medium to large firms, conducted in 2005. The results are synthesized in a nice graphic of a 5-level pyramid. The five stages are: analytically impaired, localized analytics, analytical aspirations, analytical companies and analytical competitors. The authors estimate that no more than 5% of all companies belong in the final, enlightened stage.
Chapter 3: Analytics and Business Performance
This is largely a mish-mash of anecdotes and unstructured notes, but with some good meaty check-off lists for your use, that get at what turns mere floods of data into a basis for competition.
Chapters 4-5: Competing on Analytics with Internal/External Processes
These chapters form the heart of the first part of the book. A variety of examples, organized by business function (ranging from R&D to sales and marketing) should give you all the illuminating examples you need, no matter where you are thinking of applying analytics.
Part II: Building an Analytical Capability
This part is much weaker than the first part, so it deserves much less attention. In Chapter 6, “A Roadmap to Enhanced Analytical Capabilities,” we get a fairly unsurprising map, which might, nevertheless, save you some time and fill in some of your personal blind spots if you are championing analytics. Chapter 7, “Managing Analytical People” is probably the most useful and nontrivial chapter, making, in particular, several useful points about levels of training you might need in your workforce to compete on analytics.
Chapter 8, “The Architecture of Business Intelligence” is worth a skim, but is largely forgettable, and you are better of reading a book on BI to really understand the problem this chapter addresses. Finally, Chapter 9 is titled “Future of Analytical Competition,” but is mostly a summary and conclusion with some lip service to extrapolation.
I’ll add one final note that you might find useful if you decide to get hold of this book and use it as ammunition in some sort of daring charge: you can easily string together a very complete and educational presentation to champion analytics by using the large number of diagrams, tables and bullet lists throughout the book. In fact, a good 2 hour short course, based on the book, could be very effectively created using the figures alone, with some cherry-picked anecdotes to make the key points associated with each picture. A shorter version, buttressed by some proof-case data from within your company, should probably win a few small battles.