Guest Post on VentureBeat on the iPad

I have a guest post up on VentureBeat.com, Why Apple’s design approach may not work with the iPad. I haven’t written about innovation in a while, so for those of you who like my old posts on that subject, you’ll probably enjoy this.

In Arthur Hailey’s 1971 novel, Wheels, the hero has an epiphany while looking at the Apollo Lunar Module: “Ugly is Beautiful.” Watching the iPad launch coverage, I realized that Apple limits its innovation potential by never building anything ugly. “Ugly is beautiful” isn’t just an epigram. It has substance as an innovation design principle. There are theoretical and empirical reasons to believe that revolutionary products are, by necessity, ugly-beautiful (as an effect, not a cause: a technology is not revolutionary simply because it is ugly).

Do head on over and comment. This was written with my work hat on, as part of the general tech scene conversation-joining blogging I am doing as part of trailmeme.com promotion. Sometimes you get to mix work and play…

Get Ribbonfarm in your inbox

Get new post updates by email

New post updates are sent out once a week

About Venkatesh Rao

Venkat is the founder and editor-in-chief of ribbonfarm. Follow him on Twitter

Comments

  1. Another thought provoking piece, and a pleasure to read.

    My suspicion about smartphones/iPhones/tablets etc., is that these are the computers everyone else has been waiting for. There’s a huge swath of people around the world, of all demographics, that the original PC (encompassing both the hardware and Windows OS) just sucks to deal with. PC 1.0 is just not made for the non-technical (as anyone who has spent time in technical support, even for family members, can attest to). The last innovation to help these people out was the mouse & desktop GUI implemented by Apple and MS. Touchscreens and applets provide a simplicity that cannot be overestimated.

    But in the tablet arena, Apple’s iPad is not a game-changer like the iPod and iPhone. It’s an extension of existing technology, and there’s no innovation gap that other, competing manufacturers and providers must overcome. The playing field is much more level and competitive as a result, imo. And I think the iPad is ugly, technology-wise, as the Rock v iPad comic illustrates. It’s like dating a model with a shrilly voice and man-hands. Sure, it looks great and has its perks, as long as you can tolerate the flaws.

  2. Interesting post. I don’t have much to say about the business related aspects of innovating and marketing technology. However, I was struck by your sense of the young, hip Bohemians who want curated experiences, aesthetic classicism and metaphor coherence. It is a great way to describe some of my tendencies. Do you have anything else to say on this or maybe some links to others who write about this?

  3. Peter – that sort of description of subculture groups is probably more literary than anything, so I am guessing your best bet is to read novels, movies etc. by/about the group. The only systematic study of such modern archetypes that I know of is done by marketing agencies that do psychographic profiling such as the Nielsen Claritas psychographic segmentation. You can find a high-level description of the popular Claritas PRIZM segmentation model on Wikipedia. It actually includes a category called ‘bohemian’ but the definition is not the one I made up for Apple adopters, though there may be overlap. The PRIZM categories arise out of clustering statistical survey data and then coming up with evocative names (and followed by construction of prototypical ‘personas’ for the group…)

    These research sources aren’t cheap of course, but you might find pop-marketing articles/books that talk more about it.

  4. So you’re saying what? The iPad should have a handle? Or have plastic case? Saying something is not some x,y,z is easier than defining what is is.

  5. I wonder if I got you right.

    1. Radical-disruptive innovations are ugly.
    2. Tablets are radical-disruptive.
    3. The iPad is beautiful. Apple’s motto is “Don’t be ugly”

    “Oh dear,” says the iPad, and vanishes in a puff of logic.

    “That was easy,” says Venkat, and goes on to prove that the iPhone, weighing less than a duck, must be a witch :)

    I have a more substantive argument which I’d like to go into in some detail. Maybe tomorrow.

  6. I agree with you tubelite, if you can’t really tell what Apple excluded to make it “beautiful” than you can’t really say that it’s “beauty” is a crutch. And if you think the “not-so-PC” of the iPhone OS is a negative thing… tell that to the number of people who are totally satisfied with that on their ipods and iPhones.

  7. Venkat,

    I think you have an interesting point about Apple as a whole. It is however, not something Apple should be worried about. There is plenty to do even within that ‘must be beautiful’ constraint.

    I think you’re wrong about the tablet. This isn’t a disruptive, radical shift to displace tablets. tablets effectively don’t exist. There is not tablet market. The IPad is a continuation along the ipod touch/iPhone path. No radicalism required.

    Perhaps this is Apple’s solution to the problem you outlined. Good at solving a particular class of problem? Set yourself on a path where you will encounter them. I wonder how long ago this path became clear? I imagine a convergence of phone & ipod was obvious as the ipod was getting real momentum & that the ipad was seen as an obvious extension of the iphone as soon as they started putting together the software.

  8. Long post. TLDR: Apple is right about the iPad.

    Many tablet efforts tried to get a full-fledged computer onto that form factor, using the same GUI paradigms which worked in a mouse-keyboard system. Small targets (buttons, checkboxes) which need to be precisely hit, using a stylus instead of a mouse. What do you do when you need to input letters? Either you need to put down the stylus and use an on-screen or physical keyboard, or muddle along with a “stab-keys-one-at-a-time” with the stylus or some handwriting recognition thingy with comical accuracy. Microsoft tablet PCs died for many reasons, terrible usability and corporate politics being two.

    A full-blown tablet is a hard problem, and I am not even sure it is worth solving. Fortunately, Apple was smart enough not to do a wizened shrunken-head version of desktop Mac OS.

    I would consider the iPad as incremental-disruptive. All the crucial iffy innovations – capacitive multi-touch, the use of fingers as the exclusive input device, single-tasking – have been tried out and perfected in two generations of iPhones and Touches. This is a scaled-up version (didn’t you say this yourself?) and does not, ipso facto, have to be ugly. I am glad more people are recognizing that the iPad is a consumption device first and foremost, not a be-all end-all netbook-laptop killer.

    I think most folks don’t appreciate the true innovation here. Think about input devices and what we use them for:

    1. Precision pointing and selection of small targets
    2. Selecting one or more large targets
    3. Drawing
    4. Browsing (using your display as a limited viewport into a larger object) e.g. web browsing, scrolling through documents, maps..

    The mouse is good at 1 and 2. Terrible at 3, like drawing with a brick and worse with 4. I intensely dislike the click-and-drag (+zoom-in/out) which you need to browse maps, or the Times Archive Horizontal scrollbars. Scrollbars within scrollbars. The browsing scenario is a mess, with scroll-wheels and trackpad gestures being the sole redeeming innovations.

    Trailmeme is hard to browse with the mouse for precisely this reason – the 2-D map spawns horizontal and vertical scrollbars *within* a webpage which already has horizontal and vertical scrollbars.

    The stylus is good at 1, 2, 3 even a bit of 4. That’s why so many attempts have gone down this path. The problem is that the stylus encourages lazy design, merely replicating the desktop metaphors to mobile devices, with predictably bad results. Windows Mobile (pre-7) is of the shrunken-head variety, with a clutter of tabs and small targets. It also encourages resistive touch which pushes the tradeoffs in favour of high-res, stylus oriented input and away from fingers. Try using fingers with a resistive touch screen and compare with the iPhone.

    Finally, fingers and capacitive multi-touch. They are impossible at 1, poor at 3, good at 2 and *outstanding* at 4. Since media and web consumption is very heavy on browsing, this is an excellent match.

    With the mouse, the object to be manipulated is at one level of remove from the input signal. The limitations of this approach become particularly apparent in the click-and-drag scenario. The input range is different from the output range, requiring multiple awkward passes, with a zoom-in/out thrown in between. The key feature of fingers-based input is that it removes this level of indirection. Consider the display as a viewport hovering over a large complex surface rendered in two dimensions, studded with interesting objects. You can manipulate the underlying surface, sliding it about with a fine degree of control and responsiveness, directly selecting the object d’esire with your finger as it swims into your ken. This is a huge deal. Browsing a map, pinching it to zoom in and out, turning a page, scrolling become absolutely natural.

    Look at the Wii. It has graphics which would have been embarrassing a decade ago, but its key innovation is again the same: get rid of the “chunky” controller where the user has to maintain a mental map of action to result. Remove the level of indirection, let the user move naturally, i.e. using technology honed over millions of years of evolution. And you’ve tapped into a huge new market. I have a Wii and can testify to the fascination they inspire at all points of the age/gaming interest spectrum.

    Hard-core gamers will still prefer the chunky controller and pixel processing of the non-Wii consoles. Hard, pipe-hittin’ coders will still use a big screen display, keyboard and carry laptops and netbooks around. It doesn’t mean the Wii or the iPad have no market.

    The iPad has the potential to make experiences – not only linear ones like books, but non-linear ones like the web, and Trailmeme, which tries to capture a slice through the whole-sort-of-general-mish-mash in a 2-d graph – so much more usable and compelling.

    As for traditional document metaphors – what’s wrong with them? There is such a wealth of content available in the linear-document format (i.e. books) that it makes sense to present them that way. The one alternative I can think of is to go back from the book to the scroll, have content infinitely scroll vertically, with a graduated chapter-scale at one long edge for quick jumps. Then we’d have the following scenario in reverse.

    Now to your accusation that Apple favours platonic classicism over usability. I rather suspect you haven’t used Apple products much… which is why you keep bringing up the mouse. Let’s stop with the mouse already – yes, Apple’s mouse is a mistake, but not a fatal one. Other mice exist and they work on the Mac. I prefer multi-fingered gestures on trackpads anyway.

    I’m a technology omnivore – not in Stephen Fry’s class – but enough to lend empirical weight to my arguments. I use, on a daily basis, all of the big 3 – Windows XP, Mac OS and Ubuntu Linux as desktop OSs. Have been doing so for 2 years. I’ve done more than a bit of programming. My corporate phone is a Blackberry 8900, and I’m on nodding acquaintance with a couple of iPhones nearby. The iPod Touch is my constant companion. I have a 1st-gen Kindle as well. My router and media center both run Linux.

    The point of this brazen chest-thumping is – a. I’m not exactly a rabid Apple fanboy in the traditional mould and b. if there was something major missing or compromised from the Apple set which the others had, I think I would have noticed. Do you have any other examples besides the ones you mentioned?

    Target market and metaphor coherence. For some products, metaphor coherence is a luxury – for some, it’s the very essence. Look at HP’s Halo, or Cisco’s TelePresence videoconference systems. Starting from furniture, decor, to life-size TV images, multiple cameras, excellent sound quality, massive bandwidth to deliver flawless video, they’ve gone to great lengths to make suspension-of-disbelief point so low that the illusion of sharing the same room with the other set of participants is very good indeed. No pixelations, no clicks, no sharp digital discontinuities detract from the illusion.

    One could compare this with webcams, which perform essentially the same function and say, “well, metaphor coherence is for aging baby boomers and hipster doofuses” and be dead wrong. The target market for Haloesque tech (and it will become affordable, trust me) is everyone.

    And that’s what Apple has been trying to do as well. The computer is a modeling machine. So far it has been restricted to scenarios where the fact that it’s a model is very obvious. Add “metaphor coherence” – smooth organic behaviour – and it *becomes* that which it models.

    The iPhone/Pad family are closed devices, and while I don’t like it as a programmer, it’s no excuse to go off the deep end and start the Won’t Someone Think of the Children schtick (the Mark Pilgrim post). I look around me and see many devices (router, media player, PVR) which are really computers, not to mention the DVD player reproachfully gathering dust in the corner, and half a dozen phones, dead and alive. Start crying about how many computers exist in your life which you can’t tinker with and you’ll never get ready for school.

    I guess I can blame my father for buying a closed TV (“warranty void if you so much as look at this sticker”) for my lack of electronics skillz, and his deplorable habit of buying readymade furniture for my ineptness with a saw, eh?

    On the tinkering side: The Macbook comes with a real live bash shell and a shit ton of Unix utils, like Mother used to make. And a C compiler. Windows comes with Windows… and that’s it. If you want to find out who’s hogging space in your disk, you do a du -sk on Mac and Linux. On Windows XP, there’s a bloody dog which runs around with its tongue hanging open to the sound of disk thrashing. That’s Windows for you – too tinkery for its main market which wants things to just work damnit, and not tinkery enough for advanced users.

    Charlie Brooker sounds, even after you peel away the exaggeration, like someone in an abusive relationship. Some day I should figure out what made people pick up his article with such indecent glee.

    Do you want your car to reflect a “deep, atavistic sense that the world is messy”? Hell no. I want it to take me from point A to B, philosophy be damned. A former generation of tinkerers would be appalled at this statement, and appalled at our cars in general. Our generation of tinkerers will slowly getting used to the fact that there will (hopefully!) be some computers which just work and don’t need blood sacrifice to get them to move.

    Finally – though I agree with your rationale for ugliness in the radical-disruptive space, I remain deeply unsympathetic to attempts at glorifying ugliness because of frequent abuse. All too often, it’s due to laziness and lack of imagination than form-follows-function necessity, especially in the tech sector (I’m thinking of Windows again). The first and most valuable lesson I learned when I stepped out of India was this: Ugliness is optional. Dust, dirt and entropy, so far regarded as fundamental forces of the universe, could be tamed. Apple does the same to the tech industry: Ugliness is optional.

    I’m with Stephen Fry here. The iPad will be big, and the world will be Tlon.

    • As usual, you’ve managed to post a comment that outdoes my original post. Makes me wonder how much the conversation in the blogosphere would be elevated if the 9/10 of people with good stuff to say actually took time out to say it rather than leaving everything to us primary noisemakers.

      I agree with most of what you say, though I still think the iPad will remain a niche hit. Your point about the 4 things input devices do is deep enough to deserve a stand-alone post (interested in guesting it?)

      Only two minor quibbles. One, you suggest that XP is not tinkery enough for the geeks, and too tinkery for the ‘just work dammit’ crowd, but the product has not performed like a no-man’s land product caught between a rock and a hard place. In fact, it is still the dominant end-user OS. We have to honestly ask why, and I think the answer is that end-user computing is still in enough of a growth phase that end users, whether they like it or not, still have to do significant tinkering. Like back in the model T ford days when everybody, mechanically minded or not, had to be a bit of a mechanic to adopt the innovation. Apple, from that POV, is prematurely optimizing an immature product category… OTOH you might believe we are in the equivalent of the 70s 80s era of automobiles and that there is some stupid inertia effect holding us back. I don’t know.

      The same sort of point for my second quibble. You are making a time-honored sort of false analogy between cars and computers (my use of the analogy I think, is less problematic). Your comparison is in the vein of that famous (apocryphal?) exchange between Bill Gates and GM about “if cars were like windows”.

      The computer is fundamentally a younger product with a FAR higher maturity ceiling that will probably take a millenium, rather than a century, to reach. We’re going to be on the growth part of the curve for a long time. When we’re done, we may be in a place where the prototypical “computer” may be unrecognizable (perhaps a diffuse ghost in the machine, permeating everything in your life, from clothes to refrigerators, via some mix of cloud intelligence, swarm intelligence in devices and so forth).

      Point of that techno visioning is that I will NOT accept life-threatening “atavastic world is complex” design in my car (except to the extent that the car IS a soft-real-time computer), but I expect and want it in my computing. I am not settling for a temporary plateau at 50 years, when I dream of a real plateau 500 years in the future… :)

      But we are now down to a deeper theological divide than the Apple-Windows divide I think.

      BTW, practically speaking, I am an agnostic too. Except that I’ve never found a use case in my personal life (barring an iTouch that I won in a contest that’s useful for occasional quick email checks if I am too lazy to turn on my computer). That may change… my wife is getting into photography seriously, and I am telling her to consider a Mac…

      Venkat

  9. Thanks. Feel free to take the material and do what you like with it; I have no idea when I’ll be able to make even a passing stab at a real post. It takes me way longer than you prolific dudes to actually put down ideas, and then words, in any kind of sensible order.

    Windows, I think, got into its position because of the long interregnum in which no one was willing to step into the consumer PC OS space. IBM committed hara-kiri, Apple insisted on hardware control and high standards, computers provided enough value that people were willing to put up with whatever barely worked.

    I think we are in violent agreement when it comes to the general-purpose computer, the unmasked Turing machine. It is complex, and it is premature and difficult to reduce – that’s why I rail at Windows for doing a half-ass job of enabling the advanced user to deal with the complexity of the general computer. No tools and no shell to bind them.

    I agree, Man needs to internalize the laws of information like his bones know the laws of Middle Physics, before computing can become second nature.

    Where we differ is that I believe there is a huge use case for a small subset of the general computer, thanks to the Internet – media consumption and communication. And this subset can indeed be smoothed of much of the complex wrinkles which are inherent in the general case. Look at the literal “My Dad” case – learned computers after crossing 65, spends all his time in the browser and maybe a spreadsheet or two. Uses Firefox, Gmail, loads photos from usb stick, movies from his mobile phone, views them and distributes them to unsuspecting relatives. Is frequently puzzled by where Windows fits into the whole picture, since everything is on the net, and what to make of its imperious demands to update this, secure that and defragment the other.

    I think we can and should develop a “car” computer for this case which just works.

    I was thinking a bit about your Platonic ideal thingy and figured that another way of looking at the iPad/Touch is as a Platonic object factory. Dust off your Touch, bring up the calculator. Your Touch is now a Calculator. Not a calculator program, which you need to drag your mouse to focus, half obscured by other programs simulating other things, where you look down at the keyboard to hunt and peck some numbers and look up to see if it went in right. It is a Calculator, and remains so until you decide to turn it into something else.

    Go to the App Store, download Labyrinth Lite and start it up. Your Touch is now – to mix Borges metaphors in a particularly satisfying way – a Labyrinth of Tlön. “A steely round goes a-mazeing”