The Poor Usability Tell

Work can be a delight when your tools and your environment are crafted in ways that enable you to focus on the task at hand. However, most people I suspect have only limited experience with this sort of situation; it’s rather common to see everyday tasks performed with sub-optimal tools. Software engineers are in a privileged position, parallel perhaps to blacksmiths of the past, in that the same skills used for their work may be deployed toward tool improvement. Correspondingly, they pride themselves on possessing and creating excellent tools. Unfortunately, most other roles in a given business are ill-equipped and poorly positioned to effect a similarly-scaled tool-chain improvement effort. Instead, they are reduced to requesting assistance from other departments or outside vendors, a relationship which Kevin Simler highlighted last year in a spectacular post entitled UX and the Civilizing Process. You should read the entire piece, but the salient portion for our purposes is the following paragraph:

You might think that enterprise software would be more demanding, UX-wise, since it costs more and people are using it for higher-stakes work — but then you’d be forgetting about the perversity of enterprise sales, specifically the disconnect between users and purchasers. A consumer who gets frustrated with a free iPhone app will switch to a competitor without batting an eyelash, but that just can’t happen in the enterprise world. As a rule of thumb, the less patient your users, the better-behaved your app needs to be.

Any given software project will be improved by increased usability. Nevertheless, we’ve all witnessed moments where “more cowbell” doesn’t seem to effect the desired improvements. An unalloyed good in its tautological form (better is better), it is in the specifics that we see usability as a concept fetishized.

This isn’t an accident; in fact, there can be an inverse relationship between the best user experience at the level of an individual or a small group, versus the best user experience for an organization or a network of organizations.

In poker, a tell is some sort of behavior which gives hints about the card’s in a player’s hand. Poor usability is a tell which may indicate that an organization’s and a user’s needs are in conflict, and that the organization’s needs trump.

Most days, I spend my time providing software support, consultation and training to sysadmins and programmers working on the iSeries platform (aka AS/400, aka System i aka i5 aka POWER i aka whatever IBM marketing is calling it this week). When I train new employees on the system, they are often taken aback by the terminal emulation software with the “hacker classic” green-on-black text interfaces. “This system looks like it belongs in the 80s” some say, while others are perplexed as to how non-technical users could ever be expected to use such a system (these comments betray just how much cultural conditioning plays into what is or is not considered usable).

The iSeries is neither the zeitgeist nor a high-stakes consumer app. It’s rather the epitome of Simler’s “rustic enterprise”; parallel in ways to how, in lead-up to the industrial revolution, you could observe a technological downgrade on the order of a couple centuries by moving inland from coastal Europe.

I have no qualms about complaining about a platform’s shortcomings, but playing city mouse/country mouse is only interesting when it combines clear-eyed critique with a nostalgia-free empathy for the alien and seemingly outdated.

In these introductions I typically begin by talking about the sorts of users who work on iSeries applications. Warehousing, logistics, manufacturing and accounting departments or companies are all well-represented, and we’ll pick one that the user is most familiar with to talk over; accounting, for instance. Software, at its best, functions as an extension of the mind, and becomes transparent to its user (Heidegger famously coined the term zuhanden, or ready-to-hand, for this sort of human/tool synergy). Csikszentmihalyi’s concept of flow doesn’t detail human relationships to tools, but flow is interruptible and one of the primary sources of interruption is tool presence (vorhanden); that is, when a tool (by its inadequacy toward the task at hand, or simply by a shift in perception) is recognized as separate from the tool user.

If you ever watch a seasoned accounts payable clerk run invoices on a “green screen” application, they fly. Sometimes they outpace the screen refresh rate, and they’ve input instructions for the next two or three screens before they’ve loaded. The effect is akin to watching an experienced vi (or emacs) user write code, merge files, check emails and otherwise manipulate the universe through effective utilization of their beloved text editor. To identify with these body extensions is to provide kindling for flame wars between different tool devotees. Given stable enough tasks and users, tools will be refined and ultimately recede until they become invisible, as their capacities become absorbed and assumed, and the sunk costs of obtaining expertise are lost to time. The parallels to etiquette are good, but at peak the better analogy would perhaps be speaking in one’s native language.

Interestingly, a side-effect of this receding-out-of-view of our best tools is that we discount their positive effects, and this discount comes to bear when the advertised usability of new tools is measured against the status quo. These are extreme examples, but I’ve seen more than one company move from thick-client terminal emulation to a web application version of the same, only to discover that the penalties were significant and sustained. Not only were the refresh times between pages longer on the “more usable” web application version, but commands and data could not be entered onto new pages until they had loaded. This incurred a double penalty; first, the extra second or two multiplied by every page load, but more significantly the cognitive disruption of losing the flow and waiting on the application in order to proceed with one’s work. One user I spoke with gave a thoroughly unscientific estimate of a 20-30% performance penalty moving from the old system on which she was experienced, to the new. If the old method was akin to a native tongue, the change could be consider akin to having to conduct office calls in colloquial French, given a 4-hour training class and two weeks of practice.

There are seemingly irrational and rational reasons for this state of affairs to come about. On the irrational side, novelty bias and a preference for beauty can distort technical comparisons, particularly when the benefits of the integration of the existing solution into your organization are difficult to quantify, and some of the costs of switching are necessarily hidden.

However, it’s the rational reasons for switching to a rhetorically “more usable” but quantitatively less efficient solution that are slightly evil and particularly interesting. We’re used to thinking at a human-level resolution, but if we zoom out to the larger agentic level, that of the organization (thank you, Mike Travers), what is perverse for the individual or the small group may conform to organizational reason.

We held as constant in the example above the stability of the work, and of the workforce employed. What if one or both of these is unstable? Or rather, what if you have other reasons for being willing to accept the costs of this instability?

The 2008 recession forced a lot of companies in the small-to-medium business (SMB) range out of business or into strategies for stemming blood loss. Layoffs, hiring freezes, and increased workload for existing employees all helped reduce profit loss, and in the longer term many companies invested in a strategy of increased automation, with new jobs growth generally bifurcating between very-highly skilled work and deskilled work. This is the white collar echo of the blue collar automation of factories and warehouses. (For companies already flush with deskilled labour, such as McDonalds and Walmart, you saw an explicit reliance upon social services to “make up the difference” imposed by the slowed economy).

To return to the terminal emulator example: a company switching from a thick client terminal emulator to a browser-based alternative is a tell; it sees its employee base and the labour market as being more volatile and is anticipating a higher turnover rate, with the attendant devaluation of employee training and technologies whose payoffs are slow initially but steady thereafter.

Similarly, in the world of business forms processing, user intervention is a given, but the types of use change considerably: instead of a department of AP clerks handling incoming invoices, for instance, you have people who simply scan documents and then people who exception handle whatever the software can’t automatically recognize or calculate. These sorts of solutions also impose a skills gap, a discontinuity that makes it more difficult for unskilled workers to work their way into skilled positions; no amount of scanning pieces of paper will, alone, prepare someone for a skilled accounting position. It is the workers in the middle that are disrupted by these sorts of technological interventions, as at the bottom labour is too cheap to meaningfully disrupt, and the top as yet requires too much independent judgment to replace.

Not all technological change falls into this category. There are technologies that are a boon to both the workers and the organizations as a whole. The point here is that when a technology is adopted that is “less usable” to workers than what it is replacing, this is simply an indication that worker and organizational interests are no longer in alignment, and the workers’ interests have been trumped.

I’ve focused on software technology in the examples above, but technology is a far more flexible concept than that. Little understood as technology are the social organization of a firm, its schedules, its incentive structures, its jargon and bureaucratic processes. Next time we’ll explore the world of undead technologies; abortive attempts at innovation that brought more new problems than solved old, and which were not catastrophic enough to kill their host firm.

Get Ribbonfarm in your inbox

Get new post updates by email

New post updates are sent out once a week

About Jordan Peacock

Jordan Peacock is a Minneapolis-based technologist. His ribbonfarm posts ask what the gap between what is promised and what is delivers tells us: about technologies and the organizations of which they are a part. He lurks on Twitter as @hewhocutsdown.

Comments

  1. Ian Gregory says

    Great article – I’ve long considered the green-screen to be superior to most of the UX I’ve installed for clients. There seems to be a paucity of competent UX specialists designing UX in enterprise software.

    I’ve been involved in the buying process for enterprise software on many occasions, and the rationale almost never considers the UX or the poor end-users (to my frustration). The primary drivers seem to be fashion, C suite preferences/politics, the effectiveness of vendor sales staff, raw functionality (rather than the quality of its implementation) and theoretical cost savings, which can usually be bent to justify most options and frequently never materialise. Because of this, the vendors have little incentive to invest in great software and often the choice is for the least bad option. Opportunity knocks for someone!

  2. Very interesting article!

    This leaves the question out of, who makes these usability tells and what motivates them to make the tells. The possibilities are that such people (sociopaths in the Grevais Principle sense?) can accurately predict the consequences of these tells – they want to protect themselves from changes to the environment, the labor market is more volatile. Another possibility is that, the motivations are more of the form of upgrading for upgrading’s sake – “It’s 2014 already and we are STILL using text input?! how far behind the competitors must we be? what’ll the board of directors say when they see this and blame me? Can’t fob inaction off onto the lower managers [losers], better do something.”.

    I prefer the latter possibility because it seems to align more with the knowledge individuals have inside the larger organism of the company/corporation – not much at all; local knowledge. And it works well with the concept of selection effects. Organizations are always ‘trying things out’ via the humans that comprise a part of It. These usability tells are just one of those random trial and error actions that occur all the time. It just so happens that before, the environment (labor market) probably gave a penalty for poor usability tells – when employee turnover is low but you don’t allow for training to create better workers you run out of skilled employees. Some companies would probably die from the inefficiencies produced. Now, this (at the company-as-an-organism level) random action or (at the sociopath-seeking-to-protect-themselves level) planned action is placed in a new environment. Now it’s much easier to observe companies taking these actions: they survive!

    The important point is that, perhaps [poor] usability tells such as text-interfaces to GUI are always occurring, the strong view is that they are occurring at the same rate throughout time. They ‘work’ when the environment changes, making them promote the life of the organization. A warning might be given that we might underestimate the frequency of such actions in the past, when they didn’t work, compared to now.

    • It’s a weak, not a strong claim, in that there can be other legitimate explanations for the same phenomena. My purpose was not to insist that this is what always happens, but rather that there is a strong correlation here that often gets ignored, to the [GP losers’] detriment.

      Your survival argument is a good one, but I think it underestimates just how much dead weight there is in an average organization. I’ve found it, frankly, shocking, and have compared it to a form of corporate welfare that suits the American temperament: simply not having a job is, especially in the middle class, considered bad form, but it’s perfectly acceptable to take a salary from a useless job, or one where your contribution is minor (or possibly even net negative; see The Daily WTF for numerous examples of the latter).

      This disappears the more competitive/market-driven the business environment, but most partial-equilibrium states (either in middle markets or in markets that have constructed a ‘cartel’ format, what Fernand Braudel calls an ‘antimarket’) there’s plenty of buffer.

      More on that later, perhaps.

  3. I’m sorry but I gave up when you said “Whatever IBM is calling it this week.”AS/400 ended last Century, iSeries went away 8 years ago and it’s been SIX YEARS since the O/S was renamed IBM i, a name you never even mentioned. I appreciate the point and I loath bad interfaces green or GUI but I find it difficult to take serious one who refuses to acknowledge a name that hasn’t changed this decade when we’re nearing the half way point in it.

    • In practice, I find users (meaning system programmers, admins, etc) colloquially refer to the systems as the iSeries (most common) or AS/400 (next most common), with all other terms combined being < 5% of references.

      My experience is not necessarily representative, nor is the broader point specific to the example I chose. I'm sure this chafes with the folks at IBM, but there's a lot of inertia going against these rebrandings, and there isn't volume of systems in the wild or the rush to upgrade that make rebrandings by other companies less trivial (I still routinely see folks at V5R3 or V5R4, despite those being years out of date). When you talk about Microsoft Windows 8, you are not also talking about 7, or XP, or the hardware, whereas it's unclear when you saw IBM i whether that's a term that should be applied to past iterations or not, since that's how past terms have been used.

      Ancillary to the main point, but I hope this serves less as a defense for my usage as a contextual aid to folks from outside that world. Thanks for chiming in.

  4. Great concept here. I love the idea of these kind of “tells.” Anyway, wanted to riff on this:

    “[T]here can be an inverse relationship between the best user experience at the level of an individual or a small group, versus the best user experience for an organization or a network of organizations.”

    Yes, this is a great observation, and it would be interesting to catalogue more of the ways this can be true. I’ll just throw some out there:

    1. (From this article) Powerful or efficient over the long run, for well-trained users, vs. quickly usable, for new employees.

    2. Easy to support (IT) vs. easy to use (end user).

    3. Web-based vs. thick client. I would argue that, ceteris paribus, web apps provide some pretty serious value to organizations as a whole, based on the ease with which people can run the software (no install), share URLs, etc. Just one example: You want to show something to your boss, but she doesn’t have the thick client installed. But if it were a web app, it becomes much easier. (Either she just opens the browser and logs in, or you log in on her machine.)

    4. Consumers vs. producers. Maybe the app sucks for people who have to work inside of it, but it produces nice output (reports) for other people (perhaps we could call them “shadow users,” since they don’t show up as licensed or in the stats).

    Anyway, yeah, the “agency issues” start to get real thorny with enterprise software.

    I’m glad you called out the web app (too usable by half) vs. green screen experience (looks like shit but works great). I don’t think enough people appreciate that.

    • Emilio Cecconi says

      From what I’ve seen, many times poor usability comes from replacing a legacy system that is no longer supported. People then try and blindly copy all the features from the legacy system into a new application due to business requirements.

      • They rebuild the legacy system within the new system. I attended a team in 2000 that programmed COBOL in Java. The component was basically designed by the domain experts belonging to another team i.e. not at all. I quit the job after 5 months. Three years later I heard the whole project got canceled one year after I left and they reverted to the legacy system.

  5. Scott Werner says

    Ever since reading this post, the line “Software engineers are in a privileged position, parallel perhaps to blacksmiths of the past, in that the same skills used for their work may be deployed toward tool improvement.” has stuck with me. I’m reminded of this Ribbonfarm post: https://www.ribbonfarm.com/2012/03/08/halls-law-the-nineteenth-century-prequel-to-moores-law/
    and wonder if the compounding growth we’re seeing in software is distinct from Moore’s Law and not a direct consequence. For example, I’m not sure the software engineer headcount requirement drop from a ~2000 era startup to now is mostly the result of better tools and software.

    • The old-school text based applications are almost easier and faster to use. The downside is that they take longer to learn. It’s not a question of which is better, but which is more appropriate.

      The accounts payable clerk that spends 90% of his day in four or five different “screens” should have a text based application. When he first starts his job it will take him longer to get up to become proficient, but in the long run he will be more efficient.

      Most of the applications we use today are not like this. We use dozens of applications every day, and don’t spend very much time on any one of them. The modern web-based or smart phone application generally takes no training at all to learn, even if an experienced user is less efficient.

      Most applications should not be text based, but the old “green screen” app does still have it place.

      • Scott Werner says

        I was going a little off topic from the post – but I was more thinking about that line from the perspective of software engineers in general. Sure there are big clunky interfaces that some people use, but there are also the black screen/green-text apps as well that are under active development, by developers, to make development easier, better, or more productive.

  6. One topic area that I tackle in my work is the use, abuse, misuse and disuse of technology by the human[factor] in the context of a complex socio-technical system. The last one, “disuse” strongly resonated with me after your reading your column — and the non-embrace of EDF in by certain ATC’s in Europe seems to echo the clash between the Organization vs. the Individual…
    see: http://www.zdnet.com/air-traffic-control-system-is-not…/