Site icon Mark Mullaly

I Have Become a Laggard

I have long lived life on the bleeding edge, particularly where technology is concerned. I would keep up on new software, hardware and technology. My cell phone provider was no doubt driven to distraction by my endless upgrades, usually timed about every six months or so.

I constantly experimented with new software, and would find some compelling need to upgrade my laptop every year or so. In between, tablet computers (long before iPads were sexy) were several, and there was a brief flirtation with an oh-so-small and very stylish Sony Vaio that promised much in terms of portability, even while it fell down in power and responsiveness.

The result of this was a vibrant intake of new technology and innovation, while my employees were the beneficiaries (some would say victims) of a perverse form of technology hand-me-down program. My last phone or computer would find its way onto the desk of whoever next needed an upgrade, while I went on to objects newer and shinier. The consequences of this were numerous. For starters, I suspect that my employees felt that unwanted solutions were being foisted on them, rather than my slightly more idealistic and optimistic view of having generously ‘tested’ them beforehand.

More challengingly, many were the times that I would once again find myself reinstalling Microsoft Windows (often a three day exercise, once base installation, updates and software were taken into account). This would often be done in an emotional state best described as desperate panic, as I weighed the calculus of finishing the install with missing some imminent flight or other. Explorations of new solutions, and decisions that some piece of technology or other was absolutely critical to my functioning and productivity, often oddly paralleled critical deadlines on papers, presentations or customer deliverables. Some might call this procrastination; I am guilty of at least once referring to it as fortuitous serendipity.

I in some way found comfort within this chaos by proudly donning the mantle of ‘early adopter.’ In fact, I arguably pushed the boundaries and embraced the title of ‘innovator,’ more often than not being the first in line for a new technology (long before standing in lines for iPhones was considered a rite of passage, I might add). The source of these two labels, of course, is the diffusion of innovations, a theory of innovation and adoption of technology developed by Everett Rogers. Rogers, a professor of communications study, advanced the idea that technology needs to reach a penetration rate of approximately 15% for it to achieve critical mass and become self-sustaining. This is the point at which penetration of a new technology moves past the innovators (2.5%) and early adopters (the next 12.5%), and penetrates what Rogers referred to as the ‘early majority’.

How the theory of the diffusion of innovations works is, in many ways, the story of the dot-com revolution. Innovators and early adopters serve as opinion leaders, embracing and evangelizing new solutions. This evangelism ultimately takes root through social networks and organizational adoption, until critical mass leverages the innovation over the tipping point to widespread acceptance.

A recent re-acquaintance with Rogers’ work, however, brought me to an uncomfortable realization. I am no longer what can be considered an ‘early adopter’. If published rates of adoption of things such as the latest release of iOS8 are to be believed, in fact, I don’t even qualify for the ‘early majority.’ At best, I may be a part of the ‘late majority.’ But there is, in fact, an outside risk that I have become a laggard.

The signs are many, and they are painful to accept. I still have an iPhone 5 (not even a 5S) that I have had for more than two years. My previous 3GS was nearly three years old when I finally replaced it. My laptop is a MacBook Air that I bought in late 2011, and have never reinstalled; it is still soldiering on with workmanlike reliability, and shows no sign of being replaced any time soon. While I am aware that the latest iPhone has been released, and that I can upgrade my current iPhone to iOS8, I have shown little eagerness or appetite for acquiring either. The iOS8 upgrade will likely occur when software I depend upon no longer runs on iOS7; the next iPhone replacement, however, most probably won’t occur until the rapidly waning battery life of my current phone forces the issue. More than 237 million people signed up to Twitter faster than I did, a figure roughly equivalent to half the population of North America. I no longer wait avidly for the next product announcement or search for the next new application.

What I find personally most intriguing is why this is so. Certainly, a greater awareness of the disposability of income has had an impact; I no longer feel quite so compelled to whip out my credit card in a near-Pavlovian response to the scent of recently warm shrink-wrap. More particularly, though, I have come to appreciate and value the stability and reliability of a familiar, known—and above all operating—work environment. There have been enough forced reinstallations and failed upgrades at inopportune moments that my risk tolerance for new and shiny has declined considerably. I don’t want to spend three days reinstalling my laptop because an ill-timed and ill-judged upgrade brought it crashing to its knees. I have, dare I say it, matured.

That’s not to say that I don’t appreciate new innovations, but I don’t embrace them with the same eager abandon that I did in years previous. And I think there are a few reasons for that. A big one is that innovations have become more stable and more perilous at the same time.

New versions of familiar software tend to work, and work well. I don’t update immediately, and have learned to wait to see whether the experiences of others are positive, but when I do update my current software there tends to be very few issues. Software development practices have, with some noteworthy and spectacularly pyrotechnic exceptions, evolved to the point where robust regression testing and enhanced reliability methods mean that stable software tends to stay stable.

At the same time, however, the new language of innovation has resulted in a near-ubiquitous adherence to the philosophy of a ‘minimum viable product.’ This means that new solutions are often fundamentally constrained; even while this isn’t the intent of ‘minimum viable product,’ it is its practical consequence. Gone are the days that innovators tried to hit it out of the park with a home run, developing a full solution that attempted to satisfy everyone at the outset. Now, developers are cautiously hoping for a solidly-hit single, putting an often barely-functioning prototype out into the market and hoping for enough acceptance to fund their next at-bat.

What’s interesting is that it was the pioneering risk of the new, wholly-crafted solution that I was often rewarding. I would look at a new laptop, a new phone, or a new piece of software and think, “This could be the one. This could do it all for me. This might be the solution that finally gives me everything I am looking for.” And so, like the innovators themselves, I would roll the dice and bet it all. Now, it is rare that I look at a new app and don’t see its immediate limitations over what I am already currently using. There are few organizations taking the wholesale risk of making big bets. And I’m not prepared to sacrifice the comfort of what works for the inconvenience of something new that doesn’t at least make a pretence of meeting all my needs.

This has also led to a different insight of where I personally find myself innovating more. I have come to realize that the answers to my problems of organizing, working and collaborating aren’t about technology. I am often my own worst enemy, in the form of bad work habits and inefficient organizational tendencies. In other words, it’s not the software, it’s the user. So where I do tend to innovate is in how I operate. I will change workflow readily as I find new strategies that work; I am far more reluctant to change out technology.

And so I find myself—almost by design—running the risk of becoming an innovation laggard. And I am strangely okay with that. Innovation is something I now more do for myself. It is less a compulsive addiction that I look to others to support. What I find curious is whether or not I am alone in this tendency. As the language of innovation changes and becomes more rigid, formal and consistent, are we as consumers becoming less willing to embrace the next new and shiny object? Or am I, as I fear, just maturing?

Exit mobile version