Telirati Newsletter #27

It wouldn't be a retrospective without a few thudding failures in prediction, like this one. A complex and only tenuously relevant analysis leads to an utterly wrong conclusion.

Telirati Newsletter #27: Picking the hits.

It has been an abiding interest of mine to figure out when technologies will become widely accepted and used. Recently it was brought to my attention that my interest in determining when, for example, real customers will buy unified messaging systems parallels a fashionable trend in the analysis of the history of technology.

Writers like Jared Diamond, in seminal books like Guns, Germs, and Steel have brought an interesting new infusion of rigor to the history of technology and even to more-general topics in history. This new wave of historical analysis finds the underpinnings of historical outcomes in geography, climate, minerals, plants, and disease conditions of the regions of the earth.

This is an attractive analysis. Genetics tells us that man is far more alike in all the ways that matter than superficial racial differences suggest. Finding alternative explanations to why dead white men, mostly English, dominate the formation of the modern world lets the air out of any racist analyses from all sides of historical debate over why this is so.

Diamond’s analyses, and those of historians with similar approaches, show that Greece, Rome, the Christian church, the Renaissance, the American Revolution, the industrial Revolution, and everything else leading up to modern liberal capitalism may have happened where and when they happened because of the guns, germs, steel, flora, fauna, rocks, and water of the vessels that contained those events.

The applicability of this type of analysis to technology product acceptance was brought home to me by a recent article in Feed, an online publication that surveys this new wave in historical analysis. The article describes the approach used by Jared Diamond, who places great stock in geography, Paul Levinson, who takes a Darwinian approach to the history of ideas, and Brian Winston, who hitches a school of linguistics to his plow.

Winston theorizes a multi-phase process through which ideas are accepted: ideation, prototype, social necessity, invention, suppression, and adoption. These steps are inspired by Saussurian linguistics, which, argues that speech is expressed through a process of social acceptance in which ideas are integrated into society before the structures needed to express those ideas become a part of language.

All of which sounds suspect to this one-time linguistics and cognitive science major. But the links, however tenuous, to some claim to a connection to the way that the mind works makes Winston’s analogous application of these linguistic ideas to history relatively plausible. After all, the geography of high tech seems to consist uniformly of cubicles (though you should recall that I place some credit for Microsoft’s success on putting their coders in offices). To be a bit more serious, the Internet makes Singapore and Stockholm equally likely candidates for the location of the next breakthrough.

The Web itself is an example of a technology that traversed Winston’s six steps: Early hypertext systems were heralded as a fundamentally new and better way to organize and communicate knowledge, with numerous benefits. But those early products languished. I cannot exactly map the social necessity of hypertext, but those who proclaimed hypertext in the first place were solidly convinced it was a step forward for mankind, even as their idea lay fallow. In the invention phase, an interesting thing happened that may well be a significant elaboration of a theory of product acceptance: Hypertext stopped being a product as the World Wide Web escaped the halls of academe, where it was invented and lay in suppression for a number of years. The Web just is and does, and few users think: “That hypertext stuff is a great idea!” In fact, the Web is deficient compared with earlier hypertext systems in that it does not manage broken links other than to report errors when such links are encountered. The Web has become a widely accepted idea at the same time the users of the Web forgot why it was created in the first place and forgot about the product category, hypertext, that it is part of.

My theory is that a similar thing will happen to unified messaging. Unified messaging has passed through the early stages, through prototyping in which proprietary technologies were employed, and now is poised to break out. This will happen, however, once people stop thinking of unified messaging as a separate, high-end product. Unified messaging will suddenly and without much heralding replace voice mail as we know it. The key is to find the inflection point from the present state of suppression to the future state of ubiquity. I think that point comes with you stop hearing the words “unified messaging” and begin to hear telephone companies talking of making their business model more Internet oriented. One obvious expression of this is that my cell phone’s address book will be automatically updated from an LDAP directory, and that one access option for my voice mail will be IMAP. This will happen without hoopla, and, very importantly, it will happen without price increases. The capability, like HTTP serving today, will simply be the price of staying in the game. Acceptance is, after all, ordinariness.

Copyright 1998 Zigurd Mednieks. May be reproduced and redistributed with attribution and this notice intact.


Popular Posts

5G: Hype vs Reality

A $99 Android Tablet That Doesn't Suck

The QUIC Brown Fox Jumped Over the Top of Carrier Messaging, or Allo, Duo, WebRTC, QUIC, Jibe, and RCS, Explained