Over dinner last night, the topic of human-machine interfaces came up with a particular eye towards what we might expect to see in the next 30 years given the fairly remarkable changesi of the last 30. With the advent and popularity of touchscreens,ii personal computing has never been more accessible, more widely used, and less involving of a physical keyboard. Once the sole dominion of the learned and well-connected, the Microprocessors For All movement has put some Ambien-necessitating pretense of power in the hands of most every little twerp and every larger derp over the age of about 10.iii Ok, I suppose it’s not really “computing” if the user isn’t in control of the device, but whaddya want, a CSIRAC ?
A computer is a digital, programmable device that you can give instructions to and from which you can be given a predictable, if currently unknown, response. An iPad is a TV with an impressive number of stations. One opens the doors to production and progress, one opens the doors to distraction and dilettantism. One asks you to toil, tinker, and test; the other to finger, swipe, and rest. One is for work, one is for pleasure. One is for smart people, one is for, well, I think you’re starting to get the idea.
The present state of affairs, and arguably the trend going forward for consumer products, is the increasing distribution of closed-source blackboxes of unknown providence and unknowable capability that are for all practical intents and purposes disposable toys.iv “The People” want this, obviously, because technology was previously “hard” and now it’s becoming “easier.” This is true whether we’re discussing laptops, cell phones, or refrigerators.v
Now part of this movement is quite evidently intended to sweep up the individuals – that is, those who would control their own computers and their geographicless fates along with it – into the garbage collector democratic socialism wherein they would be every bit as hemmed in as a pre-McNugget chicken. Another part is that, with the panem already taken care of, the social media circeses must do its best to tax the attentions of would-be thinkers at at damn near 100% if anyone is to continue taking large nation states seriously.
This is not to say that individuals can be corralled so readily. Complete and absolute adherence to the norms and strictures of the group were one thing in the pre-Information Age, when social pressures could be directly and physically enforced, but they’re quite another in a digital world. Today, iconoclasts from around the world can and do band together, both happily and uninterdictably.
In any event, I foresee that the illiterate and computer illiterate (there’s no practical difference anymore) will continue to chomp on disposable toys, perhaps with a stylus-like instrument,vi perhaps with a voice-to-text interface,vii but I’m unpersuaded that anyone needing to regularly input text into a machine will choose either an awkward stylus, a tactileless touchscreen, or soundproof studio-requiring voice input.
So do I see the physical keyboard dying ? Absolutely not. The physical keyboard is too quiet,viii too efficient, too precise, too reliable, too proven, and too effective to be discarded, much less supplanted entirely. Will fewer people have them in their homes ? Sure, because most “people” didn’t know what to do with keyboards in the first place and are all to happy to use their touchscreen smartphones as their sole means of communication, tap-tap-fapping away on their glasspanel interfaces.
In this way, however, the schism will deepen and grow between those using “computer” entertainment devices and those using computer productivity tools. In much the same way that Orthodox and Catholic Churches split over the location of The Holy Spirit,ix so too may computing be split over the location of the keyboard. In one camp are those who use their screen as both a keyboard and viewing monitor, in the other will be those who maintain the clickety-clackitiness of the physical keyboard as the Truth and Progress as control.
Needless to say, the QWERTY layout is here to stay and you’d better believe my kids will learn to type on a physical keyboard. Even if I have to dust off an old copy of Mavis Beacon to do it.
___ ___ ___
- N.B. I didn’t say positive changes, just remarkable, as in, worthy of being remarked. It can hardly be said that the only things worthy of remarking are positive. Almost the complete opposite, in fact, as the story of human history is that of war, massacre, disease, and pestilence – none of which are exactly “positive” for the victims, even if humanity ends up being stronger for them.↩
- As foreseen by Nicholas Negroponte circa 1970, much to the chagrin of his detractors at the time. “But our fingerprints will smudge the screen !” his opponents screamed, not unlike the “But we need central banks” derps, I suppose. Negroponte understood the power of a visual message in a way many of his contemporaries obviously didn’t.↩
- I wouldn’t be nearly as bitter about everyone having a microprocessor in their pocket if it didn’t also mean that I now have to wade through the steaming swamp of shit that now blankets the entire Internet, software development, hardware development, and of course Bitcoin spaces. It’s not hard to be up-to-fucking-here with the sea of idiots and their infantile pretenses dragging down the glory of the Connected Age, y’know ? Eh, not that I let it rile me up too much. Rome wasn’t built in a day and it’s kinda fun separating the lemma from the palea !↩
- Sure, there might be surveillance concerns as well, but these pale in comparison to active diddling.↩
- On the home appliance front, this is known as the “Internet of Things,” which is to say that your devices are monitoring your every move and transmitting its recordings across the Internet to goodness knows who. The idea that this is in any way secure, the idea that any reasonable person would want their blender to just start whirring away because a magical packet came in across the tubes with the message “turn on now,” and the idea that “home automation” could be in any way useful enough to keep up with the vagaries and eccentricities of even the most mundane human existence is really quite offensive and has absolutely a snowball’s chance in hell of succeeding.
Why will IoT fail specifically ? Because like Google Glass and the Apple Watch, it’s not cool. Some idiots might line up for it at first, but they’re nowhere near powerful enough to start the trend. “The People” might talk about these brainless technologies as some sort of panacea for the ails of the post-post-modern soul, but this is little more than Lanier-esque noise designed to distract the shallowly competent but broadly curious.↩
- While it’s true that writing with a stylus-like instrument is perhaps more natural and more ergonomic, it’s markedly slower – too slow, in fact, to be practicable – even with superior handwriting recognition software.
I recall using Palm’s early character-recognition software and even becoming accustomed to it. Surely, this can be improved upon, but it’s still a painfully methodical interface.↩
- Voice recognition software is at a similarly primitive stage of development, remaining the same sorry excuse of a secretary over the past 20 years, without appreciably narrowing the gap. The notion that AI, which would be required for complete comprehension of existing language and vocabulary, to say nothing of the made-up words often found on these pages and in human speech in general, will bridge the gap in the next two decades seems far-fetched to me. Besides, it’s not like AI is either a solution nor a problem in the first place :
AI isn’t really “a problem,” you see. The world isn’t short of intelligent and programmable slaves, it’s just that some a great many women and children have emotional hang-ups about treating the resources we have right in front of us as… resources. Instead, everyone is a Medal of Honour earning hero fit to be memorialised for all eternity with a viral YouTube video.
So AI is not only struggling from a lack of sensible specification, it’s also fighting the thermodynamic inefficiency of creating artificial life compared to biological life. Mother Nature has been at this game a lot longer than we have, and in 9 months, plus another decade for training, she can produce a hearty little servant that will understand commands and be able to execute them reliably. And, thanks to the power of the decentralised factory that is the uterus and the decentralised practise facility that is the family unit, at scale too ! Isn’t life magical ? ↩
- Even with a Model M. Speaking of which, snag one while you can !↩
- The Filioque is an addition to the Nicene Creed as adopted by the Western half of the continental body formerly known as the Holy Roman Empire – namely Catholicism, Protestantism, and Anglicanism – while rejected by the Eastern half – namely the Orthodox Churches – stating that The Holy Spirit belongs with both the Father and the Son, as opposed to just the Father.↩