At Apple’s recent special event announcing its latest MacBook Pro lineup, SVP Phil Schiller introduced the new Touch Bar feature by explaining that it was designed to provide a dynamic and adaptive replacement for the row of physical function keys that has accompanied computer keyboards since the early 1970s. Why, he asked, should interface design be constrained by the legacy of a 45-year-old technology?
Yet, just to the south of the new Touch Bar on this sleek, ultra-modern device sits a nearly 145-year-old technology that continues to artificially constrain computer interface design — one that I believe is way overdue for a radical reimagining:
The physical keyboard.
You’d probably think that, as a guy who makes his living herding words, I’d be the one yelling the loudest that you can have my keyboard when you pry it from my cold, dead hands. But before I can explain why I believe the future of writing absolutely demands the disappearance of the physical keyboard, first I need to go off on a highly pedantic tangent for just a moment.
The Way I Want to Touch You
The near-concurrent release of Microsoft’s impressive Surface Studio has inspired a lot of discussion about the divergent approaches that Microsoft and Apple are taking to interface design. The most common form of the argument is that, when it comes to their full-featured devices, Microsoft is cool with you touching the screen and Apple isn’t.
I don’t think that’s the real difference, though.
Now here’s the part where I get pedantic. MacOS, Apple’s flagship operating system, has always been touch-enabled (as has Windows, for that matter). You move the mouse with your hand and click keys with your fingers, and these actions make things happen on the screen. (And this is the part where you groan and roll your eyes.)
The difference is that these are intermediated inputs. Your hand is moving a device over here that makes changes appear over there. When you put your finger directly on the screen where you want something to happen, you are disintermediating the input from the output.
What Apple seems to be saying is that it defines the boundaries of its OS ecosystems not in terms of the devices they are used on or the things people use them for, but in terms of whether the method of input is intermediated or disintermediated from the output. On an iPhone and iPad, the input and output are primarily taking place on the same plane. On a laptop or desktop, the input is taking place on a separate device from the output.
This is a perfectly valid design philosophy. Whether it will end up winning the survival-of-the-fittest competition of the marketplace to become the industry standard remains to be seen. But either way, I think it offers the potential for a coherent, consistent worldview.
The problem is, as demonstrated on the new MacBook Pros, in order for this input philosophy to have a fair shot at success, it has to ditch the physical keyboard — which, along with the monitor, is one of the things that traditionally defines what a computer is.
As it stands now, Apple’s flagship laptop has two context-sensitive touch interfaces on the horizontal slab that allow users to interact with the system using a vocabulary of gestures that are both universal to the entire system and particular to the application or even to a specific function within the application.
And between these two sophisticated, modern, highly adaptive inputs sits four rows of single-purpose, permanently affixed mechanical switches.
Apple’s demonstration of the new Touch Bar featured a parade of people using it to do all kinds of magical things. We had a photographer using the trackpad and Touch Bar to edit a photograph, a filmmaker using them to splice a video, and a musician using them to lay down tracks.
Conspicuously, none of them ever touched the alphanumeric keys that took up most of the horizontal real estate.
What if that space could have been used as a light table for the photographer, a digital moviola for the filmmaker, and a full-on mixing board and platters for the musician? And then, when it came time to name a file or type a caption, have the alphanumeric keyboard appear for just long enough for them to type it, then disappear again to be replaced once again by a suite of controls appropriate for the particular application?
And when you’re using office-y applications, the keyboard of your choice — QWERTY, Dvorak, scientific calculator, adding machine, mathematical formula, shorthand, editing symbols — would be the context-appropriate interface that appears on the horizontal input surface, just when you need it.
Apple is exploring this philosophy on its mobile devices. Want your QWERTY keys to travel and click like they do on a physical keyboard? Haptics have you covered. Don’t need a mouse trackpad? It disappears from the screen while typing until you touch it with two fingers, at which time the keyboard instantly transforms into a trackpad and then snaps back to keyboard mode when you’re done. So not only is it do-able, it already exists within the Apple design philosophy.
Touch the Sky
The QWERTY keyboard is just one of many touch-based input methods that we use to create, manipulate, save, and share what we make. Requiring that it take up permanent residence on a device in the form of physical, single-purpose mechanical switches just looks and feels increasingly antiquated. It forces designers to add modern interface tools around its periphery just as, 45 years ago, computer designers had to add keys for functions, commands, and navigation around the basic, and familiar, typewriter keyboard.
The more things change in computer interface design, in other words, the more they stay the same — as long as those physical QWERTY keys have to be there.
So if Apple and others decide to get rid of the physical keyboard in order to realize the full potential of intermediated input, what would that mean for writers?
As far as I can tell, nothing.
Hey, if you really need or want or can’t do without a physical keyboard, get a wireless one or plug one in. Otherwise, type away on your haptic-glass input panel. If it’s done really well — which is what will make or break the idea — typing on virtual keyboards will be a perfectly normal experience for most users, while also offering options that are impossible with physical keyboards — adjustable key spacing, variable sensitivity, repositioning for maximum comfort, customizable haptic responses, multiple alphabets, accuracy adjustments based on typing speed, user-customizable keys, and tons of other things I can’t even begin to imagine.
Another way to interpret Apple’s dictum that it doesn’t want you touching the screen in macOS is that it doesn’t want you touching the output screen. Apple’s laptops now provide two touch screens on the input side: the trackpad and the Touch Bar. The next step is to unify them — along with the alphanumeric keys that sit between them. Thursday’s MacBook Pro event demonstrated pretty clearly that the future of intermediated input is adaptive, context-sensitive, and virtual. And that means doing away with the mechanical QWERTY keyboard.
The keyboard is dead. Long live the keyboard.
UPDATE, 3/17/18: Yes, a year and a half later and I still haven’t given up on this quixotic vision, and now I’ve seen a glimmer of hope that I may not be alone: “Apple invention involves a Mac laptop with a ‘keyless keyboard’,” AppleWorld.Today, 3/15/18.
UPDATE: 6/3/18: Mashable’s Michael Nuñez: “It’s time for Apple to end the tyranny of the physical keyboard altogether and build a laptop with a full touchscreen keyboard.” (“It’s time for Apple’s next act of courage: Kill the MacBook keyboard”)
Categorised as: Life the Universe and Everything
Comments are disabled on this post