« Resolutions | Main | Power plays »

The visible computer

Windows_Xp_of_Medea.JPGI have a friend I would like to lend anyone who thinks computers have gotten easier in the last 30 years.

The other evening, he asked how to host a Zoom conference. At the time, we were *in* a Zoom call, and I've seen him on many others, so he seemed competent enough.

"Di you have a Zoom account?" I said.

"How do I get that?"

I directed him to the website. No, not the window with our faces; that's the client. "Open up - what web browser do you use?"

"Er...Windows 10?"

"That's the computer's operating system. What do you use to go to a website?"

"Google?"

Did he know how to press ALT-TAB to see the open windows on his system? He did not. Not even after instruction.

But eventually he found the browser, Zoom's website, and the "Join" menu item. He created a password. The password didn't work. (No idea.) He tried to reset the password. More trouble. He decided to finish it later...

To be fair, computers *have* gotten easier. On a 1992 computer, I would have had to write my friend a list of commands to install the software, and he'd have had to type them perfectly every time and learn new commands for each program's individual interface. But the comparative ease of use of today's machines is more than offset by the increased complexity of what we're doing with them. It would never have occurred to my friend even two years ago that he could garnish his computer with a webcam and host video chats around the world.

I was reminded of this during a talk on new threats to privacy that touched on ubiquitous computing and referenced the 1991 paper The Computer for the 21st Century, by Marc Weiser, then head of the famed Xerox PARC research lab.

Weiser imagined the computer would become invisible, a theme also picked up by Donald Norman in his 1998 book, The Invisible Computer. "Invisible" here means we stop seeing it, even though it's everywhere around us. Both Weiser and Norman cited electric motors, which began as large power devices to which you attached things, and then disappeared inside thousands of small and large appliances. When computers are everywhere, they will stop commanding our attention (except when they go wrong, of course). Out of sight, out of mind - but in constant sight also means out of mind because our brains filter out normal background conditions to focus on the exceptional.

Weiser's group built three examples, which they called tabs (inch-scale), pads (foot-scale), and boards (yard-scale). His tabs sound rather like today's tracking tags. Like the Active Badges at Olivetti Research in Cambridge they copied (the privacy implications of which horrified the press at the time), they could be used to track people and things, direct calls, automate diary-keeping, and make presentations and research portable throughout the networked area. In 2013, when British journalist Simon Bisson revisited this same paper, he read them more broadly as sensors and effectuators. Pads, in Weiser's conception, were computerized sheets of "scrap" paper to be grabbed and used anywhere and left behind for the next person. Weiser called them an "antidote to windows", in that instead of cramming all programs into a window you could spread dozens of pads across a full-sized desk (or floor) to work with. Boards were displays, more like bulletin boards, that could be written on with electronic "chalk" and shared across rooms.

"The real power of the concept comes not from any one of these devices; it emerges from the interaction of all of them," Weiser wrote.

In 2013, Bisson suggested Weiser's "embodied virtuality" was taking shape around us as sensors began enabling the Internet of Things and smartphones became the dominant interface to the Internet. But I like Weiser's imagined 21st century computing better than what we actually have. While cloud services can make our devices more or less interchangeable as long as we have the right credentials, that only works if broadband is uninterruptedly reliable. But even then, has anyone lost awareness of the computer - phone - in their hand or the laptop on their desk? Compare today to what Weiser thought would be the case 20 years later - which would have been 2011:

Most important, ubiquitous computers will help overcome the problem of information overload. There is more information available at our fingertips during a walk in the woods than in any computer system, yet people find a walk among trees relaxing and computers frustrating. Machines that fit the human environment, instead of forcing humans to enter theirs, will make using a computer as refreshing as taking a walk in the woods.

Who feels like that? Certainly not the friend we began with. Even my computer expert friends seem one and all convinced that their computers hate them. People in search of relaxation watch TV (granted, maybe on a computer), play guitar (even if badly), have a drink, hang with friends and family, play a game (again, maybe on a computer), work out, tale a bath. In fact, the first thing people do when they want to relax is flee their computers and the prying interests that use them to spy on us. Worse, we no longer aspire to anything better. Those aspirations have all been lost to A/B testing to identify the most profitable design.


Illustrations: Windows XP's hillside wallpaper (via Wikimedia).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

TrackBack

TrackBack URL for this entry:
https://WWW.pelicancrossing.net/cgi-sys/cgiwrap/wendyg/managed-mt/mt-tb.cgi/1041

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)

Archives