The edge of writing
The unexpected point made at Wednesday's The Future of Text is that at any given moment text, like anything else, has had an imaginary future that's a logical extension of its present and an unexpected future, which is what actually went on to happen when the new technology hit a particular social and cultural context. When Sumerian accountants first began keeping track of commodity sales using marks on blobs of clay, they didn't imagine Jane Austen and Douglas Adams. Similarly, when ancient Egyptians began documenting their relationships with the gods, they did not imagine Martin Luther or the industrial revolution. But if we glance up from our word processors to look back, we can see a pair of tracks stretching back to those two motives for writing: business and philosophy (or maybe art).
Anyone who's ever struggled to organize their thoughts in Microsoft Word (as I wrote in the Guardian in 2006), finds it frustrating, from James Gleick (1992) to Charlie Stross (2013). In writers' circles, people trade ideas for workarounds: put paragraphs on index cards and shuffle them on a table. Or post-it notes on a wall. Or chalked-up small slabs of rock on a floor. Or wax tablets with letters cut into them...
In other words, as I see I wrote in 2006, computers haven't solved a damn thing.
Something about this discussion - perhaps under the influence of Tom Standage, there to explain that the ancient Romans had a highly developed social network - merged insidiously with previous day's first Open Data Institute summit. Ten years ago, our future looked to be about processor power; five years ago it was maybe about mobile and storage; today it's all about data. In any number of recent books you'll see the claim that human intuition and experience are on the way out, beaten by data analytics.
Yet the thing I took away from the summit is that *actually* it isn't about the data at all. The key story was that told by New York City's Michael Flowers. While his group now relies on patterns in inspection data that provide early warnings of fire risk, what enabled that was the labor-intensive process of talking to human inspectors about what the data they collected meant.
"We married knowledge and open data," he said. "We asked what they saw, then matched it to the data." In other words, it's the metadata - the data *about* the data - that really matters. I had an inkling of this when I watched a bunch of teens try to figure out what the headings meant in a spreadsheet displaying some government data they'd begged, borrowed, or stolen for Young Rewired State 2011. My guess is that if we don't keep collecting that metadata at some point today's data patterns will cease to accurately predict fire risk because the humans will have drifted in another direction.
Which, again, linked to a completely different event this week, Tuesday evening's Cybersalon event on surveillance. "Metadata" and "traffic analysis" aren't terms the general public knew much about until a couple of months ago, but of course they are the heart of what's going on with the NSA. Concern about what's being collected and analyzed are leading even Old Net Curmudgeons into some very dark places they never thought they'd consider going, such as what Becky Hogge called the "Splinternet". A number of countries - for example, Brazil - are considering legislation to force local data to be stored locally, out of the NSA's reach. For the last two decades, the Internet pioneers have seen splitting up the Net into national islands as a terrible danger and the reason not to make a number of decisions. But what's worse? The Splinternet or the Global Surveillance Platform? Are these our only choices of future?
On Wednesday a fine rant by Ted Nelson, the father of hypertext, argued that problem with today's word processors is the tyranny of printing. In other words: business won. What, asked the event's organizer, Frode Hegland, would it look like if some other paradigm had won? What if we were in an alternative future that enabled a different kind of thought? He is experimenting with an answer in the form of a different approach to word processors (unfortunately closed to me: it runs only in the parallel and alien Mac universe, though there are other niche other-paradigm word processors out there).
History, the British Museum's Jonathan Taylor said Wednesday, "is a series of accidents by people who don't understand what's going on." It took 300 years for writing to develop into telling stories. "You have to feel the need and the appropriateness," Taylor said.
Many engineers and cryptographers now feel the need and appropriateness of reinventing the Internet; I see pockets of people discussing how to build new, NSA-proof tools everywhere. While we're doing that - and while we're exploring this new universe of open data - we should keep Word in mind. We should remember to ask not just what kind of future we're creating, but what kind of future we're closing off.
Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.