" /> net.wars: April 2011 Archives

« March 2011 | Main | May 2011 »

April 29, 2011

Searching for reality

They say that every architect has, stuck in his desk drawer, a plan for the world's tallest skyscraper; probably every computer company similarly has a plan for the world's fastest supercomputer. At one time, that particular contest was always won by Seymour Cray. Currently, the world's fastest computer is Tianhe-1A, in China. But one day soon, it's going to be Blue Waters, an IBM-built machine filling 9,000 square feet at the National Center for Supercomputing Applications at the University of Illinois at Champaign-Urbana.

It's easy to forget - partly because Champaign-Urbana is not a place you visit by accident - how mainstream-famous NCSA and its host, UIUC, used to be. NCSA is the place from which Mosaic emerged in 1993. UIUC was where Arthur C. Clarke's HAL was turned on, on January 12, 1997. Clarke's choice was not accidental: my host, researcher Robert McGrath tells me that Clarke visited here and saw the seminal work going on in networking and artificial intelligence. And somewhere he saw the first singing computer, an IBM 7094 haltingly rendering "Daisy Bell." (Good news for IBM: at that time they wouldn't have had to pay copyright clearance fees on a song that was, in 1961, 69 years old.)

So much was invented here: Telnet, for example.

"But what have they done for us lately?" a friend in London wondered.

NCSA's involvement with supercomputing began when Larry Smarr, having worked in Europe and admired the access non-military scientists had to high-performance computers, wrote a letter to the National Science Foundation proposing that the NSF should fund a supercomputing center for use by civilian scientists. They agreed, and the first version of NCSA was built in 1986. Typically, a supercomputer is commissioned for five years; after that it's replaced with the fastest next thing. Blue Waters will have more than 300,000 8-core processors and be capable of a sustained rate of 1 petaflop and a peak rate of 10 petaflops. The transformer room underneath can provide 24 megawatts of power - as energy-efficiently as possible. Right now, the space where Blue Waters will go is a large empty white space broken up by black plug towers. It looks like a set from a 1950s science fiction film.

On the consumer end, we're at the point now where a five-year-old computer pretty much answers most normal needs. Unless you're a gamer or a home software developer, the pressure to upgrade is largely off. But this is nowhere near true at the high end of supercomputing.

"People are never satisfied for long," says Tricia Barker, who showed us around the facility. "Scientists and engineers are always thinking of new problems they want to solve, new details they want to see, and new variables they want to include." Planned applications for Blue Waters include studying storms to understand why some produce tornadoes and some don't. In the 1980s, she says, the data points were kilometers apart; Blue Waters will take the mesh down to 10 meters.

"It's why warnings systems are so hit and miss," she explains. Also on the list are more complete simulations to study climate change.

Every generation of supercomputers gets closer to simulating reality and increases the size of the systems we can simulate in a reasonable amount of time. How much further can it go?

They speculate, she said, about how, when, and whether exaflops can be reached: 2018? 2020? At all? Will the power requirements outstrip what can reasonably be supplied? How big would it have to be? And could anyone afford it?

In the end, of course, it's all about the data. The 500 petabytes of storage Blue Waters will have is only a small piece of the gigantic data sets that science is now producing. Across campus, also part of NCSA, senior research scientist Ray Plante is part of the Large Synoptic Survey Telescope project which, when it gets going, will capture a third of the sky every night on 3 gigapixel cameras with a wide field of view. The project will allow astronomers to see changes over a period of days, allowing them to look more closely at phenomena such as bursters and supernovae, and study dark energy.

Astronomers have led the way in understanding the importance of archiving and sharing data, partly because the telescopes are so expensive that scientists have no choice about sharing them. More than half the Hubble telescope papers, Plante says, are based on archival research, which means research conducted on the data after a short period in which research is restricted to those who proposed (and paid for) the project. In the case of LSST, he says, there will be no proprietary period: the data will be available to the whole community from Day One. There's a lesson here for data hogs if they care to listen.

Listening to Plante - and his nearby colleague Joe Futrelle - talk about the issues involved in storing, studying, and archiving these giant masses of data shows some of the issues that lie ahead for all of us. Many of today's astronomical studies rely on statistics, which in turn requires matching data sets that have been built into catalogues without necessarily considering who might in future need to use them: opening the data is only the first step.

So in answer to my friend: lots. I saw only about 0.1 percent of it.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

April 22, 2011


Modern life is full of so many moments when you see an apparently perfectly normal person doing something that not so long ago was the clear sign of a crazy person. They're walking down the street talking to themselves? They're *on the phone*. They think the inanimate objects in their lives are spying on them? They may be *right*.

Last week's net.wars ("The open zone") talked about the difficulty of finding the balance between usability, on the one hand, and giving users choice, flexibility, and control, on the other. And then, as if to prove this point, along comes Apple and the news that the iPhone has been storing users' location data, perhaps permanently.

The story emerged this week when two researchers presenting at O'Reilly's Where 2.0 conference presented an open-source utility they'd written to allow users to get a look at the data the iPhone was saving. But it really begins last year, when Alex Levinson discovered the stored location data as part of his research on Apple forensics. Based on his months of studying the matter, Levinson contends that it's incorrect to say that Apple is gathering this data: rather, the device is gathering the data, storing it, and backing it up when you sync your phone. Of course, if you sync your phone to Apple's servers, then the data is transferred to your account - and it is also migrated when you purchase a new iPhone or iPad.

So the news is not quite as bad as it first sounded: your device is spying on you, but it's not telling anybody. However: the data is held in unencrypted form and appears never to expire, and this raises a whole new set of risks about the devices that no one had really focused on until now.

A few minutes after the story broke, someone posted on Twitter that they wondered how many lawyers handling divorce cases were suddenly drafting subpoenas for copies of this file from their soon-to-be-exes' iPhones. Good question (although I'd have phrased it instead as how many script ideas the wonderful, tech-savvy writers of The Good Wife are pitching involving forensically recovered location data). That is definitely one sort of risk; another, ZDNet's Adrian Kingsley-Hughes points out is that the geolocation may be wildly inaccurate, creating a false picture that may still be very difficult to explain, either to a spouse or to law enforcement, who, as Declan McCullagh writes know about and are increasingly interested in accessing this data.

There are a bunch of other obvious privacy things to say about this, and Privacy International has helpfully said them in an open letter to Steve Jobs.

"Companies need openness and procedures," PI's executive director, Simon Davies, said yesterday, comparing Apple's position today to Google's a couple of months before the WiFi data-sniffing scandal.

The reason, I suspect, that so many iPhone users feel so shocked and betrayed is that Apple's attention to the details of glossy industrial design and easy-to-understand user interfaces leads consumers to cuddle up to Apple in a way they don't to Microsoft or Google. I doubt Google will get nearly as much anger directed at it for the news that Android phones also collect location data (the Android saves only the last 50 mobile masts and 200 WiFi networks). In either event, the key is transparency: when you post information on Twitter or Facebook about your location or turn on geo-tagging you know you're doing it. In this case, the choice is not clear enough for users to understand what they've agreed to.

The question is: how best can consumers be enabled to make informed decisions? Apple's current method - putting a note saying "Beware of the leopard" at the end of a 15,200-word set of terms and conditions (which are in any case drafted by the company's lawyer to protect the company, not to serve consumers) that users agree to when they sign up for iTunes - is clearly inadequate. It's been shown over and over again that consumers hate reading privacy policies, and you have only to look at Facebook's fumbling attempts to embed these choices in a comprehensible interface to realize that the task is genuinely difficult. This is especially true because, unlike the issue of user-unfriendly sysstems in the early 1990s, it's not particularly in any of these companies' interests to solve this intransigent and therefore expensive problem. Make it easy for consumers to opt out and they will, hardly an appetizing proposition for companies supported in whole or in part by advertising.

The answer to the question, therefore, is going to involve a number of prongs: user interface design, regulation, contract law, and industry standards, both technical and practical. The key notion, however, is that it should be feasible - even easy - for consumers to tell what information gathering they're consenting to. The most transparent way of handling that is to make opting out the default, so that consumers must take a positive action to turn these things on.

You can say - as many have - that this particular scandal is overblown. But we're going to keep seeing dust-ups like this until industry practice changes to reflect our expectations. Apple, so sensitive to the details of industrial design that will compel people to yearn to buy its products, will have to develop equal sensitivity for privacy by design.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

April 15, 2011

The open zone

This week my four-year-old computer had a hissy fit and demanded, more or less simultaneously, a new graphics card, a new motherboard, and a new power supply. It was the power supply that was the culprit: when it blew it damaged the other two pieces. I blame an incident about six months ago when the power went out twice for a few seconds each time, a few minutes apart. The computer's always been a bit fussy since.

I took it to the tech guys around the corner to confirm the diagnosis, and we discussed which replacements to order and where to order them from. I am not a particularly technical person, and yet even I can repair this machine by plugging in replacement parts and updating some software. (It's fine now, thank you.)

Here's the thing: at no time did anyone say, "It's four years old. Just get a new one." Instead, the tech guys said, "It's a good computer with a good processor. Sure, replace those parts." A watershed moment: the first time a four-year-old computer is not dismissed as obsolete.

As if by magic, confirmation turned up yesterday, when the Guardian's Charles Arthur asked whether the PC market has permanently passed its peak. Arthur goes on to quote Jay Chou, a senior research analyst at IDC, suggesting that we are now in the age of "good-enough computing" and that computer manufacturers will now need to find ways to create a "compelling user experience". Apple is the clear leader in that arena, although it's likely that if I'd had a Mac instead of a PC it would have been neither so easy nor so quick and inexpensive to fix my machine and get back to work on it, Macs are wonders of industrial design, but as I noted in 2007 when I built this machine, building PCs is now a color-by-numbers affair plugged together out of subsystem pieces that plug together in only one way. What it lacks in elegance compared to a Mac is more than made up for by being able to repair it myself.

But Chou is likely right that this is not the way the world is going.

In his 1998 book The Invisible Computer, usability pioneer Donald Norman projected a future of information appliances, arguing that computers would become invisible because they would be everywhere. (He did not, however, predict the ubiquitous 20-second delay that would accompany this development. You know, it used to be you could turn something on and it would work right away because it didn't have to load software into its memory?) For his model, Norman took electric motors: in the early days you bought one electric motor and used it to power all sorts of variegated attachments; later (now) you found yourself owning dozens of electric motors, all hidden inside appliances.

The trade-off is pretty much the same: the single electric motor with attachments was much more repairable by a knowledgeable end user than today's sealed black-box appliances are. Similarly, I can rebuild my PC, but I can only really replace the hard drive on my laptop, and the battery on my smart phone. Iphone users can't even do that. Norman, whose interest is usability, doesn't - or didn't, since he's written other books since - see this as necessarily a bad deal for consumers, who just want their technology to work intuitively so they can use it to get stuff done.

Jonathan Zittrain, though, has generally taken the opposite view, arguing in his book The Future of the Internet - and How to Stop It and in talks such as the one he gave at last year's Web science meeting that the general-purpose computer, which he dates to 1977, is dying. With it, to some extent, is going the open Internet; it was at that point that, to illustrate what he meant by curated content, he did a nice little morph from the ultra-controlled main menu of CompuServe circa 1992 to today's iPhone home screen.

"How curated do we want things to be?" he asked.

It's the key question. Zittrain's view, backed up by Tim Wu in The Master Switch is that security and copyright may be the levers used to close down general-purpose computers and the Internet, leaving us with a corporately-owned Internet that runs on black boxes to which individual consumers have little or no access. This is, ultimately, what the "Open" in Open Rights Group seems to me to be about: ensuring that the most democratic medium ever invented remains a democratic medium.

Clearly, there are limits. The earliest computer kits were open - but only to the relatively small group of people with - or willing to acquire - considerable technical skill. My computer would not be more open to me if I had to get out a soldering iron to fix my old motherboard and code my own operating system. Similarly, the skill required to deal with security threats like spam and malware attacks raises the technical bar of dealing with computers to the point where they might as well be the black boxes Zittrain fears. But somewhere between the soldering iron and the point-and-click of a TV remote control there has to be a sweet spot where the digital world is open to the most people. That's what I hope we can find.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

April 8, 2011

Brought to book

JK Rowling is seriously considering releasing the Harry Potter novels as ebooks, while Amanda Hocking, who's sold a million or so ebooks has signed a $2 million contract with St. Martin's Press. In the same week. It's hard not to conclude that ebooks are finally coming of age.

And in many ways this is a good thing. The economy surrounding the Kindle, Barnes and Noble's Nook, and other such devices is allowing more than one writer to find an audience for works that mainstream publishers might have ignored. I do think hard work and talent will usually out, and it's hard to believe that Hocking would not have found herself a good career as a writer via the usual routine of looking for agents and publishers. She would very likely have many fewer books published at this point, and probably wouldn't be in possession of the $2 million it's estimated she's made from ebook sales.

On the other hand, assuming she had made at least a couple of book sales by now, she might be much more famous: her blog posting explaining her decision notes that a key factor is that she gets a steady stream of complaints from would-be readers that they can't buy her books in stores. She expects to lose money on the St. Martin's deal compared to what she'd make from self-publishing the same titles. To fans of disintermediation, of doing away with gatekeepers and middle men and allowing artists to control their own fates and interact directly with their audiences, Hocking is a self-made hero.

And yet...the future of ebooks may not be so simply rosy.

This might be the moment to stop and suggest reading a little background on book publishing from the smartest author I know on the topic, science fiction writer Charlie Stross. In a series of blog postings he's covered common misconceptions about publishing, why the Kindle's 2009 UK launch was bad news for writers, and misconceptions about ebooks. One of Stross's central points: epublishing platforms are not owned by publishers but by consumer electronics companies - Apple, Sony, Amazon.

If there's one thing we know about the Net and electronic media generally it's that when the audience for any particular new medium - Usenet, email, blogs, social networks - gets to be a certain size it attracts abuse. It's for this reason that every so often I argue that the Internet does not scale well.

In a fascinating posting on Patrick and Theresa Nielsen-Hayden's blog Making Light, Jim Macdonald notes the case of Canadian author S K S Perry, who has been blogging on LiveJournal about his travails with a thief. Perry, having had no luck finding a publisher for his novel Darkside, had posted it for free on his Web site, where a thief copied it and issued a Kindle edition. Macdonald links this sorry tale (which seems now to have reached a happy-enough ending) with postings from Laura Hazard Owen and Mike Essex that predict a near future in which we are awash in recycled ebook...spam. As all three of these writers point out, there is no system in place to do the kind of copyright/plagiarism checking that many schools have implemented. The costs are low; the potential for recycling content vast; and the ease of gaming the ratings system extraordinary. And either way, the ebook retailer makes money.

Macdonald's posting primarily considers this future with respect to the challenge for authors to be successful*: how will good books find audiences if they're tiny islands adrift in a sea of similar-sounding knock-offs and crap? A situation like that could send us all scurrying back into the arms of people who publish on paper. That wouldn't bother Amazon-the-bookseller; Apple and others without a stake in paper publishing are likely to care more (and promising authors and readers due care and diligence might help them build a better, differentiated ebook business).

There is a mythology that those who - like the Electronic Frontier Foundation or the Open Rights Group - oppose the extension and tightening of copyright are against copyright. This is not the case: very few people want to do away with copyright altogether. What most campaigners in this area want is a fairer deal for all concerned.

This week the issue of term extension for sound recordings in the EU revived when Denmark changed tack and announced it would support the proposals. It's long been my contention that musicians would be better served by changes in the law that would eliminate some of the less fair terms of typical contracts, that would provide for the reversion of rights to musicians when their music goes out of commercial availability, and that would alter the balance of power, even if only slightly, in favor of the musicians.

This dystopian projected future for ebooks is a similar case. It is possible to be for paying artists and even publishers and still be against the imposition of DRM and the demonization of new technologies. This moment, where ebooks are starting to kick into high gear, is the time to find better ways to help authors.

*Successful: an author who makes enough money from writing books to continue writing books.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

April 1, 2011

Equal access

It is very, very difficult to understand the reasoning behind the not-so-secret plan to institute Web blocking. In a http://www.openrightsgroup.org/blog/2011/minister-confirms-voluntary-site-blocking-discussionsletter to the Open Rights Group, Ed Vaizey, the minister for culture, communications, and creative industries, confirmed that such a proposal emerged from a workshop to discuss "developing new ways for people to access content online". (Orwell would be so proud.)

We fire up Yes, Minister once again to remind everyone the four characteristics of proposals ministers like: quick, simple, popular, cheap. Providing the underpinnings of Web site blocking is not likely to be very quick, and it's debatable whether it will be cheap. But it certainly sounds simple, and although it's almost certainly not going to be popular among the 7 million people the government claims engage in illegal file-sharing - a number PC Pro has done a nice job of dissecting - it's likely to be popular with the people Vaizey seems to care most about, rights holders.

The four opposing kiss-of-death words are: lengthy, complicated, expensive, and either courageous or controversial, depending how soon the election is. How to convince Vaizey that it's these four words that apply and not the other four?

Well, for one thing, it's not going to be simple, it's going to be complicated. Web site blocking is essentially a security measure. You have decided that you don't want people to have access to a particular source of data, and so you block their access. Security is, as we know, not easy to implement and not easy to maintain. Security, as Bruce Schneier keeps saying, is a process, not a product. It takes a whole organization to implement the much more narrowly defined IWF system. What kind of infrastructure will be required to support the maintenance and implementation of a block list to cover copyright infringement? Self-regulatory, you say? Where will the block list, currently thought to be about 100 sites come from? Who will maintain it? Who will oversee it to ensure that it doesn't include "innocent" sites? ISPs have other things to do, and other than limiting or charging for the bandwidth consumption of their heaviest users (who are not all file sharers by any stretch) they don't have a dog in this race. Who bears the legal liability for mistakes?

The list is most likely to originate with rights holders, who, because they have shown over most of the last 20 years that they care relatively little if they scoop innocent users and sites into the net alongside infringing ones, no one trusts to be accurate. Don't the courts have better things to do than adjudicate what percentage of a given site's traffic is copyright-infringing and whether it should be on a block list? Is this what we should be spending money on in a time of austerity? Mightn't it be...expensive?

Making the whole thing even more complicated is the obvious (to anyone who knows the Internet) fact that such a block list will - according to Torrentfreak already has - start a new arms race.

And yet another wrinkle: among blocking targets are cyberlockers. And yet this is a service that, like search, is going mainstream: Amazon.com has just launched such a service, which it calls Cloud Drive and for which it retains the right to police rather thoroughly. Encrypted files, here we come.

At least one ISP has already called the whole idea expensive, ineffective, and rife with unintended consequences.

There are other obvious arguments, of course. It opens the way to censorship. It penalizes innocent uses of technology as well as infringing ones; torrent search sites typically have a mass of varied material and there are legitimate reasons to use torrenting technology to distribute large files. It will tend to add to calls to spy on Internet users in more intrusive ways (as Web blocking fails to stop the next generation of file-sharing technologies). It will tend to favor large (often American) services and companies over smaller ones. Google, as IsoHunt told the US Court of Appeals two weeks ago, is the largest torrent search engine. (And, of course, Google has other copyright troubles of its own; last week the court rejected the Google Books settlement.)

But the sad fact is that although these arguments are important they're not a good fit if the main push behind Web blocking is an entrenched belief that only way to secure economic growth is to extend and tighten copyright while restricting access to technologies and sites that might be used for infringement. Instead, we need to show that this entrenched belief is wrong.

We do not block the roads leading to car boot sales just because sometimes people sell things at them whose provenance is cloudy (at best). We do not place levies on the purchase of musical instruments because someone might play copyrighted music on them. We should not remake the Internet - a medium to benefit all of society - to serve the interests of one industrial group. It would make more sense to put the same energy and financial resources into supporting the games industry which, as Tom Watson (Lab - Bromwich) has pointed out has great potential to lift the British economy.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.