Main

November 23, 2012

Democracy theater

So Facebook is the latest to discover that it's hard to come up with a governance structure online that functions in any meaningful way. This week, the company announced plans to disband the system of voting on privacy changes that it put in place in 2009. To be honest, I'm surprised it took this long.

Techcrunch explains the official reasons. First, with 1 billion users, it's now too easy to hit the threshold of 7,000 comments that triggers a vote on proposed changes. Second, with 1 billion users, amassing the 30 percent of the user base necessary to make the vote count has become...pretty much impossible. (Look, if you hate Facebook's policy changes, it's easier to simply stop using the system. Voting requires engagement.) The company also complained that the system as designed encourages comments' "quantity over quality". Really, it would be hard to come up with an online system that didn't unless it was so hard to use that no one would bother anyway.

The fundamental problem for any kind of online governance is that no one except some lawyers thinks governmance is fun. (For an example of tedious meetings producing embarrassing results, see this week's General Synod.) Even online, where no one can tell you're a dog watching the Outdoor Channel while typing screeds of debate, it takes strong motivation to stay engaged. That in turn means that ultimately the people who participate, once the novelty has worn off, are either paid, obsessed, or awash in free time.

The people who are paid - either because they work for the company running the service or because they work for governments or NGOs whose job it is to protect consumers or enforce the law - can and do talk directly to each other. They already know each other, and they don't need fancy online governmental structures to make themselves heard.

The obsessed can be divided into two categories: people with a cause and troublemakers - trolls. Trolls can be incredibly disruptive, but they do eventually get bored and go away, IF you can get everyone else to starve them of the oxygen of attention by just ignoring them.

That leaves two groups: those with time (and patience) and those with a cause. Both tend to fall into the category Mark Twain neatly summed up in: "Never argue with a man who buys his ink by the barrelful." Don't get me wrong: I'm not knocking either group. The cause may be good and righteous and deserving of having enormous amounts of time spent on it. The people with time on their hands may be smart, experienced, and expert. Nonetheless, they will tend to drown out opposing views with sheer volume and relentlessness.

All of which is to say that I don't blame Facebook if it found the comments process tedious and time-consuming, and as much of a black hole for its resources as the help desk for a company with impenetrable password policies. Others are less tolerant of the decision. History, however, is on Facebook's side: democratic governance of online communities does not work.

Even without the generic problems of online communities which have been replicated mutatis mutandem since the first modem uploaded the first bit, Facebook was always going to face problems of scale if it kept growing. As several stories have pointed out, how do you get 300 million people to care enough to vote? As a strategy, it's understandable why the company set a minimum percentage: so a small but vocal minority could not hijack the process. But scale matters, and that's why every democracy of any size has representative government rather than direct voting, like Greek citizens in the Acropolis. (Pause to imagine the complexities of deciding how to divvy up Facebook into tribes: would the basic unit of membership be nation, family, or circle of friends, or should people be allocated into groups based on when they joined or perhaps their average posting rate?)

The 2009 decision to allow votes came a time when Facebook was under recurring and frequent pressure over a multitude of changes to its privacy policies, all going one way: toward greater openness. That was the year, in fact, that the system effectively turned itself inside out. EFF has a helpful timeline of the changes from 2005 to 2010. Putting the voting system in place was certainly good PR: it made the company look like it was serious about listening to its users. But, as the Europe vs Facebook site says, the choice was always constrained to old policy or new policy, not new policy, old policy, or an entirely different policy proposed by users.

Even without all that, the underlying issue is this: what company would want democratic governance to succeed? The fact is that, as Roger Clarke observed before Facebook even existed, social networks have only one business model: to monetize their users. The pressure to do that has only increased since Facebook's IPO, even though founder Mark Zuckerberg created a dual-class structure that means his decisions cannot be effectively challenged. A commercial company- especially a *public* commercial company - cannot be run as a democracy. It's as simple as that. No matter how much their engagement makes them feel they own the place, the users are never in charge of the asylum. Not even on the WELL.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series.

November 16, 2012

Grabbing at governance

Someday the development of Internet governance will look like a continuous historical sweep whose outcome, in hindsight, is obvious. At the beginning will be one man, Jon Postel, who in the mid-1990s was, if anyone was, the god of the Internet. At the end will be...well, we don't know yet. And the sad thing is that the road to governance is so long and frankly so dull: years of meetings, committees, proposals, debate, redrafted proposals, diplomatic language, and, worst of all, remote from the mundane experience of everyday Internet users, such as spam and whether they can trust their banks' Web sites.

But if we care about the future of the Internet we must take an interest in what authority should be exercised by the International Telecommunications Union or the Internet Corporation for Assigned Names and Numbers or some other yet-to-be-defined. In fact, we are right on top of a key moment in that developmental history: from December 3 to 14, the ITU is convening the World Conference on International Telecommunications (WCIT, pronounced "wicket"). The big subject for discussion: how and whether to revise the 1988 International Telecommunications Regulations.

Plans for WCIT have been proceeding for years. In May, civil society groups concerned with civil liberties and human rights signed a letter to ITU secretary-general Hamadeoun Touré asking the ITU to open the process to more stakeholders. In June, a couple of frustrated academics changed the game by setting up WCITLeaks asking anyone who had copies of the proposals being submitted to the ITU to send copies. crutiny of those proposals showed the variety and breadth of some countries' desires for regulation. On November 7, the ITU's secretary-general, Hamadoun Touré, wrote an op-ed for Wired arguing that nothing would be passed except by consensus.

On Monday, he got a sort of answer from the International Trade Union Congress secretary, Sharon Burrow who, together with former ICANN head Paul Twomey, and, by video link, Internet pioneer Vint Cerf , launched the Stop the Net Grab campaign. The future of the Internet, they argued, is too important to too many stakeholders to leave decisions about its future up to governments bargaining in secret. The ITU, in its response, argued that Greenpeace and the ITUC have their facts wrong; after the two sides met, the ITUC reiterated its desire for some proposals to be taken off the table.

But stop and think. Opposition to the ITU is coming from Greenpeace and the ITUC?

"This is a watershed," said Twomey. "We have a completely new set of players, nothing to do with money or defending the technology. They're not priests discussing their protocols. We have a new set of experienced international political warriors saying, 'We're interested'."

Explained Burrow, "How on earth is it possible to give the workers of Bahrain or Ghana the solidarity of strategic action if governments decide unions are trouble and limit access to the Internet? We must have legislative political rights and freedoms - and that's not the work of the ITU, if it requires legislation at all."

At heart for all these years, the debate remains the same: who controls the Internet? And does governing the Internet mean regulating who pays whom or controlling what behavior is allowed? As Vint Cerf said, conflating those two is confusing content and infrastructure.

Twomey concluded, "[Certain political forces around the world] see the ITU as the place to have this discussion because it's not structured to be (nor will they let it be) fully multi-stakeholder. They have taken the opportunity of this review to bring up these desires. We should turn the question around: where is the right place to discuss this and who should be involved?"

In the journey from Postel to governance, this is the second watershed. The first step change came in 1996-1997, when it was becoming obvious that governing the Internet - which at the time primarily meant managing the allocation of domain names and numbered Internet addresses (under the aegis of the Internet Assigned Numbers Authority) - was too complex and too significant a job for one man, no matter how respected and trusted. The Internet Society and IANA formed the Internet Ad-Hoc Committee, which, in a published memorandum, outlined its new strategy. And all hell broke loose.

Long-term, the really significant change was that until that moment no one had much objected to either the decisions the Internet pioneers and engineers made or their right to make them. After some pushback, in the end the committee was disbanded and the plan scrapped, and instead a new agreement was hammered out, creating ICANN. But the lesson had been learned: there were now more people who saw themselves as Internet stakeholders than just the engineers who had created it, and they all wanted representation at the table.

In the years since, the make-up of the groups demanding to be heard has remained pretty stable, as Twomey said: engineers and technologists; representatives of civil society groups, usually working in some aspect of human rights, usually civil liberties, such as EFF, ORG, CDT, and Public Knowledge, all of whom signed the May letter. So yes, for labor unions and Greenpeace to decide that Internet freedoms are too fundamental to what they do to not participate in the decision-making about its future, is a watershed.

"We will be active as long as it takes," Burrow said Monday.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series.

October 26, 2012

Lie to me

I thought her head was going to explode.

The discussion that kicked off this week's Parliament and Internet conference revolved around cybersecurity and trust online, harmlessly at first. Then Helen Goodman (Labour - Bishop Auckland), the shadow minister for Culture, Media, and Sport, raised a question: what was Nominet doing to get rid of anonymity online? Simon McCalla, Nominet's CTO, had some answers: primarily, they're constantly trying to improve the accuracy and reliability of the Whois database, but it's only a very small criminal element that engage in false domain name registration. Like that.

A few minutes later, Andy Smith, PSTSA Security Manager, Cabinet Office, in answer to a question about why the government was joining the Open Identity Exchange (as part of the Identity Assurance Programme) advised those assembled to protect themselves online by lying. Don't give your real name, date of birth, and other information that can be used to perpetrate identity theft.

Like I say, bang! Goodman was horrified. I was sitting near enough to feel the splat.

It's the way of now that the comment was immediately tweeted, picked up by the BBC reporter in the room, published as a story, retweeted, Slashdotted, tweeted some more, and finally boomeranged back to be recontextualized from the podium. Given a reporter with a cellphone and multiple daily newspaper editions, George Osborne's contretemps in first class would still have reached the public eye the same day 15 years ago. This bit of flashback couldn't have happened even five years ago.

For the record, I think it's clear that Smith gave good security advice, and that the headline - the greater source of concern - ought to be that Goodman, an MP apparently frequently contacted by constituents complaining about anonymous cyberbullying, doesn't quite grasp that this is a nuanced issue with multiple trade-offs. (Or, possibly, how often the cyberbully is actually someone you know.) Dates of birth, mother's maiden names, the names of first pets...these are all things that real-life friends and old schoolmates may well know, and lying about the answers is a perfectly sensible precaution given that there is no often choice about giving the real answers for more sensitive purposes, like interacting with government, medical, and financial services. It is not illegal to fake or refuse to disclose these things, and while Facebook has a real names policy it's enforced with so little rigor that it has a roster of fake accounts the size of Egypt.

Although: the Earl of Erroll might be a bit busy today changing the fake birth date - April 1, 1900 - he cheerfully told us and Radio 4 he uses throughout; one can only hope that he doesn't use his real mother's maiden name, since that, as Tom Scott pointed out later, is in Erroll's Wikipedia entry. Since my real birth date is also in *my* Wikipedia entry and who knows what I've said where, I routinely give false answers to standardized security questions. What's the alternative? Giving potentially thousands of people the answers that will unlock your bank account? On social networking sites it's not enough for you to be taciturn; your birth date may be easily outed by well-meaning friends writing on your wall. None of this is - or should be - illegal.

It turns out that it's still pretty difficult to explain to some people how the Internet works or why . Nominet can work as hard as it likes on verifying its own Whois database, but it is powerless over the many UK citizens and businesses that choose to register under .com, .net, and other gTLDs and country codes. Making a law to enjoin British residents and companies from registering domains outside of .uk...well, how on earth would you enforce that? And then there's the whole problem of trying to check, say, registrations in Chinese characters. Computers can't read Chinese? Well, no, not really, no matter what Google Translate might lead you to believe.

Anonymity on the Net has been under fire for a long, long time. Twenty years ago, the main source of complaints was AOL, whose million-CD marketing program made it easy for anyone to get a throwaway email address for 24 hours or so until the system locked you out for providing an invalid credit card number. Then came Hotmail, and you didn't even need that. Then, as now, there are good and bad reasons for being anonymous. For every nasty troll who uses the cloak to hide there are many whistleblowers and people in private pain who need its protection.

Smith's advice only sounds outrageous if, like Goodman, you think there's a valid comparison between Nominet's registration activity and the function of the Driver and Vehicle Licensing Agency (and if you think the domain name system is the answer to ensuring a traceable online identity). And therein lies the theme of the day: the 200-odd Parliamentarians, consultants, analysts, government, and company representatives assembled repeatedly wanted incompatible things in conflicting ways. The morning speakers wanted better security, stronger online identities, and the resources to fight cybercrime; the afternoon folks were all into education and getting kids to hack and explore so they learn to build things and understand things and maybe have jobs someday, to their own benefit and that of the rest of the country. Paul Bernal has a good summary.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.


October 12, 2012

My identity, my self

Last week, the media were full of the story that the UK government was going to start accepting Facebook logons for authentication. This week, in several presentations at the RSA Conference, representatives of the Government Digital Service begged to differ: the list of companies that have applied to become identity providers (IDPs) will be published at the end of this month and until then they are not confirming the presence or absence of any particular company. According to several of the spokesfolks manning the stall and giving presentations, the press just assumed that when they saw social media companies among the categories of organization that might potentially want to offer identity authentication, that meant Facebook. We won't actually know for another few weeks who has actually applied.

So I can mercifully skip the rant that hooking a Facebook account to the authentication system you use for government services is a horrible idea in both directions. What they're actually saying is, what if you could choose among identification services offered by the Post Office, your bank, your mobile network operator (especially for the younger generation), your ISP, and personal data store services like Mydex or small, local businesses whose owners are known to you personally? All of these sounded possible based on this week's presentations.

The key, of course, is what standards the government chooses to create for IDPs and which organizations decide they can meet those criteria and offer a service. Those are the details the devil is in: during the 1990s battles about deploying strong cryptography, the government's wanted copies of everyone's cryptography keys to be held in escrow by a Trusted Third Party. At the time, the frontrunners were banks: the government certainly trusted those, and imagined that we did, too. The strength of the disquiet over that proposal took them by surprise. Then came 2008. Those discussions are still relevant, however; someone with a long memory raised the specter of Part I of the Electronic Communications Act 2000, modified in 2005, as relevant here.

It was this historical memory that made some of us so dubious in 2010, when the US came out with proposals rather similar to the UK's present ones, the National Strategy for Trusted Identities in Cyberspace (NSTIC). Ross Anderson saw it as a sort of horror-movie sequel. On Wednesday, however, Jeremy Grant, the senior executive advisor for identity management at the US National Institute for Standards and Technology (NIST), the agency charged with overseeing the development of NSTIC, sounded a lot more reassuring.

Between then and now came both US and UK attempts to establish some form of national ID card. In the US, "Real ID", focused on the state authorities that issue driver's licenses. In the UK, it was the national ID card and accompanying database. In both countries the proposals got howled down. In the UK especially, the combination of an escalating budget, a poor record with large government IT projects, a change of government, and a desperate need to save money killed it in 2006.

Hence the new approach in both countries. From what the GDS representatives - David Rennie (head of proposition at the Cabinet Office), Steven Dunn (lead architect of the Identity Assurance Programme; Twitter: @cuica), Mike Pegman (security architect at the Department of Welfare and Pensions, expected to be the first user service; Twitter: @mikepegman), and others manning the GDS stall - said, the plan is much more like the structure that privacy advocates and cryptographers have been pushing for 20 years: systems that give users choice about who they trust to authenticate them for a given role and that share no more data than necessary. The notion that this might actually happen is shocking - but welcome.

None of which means we shouldn't be asking questions. We need to understand clearly the various envisioned levels of authentication. In practice, will those asking for identity assurance ask for the minimum they need or always go for the maximum they could get? For example, a bar only needs relatively low-level assurance that you are old enough to drink; but will bars prefer to ask for full identification? What will be the costs; who pays them and under what circumstances?

Especially, we need to know what the detail of the standards organizations must meet to be accepted as IDPs, in particular, what kinds of organization they exclude. The GDS as presently constituted - composed, as William Heath commented last year, of all the smart, digitally experienced people you *would* hire to reinvent government services for the digital world if you had the choice - seems to have its heart in the right place. Their proposals as outlined - conforming, as Pegman explained happily, to Kim Cameron's seven laws of identity - pay considerable homage to the idea that no one party should have all the details of any given transaction. But the surveillance-happy type of government that legislates for data retention and CCDP might also at some point think, hey, shouldn't we be requiring IDPs to retain all data (requests for authentication, and so on) so we can inspect it should we deem it necessary? We certainly want to be very careful not to build a system that could support such intimate secret surveillance - the fundamental objection all along to key escrow.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of the earlier columns in this series.


June 15, 2012

A license to print money

"It's only a draft," Julian Huppert, the Liberal Democrat MP for Cambridge, said repeatedly yesterday. He was talking about the Draft Communications Data Bill (PDF), which was published on Wednesday. Yesterday, in a room in a Parliamentary turret, Hupper convened a meeting to discuss the draft; in attendance were a variety of Parliamentarians plus experts from civil society groups such as Privacy International, the Open Rights Group, Liberty, and Big Brother Watch. Do we want to be a nation of suspects?

The Home Office characterizes the provisions in the draft bill as vital powers to help catch criminals, save lives, and protect children. Everyone else - the Guardian, ZDNet UK, and dozens more - is calling them the "Snooper's charter".

Huppert's point is important. Like the Defamation Bill before it, publishing a draft means there will be a select committee with 12 members, discussion, comments, evidence taken, a report (by November 30, 2012), and then a rewritten bill. This draft will not be voted on in Parliament. We don't have to convince 650 MPs that the bill is wrong; it's a lot easier to talk to 12 people. This bill, as is, would never pass either House in any case, he suggested.

This is the optimistic view. The cynic might suggest that since it's been clear for something like ten years that the British security services (or perhaps their civil servants) have a recurring wet dream in which their mountain of data is the envy of other governments, they're just trying to see what they can get away with. The comprehensive provisions in the first draft set the bar, softening us up to give away far more than we would have in future versions. Psychologists call this anchoring, and while probably few outside the security services would regard the wholesale surveillance and monitoring of innocent people as normal, the crucial bit is where you set the initial bar for comparison for future drafts of the legislation. However invasive the next proposals are, it will be easy for us to lose the bearings we came in with and feel that we've successfully beaten back at least some of the intrusiveness.

But Huppert is keeping his eye on the ball: maybe we can not only get the worst stuff out of this bill but make things actually better than they are now; it will amend RIPA. The Independent argues that private companies hold much more data on us overall but that article misses that this bill intends to grant government access to all of it, at any time, without notice.

The big disappointment in all this, as William Heath said yesterday, is that it marks a return to the old, bad, government IT ways of the past. We were just getting away from giant, failed public IT projects like the late unlamented NHS platform for IT and the even more unlamented ID card towards agile, cheap public projects run by smart guys who know what they're doing. And now we're going to spend £1.8 billion of public money over ten years (draft bill, p92) building something no one much wants and that probably won't work? The draft bill claims - on what authority is unclear - that the expenditure will bring in £5 to £6 billion in revenues. From what? Are they planning to sell the data?

Or are they imagining the economic growth implied by the activity that will be necessary to build, install, maintain, and update the black boxes that will be needed by every ISP in order to comply with the law. The security consultant Alec Muffet has laid out the parameters for this SpookBox 5000: certified, tested, tamperproof, made by, say, three trusted British companies. Hundreds of them, legally required, with ongoing maintenance contracts. "A license to print money," he calls them. Nice work if you can get it, of course.

So we're talking - again - about spending huge sums of government money on a project that only a handful of people want and whose objectives could be better achieved by less intrusive means. Give police better training in computer forensics, for example, so they can retrieve the evidence they need from the devices they find when executing a search warrant.

Ultimately, the real enemy is the lack of detail in the draft bill. Using the excuse that the communications environment is changing rapidly and continuously, the notes argue that flexibility is absolutely necessary for Clause 1, the one that grants the government all the actual surveillance power, and so it's been drafted to include pretty much everything, like those contracts that claim copyright in perpetuity in all forms of media that exist now or may hereinafter be invented throughout the universe. This is dangerous because in recent years the use of statutory instruments to bypass Parliamentary debate has skyrocketed. No. Make the defenders of this bill prove every contention; make them show the evidence that makes every extra bit of intrusion necessary.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.


April 13, 2012

The people perimeter

People with jobs are used to a sharp division between their working lives and their private lives. Even in these times, when everyone carries a mobile phone and may be on call at any moment, they still tend to believe that what they say to their friends is no concern of their employer's. (Freelances tend not to have these divisions; to a much larger extent we have always been "in public" most of the time.)

These divisions were always less in small towns, where teachers or clergy had little latitude, and where even less-folk would be well advised to leave town before doing anything they wouldn't want discussed in detail. Then came social media, which turns everywhere into a small town and where even if you behave impeccably details about you and your employer may be exposed without your knowledge.

That's all a roundabout way of leading to yesterday's London Tea camp, where the subject of discussion was developing guidelines for social media use by civil servants.

Civil servants! The supposedly faceless functionaries who, certainly at the senior levels, are probably still primarily understood by most people through the fictional constructs of TV shows like Yes, Minister and The Thick of It. All of the 50 or 60 people from across government who attended yesterday have Twitter IDs; they're on Facebook and Foursquare, and probably a few dozen other things that would horrify Sir Humphrey. And that's as it should be: the people administering the nation's benefits, transport, education, and health absolutely should live like the people they're trying to serve. That's how you get services that work for us rather than against us.

The problem with social media is the same as their benefit: they're public in a new and different way. Even if you never identify your employer, Foursquare or the geotagging on Twitter or Facebook checks you in at a postcode that's indelibly identified with the very large government building where your department is the sole occupant. Or a passerby photographs you in front of it and Facebook helpfully tags your photograph with your real name, which then pops up in outside searches. Or you say something to someone you know who tells someone else who posts it online for yet another person to identify and finally the whole thing comes back and bites you in the ass. Even if your Tweets are clearly personal, and even if your page says, "These are just my personal opinions and do not reflect those of my employer", the fact of where you can be deduced to work risks turning anything connected to you into something a - let's call it - excitable journalist can make into a scandal. Context is king.

What's new about this is the uncontrollable exposure of this context. Any Old Net Curmudgeon will tell you that the simple fact of people being caught online doing things their employers don't like goes back to the dawn of online services. Even now I'm sure someone dedicated could find appalling behavior in the Usenet archives by someone who is, 25 years on, a highly respected member of society. But Usenet was a minority pastime; Facebook, Twitter et al are mainstream.

Lots has been written by and about employers in this situation: they may suffer reputational damage, legal liability, or a breach that endangers their commercial secrets. Not enough has been written about individuals struggling to cope with sudden, unwanted exposure. Don't we have the right to private lives? someone asked yesterday. What they are experiencing is the same loss of border control that security engineers are trying to cope with. They call it "deperimeterization", because security used to mean securing the perimeter of your network and now security means coping with its loss. Adding wireless, remote access for workers at home, personal devices such as mobile phones, and links to supplier and partner networks have all blown holes in it.

There is no clear perimeter any more for networks - or individuals, either. Trying to secure one by dictating behavior, whether by education, leadership by example, or written guidelines, is inevitably doomed. There is, however, a very valid reason to have these things: to create a general understanding between employer and employee. It should be clear to all sides what you can and cannot get fired for.

In 2003, Danny O'Brien nailed a lot of this when he wrote about the loss of what he called the "private-intermediate sphere". In that vanishing country, things were private without being secret. You could have a conversation in a pub with strangers walking by and be confident that it would reach only the audience present at the time and that it would not unexpectedly be replayed or published later (see also Don Harmon and Chevy Chase's voicemail). Instead, he wrote, the Net is binary: secret or public, no middle ground.

What's at stake here is really not private life, but *social* life. It's the addition of the online component to our social lives that has torn holes in our personal perimeters.

"We'll learn a kind of tolerance for the private conversation that is not aimed at us, and that overreacting to that tone will be a sign of social naivete," O'Brien predicted. Maybe. For now, hard cases make bad law (and not much better guidelines) *First* cases are almost always hard cases.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.


March 23, 2012

The year of the future

If there's one thing everyone seemed to agree on yesterday at Nominet's annual Internet policy conference, it's that this year, 2012, is a crucial one in the development of the Internet.

The discussion had two purposes. One is to feed into Nominet's policy-making as the body in charge of .uk, in which capacity it's currently grappling with questions such as how to respond to law enforcement demands to disappear domains. The other, which is the kind of exercise net.wars particularly enjoys and that was pioneered at the Computers, Freedom, and Privacy conference (next one spring 2013, in Washington, DC), is to peer into the future and try to prepare for it.

Vint Cerf, now Google's Chief Internet Evangelist, outlined some of that future, saying that this year, 2012, will see more dramatic changes to the Internet than anything since 1983. He had a list:

- The deployment of better authentication in the form of DNSSec;

- New certification regimes to limit damage in the event of more cases like 2011's Diginotar hack;

- internationalized domain names;

- The expansion of new generic top-level domains;

- The switch to IPv6 Internet addressing, which happens on June 6;

- Smart grids;

- The Internet of things: cars, light bulbs, surfboards (!), and anything else that can be turned into a sensor by implanting an RFID chip.

Cerf paused to throw in an update on his long-running project the interplanetary Internet he's been thinking about since 1998 (TXT).

"It's like living in a science fiction novel," he said yesterday as he explained about overcoming intense network lag by using high-density laser pulses. The really cool bit: repurposing space craft whose scientific missions have been completed to become part of the interplanetary backbone. Not space junk: network nodes-in-waiting.

The contrast to Ed Vaizey, the minister for culture, communications and the creative industries at the Department of Culture, Media, and Sport, couldn't have been more marked. He summed up the Internet's governance problem as the "three Ps": pornography, privacy, and piracy. It's nice rhetorical alliteration, but desperately narrow. Vaizey's characterization of 2012 as a critical year rests on the need to consider the UK's platform for the upcoming Internet Governance Forum leading to 2014's World Information Technology Forum. When Vaizey talks about regulating with a "light touch", does he mean the same things we do?

I usually place the beginning of the who-governs-the-Internet argument at1997, the first time the engineers met rebellion when they made a technical decision (revamping the domain name system). Until then, if the pioneers had an enemy it was governments, memorably warned off by John Perry Barlow's 1996 Declaration of the Independence of Cyberspace. After 1997, it was no longer possible to ignore the new classes of stakeholders, commercial interests and consumers.

I'm old enough as a Netizen - I've been online for more than 20 years - to find it hard to believe that the Internet Governance Forum and its offshoots do much to change the course of the Internet's development: while they're talking, Google's self-drive cars rack up 200,000 miles on San Francisco's busy streets with just one accident (the car was rear-ended; not their fault) and Facebook sucks in 800 million users (if it were a country, it would be the world's third most populous nation).

But someone has to take on the job. It would be morally wrong for governments, banks, and retailers to push us all to transact with them online if they cannot promise some level of service and security for at least those parts of the Internet that they control. And let's face it: most people expect their governments to step in if they're defrauded and criminal activity is taking place, offline or on, which is why I thought Barlow's declaration absurd at the time

Richard Allan, director of public policy for Facebook EMEA - or should we call him Lord Facebook? - had a third reason why 2012 is a critical year: at the heart of the Internet Governance Forum, he said, is the question of how to handle the mismatch between global Internet services and the cultural and regulatory expectations that nations and individuals bring with them as they travel in cyberspace. In Allan's analogy, the Internet is a collection of off-shore islands like Iceland's Surtsey, which has been left untouched to develop its own ecosystem.

Should there be international standards imposed on such sites so that all users know what to expect? Such a scheme would overcome the Balkanization problem that erupts when sites present a different face to each nation's users and the censorship problem of blocking sites considered inappropriate in a given country. But if that's the way it goes, will nations be content to aggregate the most open standards or insist on the most closed, lowest-common-denominator ones?

I'm not sure this is a choice that can be made in any single year - they were asking this same question at CFP in 1994 - but if this is truly the year in which it's made, then yes, 2012 is a critical year in the development of the Internet.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

January 21, 2012

Camping out

"Why hasn't the marvelous happened yet?" The speaker - at one of today's "unconference" sessions at this year's UK Govcamp - was complaining that with 13,000-odd data sets up on his organization's site there ought to be, you know, results.

At first glance, GovCamp seems peculiarly British: an incongruous mish-mash of government folks, coders, and activists, all brought together by the idea that technology makes it possible to remake government to serve us better. But the Web tells me that events like this are happening in various locations around Europe. James Hendler, who likes to collect government data sets from around the world (700,000 and counting now!), tells me that events like this are happening all over the US, too - except that there this size of event - a couple of hundred people - is New York City.

That's both good and bad: a local area in the US can find many more people to throw at more discrete problems - but on the other hand the federal level is almost impossible to connect with. And, as Hendler points out, the state charters mean that there are conversations the US federal government simply cannot have with its smaller, local counterparts. In the UK, if central government wants a local authority to do something, it can just issue an order.

This year's GovCamp is a two-day affair. Today was an "unConference": dozens of sessions organized by participants to talk about...stuff. Tomorrow will be hands-on, doing things in the limited time available. By the end of the day, the Twitter feed was filling up with eagerness to get on with things.

A veteran camper - I'm not sure how to count how many there have been - tells me that everyone leaves the event full of energy, convinced that they can change the world on Monday. By later next week, they'll have come down from this exhilarated high to find they're working with the same people and the same attitudes. Wonders do not happen overnight.

Along those lines, Mike Bracken, the guy who launched the Guardian's open data platform, now at the Cabinet Office, acknowledges this when he thanks the crowd for the ten years of persistence and pain that created his job. The user, his colleague Mark O'Neill said recently is at the center of everything they're working on. Are we, yet, past proving the concept?

"What should we do first?" someone I couldn't identify (never knowing who's speaking is a pitfall of unConferences) asked in the same session as the marvel-seeker. One offered answer was one any open-source programmer would recognize: ask yourself, in your daily life, what do you want to fix? The problem you want to solve - or the story you want to tell - determines the priorities and what gets published. That's if you're inside government; if you're outside, based on last summer's experience following the Osmosoft teams during Young Rewired State, often the limiting factor is what data is available and in what form.

With luck and perseverance, this should be a temporary situation. As time goes on, and open data gets built into everything, publishing it should become a natural part of everything government does. But getting there means eliminating a whole tranche of traditional culture and overcoming a lot of fear. If I open this data and others can review my decisions will I get fired? If I open this data and something goes wrong will it be my fault?

In a session on creative councils, I heard the suggestion that in the interests of getting rid of gatekeepers who obstruct change organizational structures should be transformed into networks with alternate routes to getting things done until the hierarchy is no longer needed. It sounds like a malcontent's dream for getting the desired technological change past a recalcitrant manager, but the kind of solution that solves one problem by breaking many other things. In such a set-up, who is accountable to taxpayers? Isn't some form of hierarchy inevitable given that someone has to do the hiring and firing?

It was in a session on engagement where what became apparent that as much as this event seems to be focused on technological fixes, the real goal is far broader. The discussion veered into consultations and how to build persistent networks of people engaged with particular topics.

"Work on a good democratic experience," advised the session's leader. Make the process more transparent, make people feel part of the process even if they don't get what they want, create the connection that makes for a truly representative democracy. In her view, what goes wrong with the consultation process now - where, for example, advocates of copyright reform find themselves writing the same ignored advice over and over again in response to the same questions - is that it's trying to compensate for the poor connections to their representatives that most people have. Building those persistent networks and relationships is only a partial answer.

"You can't activate the networks and not at the same time change how you make decisions," she said. "Without that parallel change you'll wind up disappointing people."

Marvels tomorrow, we hope.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

December 30, 2011

Ignorance is no excuse

My father was not a patient man. He could summon up some compassion for those unfortunates who were stupider than himself. What he couldn't stand was ignorance, particularly willful ignorance. The kind of thing where someone boasts about how little they know.

That said, he also couldn't abide computers. "What can you do with a computer that you can't do with a paper and pencil?" he demanded to know when I told him I was buying a friend's TRS-80 Model III in 1981. He was not impressed when I suggested that it would enable me to make changes on page 3 of a 78-page manuscript without retyping the whole thing.

My father had a valid excuse for that particular bit of ignorance or lack of imagination. It was 1981, when most people had no clue about the future of the embryonic technology they were beginning to read about. And he was 75. But I bet if he'd made it past 1984 he'd have put some effort into understanding this technology that would soon begin changing the printing industry he worked in all his life.

While computers were new on the block, and their devotees were a relatively small cult of people who could be relatively easily spotted as "other", you could see the boast "I know nothing about computers" as a replay of high school. In American movies and TV shows that would be jocks and the in-crowd on one side, a small band of miserable, bullied nerds on the other. In the UK, where for reasons I've never understood it's considered more admirable to achieve excellence without ever being seen to work hard for it, the sociology plays out a little differently. I guess here the deterrent is less being "uncool" and more being seen as having done some work to understand these machines.

Here's the problem: the people who by and large populate the ranks of politicians and the civil service are the *other* people. Recent events such as the UK's Government Digital Service launch suggest that this is changing. Perhaps computers have gained respectability at the top level from the presence of MPs who can boast that they misspent their youth playing video games rather than, like the last generation's Ian Taylor, getting their knowledge the hard way, by sweating for it in the industry.

There are several consequences of all this. The most obvious and longstanding one is that too many politicians don't "get" the Net, which is how we get legislation like the DEA, SOPA, PIPA, and so on. The less obvious and bigger one is that we - the technology-minded, the early adopters, the educated users - write them off as too stupid to talk to. We call them "congresscritters" and deride their ignorance and venality in listening to lobbyists and special interest groups.

The problem, as Emily Badger writes for Miller-McCune as part of a review of Clay Johnson's latest book, is that if we don't talk to them how can we expect them to learn anything?

This sentiment is echoed in a lecture given recently at Rutgers by the distinguished computer scientist David Farber on the technical and political evolution of the Internet (MP3) (the slides are here (PDF)). Farber's done his time in Washington, DC, as chief technical advisor to the Federal Communications Commission and as a member of the Presidential Advisory Board on Information Technology. In that talk, Farber makes a number of interesting points about what comes next technically - it's unlikely, he says, that today's Internet Protocols will be able to cope with the terabyte networks on the horizon, and reengineering is going to be a very, very hard problem because of the way humans resist change - but the more relevant stuff for this column has to do with what he learned from his time in DC.

Very few people inside the Beltway understand technology, he says there, citing the Congressman who asked him seriously, "What is the Internet?" (Well, see, it's this series of tubes...) And so we get bad - that is, poorly grounded - decisions on technology issues.

Early in the Net's history, the libertarian fantasy was that we could get on just fine without their input, thank you very much. But as Farber says, politicians are not going to stop trying to govern the Internet. And, as he doesn't quite say, it's not like we can show them that we can run a perfect world without them. Look at the problems techies have invented: spam, the flaky software infrastructure on which critical services are based, and so on. "It's hard to be at the edge in DC," Farber concludes.

So, going back to Badger's review of Johnson: the point is it's up to us. Set aside your contempt and distrust. Whether we like politicians or not, they will always be with us. For 2012, adopt your MP, your Congressman, your Senator, your local councilor. Make it your job to help them understand the bills they're voting on. Show them tshat even if they don't understand the technology there's votes in those who do. It's time to stop thinking of their ignorance as solely *their* fault.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.


December 9, 2011

Reversal of government fortunes

What if - I say, what if? - a country in which government IT projects have always been marked as huge, expensive, lengthy failures could transform itself into a country where IT genuinely works for both government and the people? What if the cheeky guys who founded MySociety and made communicating with your MP or looking up his voting record as easy as buying a book from Amazon were given the task of digitizing government? The guys (which I use as a gender-neutral term) who made e-petitions, PledgeBank, and FixMyStreet? Who embarrassed dozens of big, fat, failed government IT projects? What would that look like?

Government IT in Britain has been an expensive calamity for so long that it's become generally accepted that it will fail, and the headlines describing the latest billions lost in taxpayers' money have become a national joke on a par with losing at sports. People complain that Andy Murray hasn't won anything big, but the near-miss is thoroughly ingrained in the British national consciousness; the complaints are as familiar and well-worn a track as the national anthem. No one is happy about it - but it's like comfort food.

It was gently explained to me this week - in a pub, of course - that my understanding of how the UK government operates, based as it is on a mish-mash of single readings of Anthony Trollope's Palliser novels, repeated viewings of the 1980s sitcom Yes, Minister, and the occasional patient explanation from friends and acquaintances needs to be updated. The show was (and remains) a brilliant exposé of the inner workings of the civil service of the day, something that until then was completely obscure. Politicians repeatedly said it was a documentary, not fiction - and then they began to change in response to it. Who saw that coming? The Blair government bypassed the civil service by hiring outside consultants - who were expensive and, above all, not disinterested. The coalition has reacted by going the other way, thinking small, and hiring people who are good at doing things with all this fancy, new technology. Cheap things. Effective things. Even some of the MySociety people. I know, right?

The fact that people like Mike Bracken, who masterminded the Guardian's open platform and who is a founder of MySociety, are working in government is kind of astonishing. And not just him: also Wired UK, and who has gone on to work for the BBC and advise Ofcom on digital strategy and Richard Pope, another of the MySociety guys.

The question is, can a small cohort of clever people succeed in turning a lumbering ship like a national government, let alone one running a country so wedded to the traditional way of doing things as Britain is? This week, the UK government has seemed to embrace both the dysfunctional old, in the form of promising the nation's public health data to life sciences companies, and the new, in the form of launching the Government Digital Service. You almost want to make one of those old Tired/Wired tables. Tired: centralisation, big databases, the British population as assets to be sold off or given away to "users", who are large organisations. Wired: individual control, personal data stores, users who are citizens in charge of their own destinies.

Yesterday, Bracken was the one to announce the new Government Data Service. William Heath, who founded the government consultancy Kable (since sold and now Guardian Government Computing) and, in 2004, the Ideal Government blog in pursuit of something exactly like this, could scarcely contain his excitement.

What's less encouraging is seeing health data mixed in with the Autumn Statement's open data provisions (PDF). As Heath wrote when the news broke, open data is about things, not people. Open data is: transport schedules, mapping data, lists of government assets, national statistics, and so on. This kind of data we want published as openly and in as raw a form as possible, so that it can be reused and form the basis for new businesses and economic growth. This is the process that not the kind of data we want to open. Yes, there are many organisations that would like access to it: life sciences companies, researchers of all types, large pharmaceutical companies, and so on. This is a battle that has been going on in Europe for more than ten years and for a somewhat shorter amount of time in the US, where the lack of nationalized health insurance means that it's taken longer for the issue to come to the front. In the UK, Ross Anderson (see also here) and Fleur Fisher are probably the longest-running campaigners against the assembling of patient records into a single national database. As the case of Wikileaks and the diplomatic cables showed, it is hopeless to think that a system accessible by 800,000 people can keep a secret.

But let's wait to see the details before we get mad. For today, enjoy the moment. Change may happen! In a good way!

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

October 14, 2011

Think of the children

Give me smut and nothing but! - Tom Lehrer

Sex always sells, which is presumably why this week's British headlines have been dominated by the news that the UK's ISPs are to operate an opt-in system for porn. The imaginary sales conversations alone are worth any amount of flawed reporting:

ISP Customer service: Would you like porn with that?

Customer: Supersize me!

Sadly, the reporting was indeed flawed. Cameron, it turns out was merely saying that new customers signing up with the four major consumer ISPs would be asked if they want parental filtering. So much less embarrassing. So much less fun.

Even so, it gave reporters such as Violet Blue, at ZDNet UK, a chance to complain about the lack of transparency and accountability of filtering systems.

Still, the fact that so many people could imagine that it's technically possible to turn "Internet porn" on and off as if operated a switch is alarming. If it were that easy, someone would have a nice business by now selling strap-on subscriptions the way cable operators do for "adult" TV channels. Instead, filtering is just one of several options for which ISPs, Web sites, and mobile phone operators do not charge.

One of the great myths of our time is that it's easy to stumble accidentally upon porn on the Internet. That, again, is television, where idly changing channels on a set-top box can indeed land you on the kind of smut that pleased Tom Lehrer. On the Internet, even with safe search turned off, it's relatively difficult to find porn accidentally - though very easy to find on purpose. (Especially since the advent of the .xxx top-level domain.)

It is, however, very easy for filtering systems to remove non-porn sites from view, which is why I generally turn off filters like "Safe search" or anything else that will interfere with my unfettered access to the Internet. I need to know that legitimate sources of information aren't being hidden by overactive filters. Plus, if it's easy to stumble over pornography accidentally I think that as a journalist writing about the Net and in general opposing censorship I think I should know that. I am better than average at constraining my searches so that they will retrieve only the information I really want, which is a definite bias in this minuscule sample of one. But I can safely say that the only time I encounter unwanted anything-like-porn is in display ads on some sites that assume their primary audience is young men.

Eli Pariser, whose The Filter Bubble: What the Internet is Hiding From You I reviewed recently for ZDNet UK, does not talk in his book about filtering systems intended to block "inappropriate" material. But surely porn filtering is a broad-brush subcase of exactly what he's talking about: automated systems that personalize the Net based on your known preferences by displaying content they already "think" you like at the expense of content they think you don't want. If the technology companies were as good at this as the filtering people would like us to think, this weekend's Singularity Summit would be celebrating the success of artificial intelligence instead of still looking 20 to 40 years out.

If I had kids now, would I want "parental controls"? No, for a variety of reasons. For one thing, I don't really believe the controls keep them safe. What keeps them safe is knowing they can ask their parents about material and people's behavior that upsets them so they can learn how to deal with it. The real world they will inhabit someday will not obligingly hide everything that might disturb their equanimity.

But more important, our children's survival in the future will depend on being able to find the choices and information that are hidden from view. Just as the children of 25 years ago should have been taught touch typing, today's children should be learning the intricacies of using search to find the unknown. If today's filters have any usefulness at all, it's as a way of testing kids' ability to think ingeniously about how to bypass them.

Because: although it's very hard to filter out only *exactly* the material that matches your individual definition of "inappropriate", it's very easy to block indiscriminately according to an agenda that cares only about what doesn't appear. Pariser worries about the control that can be exercised over us as consumers, citizens, voters, and taxpayers if the Internet is the main source of news and personalization removes the less popular but more important stories of the day from view. I worry that as people read and access only the material they already agree with our societies will grow more and more polarized with little agreement even on basic facts. Northern Ireland, where for a long time children went to Catholic or Protestant-owned schools and were taught that the other group was inevitably going to Hell, is a good example of the consequences of this kind of intellectual segregation. Or, sadly, today's American political debates, where the right and left have so little common basis for reasoning that the nation seems too polarized to solve any of its very real problems.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

September 9, 2011

The final countdown

The we-thought-it-was-dead specter of copyright term extension in sound recordings has done a Diabolique maneuver and been voted alive by the European Council. In a few days, the Council of Ministers could make it EU law because, as can happen under the inscrutable government structures of the EU, opposition has melted away.

At stake is the extension of copyright in sound recordings from 50 years to 70, something the Open Rights Group has been fighting since it was born. The push to extend it above 50 years has been with us for at least five years; originally the proposal was to take it to 95 years. An extension from 50 to 70 years is modest by comparison, but given the way these things have been going over the last 50 years, that would buy the recording industry 20 years in which to lobby for the 95 years they originally wanted, and then 25 years to lobby for the line to be moved further. Why now? A great tranche of commercially popular recordings is up for entry into the public domain: Elvis Presley's earliest recordings date to 1956, and The Beatles' first album came out in 1963; their first singles are 50 years old this year. It's not long after that to all the great rock records of the 1970s.

My fellow Open Rights Group advisory council member Paul Sanders, has up a concise little analysis about what's wrong here. Basically, it's never jam today for the artists, but jam yesterday, today, and tomorrow for the recording companies. I have commented frequently on the fact that the more record companies are able to make nearly pure profit on their back catalogues whose sunk costs have long ago been paid, the more new, young artists are required to compete for their attention with an ever-expanding back catalogue. I like Sanders' language on this: "redistributive, from younger artists to older and dead ones".

In recent years, we've heard a lof of the mantra "evidence-based policy" from the UK government. So, in the interests of ensuring this evidence-based policy the UK government is so keen on, here is some. The good news is they commissioned it themselves, so it ought to carry a lot of weight with them. Right? Right.

There have been two major British government reports studying the future of copyright and intellectual property law generally in the last five years: the Gowers Review, published in 2006, and the Hargreaves report was commissioned in November 2010 and released in May 2011.

From Hargreaves:

Economic evidence is clear that the likely deadweight loss to the economy exceeds any additional incentivising effect which might result from the extension of copyright term beyond its present levels.14 This is doubly clear for retrospective extension to copyright term, given the impossibility of incentivising the creation of already existing works, or work from artists already dead.

Despite this, there are frequent proposals to increase term, such as the current proposal to extend protection for sound recordings in Europe from 50 to 70 or even 95 years. The UK Government assessment found it to be economically detrimental. An international study found term extension to have no impact on output.

And further:

Such an extension was opposed by the Gowers Review and by published studies commissioned by the European Commission.

Ah, yes, Gowers and its 54 recommendations, many or most of which have been largely ignored. (Government policy seems to have embraced "strengthening of IP rights, whether through clamping down on piracy" to the exclusion of things like "improving the balance and flexibility of IP rights to allow individuals, businesses, and institutions to use content in ways consistent with the digital age".

To Gowers:

Recommendation 3: The European Commission should retain the length of protection on sound recordings and performers' rights at 50 years.

And:

Recommendation 4: Policy makers should adopt the principle that the term and scope of protection for IP rights should not be altered retrospectively.

I'd use the word "retroactive", myself, but the point is the same. Copyright is a contract with society: you get the right to exploit your intellectual property for some number of years, and in return after that number of years your work belongs to the society whose culture helped produce it. Trying to change an agreed contract retroactively usually requires you to show that the contract was not concluded in good faith, or that someone is in breach. Neither of those situations applies here, and I don't think these large companies with their in-house lawyers, many of whom participated in drafting prior copyright law, can realistically argue that they didn't understand the provisions. Of course, this recommendation cuts both ways: if we can't put Elvis's earliest recordings back into copyright, thereby robbing the public domain, we also can't shorten the copyright protection that applies to recordings created with the promise of 50 years' worth of protection.

This whole mess is a fine example of policy laundering: shopping the thing around until you either wear out the opposition or find sufficient champions. The EU, with its Hampton Court maze of interrelated institutions, could have been deliberately designed to facilitate this. You can write to your MP, or even your MEP - but the sad fact is that the shiny, new EU government is doing all this in old-style backroom deals.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

August 12, 2011

"Phony concerns about human rights"

Why can't you both condemn violent rioting and looting *and* care about civil liberties?

One comment of David Cameron's yesterday in the Commons hit a nerve: that "phony" (or "phoney", if you're British) human rights concerns would not get in the way of publishing CCTV images in the interests of bringing the looters and rioters to justice. Here's why it bothers me: even the most radical pro-privacy campaigner is not suggesting that using these images in this way is wrong. But in saying it, Cameron placed human rights on the side of lawlessness. One can oppose the privacy invasiveness of embedding crowdsourced facial recognition into Facebook and still support the use of the same techniques by law enforcement to identify criminals.

It may seem picky to focus on one phrase in a long speech in a crisis, but this kind of thinking is endemic - and, when it's coupled with bad things happening and a need for politicians to respond quickly and decisively, dangerous. Cameron shortly followed it with the suggestion that it might be appropriate to shut down access to social media sites when they are being used to plan "violence, disorder and criminality".

Consider the logic there: given the size of the population, there are probably people right now planning crimes over pints of beer in pubs, over the phone, and sitting in top-level corporate boardrooms. Fellow ORG advisory council member Kevin Marks blogs a neat comparison by Douglas Adams to cups of tea. But no, let's focus on social media.

Louise Mensch, MP and novelist, was impressove during the phone hacking hearings aside from her big gaffe about Piers Morgan. But she's made another mistake here in suggesting that taking Twitter and/or Facebook down for an hour during an emergency is about like shutting down a road or a railway station.

First of all, shutting down the tube in the affected areas has costs: innocent bystanders were left with no means to escape their violent surroundings. (This is the same thinking that wanted to shut down the tube on New Year's Eve 1999 to keep people out of central London.)

But more important, the comparison is wrong. Shutting down social networks is the modern equivalent of shutting down radio, TV, and telephones, not transport. The comparison suggests that Mensch is someone who uses social media for self-promotion rather than, like many of us, as a real-time news source and connector to friends and family. This is someone for whom social media are a late add-on to an already-structured life; in 1992 an Internet outage was regarded as a non-issue, too. The ability to use social media in an emergency surely takes pressure off the telephone network by helping people reassure friends and family, avoid trouble areas, find ways home, and so on. Are there rumors and misinformation? Sure. That's why journalists check stuff out before publishing it (we hope). But those are vastly overshadowed by the amount of useful and timely updates.

Is barring access is even possible? As Ben Rooney writes in the Wall Street Journal Europe, it's hard enough to ground one teenager these days, let alone a countryful. But let's say they decide to try. What approaches can they take?

One: The 95 percent approach. Shut down access to the biggest social media sites and hope that the crimes aren't being planned on the ones you haven't touched. Like the network that the Guardian finds was really used - Blackberry messaging.

Two: The Minority Report approach. Develop natural language processing and artificial intelligence technology to the point where it can interact on the social networks, spot prospective troublemakers, and turn them in before they commit crimes.

Three: The passive approach. Revive all the net.wars of the past two decades. Reinstate the real-world policing. One of the most important drawbacks to relying on mass surveillance technologies is that they encourage a reactive, almost passive, style of law enforcement. Knowing that the police can catch the crooks later is no comfort when your shop is being smashed up. It's a curious, schizophrenic mindset politicians have: blame social ills on new technology while imagining that other new technology can solve them.

The riots have ended - at least for now, but we will have to live for a long time with the decisions we make about what comes next. Let's not be hasty. Think of the PATRIOT Act, which will be ten years old soon.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

April 1, 2011

Equal access

It is very, very difficult to understand the reasoning behind the not-so-secret plan to institute Web blocking. In a http://www.openrightsgroup.org/blog/2011/minister-confirms-voluntary-site-blocking-discussionsletter to the Open Rights Group, Ed Vaizey, the minister for culture, communications, and creative industries, confirmed that such a proposal emerged from a workshop to discuss "developing new ways for people to access content online". (Orwell would be so proud.)

We fire up Yes, Minister once again to remind everyone the four characteristics of proposals ministers like: quick, simple, popular, cheap. Providing the underpinnings of Web site blocking is not likely to be very quick, and it's debatable whether it will be cheap. But it certainly sounds simple, and although it's almost certainly not going to be popular among the 7 million people the government claims engage in illegal file-sharing - a number PC Pro has done a nice job of dissecting - it's likely to be popular with the people Vaizey seems to care most about, rights holders.

The four opposing kiss-of-death words are: lengthy, complicated, expensive, and either courageous or controversial, depending how soon the election is. How to convince Vaizey that it's these four words that apply and not the other four?

Well, for one thing, it's not going to be simple, it's going to be complicated. Web site blocking is essentially a security measure. You have decided that you don't want people to have access to a particular source of data, and so you block their access. Security is, as we know, not easy to implement and not easy to maintain. Security, as Bruce Schneier keeps saying, is a process, not a product. It takes a whole organization to implement the much more narrowly defined IWF system. What kind of infrastructure will be required to support the maintenance and implementation of a block list to cover copyright infringement? Self-regulatory, you say? Where will the block list, currently thought to be about 100 sites come from? Who will maintain it? Who will oversee it to ensure that it doesn't include "innocent" sites? ISPs have other things to do, and other than limiting or charging for the bandwidth consumption of their heaviest users (who are not all file sharers by any stretch) they don't have a dog in this race. Who bears the legal liability for mistakes?

The list is most likely to originate with rights holders, who, because they have shown over most of the last 20 years that they care relatively little if they scoop innocent users and sites into the net alongside infringing ones, no one trusts to be accurate. Don't the courts have better things to do than adjudicate what percentage of a given site's traffic is copyright-infringing and whether it should be on a block list? Is this what we should be spending money on in a time of austerity? Mightn't it be...expensive?

Making the whole thing even more complicated is the obvious (to anyone who knows the Internet) fact that such a block list will - according to Torrentfreak already has - start a new arms race.

And yet another wrinkle: among blocking targets are cyberlockers. And yet this is a service that, like search, is going mainstream: Amazon.com has just launched such a service, which it calls Cloud Drive and for which it retains the right to police rather thoroughly. Encrypted files, here we come.

At least one ISP has already called the whole idea expensive, ineffective, and rife with unintended consequences.

There are other obvious arguments, of course. It opens the way to censorship. It penalizes innocent uses of technology as well as infringing ones; torrent search sites typically have a mass of varied material and there are legitimate reasons to use torrenting technology to distribute large files. It will tend to add to calls to spy on Internet users in more intrusive ways (as Web blocking fails to stop the next generation of file-sharing technologies). It will tend to favor large (often American) services and companies over smaller ones. Google, as IsoHunt told the US Court of Appeals two weeks ago, is the largest torrent search engine. (And, of course, Google has other copyright troubles of its own; last week the court rejected the Google Books settlement.)

But the sad fact is that although these arguments are important they're not a good fit if the main push behind Web blocking is an entrenched belief that only way to secure economic growth is to extend and tighten copyright while restricting access to technologies and sites that might be used for infringement. Instead, we need to show that this entrenched belief is wrong.

We do not block the roads leading to car boot sales just because sometimes people sell things at them whose provenance is cloudy (at best). We do not place levies on the purchase of musical instruments because someone might play copyrighted music on them. We should not remake the Internet - a medium to benefit all of society - to serve the interests of one industrial group. It would make more sense to put the same energy and financial resources into supporting the games industry which, as Tom Watson (Lab - Bromwich) has pointed out has great potential to lift the British economy.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

March 18, 2011

Block party

When last seen in net.wars, the Internet Watch Foundation was going through the most embarrassing moment of its relatively short life: the time it blocked a Wikipedia page. It survived, of course, and on Tuesday this week it handed out copies of its latest annual report (PDF) and its strategic plan for the years 2011 to 2014 (PDF) in the Strangers Dining Room at the House of Commons.

The event was, more or less, the IWF's birthday party: in August it will be 15 years since the suspicious, even hostile first presentation, in 1996, of the first outline of the IWF. It was an uneasy compromise between an industry accused of facilitating child abuse, law enforcement threatening technically inept action, and politicians anxious to be seen to be doing something, all heightened by some of the worst mainstream media reporting I've ever seen.

Suspicious or not, the IWF has achieved traction. It has kept government out of the direct censorship business and politicians and law enforcement reasonably satisfied. Without - as was pointed out - cost to the taxpayer, since the IWF is funded from a mix of grants, donations, and ISPs' subscription fees.

And to be fair, it has been arguably successful at doing what it set out to do, which is to disrupt the online distribution of illegal pornographic images of children within the UK. The IWF has reported for some years now that the percentage of such images hosted within the UK is near zero. On Tuesday, it said the time it takes to get foreign-hosted content taken down has halved. Its forward plan includes more of the same, plus pushing more into international work by promoting the use its URL list abroad and developing partnerships.

Over at The Register Jane Fae Ozniek has done a good job of tallying up the numbers the IWF reported, and also of following up on remarks made by Culture Minister Ed Vaizey and Home Office Minister James Brokenshire that suggested the IWF or its methods might be expanded to cover other categories of material. So I won't rehash either topic here.

Instead, what struck me is the IWF's report that a significant percentage of its work now concerns sexual abuse images and videos that are commercially distributed. This news offered a brief glance into a shadowy world that is illegal for any of us to study since under UK law (and the laws of many other countries) it's illegal to access such material. If this is a correct assessment, it certainly follows the same pattern as the world of malware writing, which has progressed from the giggling, maladjusted teenager writing a bit of disruptive code in his bedroom to a highly organized, criminal, upside-down image of the commercial software world (complete, I'm told by experts from companies like Symantec and Sophos, with product trials, customer support, and update patches). Similarly, our, or at least my, image was always of like-minded amateurs exchanging copies of the things they managed to pick up rather like twisted stamp collectors.

The IWF report says it has identified 715 such commercial sources, 321 of which were active in 2010. At least 47.7 percent of the commercially branded material is produced by the top ten, and the most prolific of these brands used 862 URLs. The IWF has attempted to analyze these brands, and believes that they are operated in clusters by criminals. To quote the report:

Each of the webpages or websites is a gateway to hundreds or even thousands of individual images or videos of children being sexually abused, supported by layers of payment mechanisms, content sores, membership systems, and advertising frames. Payment systems may include pre-pay cards, credit cards, "virtual money" or e-payment systems, and may be carried out across secure webpages, text, or email.

This is not what people predicted when they warned at the original meeting that blocking access to content would drive it underground into locations that were harder to police. I don't recall anyone saying: it will be like Prohibition and create a new Mafia. How big a problem this is and how it relates to events like yesterday's shutdown of boylovers.net remains to be seen. But there's logic to it: anything that's scarce attracts a high price and anything high-priced and illegal attracts dedicated criminals. So we have to ask: would our children be safer if the IWF were less successful?

The IWF will, I think always be a compromise. Civil libertarians will always be rightly suspicious of any organization that has the authority and power to shut down access to content, online or off. Still, the IWF's ten-person board now includes, alongside the representatives of ISPs, top content sites, and academics, a consumer representative, and seems to be less dominated by repressive law enforcement interests. There's an independent audit in the offing, and while the IWF publishes no details of its block list for researchers to examine, it advocates transparency in the form of a splash screen that tells users a site that is blocked and why. They learned, the IWF's departing head, Peter Robbins, said in conversation, a lot from the Wikipedia incident.

My summary: the organization will know it has its balance exactly right when everyone on all sides has something to complain about.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

March 4, 2011

Tax returns

In 1994, when Jeff Bezos was looking for a place to put the online bookseller he intended to grow into the giant, multi-faceted online presence it is today, he began with a set of criteria that included, high up on the list, avoiding liability for sales tax as much as possible. That meant choosing a small state, so that the vast majority of the new site's customers would be elsewhere.

Bezos could make this choice because of the 1992 Supreme Court decision in Quill Corp v. North Dakota, blocking states from compelling distance sellers to collect sales tax from customers unless the seller had a substantial physical operation (a "nexus") in the customer's state. Why, the reasoning went, should a company be required to pay taxes in a state where it receives no benefit in the form of public services? The decision helped fuel the growth of first mail-order sales and then ecommerce.

And so throughout the growth of electronic commerce Americans have gone along taking advantage of the relief from sales tax afforded by online sales. This is true despite the fact that many states have laws requiring their residents to declare and pay the sales tax on purchases over a certain amount. Until the current online tax disputes blew up, few knew about these laws - I only learned of them from a reader email some years ago - and as far as I'm aware it isn't enforced. Doing so would require comprehensive surveillance of ecommerce sites.

But this is the thing when something is new: those setting up businesses can take advantage of loopholes created for very different markets and conditions. A similar situation applies in the UK with respect to DVD and CD sales. Fulfilled by subsidiaries or partners based in the Channel Islands, the DVD and CD sales of major retailers such as Amazon, Tesco, and others take advantage of tax relief rules intended to speed shipments of agricultural products. Basically, any package valued under £18 is exempt from VAT. For consumers, this represents substantial savings; for local shops, it represents a tough challenge.

Even before that, in the early 1990s, CompuServe and AOL, as US-based Internet service providers, were able to avoid charging VAT in the UK based on a rule making services taxable based on their point of origin. That gave those two companies a significant - 17.5 percent - advantage over native ISPs like Demon and Pipex. There were many objections to this situation, and eventually the loophole was closed and both CompuServe and AOL began charging VAT.

You can't really blame companies for taking advantage of the structures that are there. No one wants to pay more tax - or pay for more administration - than is required by law, and anyone running those companies would make the same decisions. But as the recession continues to bite and state, federal, and central governments are all scrambling to replace lost revenues from a tax base that's been , the calls to level the playing field by closing off these tax-advantage workarounds are getting louder.

This type of argument is as old as mail order. But in the beginning there was a general view - implemented also in the US as a moratorium on taxing Internet services that was renewed as recently as 2007 - that exempting the Internet from as many taxes as possible would help the new medium take root and flourish. There was definitely some truth to the idea that this type of encouragement helped; an early FCC proposal to surcharge users for transmitting data was dropped after 10,000 users sent letters of complaint. Nonetheless, the FCC had to continue issuing denials for years as the dropped proposal continued to make the rounds as the "modem tax" hoax spam.

The arguments for requiring out-of-state sellers to collect and remit sales taxes (or VAT) are fairly obvious. Local retailers, especially small independents, are operating at a price disadvantage (even though customers must pay shipping and delivery charges when they buy online). Governments are losing one of their options for raising revenues to pay for public services. In addition, people buy online for many more reasons than saving money. Online shopping is convenient and offers greater choice. It is also true, though infrequently remembered, that the demographics of online shopping skew toward the wealthier members of our society - that is, the people who best afford to pay the tax.

The arguments against largely boil down to the fact that collecting taxes in many jurisdictions is administratively burdensome. There are some 8,000 different tax rates across the US's 50 states, and although there are many fewer VAT rates across Europe, once your business in a country has reached a certain threshold the rules and regulations governing each one can be byzantine and inconsistent. Creating a single, simple, and consistent tax rule to apply across the board to distance selling would answer these.

No one likes paying taxes (least of all us). But the fact that Amazon would apparently rather jettison the associates program that helped advertise and build its business than allow a state to claim those associates constitute a nexus exposing it to sales tax liability says volumes about how far we've come. And, therefore, how little the Net's biggest businesses now need the help.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

February 25, 2011

Wartime economy

Everyone loves a good headline, and £27 billion always makes a *great* one. In this case, that was the sum that a report written by the security consultancy firm Detica, now part of BAE Systems and issued by the Office of Cyber Security and Information Assurance (PDF) estimates that cybercrime is costing the UK economy annually. The claim was almost immediately questioned by ZDNet's Tom Espiner, who promptly checked it out with security experts. Who complained that the report was full of "fake precision" (LSE professor Peter Sommer), "questionable calculations" (Harvard's Tyler Moore), and "nonsense" (Cambridge's Richard Clayton).

First, some comparisons.

Twenty-seven billion pounds (approximately $40 billion) is slightly larger than a year's worth of the International Federation of the Phonographic Industry's estimate of the cumulative retail revenue lost to piracy by the European creative industries from 2008 to 2015 (PDF) (total €240 billion, about £203 million, eight years, £25.4 billion a year). It is roughly the estimated cost of the BP oil spill, the amount some think Facebook will be worth at an IPO, and noticeably less than Apple's $51 billion cash hoard. But: lots smaller than the "£40 billion underworld" The Times attributed to British gangs in 2008.

Several things baffle about this report. The first is that so little information is given about the study's methodology. Who did the researchers talk to? What assumptions did they make and what statistical probabilities did they assign in creating the numbers and charts? How are they defining categories like "online scams" or "IP theft" (they're clear about one thing: they're not including file-sharing in that figure)? What is the "causal model" they developed?

We know one person they didn't talk to: Computer Weekly notes the omission of Detective superintendent Charlie McMurdie, head of the Metropolitan Police's Central e-Crime Unit, who you'd' think would be one of the first ports of call for understanding the on-the-ground experience.

One issue the report seems to gloss over is how very difficult it is to define and categorize cybercrime. Last year, the Oxford Internet Institute conducted a one-day forum on the subject, out of which came the report Mapping and Measuring Cybercrime (PDF) , published in June 2010. Much of this report is given over to the difficulty of such definitions; Sommer, who participated in the forum, argued that we shouldn't worry about the means of commission - a crime is a crime. More recently - perhaps a month ago - Sommer teamed up with the OII's Ian Brown to publish a report for an OECD project on future global shocks, Reducing Systemic Cybersecurity Risk (PDF). The authors' conclusion: "very few single cyber-related events have the capacity to cause a global shock". This report also includes considerable discussion of cybercrime in assessing whether "cyberwarfare" is a genuine global threat. But the larger point about both these reports is that they disclose their methodology in detail.

And as a result, they make much more modest and measured claims, which is one reason that critics have looked at the source of the OCSIA/Detica report - BAE - and argued that the numbers are inflated and the focus largely limited to things that fit BAE's business interests (that is, IP theft and espionage; the usual demon, abuse of children, is left untouched).

The big risk here is that this report will be used in determining how policing resources are allocated.

"One of the most important things we can do is educate the public," says Sommer. "Not only about how to protect themselves but to ensure they don't leave their computers open to be formed into botnets. I am concerned that the effect of all these hugely military organizations lobbying for funding is that in the process things like Get Safe Online will suffer."

There's a broader point that begins with a personal nitpick. On page four, the report says this: "...the seeds of criminality planted by the first computer hackers 20 years ago." Leaving aside the even smaller nitpick that the *real*, original computer hackers, who built things and spent their enormous cleverness getting things to work, date to 40 and 50 years ago, it is utterly unfair to compare today's cybercrime to the (mostly) teenaged hackers of 1990, who spent their Saturday nights in their bedrooms war-dialling sites and trying out passwords. They were the computer equivalent of joy-riders, caused little harm, and were so disproportionately the targets of freaked-out, uncomprehending law enforcement that the the Electronic Frontier Foundation was founded to spread some sanity on the situation. Today's cybercrime underground is composed of professional criminals who operate in an organized and methodical way. There is no more valid comparison between the two than there is between Duke Nukem and al-Qaeda.

One is not a gateway to the other - but the idea that criminals would learn computer techniques and organized crime would become active online was repeatedly used as justification for anti-society legislation from cryptographic key escrow to data retention and other surveillance. The biggest risk of a report like this is that it will be used as justification for those wrong-headed policies rather than as it might more rightfully be, as evidence of the failure of no less than five British governments to plan ahead on our behalf.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

November 19, 2010

Power to the people

We talk often about the fact that ten years of effort - lawsuits, legislation, technology - on the part of the copyright industries has made barely a dent in the amount of material available online as unauthorized copies. We talk less about the similar situation that applies to privacy despite years of best efforts by Privacy International, Electronic Privacy Information Center, Center for Democracy and Technology, Electronic Frontier Foundation, Open Rights Group, No2ID, and newcomer Big Brother Watch. The last ten years have built Google, and Facebook, and every organization now craves large data stores of personal information that can be mined. Meanwhile, governments are complaisant, possibly because they have subpoena power. It's been a long decade.

"Information is the oil of the 1980s," wrote Thomas McPhail and Brenda McPhail in 1987 in an article discussing the politics of the International Telecommunications Union, and everyone seems to take this encomium seriously.

William Heath, who spent his early career founding and running Kable, a consultancy specializing in government IT. The question he focused on a lot: how to create the ideal government for the digital era, has been saying for many months now that there's a gathering wave of change. His idea is that the *new* new thing is technologies to give us back control and up-end the current situation in which everyone behaves as if they own all the information we give them. But it's their data only in exactly the same way that taxpayers' money belongs to the government. They call it customer relationship management; Heath calls the data we give them volunteered personal information and proposes instead vendor relationship management.

Always one to put his effort where his mouth is (Heath helped found the Open Rights Group, the Foundation for Policy Research, and the Dextrous Web as well as Kable), Heath has set up not one, but two companies. The first, Ctrl-Shift, is a research and advisory businesses to help organizations adjust and adapt to the power shift. The second, Mydex, a platform now being prototyped in partnership with the Department of Work and Pensions and several UK councils (PDF). Set up as a community interest company, Mydex is asset-locked, to ensure that the company can't suddenly reverse course and betray its customers and their data.

The key element of Mydex is the personal data store, which is kept under each individual's own control. When you want to do something - renew a parking permit, change your address with a government agency, rent a car - you interact with the remote council, agency, or company via your PDS. Independent third parties verify the data you present. To rent a car, for example, you might present a token from the vehicle licensing bureau that authenticates your age and right to drive and another from your bank or credit card company verifying that you can pay for the rental. The rental company only sees the data you choose to give it.

It's Heath's argument that such a setup would preserve individual privacy and increase transparency while simultaneously saving companies and governments enormous sums of money.

"At the moment there is a huge cost of trying to clean up personal data," he says. "There are 60 to 200 organisations all trying to keep a file on you and spending money on getting it right. If you chose, you could help them." The biggest cost, however, he says, is the lack of trust on both sides. People vanish off the electoral rolls or refuse to fill out the census forms rather than hand over information to government; governments treat us all as if we were suspected criminals when all we're trying to do is claim benefits we're entitled to.

You can certainly see the potential. Ten years ago, when they were talking about "joined-up government", MPs dealing with constituent complaints favored the notion of making it possible to change your address (for example) once and have the new information propagate automatically throughout the relevant agencies. Their idea, however, was a huge, central data store; the problem for individuals (and privacy advocates) was that centralized data stores tend to be difficult to keep accurate.

"There is an oft-repeated fallacy that existing large organizations meant to serve some different purpose would also be the ideal guardians of people's personal data," Heath says. "I think a purpose-created vehicle is a better way." Give everyone a PDS, and they can have the dream of changing their address only once - but maintain control over where it propagates.

There are, as always, key questions that can't be answered at the prototype stage. First and foremost is the question of whether and how the system can be subverted. Heath's intention is that we should be able to set our own terms and conditions for their use of our data - up-ending the present situation again. We can hope - but it's not clear that companies will see it as good business to differentiate themselves on the basis of how much data they demand from us when they don't now. At the same time, governments who feel deprived of "their" data can simply pass a law and require us to submit it.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

October 29, 2010

Wanted: less Sir Humphrey, more shark


Seventeen MPs showed up for Thursday's Backbenchers' Committee debate on privacy and the Internet, requested by Robert Halfon (Con-Harlow). They tell me this is a sell-out crowd. The upshot: Google and every other Internet company may come to rue the day that Google sent its Street View cars around Britain. It crossed a line.

That line is this: "Either your home is your castle or it's not." Halfon, talking about StreetView and email he had from a vastly upset woman in Cornwall whose home had been captured and posted on the Web. It's easy for Americans to forget how deep the "An Englishman's home is his castle" thing goes.

Halfon's central question: are we sleepwalking into a privatized surveillance society, and can we stop it? "If no one has any right to privacy, we will live in a Big Brother society run by private companies." StreetView, he said, "is brilliant - but they did it without permission." Of equal importance to Halfon is the curious incident of the silent Information Commissioner (unlike apparently his equivalent everywhere else in the world) and Google's sniffed wi-fi data. The recent announcement that the sniffed data includes contents of email messages, secure Web pages, and passwords has prompted the ICO to take another look.

The response of the ICO, Halfon said, "has been more like Sir Humphrey than a shark with teeth, which is what it should be."

Google is only one offender; Julian Huppert (LibDem-Cambridge) listed some of the other troubles, including this week's release of Firesheep, a Firefox add-on designed to demonstrate Facebook's security failings. Several speakers raised the issue of the secret BT/Phorm trials. A key issue: while half the UK's population choose to be Facebook users (!), and many more voluntarily use Google daily, no one chose to be included in StreetView; we did not ask to be its customers.

So Halfon wants two things. He wants an independent commission of inquiry convened that would include MPs with "expertise in civil liberties, the Internet, and commerce" to suggest a new legal framework that would provide a means of redress, perhaps through an Internet bill of rights. What he envisions is something that polices the behavior of Internet companies the way the British Medical Association or the Law Society provides voluntary self-regulation for their fields. In cases of infringement, fines, perhaps.

In the ensuing discussion many other issues were raised. Huppert mentioned "chilling" (Labour) government surveillance, and hoped that portions of the Digital Economy Act might be repealed. Huppert has also been asking Parliamentary Questions about the is-it-still-dead? Interception Modernization Programme; he is still checking on the careful language of the replies. (Asked about it this week, the Home Office told me they can't speculate in advance about the details will that be provided "in due course"; that what is envisioned is a "program of work on our communications abilities"; that it will be communications service providers, probably as defined in RIPA Section 2(1), storing data, not a government database; that the legislation to safeguard against misuse will probably but not certainly, be a statutory instrument.)

David Davis (Con-Haltemprice and Howden) wasn't too happy even with the notion of decentralized data held by CSPs, saying these would become a "target for fraudsters, hackers and terrorists". Damien Hinds (Con-East Hampshire) dissected Google's business model (including £5.5 million of taxpayers' money the UK government spent on pay-per-click advertising in 2009).

Perhaps the most significant thing about this debate is the huge rise in the level of knowledge. Many took pains to say how much they value the Internet and love Google's services. This group know - and care - about the Internet because they use it, unlike 1995, when an MP was about as likely to read his own email as he was to shoot his own dog.

Not that I agreed with all of them. Don Foster (LibDem-Bath) and Mike Weatherley (Con-Hove) were exercised about illegal file-sharing (Foster and Huppert agreed to disagree about the DEA, and Damian Collins (Con-Folkestone and Hythe complained that Google makes money from free access to unauthorized copies). Nadine Dorries (Con-Mid Bedfordshire) wanted regulation to young people against suicide sites.

But still. Until recently, Parliament's definition of privacy was celebrities' need for protection from intrusive journalists. This discussion of the privacy of individuals is an extraordinary change. Pressure groups like PI, , Open Rights Group, and No2ID helped, but there's also a groundswell of constituents' complaints. Mark Lancaster (Con-Milton Keynes North) noted that a women's refuge at a secret location could not get Google to respond to its request for removal and that the town of Broughton formed a human chain to block the StreetView car. Even the attending opposition MP, Ian Lucas (Lab-Wrexham), favored the commission idea, though he still had hopes for self-regulation.

As for next steps, Ed Vaizey (Con-Wantage and Didcot), the Minister for Communication, Culture, and the Creative Industries, said he planned to convene a meeting with Google and other Internet companies. People should have a means of redress and somewhere to turn for mediation. For Halfon that's still not enough. People should have a choice in the first place.

To be continued...

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

October 23, 2010

An affair to remember

Politicians change; policies remain the same. Or if, they don't, they return like the monsters in horror movies that end with the epigraph, "It's still out there..."

Cut to 1994, my first outing to the Computers, Freedom, and Privacy conference. I saw: passionate discussions about the right to strong cryptography. The counterargument from government and law enforcement and security service types was that yes, strong cryptography was a fine and excellent thing at protecting communications from prying eyes and for that very reason we needed key escrow to ensure that bad people couldn't say evil things to each other in perfect secrecy. The listing of organized crime, terrorists, drug dealers, and pedophiles as the reasons why it was vital to ensure access to cleartext became so routine that physicist Timothy May dubbed them "The Four Horsemen of the Infocalypse". Cypherpunks opposed restrictions on the use and distribution of strong crypto; government types wanted at the very least a requirement that copies of secret cryptographic keys be provided and held in escrow against the need to decrypt in case of an investigation. The US government went so far as to propose a technology of its own, complete with back door, called the Clipper chip.

Eventually, the Clipper chip was cracked by Matt Blaze, and the needs of electronic commerce won out over the paranoia of the military and restrictions on the use and export of strong crypto were removed.

Cut to 2000 and the run-up to the passage of the UK's Regulation of Investigatory Powers Act. Same Four Horsemen, same arguments. Eventually RIPA passed with the requirement that individuals disclose their cryptographic keys - but without key escrow. Note that it's just in the last couple of months that someone - a teenager - has gone to jail in the UK for the first time for refusing to disclose their key.

It is not just hype by security services seeking to evade government budget cuts to say that we now have organized cybercrime. Stuxnet rightly has scared a lot of people into recognizing the vulnerabilities of our infrastructure. And clearly we've had terrorist attacks. What we haven't had is a clear demonstration by law enforcement that encrypted communications have impeded the investigation.

A second and related strand of argument holds that communications data - that is traffic data such as email headers and Web addresses - must be retained and stored for some lengthy period of time, again to assist law enforcement in case an investigation is needed. As the Foundation for Information Policy Research and Privacy International have consistently argued for more than ten years, such traffic data is extremely revealing. Yes, that's why law enforcement wants it; but it's also why the American Library Association has consistently opposed handing over library records. Traffic data doesn't just reveal who we talk to and care about; it also reveals what we think about. And because such information is of necessity stored without context, it can also be misleading. If you already think I'm a suspicious person, the fact that I've been reading proof-of-concept papers about future malware attacks sounds like I might be a danger to cybersociety. If you know I'm a journalist specializing in technology matters, that doesn't sound like so much of a threat.

And so to this week. The former head of the Department of Homeland Security, Michael Chertoff, at the RSA Security Conference compared today's threat of cyberattack to nuclear proliferation. The US's Secure Flight program is coming into effect, requiring airline passengers to provide personal data for the US to check 72 hours in advance (where possible). Both the US and UK security services are proposing the installation of deep packet inspection equipment at ISPs. And language in the UK government's Strategic Defence and Security Review (PDF) review has led many to believe that what's planned is the revival of the we-thought-it-was-dead Interception Modernisation Programme.

Over at Light Blue Touchpaper, Ross Anderson links many of these trends and asks if we will see a resumption of the crypto wars of the mid-1990s. I hope not; I've listened to enough quivering passion over mathematics to last an Internet lifetime.

But as he says it's hard to see one without the other. On the face of it, because the data "they" want to retain is traffic data and note content, encryption might seem irrelevant. But a number of trends are pushing people toward greater use of encryption. First and foremost is the risk of interception; many people prefer (rightly) to use secured https, SSH, or VPN connections when they're working over public wi-fi networks. Others secure their connections precisely to keep their ISP from being able to analyze their traffic. If data retention and deep packet inspection become commonplace, so will encrypted connections.

And at that point, as Anderson points out, the focus will return to long-defeated ideas like key escrow and restrictions on the use of encryption. The thought of such a revival is depressing; implementing any of them would be such a regressive step. If we're going to spend billions of pounds on the Internet infrastructure - in the UK, in the US, anywhere else - it should be spent on enhancing robustness, reliability, security, and speed, not building the technological infrastructure to enable secret, warrantless wiretapping.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

October 15, 2010

The elected dictatorship

I wish I had a nickel for every time I had the following conversation with some British interlocutor in the 1970s and 1980s:

BI: You should never have gotten rid of Nixon.

wg: He was a crook.

BI: They're all crooks. He was the best foreign policy president you ever had.

As if it were somehow touchingly naïve to expect that politicians should be held to standards of behaviour in office. (Look, I don't care if they have extramarital affairs; I care if they break the law.)

It is, however, arguable that the key element of my BIs' disapproval was that Americans had the poor judgment and bad taste to broadcast the Watergate hearings live on television. (Kids, this was 1972. There was no C-Span then.) If Watergate had happened in the UK, it's highly likely no one would ever have heard about it until 50 or however many years later the Public Records Office opened the archives.

Around the time I founded The Skeptic, I became aware of the significant cultural difference in how people behave in the UK versus the US when they are unhappy about something. Britons write to their MP. Americans...make trouble. They may write letters, but they are equally likely to found an organization and create a campaign. This do-it-yourself ethic is completely logical in a relatively young country where democracy is still taking shape.

Britain, as an older - let's be polite and call it mature - country, operates instead on a sort of "gentlemen's agreement" ethos (vestiges of which survive in the US Constitution, to be sure). You can get a surprising amount done - if you know the right people. That system works perfectly for the in-group, and so to effect change you either have to become one of them (which dissipates your original desire for change) or gate-crash the party. Sometimes, it takes an American...

This was Heather Brooke's introduction to English society. The daughter of British parents and the wife of a British citizen, burned out from years of investigative reporting on murders and other types of mayhem in the American South, she took up residence in Bethnal Green with her husband. And became bewildered when repeated complaints to the council and police about local crime produced no response. Stonewalled, she turned to writing her book Your Right to Know, which led her to make her first inquiries about viewing MPs' expenses. The rest is much-aired scandal.

In her latest book, The Silent State, Brooke examines the many ways that British institutions are structured to lock out the public. The most startling revelation: things are getting worse, particularly in the courts, where the newer buildings squeeze public and press into cramped, uncomfortable spaces but the older buildings. Certainly, the airport-style security that's now required for entry into Parliament buildings sends the message that the public are both unwelcome and not to be trusted (getting into Thursday's apComms meeting required standing outside in the chill and damp for 15 minutes while staff inspected and photographed one person at a time).

Brooke scrutinizes government, judiciary, police, and data-producing agencies such as the Ordnance Survey, and each time finds the same pattern: responsibility for actions cloaked by anonymity; limited access to information (either because the information isn't available or because it's too expensive to obtain); arrogant disregard for citizens' rights. And all aided by feel-good, ass-covering PR and the loss of independent local press to challenge it. In a democracy, she argues, it should be taken for granted that citizens should have a right to get an answer when they ask the how many violent attacks are taking place on their local streets, take notes during court proceedings or Parliamentary sessions, or access and use data whose collection they paid for. That many MPs seem to think of themselves as members of a private club rather than public servants was clearly shown by the five years of stonewalling Brooke negotiated in trying to get a look at their expenses.

In reading the book, I had a sudden sense of why electronic voting appeals to these people. It is yet another mechanism for turning what was an open system that anyone could view and audit - it doesn't take an advanced degree to be able to count pieces of paper - into one whose inner workings can effectively be kept secret. That its inner workings are also not understandable to MPs =themselves apparently is a price they're willing to pay in return for removing much of the public's ability to challenge counts and demand answers. Secrecy is a habit of mind that spreads like fungus.

We talk a lot about rolling back newer initiatives like the many databases of Blair's and Brown's government, data retention, or the proliferation of CCTV cameras. But while we're trying to keep citizens from being run down by the surveillance state we should also be examining the way government organizes its operations and block the build-out of further secrecy. This is a harder and more subtle thing to do, but it could make the lives of the next generation of campaigners easier.

At least one thing has changed in the last 30 years, though: people's attitudes. In 2009, when the scandal over MPs' expenses broke, you didn't hear much about how other qualities meant we should forgive MPs. Britain wanted *blood*.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

July 30, 2010

Three-legged race

"If you are going to do this damn silly thing, don't do it in this damn silly way," Sir Humphrey Appleby tells Jim Hacker in a fit of unaccustomed straight talking.

We think of this often these days, largely because it seems as though lawmakers, having been belittled by impatient and malcontent geeks throughout the 1990s for being too slow to keep up with Internet time, are trying to speed through the process of creating legislation by eliminating thought, deliberation, and careful drafting. You can see why they'd want to get rid of so many civil servants, who might slow this process down.

In that particular episode of Yes, Minister, "The Writing on the Wall" (S1e05), Appleby and Hacker butt heads over who will get the final say over the wording of a draft proposal on phased Civil Service reductions (today's civil servants and ministers might want to watch episode S1e03, "The Economy Drive", for what their lives will soon be like). Hacker wins that part of the battle only to discover that his version, if implemented, will shut down his own department. Oops.

Much of the Digital Economy Act (2010) was like this: redrafted at the last minute in all sorts of unhelpful ways. But the devil is always in the details, and it was not unreasonable to hope that Ofcom, charging with defining and consulting on those details, would operate in a more measured fashion. But apparently not, and so we have a draft code of practice that's so incomplete that it could be a teenager's homework.

Both Consumer Focus and the Open Rights Group have analyses of the code's non-compliance with the act and a helpful <"a href=http://e-activist.com/ea-campaign/clientcampaign.do?ea.client.id=1422&ea.campaign.id=7268">online form should you wish to submit your opinions. The consultation closes today, so run, do not walk, to add your comments.

What's more notable is when it opened: May 28, only three days after the State Opening of the post-election parliamentary session, three weeks after the election, and six weeks after the day that Gordon Brown called the election. Granted, civil servants do not down pencils while the election is proceeding. But given that the act went through last-second changes and then was nodded through the House of Commons in the frantic dash to get home to start campaigning, the most time Ofcom can have had to draft this mish-mash was about six weeks. Which may explain the holes and inadequacies, but then you have to ask: why didn't they take their time and do it properly?

The Freedom bill, which is to repeal so many of the items on our wish list, is mute on the subject of the Digital Economy Act, despite a number of appearances on the Freedom bill's ideas site. (Big Brother Watch has some additional wish list items.)

The big difficulty for anyone who hates the copyright protectionist provisions in the act - the threat to open wi-fi, the disconnection or speed-limitation of Internet access ("technical measures") to be applied to anyone who is accused of copyright infringement three times ("three-strikes", or HADOPI, after the failed French law attempting to do the same) - is that what you really want is for the act to go away. Preferably back where it came from, some copyright industry lobbyist's brain. A carefully drafted code of practice that pays attention to ensuring that the evidentiary burden on copyright holders is strong enough to deter the kind of abuse seen in the US since the passage of the Digital Millennium Copyright Act (1998) is still not a good scenario, merely a least-worst one.

Still, ORG and Consumer Focus are not alone in their unhappiness. BT and TalkTalk have expressed their opposition, though for different reasons. TalkTalk is largely opposed to the whole letter-writing and copyright infringement elements; but both ISPs are unhappy about Ofcom's decision to limit the code to fixed-line ISPs with more than 400,000 customers. In the entire UK, there are only seven: TalkTalk, BT, Post Office, Virgin, Sky, Orange, and O2. Yet it makes sense to exclude mobile ISPs for now: at today's prices it's safe to guess that no one spends a lot of time downloading music over them. For the rest...these ISPs can only benefit if unauthorised downloading on their services decreases, don't all ISPs want the heaviest downloaders to leech off someone else's service?

LINX, the largest membership organisation for UK Internet service providers has also objected (PDF) to the Act's apportionment of costs: ISPs, LINX's Malcolm Hutty argues, are innocent third parties, so rather than sharing the costs of writing letters and retaining the data necessary to create copyright infringement reports ISPs should be reimbursed for not only the entire cost of implementing the necessary systems but also opportunity costs. It's unclear, LINX points out, how much change Ofcom has time to make to the draft code and still meet its statutory timetable.

So this is law on Internet time: drafted for, if not by, special interests, undemocratically rushed through Parliament, hastily written, poorly thought-out, unfairly and inequitably implemented in direct opposition to the country's longstanding commitment to digital inclusion. Surely we can do better.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

June 11, 2010

Bonfire of the last government's vanities

"We have no hesitation in making the national identity card scheme an unfortunate footnote in history. There it should remain - a reminder of a less happy time when the Government allowed hubris to trump civil liberties," the Home Secretary, Theresa May, told the House of Commons at the second reading of the Identity Documents Bill 2010, which will erase the 2006 act introducing ID cards and the National Identity Register. "This will not be a literal bonfire of the last Government's vanities, but it will none the less be deeply satisfying." Estimated saving: £86 million over the next four years.

But not so fast...

An "unfortunate footnote" sounds like the perfect scrapheap on which to drop the National Identity Register and its physical manifestation, ID cards, but if there's one thing we know about ID cards it's that, like the monster in horror movies, they're always "still out there".

In 2005, Lilian Edwards, then at the Centre for Research in Intellectual Property and Law at the University of Edinburgh, invited me to give a talkIdentifying Risks, on the history of ID cards, an idea inspired by a comment from Ross Anderson. The gist: after the ID card was scrapped in 1952 at the end of World War II, attempts to bring it back an ID card were made, on average, about every two or three years. (Former cabinet minister Peter Lilley, speaking at Privacy International's 2002 conference, noted that every new IT minister put the same set of ID card proposals before the Cabinet.)

The most interesting thing about that history is that the justification for bringing in ID cards varied so much; typically, it drew on the latest horrifying public event. So, in 1974 it was the IRA bombings in Guildford and Birmingham. In 1988, football hooliganism and crime. In 1989, social security fraud. In 1993, illegal immigration, fraud, and terrorism.

Within the run of just the 2006 card, the point varied. The stated goals began with blocking benefit fraud, then moved on to include preventing terrorism and serious crime, stopping illegal immigration, and needing to comply with international standards that require biometric features in passports. It is this chameleon-like adaptation to the troubles of the day that makes ID cards so suspect as the solution to anything.

Immediately after the 9/11 attacks, Tony Blair rejected the idea of ID cards (which he had actively opposed in 1995, when John Major's government issued a green paper). But by mid-2002 a consultation paper had been published and by 2004 Blair was claiming that the civil liberties objections had vanished.

Once the 2006 ID card was introduced as a serious set of proposals in 2002, events unfolded much as Simon Davies predicted they would at that 2002 meeting. The government first clothed the ID card in user-friendly obfuscation: an entitlement card. The card's popularity in the polls, at first favourable (except, said David Blunkett for a highly organised minority), slid inexorably as the gory details of its implementation and costs became public. Yet the (dear, departed) Labour government clung to the proposals despite admitting, from time to time, their utter irrelevance for preventing terrorism.

Part of the card's sliding popularity has been due to people's increased understanding of the costs and annoyance it would impose. Their apparent support for the card was for the goals of the card, not the card itself. Plus, since 2002 the climate has changed: the Iraq war is even less popular and even the 2005 "7/7" London attacks did not keep acceptance of the "we are at war" justification for increased surveillance from declining. And the economic climate since 2008 makes large expenditure on bureaucracy untenable.

Given the frequency with which the ID card has resurfaced in the past, it seems safe to say that the idea will reappear at some point, though likely not during this coalition government. The LibDems always opposed it; the Conservatives have been more inconsistent, but currently oppose large-scale public IT projects.

Depending how you look at it, ID cards either took 54 years to resurface (from their withdrawal in1952 to the 2006 Identity Cards Act), or the much shorter time to the first proposals to reinstate them. Australia might be a better guide. In 1985, Bob Hawke made the "Australia card" a central plank of his government. He admitted defeat in 1987, after widespread opposition fueled by civil liberties groups. ID card proposals resurfaced in Australia in 2006, to be withdrawn again at the end of 2007. That's about 21 years - or a generation.

In 2010 Britain, it's as important that much of the rest of the Labour government's IT edifice, such as the ContactPoint database, intended to track children throughout their school years, is being scrapped. Left in place, it might have taught today's generation of children to perceive state tracking as normal. The other good news is that many of today's tireless campaigners against the 2006 ID card will continue to fight the encroachment of the database state. In 20 years - or sooner, if (God forbid) some catastrophe makes it politically acceptable - when or if an ID card comes back, they will still be young enough to fight it. And they will remember how.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of the earlier columns in this series.

May 7, 2010

Wish list

It's 2am on election night, so of course no one can think about anything except the returns. Reported so far: 57 of 650 seats. Swing from Labour to Conservative: 4 percent.

The worst news of the night so far is that people have been turned away from polling stations because the queues couldn't be processed fast enough to get everyone through before the official closing time of 10pm. Creative poll workers locked the unvoted inside the station and let them vote. Uncreative ones sent them home, or tried to - I'm glad to see there were angry protests and, in some cases, sit-ins. Incredibly, some people couldn't vote because their stations ran out of ballot papers. In one area, hundreds of postal ballots are missing. It's an incredible shambles considering Britain's centuries of experience of running elections. Do not seize on this mess as an excuse to bring in electronic voting, something almost every IT security expert warns is a very bad idea. Print some more ballot papers, designate more polling stations, move election day to Saturday.

Reported: 69 Swing: 3.8 percent: Both Conservatives and LibDems have said they will scrap the ID card. Whether they'll follow through remains to be seen. My sense from interviews with Conservative spokespeople for articles in the last year is that they want to scrap large IT projects in favor of smaller, more manageable ones undertaken in partnership with private companies. That should spell death for the gigantic National Identity Register database and profound change for the future of NHS IT; hopefully smaller systems should give individuals more control. It does raise the question of handing over data to private companies in, most likely, other countries. The way LibDem peers suddenly switched sides on the Digital Economy Act last month dinged our image of the LibDems as the most sensible on net.wars issues of all the parties. Whoever gets in, yes, please, scrap the National Identity Register and stick to small, locally grown IT projects that serve their users. That means us, not the Whitehall civil service.

Reported: 82. Swing: 3.6 percent: Repeal the Digital Economy Act and take time out for a rethink and public debate. The copyright industries are not going to collapse without three-strikes and disconnection notices. Does the UK really want laws that France has rejected?

Reported: 104. Swing: 4.1 percent: Coincidentally, today I received today a letter "inviting" me to join a study on mobile phones and brain cancer; I would be required to answer periodic surveys about my phone use. The explanatory leaflet notes: "Imperial College will review your health directly through routine medical and other health-related records" using my NHS number, name, address, and date of birth - for the next 20 to 30 years. Excuse me? Why not ask me to report relevant health issues, and request more detailed access only if I report something relevant? This Labour government has fostered this attitude of We Will Have It All. I'd participate in the study if I could choose what health information I give; I'm not handing over untrammeled right of access. New government: please cease to regard our health data as yours to hand over "for research purposes" to whomever you feel like. Do not insult our intelligence and knowledge by claiming that anonymizing data protects our privacy; such data can often be very easily reidentified.

Reported: 120. Swing: 3.9 percent: Reform libel law. Create a public interest defense for scientific criticism, streamline the process, and lower costs for defendants. Re-allocate the burden of proof to the plaintiff. Stop hearing cases with little or no connection to the UK.

Reported: 149. Swing: 4.3 percent: While you're reforming legal matters, require small claims court to hear cases in which photographers (and other freelances) pursue publishers who have infringed their copyright. Photographers say these courts typically kick such "specialist" cases up to higher levels, making it impracticably expensive to get paid.

Reported: 231. Swing: 4.8 percent: Any government that's been in power as long as Labour currently has is going to seem tired and in need of new ideas. But none of the complaints above - the massive growth in surveillance, the lack of regard for personal privacy, the sheer cluelessness about IT - knocked Labour down. Even lying about the war didn't do it. It was, as Clinton's campaign posted on its office walls, the economy. Stupid.

Reported: 327. Swing: 5 percent: Scrap ContactPoint, the (expensive, complicated) giant database intended to track children through their school days to adulthood - and, by the time they get there, most likely beyond. Expert reports the government commissioned and paid for advised against taking the risk of data breaches. Along with it modernize data protection instead of data retention.

Reported: 626. Swing: 5.3 percent:
A hung Parliament (as opposed to hanging chad). Good. For the last 36 years Britain has been ruled by an uninterrupted elected dictatorship. It is about time the parties were forced to work together again. Is anyone seriously in doubt that the problems the country has are bigger than any one party's interests? Bring on proportional representation. Like they have in Scotland.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

April 30, 2010

Child's play

In the TV show The West Wing (Season 6, Episode 17, "A Good Day") young teens tackle the president: why shouldn't they have the right to vote? There's probably no chance, but they made their point: as a society we trust kids very little and often fail to take them or their interests seriously.

That's why it was so refreshing to read in 2008's < a href="http://www.dcsf.gov.uk/byronreview/actionplan/">Byron Review the recommendation that we should consult and listen to children in devising programs to ensure their safety online. Byron made several thoughtful, intelligent analogies: we supervise as kids learn to cross streets, we post warning signs at swimming pools but also teach them to swim.

She also, more controversially, recommended that all computers sold for home use in the UK should have Kitemarked parental control software "which takes parents through clear prompts and explanations to help set it up and that ISPs offer and advertise this prominently when users set up their connection."

The general market has not adopted this recommendation; but it has been implemented with respect to the free laptops issued to low-income families under Becta's £300 million Home Access Laptop scheme, announced last year as part of efforts to bridge the digital divide. The recipients - 70,000 to 80,000 so far - have a choice of supplier, of ISP, and of hardware make and model. However, the laptops must meet a set of functional technical specifications, one of which is compliance with PAS 74:2008, the British Internet safety standard. That means anti-virus, access control, and filtering software: NetIntelligence.

Naturally, there are complaints; these fall precisely in line with the general problems with filtering software, which have changed little since 1996, when the passage of the Communications Decency Act inspired 17-year-old Bennett Haselton to start Peacefire to educate kids about the inner working of blocking software - and how to bypass it. Briefly:

1. Kids are often better at figuring out ways around the filters than their parents are, giving parents a false sense of security.

2. Filtering software can't block everything parents expect it to, adding to that false sense of security.

3. Filtering software is typically overbroad, becoming a vehicle for censorship.

4. There is little or no accountability about what is blocked or the criteria for inclusion.

This case looks similar - at first. Various reports claim that as delivered NetIntelligence blocks social networking sites and even Google and Wikipedia, as well as Google's Chrome browser because the way Chrome installs allows the user to bypass the filters.

NetIntelligence says the Chrome issue is only temporary; the company expects a fix within three weeks. Marc Kelly, the company's channel manager, also notes that the laptops that were blocking sites like Google and Wikipedia were misconfigured by the supplier. "It was a manufacturer and delivery problem," he says; once the software has been reinstalled correctly, "The product does not block anything you do not want it to." Other technical support issues - trouble finding the password, for example - are arguably typical of new users struggling with unfamiliar software and inadequate technical support from their retailer.

Both Becta and NetIntelligence stress that parents can reconfigure or uninstall the software even if some are confused about how to do it. First, they must first activate the software by typing in the code the vendor provides; that gets them password access to change the blocking list or uninstall the software.

The list of blocked sites, Kelly says, comes from several sources: the Internet Watch Foundation's list and similar lists from other countries; a manual assessment team also reviews sites. Sites that feel they are wrongly blocked should email NetIntelligence support. The company has, he adds, tried to make it easier for parents to implement the policies they want; originally social networks were not broken out into their own category. Now, they are easily unblocked by clicking one button.

The simple reaction is to denounce filtering software and all who sail in her - censorship! - but the Internet is arguably now more complicated than that. Research Becta conducted on the pilot group found that 70 percent of the parents surveyed felt that the built-in safety features were very important. Even the most technically advanced of parents struggle to balance their legitimate concerns in protecting their children with the complex reality of their children's lives.

For example: will what today's children post to social networks damage their chances of entry into a good university or a job? What will they find? Not just pornography and hate speech; some parents object to creationist sites, some to scary science fiction, others to Fox News. Yesterday's harmless flame wars are today's more serious cyber-bullying and online harassment. We must teach kids to be more resilient, Byron said; but even then kids vary widely in their grasp of social cues, common sense, emotional make-up, and technical aptitude. Even experts struggle with these issues.

"We are progressively adding more information for parents to help them," says Kelly. "We want the people to keep the product at the end. We don't want them to just uninstall it - we want them to understand it and set the policies up the way they want them." Like all of us, Kelly thinks the ideal is for parents to engage with their children on these issues, "But those are the rules that have come along, and we're doing the best we can."

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

March 19, 2010

Digital exclusion: the bill

The workings of British politics are nearly as clear to foreigners as cricket; and unlike the US there's no user manual. (Although we can recommend Anthony Trollope's Palliser novels and the TV series Yes, Minister as good sources of enlightenment on the subject.) But what it all boils down to in the case of the Digital Economy Bill is that the rights of an entire nation of Internet users are about to get squeezed between a rock and an election unless something dramatic happens.

The deal is this: the bill has completed all the stages in the House of Lords, and is awaiting its second reading in the House of Commons. Best guesses are that this will happen on or about March 29 or 30. Everyone expects the election to be called around April 8, at which point Parliament disbands and everyone goes home to spend three weeks intensively disrupting the lives of their constituency's voters when they're just sitting down to dinner. Just before Parliament dissolves there's a mad dash to wind up whatever unfinished business there is, universally known as the "wash-up". The Digital Economy Bill is one of those pieces of unfinished business. The fun part: anyone who's actually standing for election is of course in a hurry to get home and start canvassing. So the people actually in the chamber during the wash-up while the front benches are hastily agreeing to pass stuff thought on the nod are likely to be retiring MPs and others who don't have urgent election business.

"What we need," I was told last night, "is a huge, angry crowd." The Open Rights Group is trying to organize exactly that for this Wednesday, March 24.

The bill would enshrine three strikes and disconnection into law. Since the Lords' involvement, it provides Web censorship. It arguably up-ends at least 15 years of government policy promoting the Internet as an engine of economic growth to benefit one single economic sector. How would the disconnected vote, pay taxes, or engage in community politics? What happened to digital inclusion? More haste, less sense.

Last night's occasion was the 20th anniversary of Privacy International (Twitter: @privacyint), where most people were polite to speakers David Blunkett and Nick Clegg. Blunkett, who was such a front-runner for a second Lifetime Menace Big Brother Award that PI renamed the award after him, was an awfully good sport when razzed; you could tell that having his personal life hauled through the tabloid press in some detail has changed many of his views about privacy. Though the conversion is not quite complete: he's willing to dump the ID card, but only because it makes so much more sense just to make passports mandatory for everyone over 16.

But Blunkett's nearly deranged passion for the ID card was at least his own. The Digital Economy Bill, on the other hand, seems to be the result of expert lobbying by the entertainment industry, most especially the British Phonographic Industry. There's a new bit of it out this week in the form of the Building a Digital Economy report, which threatens the loss of 250,000 jobs in the UK alone (1.2 million in the EU, enough to scare any politician right before an election). Techdirt has a nice debunking summary.

A perennial problem, of course, is that bills are notoriously difficult to read. Anyone who's tried knows these days they're largely made up of amendments to previous bills, and therefore cannot be read on their own; and while they can be marked up in hypertext for intelligent Internet perusal this is not a service Parliament provides. You would almost think they don't really want us to read these things.

Speaking at the PI event, Clegg deplored the database state that has been built up over the last ten to 15 years, the resulting change in the relationship between citizen and state, and especially the omission that, "No one ever asked people to vote on giant databases." Such a profound infrastructure change, he argued, should have been a matter for public debate and consideration - and wasn't. Even Blunkett, who attributed some of his change in views to his involvement in the movie Erasing David (opening on UK cinema screens April 29), while still mostly defending the DNA database, said that "We have to operate in a democratic framework and not believe we can do whatever we want."

And here we are again with the Digital Economy Bill. There is plenty of back and forth among industry representatives. ISPs estimate the cost of the DEB's Web censorship provisions at up to £500 million. The BPI disagrees. But where is the public discussion?

But the kind of thoughtful debate that's needed cannot take place in the present circumstances with everyone gunning their car engines hoping for a quick getaway. So if you think the DEB is just about Internet freedoms, think again; the way it's been handled is an abrogation of much older, much broader freedoms. Are you angry yet?


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

November 20, 2009

Thou shalt not steal

As we're so fond of saying, technology moves fast, and law moves slowly. What we say far less often is that law should move slowly. It is not a sign of weakness to deliberate carefully about laws that affect millions of people's lives and will stay on the books for a long, long time. It's always seemed to me that the Founding Fathers very deliberately devised the US system to slow things down - and to ensure that the further-reaching the change the more difficult it is to enact.

Cut to today's Britain. The Internet may perceive censorship as damage and route around it, but politicians seem increasingly to view due and accountable legal process as an unnecessary waste of time and try to avoid it. Preventing this is, of course, what we have constitutions for; democracy is a relatively mature technology.

Today's Digital Economy bill is loaded with provisions for enough statutory instruments to satisfy the most frustrated politician's desire to avoid all that fuss and bother of public debate and research. Where legislation requires draft bills, public consultations, and committee work, a statutory instrument can pass both houses of Parliament on the nod. For minor regulatory changes - such as, for example, the way money is paid to pensioners (1987) - limiting the process to expert discussion and a quick vote makes sense. But when it comes to allowing the Secretary of State to change something as profound and far-reaching in impact as copyright law with a minimum of public scrutiny, it's an outrageous hijack of the democratic process.

Here is the relevant quote from the bill, talking about the Copyright, Designs, and Patents Act 1988:

The Secretary of State may by order amend Part 1 or this Part for the purpose of preventing or reducing the infringement of copyright by means of the internet, if it appears to the Secretary of State appropriate to do so having regard to technological developments that have occurred or are likely to occur.

Lower down, the bill does add that:

Before making any order under this section the Secretary of State must consult such persons who the Secretary of State thinks likely to be affected by the order, or who represent any of those persons, as the Secretary of State thinks fit.

Does that say he (usually) has to consult the public? I don't think so; until very recently it was widely held that the only people affected by copyright law were creators and rights holders - these days rarely the same people even though rights holders like, for public consumption, to pretend otherwise (come contract time, it's a whole different story). We would say that everyone now has a stake in copyright law, given the enormously expanded access to the means to create and distribute all sorts of media, but it isn't at all clear that the Secretary of State would agree or what means would be available to force him to do so. What we do know is that the copyright policies being pushed in this bill come directly from the rights holders.

Stephen Timms, talking to the Guardian, attempted to defend this provision this way:

The way that this clause is formed there would be a clear requirement for full public consultation [before any change] followed by a vote in favour by both houses of Parliament."

This is, put politely, disingenuous: this government has, especially lately - see also ID cards - a terrible record of flatly ignoring what public consultations are telling them, even when the testimony submitted in response to such consultations comes from internationally recognized experts.

Timms' comments are a very bad joke to anyone who's followed the consultations on this particular bill's provisions on file-sharing and copyright, given that everyone from Gowers to Dutch economists are finding that loosening copyright restrictions has society-wide benefits, while Finland has made 1Mb broadband access a legal right and even France's courts see Internet access as a fundamental human right (especially ironic given that France was the first place three strikes actually made it into law).

In creating the Digital Economy bill, not only did this government ignore consultation testimony from everyone but rights holders, it even changed its own consultation mid-stream, bringing back such pernicious provisions as three-strikes-and-you're-disconnected even after agreeing they were gone. This government is, in fact, a perfect advertisement for the principle that laws that are enacted should be reviewed with an eye toward what their effect will be should a government hostile to its citizenry come to power.

Here is some relevant outrage from an appropriately native British lawyer specializing in Net issues, Lilian Edwards:

So clearly every time things happen fast and the law might struggle to keep up with them, in future, well we should just junk ordinary democratic safeguards before anyone notices, and bow instead to the partisan interests who pay lobbyists the most to shout the loudest?

Tell me to "go home if you don't like it here" because I wasn't born in the UK if you want to, but she's a native. And it's the natives who feel betrayed that you've got to watch out for.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of the earlier columns in this series. Readers are welcome to post here, follow on , or send email to netwars@skeptic.demon.co.uk.

November 13, 2009

Cookie cutters

Sometimes laws sneak up on you while you're looking the other way. One of the best examples was the American Telecommunications Act of 1996: we were so busy obsessing about the freedom of speech-suppressing Communications Decency Act amendment that we failed to pay attention to the implications of the bill itself, which allowed the regional Baby Bells to enter the long distance market and changed a number of other rules regarding competition.

We now have a shiny, new example: we have spent so much time and electrons over the nasty three-strikes-and-you're offline provisions that we, along with almost everyone else, utterly failed to notice that the package contains a cookie-killing provision last seen menacing online advertisers in 2001 (our very second net.wars).

The gist: Web sites cannot place cookies on users' computers unless said users have agreed to receive them unless the cookies are strictly necessary - as, for example, when you select something to buy and then head for the shopping cart to check out.

As the Out-Law blog points out this proposal - now to become law unless the whole package is thrown out - is absurd. We said it was in 2001 - and made the stupid assumption that because nothing more had been heard about it the idea had been nixed by an outbreak of sanity at the EU level.

Apparently not. Apparently MEPs and others at EU level spend no more time on the Web than they did eight years ago. Apparently none of them have any idea what such a proposal would mean. Well, I've turned off cookies in my browser, and I know: without cookies, browsing the Web is as non-functional as a psychic being tested by James Randi.

But it's worse than that. Imagine browsing with every site asking you to opt in every - pop-up - time - pop-up - it - pop-up - wants - pop-up - to - pop-up - send - pop-up - you - a - cookie - pop-up. Now imagine the same thing, only you're blind and using the screen reader JAWS.

This soon-to-be-law is not just absurd, it's evil.

Here are some of the likely consequences.

As already noted, it will make Web use nearly impossible for the blind and visually impaired.

It will, because such is the human response to barriers, direct ever more traffic toward those sites - aggregators, ecommerce, Web bulletin boards, and social networks - that, like Facebook, can write a single privacy policy for the entire service to which users consent when they join (and later at scattered intervals when the policy changes) that includes consent to accepting cookies.

According to Out-Law, the law will trap everyone who uses Google Analytics, visitor counters, and the like. I assume it will also kill AdSense at a stroke: how many small DIY Web site owners would have any idea how to implement an opt-in form? Both econsultancy.com and BigMouthMedia think affiliate networks generally will bear the brunt of this legislation. BigMouthMedia goes on to note a couple of efforts - HTTP.ETags and Flash cookies - intended to give affiliate networks more reliable tracking that may also fall afoul of the legislation. These, as those sources note, are difficult or impossible for users to delete.

It will presumably also disproportionately catch EU businesses compared to non-EU sites. Most users probably won't understand why particular sites are so annoying; they will simply shift to sites that aren't annoying. The net effect will be to divert Web browsing to sites outside the EU - surely the exact opposite of what MEPs would like to see happen.

And, I suppose, inevitably, someone will write plug-ins for the popular browsers that can be set to respond automatically to cookie opt-in requests and that include provisions for users to include or exclude specific sites. Whether that will offer sites a safe harbour remains to be seen.

The people it will hurt most, of course, are the sites - like newspapers and other publications - that depend on online advertising to stay afloat. It's hard to understand how the publishers missed it; but one presumes they, too, were distracted by the need to defend music and video from evil pirates.

The sad thing is that the goal behind this masterfully stupid piece of legislation is a reasonably noble one: to protect Internet users from monitoring and behavioural targeting to which they have not consented. But regulating cookies is precisely the wrong way to go about achieving this goal, not just because it disables Web browsing but because technology is continuing to evolve. The EU would be better to regulate by specifying allowable actions and consequences rather than specifying technology. Cookies are not in and of themselves inherently evil; it's how they're used.

Eight years ago, when the cookie proposals first surfaced, they, logically enough, formed part of a consumer privacy bill. That they're now part of the telecoms package suggests they've been banging around inside Parliament looking for something to attach themselves to ever since.

I probably exaggerate slightly, since Out-Law also notes that in fact the EU did pass a law regarding cookies that required sites to offer visitors a way to opt out. This law is little-known, largely ignored, and unenforced. At this point the Net's best hope looks to be that the new version is treated the same way.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, follow on Twitter or by email to netwars@skeptic.demon.co.uk).

October 30, 2009

Kill switch

There's an old sort-of joke that goes, "What's the best way to kill the Internet?" The number seven answer, according to Simson Garfinkel, writing for HotWired in 1997: "Buy ten backhoes." Ba-boom.

The US Senate, never folks to avoid improving a joke, came up with a new suggestion: install a kill switch. They published this little gem (as S.773) on April 1. It got a flurry of attention and then forgotten until the last week or two. (It's interesting to look back at Garfinkel's list of 50 ways to kill the Net and notice that only two are government actions, and neither is installing a "kill switch").

To be fair, "kill switch" is an emotive phrase for what they have in mind, which is that the president:

may declare a cybersecurity emergency and order the limitation or shutdown of Internet traffic to and from any compromised Federal Government or United States critical infrastructure information system or network
.

Now, there's a lot of wiggle room in a vague definition like "critical infrastructure system". That could be the Federal government's own servers. Or the electrical grid, the telephone network, the banking system, the water supply, or even, arguably, Google. (It has 64+ percent of US search queries, and if you can't find things the Internet might as well be dead.) But what this particular desire of the Senate's sounds most like is those confused users who think they can catch a biological virus from their computers.

Still, for the media, calling the Senate's idea a "kill switch" is attention-getting political genius. We don't call the president's power to order the planes out of the sky, as happened on 9/11 a "crash switch", but imagine the outcry against it if we did.

Technically, the idea that there's a single off switch waiting to be implemented somewhere, is of course ridiculous.

The idea is also administrative silliness: Obama, we hope, is kind of busy. The key to retaining sanity when you're busy is to get other people to do all the things they can without your input. We would hope that the people running the various systems powering the federal government's critical infrastructure could make their own, informed decisions - faster than Obama can - about when they need to take down a compromised server.

Despite wishful thinking, John Gilmore's famous aphorism, "The Net perceives censorship as damage and routes around", doesn't really apply here. For one thing, even a senator knows - probably - that you can't literally shut down the entire Internet from a single switch sitting in the President's briefcase (presumably next to the nuclear attack button). Much of the Internet is, after all, outside the US; much of it is in private ownership. (Perhaps the Third Amendment could be invoked here?)

For another, Gilmore's comment really didn't apply to individual Internet-linked computer networks; Google's various bits of outages this year ought to prove that it's entirely possible for those to be down without affecting the network at large. No, the point was that if you try to censor the Net its people will stop you by putting up mirror servers and passing the censored information around until everyone has a copy. The British Chiropractic Association (quacklash!) and Trafigura are the latest organizations to find out what Gilmore knew in 1993. He also meant, I suppose, that the Internet protocols were designed for resilience and to keep trying by whatever alternate routes are available if data packets don't get through.

Earlier this week another old Net hand, Web inventor Tim Berners-Lee, gave some rather sage advice to the Web 2.0 conference. One key point: do not build your local laws into the global network. That principle would not, unfortunately, stop the US government from shutting off its own servers (to spite its face?), but it does nix the idea of, say, building the network infrastructure to the specification of any one particular group - the MPAA or the UK government, in defiance of the increasingly annoyed EU. In the same talk, Berners-Lee also noted (according to CNET): "I'm worried about anything large coming in to take control, whether it's large companies or government."

Threats like these were what he set up W3C to protect against. People talk with reverence of Berners-Lee's role as inventor, but many fewer understand that the really big effort is the 30 years since the aha! moment of creation, during which Berners-Lee has spent his time and energy nurturing the Web and guiding its development. Without that, it could easily have been strangled by competing interests, both corporate and government. As, of course, it still could be, depending on the outcome of the debates over network neutrality rules.

Dozens of decisions like Berners-Lee's were made in creating the Internet. They have not made it impossible to kill - I'm not sure how many backhoes you'd need now, but I bet it's still a surprisingly finite number - but they have made it a resilient and robust network. A largely democratic medium, in fact, unlike TV and radio, at least so far. The Net was born free; the battles continue over whether it should be in chains.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, follow on Twitter, or by email to netwars@skeptic.demon.co.uk.

July 3, 2009

What's in an assigned name?

There's a lot I didn't know at the time about the founding of the Internet Corporation for Assigned Names and Numbers, but I do remember the spat that preceded it. Until 1998, the systems for assigning domain names (DNS) and assigning Internet numbers (IANA) were both managed by one guy, Jon Postel, who by all accounts and records was a thoughtful and careful steward and an important contributor to much of the engineering that underpins the Internet even now. Even before he died in October 1998, however, plans were underway to create a successor organization to take over the names and numbers functions.

The first proposal was to turn these bits of management over to the International Telecommunications Union, and a memorandum of understanding was drawn up that many, especially within the ITU, assumed would pass unquestioned. Instead, there was much resentment and many complaints that important stakeholders (consumers, most notably) had been excluded. Eventually, ICANN was created under the auspices of the US Department of Commerce intended to become independent once it had fulfilled certain criteria. We're still waiting.

As you might expect, the US under Bush II wasn't all that interested in handing off control. The US government had some support in this, in part because many in the US seem to have difficulty accepting that the Internet was not actually built by the US alone. So alongside the US government's normal resistance to relinquishing control was an endemic sense that it would be "giving away" something the US had created.

All that aside, the biggest point of contention was not ICANN's connection to the US government, as desirable as that might be to those outside the US. Nor was it the assignment of numbers, which, since numbers are the way the computers find each other, is actually arguably the most important bit of the whole thing. It wasn't even, or at least not completely, the money (PDF), as staggering as it is that ICANN expects to rake in $61 million in revenue this year as its cut of domain name registrations. No, of course it was the names that are meaningful to people: who should be allowed to have what?

All this background is important because on September 30 the joint project agreement with DoC under which ICANN operates expires, and all these debates are being revisited. Surprisingly little has changed in the arguments about ICANN since 1998. Michael Froomkin argued in 2000 (PDF) that ICANN bypassed democratic control and accountability. Many critics have argued in the intervening years that ICANN needs to be reined in: its mission kept to a narrow focus on the DNS, and its structure designed to be transparent and accountable, and kept free of not only US government inteference but that of other governments as well.

Last month, the Center for Democracy and Technology published its comments to that effect. Last year, and in 2006, former elected ICANN board member Karl Auerbach">argued similarly, with much more discussion of ICANN's finances, which he regards as a "tax". Perhaps even more than might have been obvious then: ICANN's new public dashboard has revealed that the company lost $4.6 million on the stock market last year, an amount reporter John Levine equates to the 20-cent fee from 23 million domain name registrations. As Levine asks, if they could afford to lose that amount then they didn't need the money - so why did they collect it from us? There seems to be no doubt that ICANN can keep growing in size and revenues by creating more top-level domains, especially as it expands into long-mooted non-ASCII names (iDNs).

Arguing about money aside, the fact is that we have not progressed much, if at all, since 1998. We are asking the same questions and having the same arguments. What is the DNS for? Should it be a directory, a handy set of mnemonics, a set of labels, a zoning mechanism, or a free-for-all? Do languages matter? Early discussions included the notion that there would be thousands, even tens of thousands of global top-level domains. Why shouldn't Microsoft, Google, or the Electronic Frontier Foundation operate their own registries? Is managing the core of the Internet an engineering, legal, or regulatory problem? And, latterly, given the success and central role of search engines, do we need DNS at all? Personally, I lean toward the view that the DNS has become less important than it was, as many services (Twitter, instant messaging, VOIP) do not require it. Even the Web needs it less than it did. But if what really matters about the DNS is giving people names they can remember, then from the user point of view it matters little how many top-level domains there are. The domain info.microsoft is no less memorable than microsoft.info or microsoft.com.

What matters is that the Internet continues to function and that anyone can reach any part of it. The unfortunate thing is that none of these discussions have solved the problems we really have. Four years after the secured version of DNS (DNSsec) was developed to counteract security threats such as DNS cache poisoning that had been mooted for many more years than that, it's still barely deployed.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, follow on , or send email to netwars@skeptic.demon.co.uk.

June 26, 2009

Pass the policy

For quite a few years now, the Canadian law professor Michael Geist has been writing a column on technology law for the Toronto Star. (Brief pause to admire the Star for running such a thing.) This week, he settled down with a simple question: where does copyright policy come from?

The story is that about a month ago the Conference Board of Canada recalled three reports on intellectual property rights after Geist accused the Board of plagiarism and also of accepting funding for the reports from various copyright lobby groups. The source of the copied passages: the International Intellectual Property Alliance. According to its own Web site, the IIPA was "formed in 1984 to represent the US copyright-based industries." It includes: the Association of American Publishers, the Business Software Alliance, the Entertainment Software Association, the Independent Film and Television Alliance, the Motion Picture Association of America, the National Music Publishers' Association, and the Recording Industry Association of America. We know *those* guys, or most of them.

This week, Geist settled down to examine the sources more closely in a lovely bit of spaghetti-detangling. Basically, just two organizations - Canada's equivalents of the MPAA and RIAA - were the source of multiple reports as well as funding for further lobbying organizations. "The net effect," Geist writes, "has been a steady stream of reports that all say basically the same thing, cite to the same sources, make the same recommendations, and often rely on each other to substantiate the manufactured consensus on copyright reform." And of course, these guys don't mean "copyright reform" the way Geist - or the Electronic Frontier Foundation or the Open Rights Group would. We say reform, we mean liberalize and open up; they say reform, they mean tighten and extend. I'd call their way "business as usual".

What's interesting, of course, is to compare Geist's handy table of who recommended what and to whom to the various proposals that are flying around the UK and Europe at the moment. To wit:

Create an IP Council. The Digital Britain report, launched ten days ago, calls this the "Digital Rights Agency", and there's even an entire separate report (PDF) outlining what it might be good for. It would include industry representatives working in collaboration with government (but would not be a government agency), and it would, among other things, educate the public. Which leads us to...

Create public education awareness programs. Of course, I predicted something like this in 1997 - for 2002.

Create an Intellectual Property Crime Task Force. While I'm not aware of speciific Briitsh proposals for this, I would note that Britain does have various law enforcement agencies already who deal with physical forms of IP counterfeiting, and the Internet Watch Foundation has throughout its history mentioned the possibility of tackling online copyright infringement.

Tougher penalties. The Digital Britain report is relatively polite on this one. It says flatly that for-profit counterfeiters will be pursued under criminal law, and calls file-sharing, flatly, "wrong", but also says that most people would prefer to remain within the law (true) and therefore it intends to encourage the development of legal downloading markets (good). However, it also proposes that ISPs should use methods such as bandwidth throttling to deter persistent file-sharers.

Implement the WIPO Internet treaties and anti-circumvention measures. Already done. Anti-circumvention was a clause in the 2001 European Union Copyright Directive and was enacted in the UK in 2003, with some exceptions for cryptographic research.

Increase funding and resources to tackle IP crime. Well. Where agencies come doubtless funding will follow.

The Digital Britain report's proposed next steps include passing legislation to enact sanctions such as bandwidth throttling. There's also a consultation on "illicit peer-to-peer filesharing" (deadline September 15); the government's proposals would require ISPs to notify alleged infringers, keep records of how often they've been notified, and allows rightsholders to use this information, anonymized, to decide when to initiate legal action. Approving the code will be Ofcom, for the time being. The consultation document helpfully reviews the state of legislative play in other countries.

It's extremely rare that we get a case where the origins of a particular set of policies can, as Geist has done here, be traced with such clarity and certainty. And it means that advocates of real copyright reform were right to distrust the claims in this area - the figures the industry claims represent losses to rightsholders from file-sharing - no matter how neutral the apparent source.

I first heard the term "policy laundering" from Privacy International's Gus Hosein; it's used to describe the way today's unwanted policies are shopped around until their sponsors can find a taker. The game works like this, as Geist shows: you publish reports until a government agency - any government agency - adopts your point of view in an apparently neutral document. Then you cite that to other governments until someone passes the laws you want. Then you promote that legislation to other countries: Be the envy of other major governments.

The Digital Britain report sells these policies as aiding the British intellectual property industry. But that's not where they came from originally. Does anyone really think the MPAA and RIAA have Britain's best interests at heart?

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and links to earlier columns in this series. Readers are welcome to post here, follow on Twitter or send email to netwars@skeptic.demon.co.uk (but please turn off HTML).

April 11, 2009

Statebook of the art

The bad thing about the Open Rights Group's new site, Statebook is that it looks so perfectly simple to use that the government may decide it's actually a good idea to implement something very like it. And, unfortunately, that same simplicity may also create the illusion in the minds of the untutored who still populate the ranks of civil servants and politicians that the technology works and is perfectly accurate.

For those who shun social networks and all who sail in her: Statebook's interface is an almost identical copy of that of Facebook. True, on Facebook the applications you click on to add are much more clearly pointless wastes of time, like making lists of movies you've liked to share with your friends or playing Lexulus (the reinvention of the game formerly known as Scrabulous until Hasbrouck got all huffy and had it shut down).

Politicians need to resist the temptation to believe it's as easy as it looks. The interfaces of both the fictional Statebook and the real Facebook look deceptively simple. In fact, although friends tell me how much they like the convenience of being able to share photos with their friends in a convenient single location, and others tell me how much they prefer Facebook's private messaging to email, Facebook is unwieldy and clunky to use, requiring a lot of wait time for pages to load even over a fast broadband connection. Even if it weren't, though, one of the difficulties with systems attempting to put EZ-2-ewes front ends on large and complicated databases is that they deceive users into thinking the underlying tasks are also simple.

A good example would be airline reservations systems. The fact is that underneath the simple searching offered by Expedia or Travelocity lies some extremely complex software; it prices every itinerary rather precisely depending on a host of variables. These may not just the obvious things like the class of cabin, but the time of day, the day of the week, the time of year, the category of flyer, the routing, how far in advance the ticket is being purchased, and the number of available seats left. Only some of this is made explicit; frequent flyers trying to maxmize their miles per dollar despair while trying to dig out arcane details like the class of fare.

In his 1988 book The Design of Everyday Things, Donald Norman wrote about the need to avoid confusing the simplicity or complexity of an interface with the characteristics of the underlying tasks. He also writes about the mental models people create as they attempt to understand the controls that operate a given device. His example is a refrigerator with two compartments and two thermostatic controls. An uninformed user naturally assumes each thermostat controls one compartment, but in his example, one control sets the thermostat and the other directs the proportion of cold air that's sent to each comparment. The user's mental model is wrong and, as a consequence, attempts that user makes to set the temperature will also, most likely, be wrong.

In focusing on the increasing quantity and breadth of data the government is collecting on all of us, we've neglected to think about how this data will be presented to its eventual users. We have warned about the errors that build up in very large databases that are compiled from multiple sources. We have expressed concern about surveillance and about its chilling impact on spontaneous behaviour. And we have pointed out that data is not knowledge; it is very easy to take even accurate data and build a completely false picture of a person's life. Perhaps instead we should be focusing on ensuring that the software used to query these giant databases-in-progress teaches users not to expect too much.

As an everyday example of what I mean, take the automatic line-calling system used in tennis since 2005, Hawkeye. Hawkeye is not perfectly accurate. Its judgements are based on reconstructions that put together the video images and timing data from four or more high-speed video cameras. The system uses the data to calculate the three-dimensional flight of the ball; it incorporates its knowledge of the laws of physics, its model of the tennis court, and its database of the rules of the game in order to judge whether the ball is in or out. Its official margin for error is 3.6mm.

A study by two researchers at Cardiff University disputed that number. But more relevant here, they pointed out that the animated graphics used to show the reconstructed flight of the ball and the circle indicating where it landed on the court surface are misleading because they look to viewers as though they are authoritative. The two researchers, Harry Collins and Robert Evans, proposed that in the interests of public education the graphic should be redesigned to display the margin for error and the level of confidence.

This would be a good approach for database matches, too, especially since the number of false matches and errors will grow with the size of the databases. A real-life Statebook that doesn't reflect the uncertainty factor of each search, each match, and each interpretation next to every hit would indeed be truly dangerous.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

March 13, 2009

Threat model

It's not about Phorm, it's about snooping. At Wednesday morning's Parliamentary roundtable, "The Internet Threat", the four unhappy representatives I counted from Phorm had a hard time with this. Weren't we there to trash them and not let them reply? What do you mean the conversation isn't all about them?

We were in a committee room many medieval steps up unside the House of Lords. The gathering, was convened by Baroness Miller of Chilthorne Domer with the idea of helping Parliamentarians understand the issues raised not only by Phorm but also by the Interception Modernisation Programme, Google, Microsoft, and in fact any outfit that wants to collect huge amounts of our data for purposes that won't be entirely clear until later.

Most of the coverage of this event has focused on the comments of Sir Tim Berners-Lee, the indefatigable creator of the 20-year-old Web (not the Internet, folks!), who said categorically, "I came here to defend the integrity of the Internet as a medium." Using the Internet, he said, "is a fundamental human act, like the act of writing. You have to be able to do it without interference and/or snooping." People use the Internet when they're in crisis; even just a list of URLs you've visited is very revealing of sensitive information.

Other distinguished speakers included Professor Wendy Hall, Nicholas Bohm representing the Foundation for Information Policy Research, the Cambridge security research group's Richard Clayton, the Open Rights Group's new executive director, Jim Killock, and the vastly experienced networking and protocol consultant Robb Topolski.

The key moment, for me, was when one of the MPs the event was intended to educate asked this: "Why now?" Why, in other words, is deep packet inspection suddenly a problem?

The quick answer, as Topolski and Clayton explained, is "Moore's Law." It was not, until a couple-three years ago, possible to make a computer fast enough to sit in the middle of an Internet connection and not only sniff the packets but examine their contents before passing them on. Now it is. Plus, said Clayton, "Storage."

But for Kent Ertegrul, Phorm's managing director, it was all about Phorm. The company had tried to get on the panel and been rejected. His company's technology was being misrepresented. Its system makes it impossible for browsing habits to be tracked back to people. Tim Berners-Lee, of all people, if he understood their system, would appreciate the elegance of what they've actually done.

Berners-Lee was calm, but firm. "I have not at all criticized behavioral advertising," he pointed out. "What I'm saying is a mistake is snooping on the Internet."

Right on.

The Internet, Berners-Lee and Topolski explained, was built according to the single concept that all the processing happens at the ends, and that the middle is just a carrier medium. That design decision has had a number of consequences, most of them good. For example, it's why someone can create the new application of the week and deploy it without getting permission. It's why VOIP traffic flows across the lines of the telephone companies whose revenues it's eating. It is what network neutrality is all about.

Susan Kramer, saying she was "the most untechie person" (and who happens to be my MP), asked if anyone could provide some idea of what lawmakers can actually do. The public, she said, is "frightened about the ability to lose privacy through these mechanisms they don't understand".

Bohm offered the analogy of water fluoridation: it's controversial because we don't expect water flowing into our house to have been tampered with. In any event, he suggested that if the law needs to be made clearer it is in the area of laying down the purposes for which filtering, management, and interference can be done. It should, he said, be "strictly limited to what amounts to matters of the electronic equivalent of public health, and nothing else."

Fluoridation of water is a good analogy for another reason: authorities are transparent about it. You can, if you take the trouble, find out what is in your local water supply. But one of the difficulties about a black-box-in-the-middle is that while we may think we know what it does today - because even if you trust, say, Richard Clayton's report on how Phorm works (PDF) there's no guarantee of how the system will change in the future. Just as, although today's government may have only good intentions in installing a black box in every ISP that collects all traffic data, the government of ten years hence may use the system in entirely different ways for which today's trusting administration never planned. Which is why it's not about Phorm and isn't even about behavioural advertising; Phorm was only a single messenger in a bigger problem.

So the point is this: do we want black boxes whose settings we don't know and whose workings we don't understand sitting at the heart of our ISPs' networks examining our traffic? This was the threat Baroness Miller had in mind - a threat *to* the Internet, not the threat *of* the Internet beloved of the more scaremongering members of the press. Answers on a postcard...


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML)

February 6, 2009

Forty-five years

This week the EU's legal affairs committee, JURI, may vote - again - on term extension in sound recordings. As of today, copyright is still listed on the agenda.

Opposing term extension was a lot simpler at the national level in the UK; the path from proposal to legislation is well-known, well trodden, and well-watched by the national media. At the EU level, JURI is only one of four committees involved in proposing and amending term extension on behalf of the European Parliament - and then even after the Parliament votes it's the Commission who makes the final decision. The whole thing drags on for something close to forever, which pretty much guarantees that only the most obsessed stay in touch through the whole process. If you had designed a system to ensure apathy except among lobbyists who like good food, you'd have done exactly this.

There are many reasons to oppose term extension, most of which we've covered before. Unfortunately, these seem invisible to some politicians. As William Patry blogs, the harm done by term extension is diffuse and hard to quantify while easily calculable benefits accrue to a small but wealthy and vocal set of players.

What's noticeable is how many independent economic reviews agree with what NGOs like the Electronic Frontier Foundation and the Open Rights Group have said all along.

According to a joint report from several European intellectual property law centers (PDF), the Commission itself estimates that 45 extra years of copyright protection will hand the European music industry between €44 million and €843 million - uncertain by a factor of 20! The same report also notes that term extension will not net performers additional broadcast revenue; rather, the same pot will be spread among a larger pool of musicians, benefiting older musicians at the expense of young incomers. The report also notes that performers don't lose control over their music when the term of copyright ends; they lose it when they sign recording contracts (so true).

Other reports are even less favorable. In 2005, for example, the Dutch Institute for Information Law concluded that copyright in sound recordings has more in common with design rights and patents than with other areas of copyright, and it would be more consistent to reduce the term rather than extend it. More recently, an open letter from Bournemouth University's Centre for Intellectual Property Policy Management questioned exactly where those estimated revenues were going to come from, and pointed out the absurdity of the claim that extension would help performers.

And therein is the nub. Estimates are that the average session musician will benefit from term extension in the amount of €4 to €58 (there's that guess-the-number-within-a-factor-of-20 trick again). JURI's draft opinion puts the number of affected musicians at 7,000 per large EU member state, less in the rest. Call it 7,000 in all 27 and give each musician €20; that's €3.78 million, hardly enough for a banker's bonus. We could easily hand that out in cash, if handouts to aging performers are the purpose of the exercise.

Benefiting performers is a lobbyists' red herring that cynically plays on our affection for our favorite music and musicians; what term extension will do, as the Bournemouth letter points out, is benefit recording companies. Of that wackily wide range of estimated revenues in the last paragraph, 90 percent, or between €39 million and €758 million will go to record producers, even according to the EU's own impact assessment (PDF), based on a study carried out by PriceWaterhouseCooper.

If you want to help musicians, the first and most important thing you should do is improve the industry's standard contracts and employment practices. We protect workers in other industries from exploitation; why should we make an exception for musicians? No one is saying - not even Courtney Love - that musicians deserve charity. But we could reform UK bankruptcy law so that companies acquiring defunct labels are required to shoulder ongoing royalty payment obligations as well as the exploitable assets of the back catalogue. We could put limits on what kind of clauses a recording company is allowed to impose on first-time recording artists. We could set minimums for what is owed to session musicians. And we could require the return of rights to the performers in the event of a recording's going out of print. Any or all of those things would make far more difference to the average musician's lifetime income than an extra 45 years of copyright.

Current proposals seem to focus on this last idea as a "use it or lose it" clause that somehow makes the rest of term extension all right. Don Foster, the conservative MP who is shadow minister for the Department of Culture, Media, and Sport, for example, has argued for it repeatedly. But by itself it's not enough of a concession to balance the effect of term extension and the freezing of the public domain.

If you want to try to stop term extension, this is a key moment. Lobby your MEP and the members of the relevant committees. Remind them of the evidence. And remind them that it's not just the record companies and the world's musicians who have an interest in copyright; it's the rest of us, too.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

January 30, 2009

Looking backward

Governments move slowly; technology moves fast. That's not a universal truth - witness Obama's first whirlwind week in office - but in the early days of the Net it was the kind of thing people said smugly when they wanted to claim that cyberspace was impervious to regulation. It worked well enough for, say, setting free strong cryptography over the objections of the State Department and ITAR.

This week had two perfect examples. First: Microsoft noted in its 10-Q that the EU may force it to do something about tying Internet Explorer to Windows - remove it, make it one of only several browsers consumers can choose from at setup, or randomly provide different browsers. Still fighting the browser wars? How 1995.

Second: the release of the interim Digital Britain report by the Department for Culture, Media, and Sport. Still proposing Digital Rights Management as a way of protecting rightsholders' interest in content? How 2005.

It probably says something about technology cycles that the DRM of 2005 is currently more quaint and dated than the browser wars of 1995-1998. The advent of cloud computing and Google's release of Chrome last year have reinvigorated the browser "market". After years of apparent stagnation it suddenly matters again that we should have choices and standards to keep the Internet from turning into a series of walled gardens (instead of a series of tubes).

DRM, of course, turns content into a series of walled gardens and causes a load of other problems we've all written about extensively. But the most alarming problem about its inclusion in the government's list of action items is that even the music industry that most wanted it is abandoning it. What year was this written in? Why is a report that isn't even finished proposing to adopt a technological approach that's already a market failure? What's next, a set of taxation rules designed for CompuServe?

The one bit of good, forwarding-thinking news - which came as a separate announcement from Intellectual Property Minister David Lammy, is that apparently the UK government is ready to abandon the "three strikes" idea for punishing file-sharers - it's too complicated (Yes, Minister rules!) to legislate. And sort of icky arresting teenagers in their bedrooms, even if the EU doesn't see anything wrong with that and the Irish have decided to go ahead with it.

The interim report bundles together issues concerning digital networks (broadband, wireless, infrastructure), digital television and radio, and digital content. It's the latter that's most contentious: the report proposes creating a Rights Agency intended to encourage good use (buying content) and discourage bad use (whatever infringes copyright law). The report seems to turn a blind eye to the many discussions of how copyright law should change. And then there's a bunch of stuff about whether Britain should have a second public service broadcaster to compete "for quality" with the BBC. How all these things cohere is muddy.

For a really scathing review of the interim report, see The Guardian , where Charles Arthur attacks not only the report's inclusion of DRM and a "rights agency" to collaborate on developing it, but its dirt path approach to broadband speed and its proposed approach to network neutrality (which it calls "net neutrality", should you want to search the report to find out what it says).

The interim report favors allowing the kind of thing Virgin has talked about: making deals with content providers in which they're paid for guaranteed service levels. That turns the problem of who will pay for high-speed fiber into a game of pass-the-parcel. Most likely, consumers will end up paying, whether that money goes to content providers or ISPs. If the BBC pays for the iPlayer, so do we, through the TV license. If ISPs pay, we pay in higher bandwidth charges. If we're going to pay for it anyway, why shouldn't we have the freedom of the Internet in return?

This is especially true because we do not know what's going to come next or how people will use it. When YouTube became the Next Big Thing, oh, say, three or four years ago, it was logical to assume that all subsequent Next Big Things were going to be bandwidth hogs. The next NBT turned out to be Twitter, which is pretty much your diametrical opposite. Now, everything is social media - but if there's one thing we know about the party on the Internet it's that it keeps on moving on.

There's plenty that's left out of this interim report. There's a discussion of spectrum licensing that doesn't encompass newer ideas about spectrum allocation. It talks about finding new business models for rightsholders without supporting obsolete ones and the "sea of unlawful activity in which they have to swim" and mentions ISPs - but leaves out consumers except as "customers" or illegal copiers. It nods at the notion that almost anyone can be a creator and find distribution, but still persists in talking of customers and rightsholders as if they were never the same people.

No one ever said predicting the future was easy, least of all Niels Bohr, but it does help if you start by noticing the present.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

December 5, 2008

Saving seeds

The 17 judges of the European Court of Human Rights ruled unanimously yesterday that the UK's DNA database, which contains more than 3 million DNA samples, violates Article 8 of the European Convention on Human Rights. The key factor: retaining, indefinitely, the DNA samples of people who have committed no crime.

It's not a complete win for objectors to the database, since the ruling doesn't say the database shouldn't exist, merely that DNA samples should be removed once their owners have been acquitted in court or the charges have been dropped. England, the court said, should copy Scotland, which operates such a policy.

The UK comes in for particular censure, in the form of the note that "any State claiming a pioneer role in the development of new technologies bears special responsibility for striking the right balance..." In other words, before you decide to be the first on your block to use a new technology and show the rest of the world how it's done, you should think about the consequences.

Because it's true: this is the kind of technology that makes surveillance and control-happy governments the envy of other governments. For example: lacking clues to lead them to a serial killer, the Los Angeles Police Department wants to copy Britain and use California's DNA database to search for genetic profiles similar enough to belong to a close relative .The French DNA database, FNAEG, was proposed in 1996, created in 1998 for sex offenders, implemented in 2001, and broadened to other criminal offenses after 9/11 and again in 2003: a perfect example of function creep. But the French DNA database is a fiftieth the size of the UK's, and Austria's, the next on the list, is even smaller.

There are some wonderful statistics about the UK database. DNA samples from more than 4 million people are included on it. Probably 850,000 of them are innocent of any crime. Some 40,000 are children between the ages of 10 and 17. The government (according to the Telegraph) has spent £182 million on it between April 1995 and March 2004. And there have been suggestions that it's too small. When privacy and human rights campaigners pointed out that people of color are disproportionately represented in the database, one of England's most experienced appeals court judges, Lord Justice Sedley, argued that every UK resident and visitor should be included on it. Yes, that's definitely the way to bring the tourists in: demand a DNA sample. Just look how they're flocking to the US to give fingerprints, and how many more flooded in when they upped the number to ten earlier this year. (And how little we're getting for it: in the first two years of the program, fingerprinting 44 million visitors netted 1,000 people with criminal or immigration violations.)

At last week's A Fine Balance conference on privacy-enhancing technologies, there was a lot of discussion of the key technique of data minimization. That is the principle that you should not collect or share more data than is actually needed to do the job. Someone checking whether you have the right to drive, for example, doesn't need to know who you are or where you live; someone checking you have the right to borrow books from the local library needs to know where you live and who you are but not your age or your health records; someone checking you're the right age to enter a bar doesn't need to care if your driver's license has expired.

This is an idea that's been around a long time - I think I heard my first presentation on it in about 1994 - but whose progress towards a usable product has been agonizingly slow. IBM's PRIME project, which Jan Camenisch presented, and Microsoft's purchase of Credentica (which wasn't shown at the conference) suggest that the mainstream technology products may finally be getting there. If only we can convince politicians that these principles are a necessary adjunct to storing all the data they're collecting.

What makes the DNA database more than just a high-tech fingerprint database is that over time the DNA stored in it will become increasingly revealing of intimate secrets. As Ray Kurzweil kept saying at the Singularity Summit, Moore's Law is hitting DNA sequencing right now; the cost is accordingly plummeting by factors of ten. When the database was set up, it was fair to characterize DNA as a high-tech version of fingerprints or iris scans. Five - or 15, or 25, we can't be sure - years from now, we will have learned far more about interpreting genetic sequences. The coded, unreadable messages we're storing now will be cleartext one day, and anyone allowed to consult the database will be privy to far more intimate information about our bodies, ourselves than we think we're giving them now.

Unfortunately, the people in charge of these things typically think it's not going to affect them. If the "little people" have no privacy, well, so what? It's only when the powers they've granted are turned on them that they begin to get it. If a conservative is a liberal who's been mugged, and a liberal is a conservative whose daughter has needed an abortion, and a civil liberties advocate is a politician who's been arrested...maybe we need to arrest more of them.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

July 4, 2008

The new normal

The (only) good thing about a war is you can tell when it's over.

The problem with the "War on Terror" is that terrorism is always with us, as Liberty's director, Shami Chakrabarti, said yesterday at the Homeland and Border Security 08 conference. "I do think the threat is very serious. But I don't think it can be addressed by a war." Because, "We, the people, will not be able to verify a discernible end."

The idea that "we are at war" has justified so much post 9/11 legislation, from the ID card (in the UK) and Real ID (US) to the continued expansion of police powers.

How long can you live in a state of emergency before emergency becomes the new normal? If there is no end, when do you withdraw the latitude wartime gives a government?

Several of yesterday's speakers talked about preserving "our way of life" while countering the threat with better security. But "our way of life" is a moving target.

For example, Baroness Pauline Neville-Jones, the shadow security minister, talked about the importance of controlling the UK's borders. "Perimeter security is absolutely basic." Her example: you can't go into a building without having your identity checked. But it's not so long ago - within the 18 years I've been living in London - that you could do exactly that, even sometimes in central London. In New York, of course, until 9/11, everything was wide open; these days midtown Manhattan makes you wait in front of barriers while you're photographed, checked, and treated with great suspicion if the person you're visiting doesn't answer the phone.

Only seven years ago, flying did not involve two hours of standing in line. Until January, tourists do not have to register three days before flying to the US for pre-screening.

It's not clear how much would change with a Conservative government. "There is a very great deal by this government we would continue," said Neville-Jones. But, she said, besides trackling threats, whether motivated (terrorists) or not (floods, earthquakes, "we are also at any given moment in the game of deciding what kind of society we want to have and what values we want to preserve." She wants "sustainable security, predicated on protecting people's freedom and ensuring they have more, not less, control over their lives." And, she said, "While we need protective mechanisms, the surveillance society is not the route down which we should go. It is absolutely fundamental that security and freedom lie together as an objective."

To be sure, Neville-Jones took issue with some of the present government's plans - the Conservatives would not, she said, go ahead with the National Identity Register, and they favour "a more coherent and wide-ranging border security force". The latter would mean bringing together many currently disparate agencies to create a single border strategy. The Conservatives also favour establishing a small "homeland command for the armed forces" within the UK because, "The qualities of the military and the resources they can bring to complex situations are important and useful." At the moment, she said, "We have to make do with whoever happens to be in the country."

OK. So take the four core elements of the national security strategy according to Admiral Lord Alan West, a Parliamentary under-secretary of state at the Home Office: pursue, protect, prepare, and prevent. "Prevent" is the one that all this is about. If we are in wartime, and we know that any measure that's brought in is only temporary, our tolerance for measures that violate the normal principles of democracy is higher.

Are the Olympics wartime? Security is already in the planning stages, although, as Tarique Ghaffur pointed out, the Games are one of several big events in 2012. And some events like sailing and Olympic football will be outside London, as will 600 training camps. Add in the torch relay, and it's national security.

And in that case, we should be watching very closely what gets brought in for the Olympics, because alongside the physical infrastructure that the Games always leave behind - the stadia and transport - may be a security infrastructure that we wouldn't necessarily have chosen for daily life.

As if the proposals in front of us aren't bad enough. Take for example, the clause of the counterterrorism bill (due for its second reading in the Lords next week) that would allow the authorities to detain suspects for up to 42 days without charge. Chakrabarti lamented the debate over this, which has turned into big media politics.

"The big frustration," she said, "is that alternatives created by sensible, proportionate means of early intervention are being ignored." Instead, she suggested, make the data legally collected by surveillance and interception admissible in fair criminal trials. Charge people with precursor terror offenses so they are properly remanded in custody and continue the investigation for the more serious plot. "That is a way of complying with ancient principles that you should know what you are accused of before being banged up, but it gives the police the time and powers they need."

Not being at war gives us the time to think. We should take it.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

June 27, 2008

Mistakes were made

This week we got the detail on what went wrong at Her Majesty's Revenue and Customs that led to the loss of those two CDs full of the personal details of 25 million British households last year with the release of the Poynter Review (PDF). We also got a hint of how and whether the future might be different with the publication yesterday of Data Handling: Proecures in Government (PDF), written by Sir Gus O'Donnell and commissioned by the Prime Minister after the HMRC loss. The most obvious message of both reports: government needs to secure data better.

The nicest thing the Poynter review said was that HMRC has already made changes in response to its criticisms. Otherwise, it was pretty much a surgical demonstration of "institutional deficiencies".

The chief points:


- Security was not HMRC's top priority.

- HMRC in fact had the technical ability to send only the selection of data that NAO actually needed, but the staff involved didn't know it.

- There was no designated single point of contact between HMRC and NAO.

- HMRC used insecure methods for data storage and transfer.

- The decision to send the CDs to the NAO was taken by junior staff without consulting senior managers - which under HMRC's own rules they should have done.

- The reason HMRC's junior staff did not consult managers was that they believed (wrongly) that NAO had absolute authority to access any and all information HMRC had.

- The HMRC staffer who dispatched the discs incorrectly believed the TNT Post service was secure and traceable, as required by HMRC policy. A different TNT service that met those requirements was in fact available.

- HMRC policies regarding information security and the release of data were not communicated sufficiently through the organization and were not sufficiently detailed.

- HMRC failed on accountability, governance, information security...you name it.

The real problem, though, isn't any single one of these things. If junior staff had consulted senior staff, it might not have mattered that they didn't know what the policies were. If HMRC used proper information security and secure methods for data storage (that is, encryption rather than simple password protection), they wouldn't have had access to send the discs. If they'd understood TNT's services correctly, the discs wouldn't have gotten lost - or at least been traceable if they had.

The real problem was the interlocking effect of all these factors. That, as Nassim Nicholas Taleb might say, was the black swan.

For those who haven't read Taleb's The Black Swan: The Impact of the Highly Improbable, the black swan stands for the event that is completely unpredictable - because, like black swans until one was spotted in Australia, no such thing has ever been seen - until it happens. Of course, data loss is pretty much a white swan; we've seen lots of data breaches. The black swan, really, is the perfectly secure system that is still sufficiently open for the people who need to use it.

That challenge is what O'Donnell's report on data handling is about and, as he notes, it's going to get harder rather than easier. He recommends a complete rearrangement of how departments manage information as well as improving the systems within individual departments. He also recommends greater openness about how the government secures data.

"No organisation can guarantee it will never lose data," he writes, "and the Government is no exception." O'Donnell goes on to consider how data should be protected and managed, not whether it should be collected or shared in the first place. That job is being left for yet another report in progress, due soon.

It's good to read that some good is coming out of the HMRC data loss: all departments are, according to the O'Donnell report, reviewing their data practices and beginning the process of cultural change. That can only be a good thing.

But the underlying problem is outside the scope of these reports, and it's this government's fondness for creating giant databases: the National Identity Register, ContactPoint, the DNA database, and so on. If the government really accepted the principle that it is impossible to guarantee complete data security, what would they do? Logically, they ought to start by cancelling the data behemoths on the understanding that it's a bad idea to base public policy on the idea that you can will a black swan into existence.

It would make more sense to create a design for government use of data that assumes there will be data breaches and attempts to limit the adverse consequences for the individuals whose data is lost. If my privacy is compromised alongside 50 million other people's and I am the victim of identity theft does it help me that the government department that lost the data knows which staff member to blame?

As Agatha Christie said long ago in one of her 80-plus books, "I know to err is human, but human error is nothing compared to what a computer can do if it tries." The man-machine combination is even worse. We should stop trying to breed black swans and instead devise systems that don't create so many white ones.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

May 30, 2008

Ten

It's easy to found an organization; it's hard to keep one alive even for as long as ten years. This week, the Foundation for Information Policy Research celebrated its tenth birthday. Ten years is a long time in Internet terms, and even longer when you're trying to get government to pay attention to expertise in a subject as difficult as technology policy.

My notes from the launch contain this quote from FIPR's first director, Caspar Bowden, which shows you just how difficult FIPR's role was going to be: "An educational charity has a responsibility to speak the truth, whether it's pleasant or unpleasant." FIPR was intended to avoid the narrow product focus of corporate laboratory research and retain the traditional freedoms of an academic lab.

My notes also show the following list of topics FIPR intended to research: the regulation of electronic commerce; consumer protection; data protection and privacy; copyright; law enforcement; evidence and archiving; electronic interaction between government, businesses, and individuals; the risks of computer and communications systems; and the extent to which information technologies discriminate against the less advantaged in society. Its first concern was intended to be researching the underpinnings of electronic commerce, including the then recent directive launched for public consultation by the European Commission.

In fact, the biggest issue of FIPR's early years was the crypto wars leading up to and culminating in the passage of the Regulation of Investigatory Powers Act (2000). It's safe to say that RIPA would have been a lot worse without the time and energy Bowden spent listening to Parliamentary debates, decoding consultation papers, and explaining what it all meant to journalists, politicians, civil servants, and anyone else who would listen.

Not that RIPA is a fountain of democratic behavior even as things are. In the last couple of weeks we've seen the perfect example of the kind of creeping functionalism that FIPR and Privacy International warned about at the time: the Poole council using the access rules in RIPA to spy on families to determine whether or not they really lived in the right catchment area for the schools their children attend.

That use of the RIPA rules, Bowden said at at FIPR's half-day anniversary conference last Wednesday, sets a precedent for accessing traffic data for much lower level purposes than the government originally claimed it was collecting the data for. He went on to call the recent suggestion that the government may be considering a giant database, updated in real time, of the nation's communications data "a truly Orwellian nightmare of data mining, all in one place."

Ross Anderson, FIPR's founding and current chair and a well-known security engineer at Cambridge, noted that the same risks adhere to the NHS database. A clinic that owns its own data will tell police asking for the names of all its patients under 16 to go away. "If," said Anderson, "it had all been in the NHS database and they'd gone in to see the manager of BT, would he have been told to go and jump in the river? The mistake engineers make too much is to think only technology matters."

That point was part of a larger one that Anderson made: that hopes that the giant databases under construction will collapse under their own weight are forlorn. Think of developing Hulk-Hogan databases and the algorithms for mining them as an arms race, just like spam and anti-spam. The same principle that holds that today's cryptography, no matter how strong, will eventually be routinely crackable means that today's overload of data will eventually, long after we can remember anything we actually said or did ourselves, be manageable.

The most interesting question is: what of the next ten years? Nigel Hickson, now with the Department of Business, Enterprise, and Regulatory Reform, gave some hints. On the European and international agenda, he listed the returning dominance of the large telephone companies on the excuse that they need to invest in fiber. We will be hearing about quality of service and network neutrality. Watch Brussels on spectrum rights. Watch for large debates on the liability of ISPs. Digital signatures, another battle of the late 1990s, are also back on the agenda, with draft EU proposals to mandate them for the public sector and other services. RFID, the "Internet for things" and the ubiquitous Internet will spark a new round of privacy arguments.

Most fundamentally, said Anderson, we need to think about what it means to live in a world that is ever more connected through evolving socio-technological systems. Government can help when markets fail; though governments themselves seem to fail most notoriously with large projects.

FIPR started by getting engineers, later engineers and economists, to talk through problems. "The next growth point may be engineers and psychologists," he said. "We have to progressively involve more and more people from more and more backgrounds and discussions."

Probably few people feel that their single vote in any given election really makes a difference. Groups like FIPR, PI, No2ID, and ARCH remind us that even a small number of people can have a significant effect. Happy birthday.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).


May 23, 2008

The haystack conundrum

Early this week the news broke that the Home Office wants to create a giant database in which will be stored details of all communications sent in Britain. In other words, instead of data retention, in which ISPs, telephone companies, and other service providers would hang onto communications data for a year or seven in case the Home Office wanted it, everything would stream to a Home Office data center in real time. We'll call it data swallowing.

Those with long memories - who seem few and far between in the national media covering this sort of subject - will remember that in about 1999 or 2000 there was a similar rumor. In the resulting outraged media coverage it was more or less thoroughly denied and nothing had been heard of it since, though privacy advocates continued to suspect that somewhere in the back of a drawer the scheme lurked, dormant, like one of those just-add-water Martians you find in the old Bugs Bunny cartoons. And now here it is again in another leak that the suspicious veteran watcher of Yes, Minister might think was an attempt to test public opinion. The fact that it's been mooted before makes it seem so much more likely that they're actually serious.

This proposal is not only expensive, complicated, slow, and controversial/courageous (Yes, Minister's Fab Four deterrents), but risk-laden, badly conceived, disproportionate, and foolish. Such a database will not catch terrorists, because given the volume of data involved trying to use it to spot any one would-be evil-doer will be the rough equivalent of searching for an iron filing in a haystack the size of a planet. It will, however, make it possible for anyone trawling the database to make any given individual's life thoroughly miserable. That's so disproportionate it's a divide-by-zero error.

The risks ought to be obvious: this is a government that can't keep track of the personal details of 25 million households, which fit on a couple of CDs. Devise all the rules and processes you want, the bigger the database the harder it will be to secure. Besides personal information, the giant communications database would include businesses' communication information, much of likely to be commercially sensitive. It's pretty good going to come up with a proposal that equally offends civil liberties activists and businesses.

In a short summary of the proposed legislation, we find this justification: "Unless the legislation is updated to reflect these changes, the ability of public authorities to carry out their crime prevention and public safety duties and to counter these threats will be undermined."

Sound familiar? It should. It's the exact same justification we heard in the late 1990s for requiring key escrow as part of the nascent Regulation of Investigatory Powers Act. The idea there was that if the use of strong cryptography to protect communications became widespread law enforcement and security services would be unable to read the content of the messages and phone calls they intercepted. This argument was fiercely rejected at the time, and key escrow was eventually dropped in favor of requiring the subjects of investigation to hand over their keys under specified circumstances.

There is much, much less logic to claiming that police can't do their jobs without real-time copies of all communications. Here we have real analogies: postal mail, which has been with us since 1660. Do we require copies of all letters that pass through the post office to be deposited with the security services? Do we require the Royal Mail's automated sorting equipment to log all address data?

Sanity has never intervened in this government's plans to create more and more tools for surveillance. Take CCTV. Recent studies show that despite the millions of pounds spent on deploying thousands of cameras all over the UK, they don't cut crime, and, more important, the images help solve crime in only 3 percent of cases. But you know the response to this news will not be to remove the cameras or stop adding to their number. No, the thinking will be like the scheme I once heard for selling harmless but ineffective alternative medical treatments, in which the answer to all outcomes is more treatment. (Patient gets better - treatment did it. Patient stays the same - treatment has halted the downward course of the disease. Patient gets worse - treatment came too late.)

This week at Computers, Freedom, and Privacy, I heard about the Electronic Privacy Information Center's work on fusion centers, relatively new US government efforts to mine many commercial and public sources of data. EPIC is trying to establish the role of federal agencies in funding and controlling these centers, but it's hard going.

What do these governments imagine they're going to be able to do with all this data? Is the fantasy that agents will be able to sit in a control room somewhere and survey it all on some kind of giant map on which criminals will pop up in red, ready to be caught? They had data before 9/11 and failed to collate and interpret it.

Iron filing; haystack; lack of a really good magnet.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

May 9, 2008

Swings and roundabouts

There was a wonderful cartoon that cycled frequently around computer science departments in the pre-Internet 1970s - I still have my paper copy - that graphically illustrated the process by which IT systems get specified, designed, and built, and showed precisely why and how far they failed the user's inner image of what it was going to be. There is a scan here. The senior analyst wanted to make sure no one could possibly get hurt; the sponsor wanted a pretty design; the programmers, confused by contradictory input, wrote something that didn't work; and the installation was hideously broken.

Translate this into the UK's national ID card. Consumers, Sir James Crosby wrote in March (PDF)want identity assurance. That is, they - or rather, we - want to know that we're dealing with our real bank rather than a fraud. We want to know that the thief rooting through our garbage can't use any details he finds on discarded utility bills to impersonate us, change our address with our bank, clean out our accounts, and take out 23 new credit cards in our name before embarking on a wild spending spree leaving us to foot the bill. And we want to know that if all that ghastliness happens to us we will have an accessible and manageable way to fix it.

We want to swing lazily on the old tire and enjoy the view.

We are the users with the seemingly simple but in reality unobtainable fantasy.

The government, however - the project sponsor - wants the three-tiered design that barely works because of all the additional elements in the design but looks incredibly impressive. ("Be the envy of other major governments," I feel sure the project brochure says.) In the government's view, they are the users and we are the database objects.

Crosby nails this gap when he draws the distinction between ID assurance and ID management:

The expression 'ID management' suggests data sharing and database consolidation, concepts which principally serve the interests of the owner of the database, for example, the Government or the banks. Whereas we think of "ID assurance" as a consumer-led concept, a process that meets an important consumer need without necessarily providing any spin-off benefits to the owner of any database.

This distinction is fundamental. An ID system built primarily to deliver high levels of assurance for consumers and to command their trust has little in common with one inspired mainly by the ambitions of its owner. In the case of the former, consumers will extend use both across the population and in terms of applications such as travel and banking. While almost inevitably the opposite is true for systems principally designed to save costs and to transfer or share data.

As writer and software engineer Ellen Ullman wrote in her book Close to the Machine, databases infect their owners, who may start with good intentions but are ineluctibly drawn to surveillance.

So far, the government pushing the ID card seems to believe that it can impose anything it likes and if it means the tree collapses with the user on the swing, well, that's something that can be ironed out later. Crosby, however, points out that for the scheme to achieve any of the government's national security goals it must get mass take-up. "Thus," he writes, "even the achievement of security objectives relies on consumers' active participation."

This week, a similarly damning assessment of the scheme was released by the Independent Scheme Assurance Panel (PDF) (you may find it easier to read this clean translation - scroll down to policywatcher's May 8 posting). The gist: the government is completely incompetent at handling data, and creating massive databases will, as a result, destroy public trust in it and all its systems.

Of course, the government is in a position to compel registration, as it's begun doing with groups who can't argue back, like foreigners, and proposes doing for employees in "sensitive roles or locations, such as airports". But one of the key indicators of how little its scheme has to do with the actual needs and desires of the public is the list of questions it's asking in the current consultation on ID cards, which focus almost entirely on how to get people to love, or at least apply for, the card. To be sure, the consultation document pays lip service to accepting comments on any ID card-related topic, but the consultation is specifically about the "delivery scheme".

This is the kind of consultation where we're really damned if we do and damned if we don't. Submit comments on, for example, how best to "encourage" young people to sign up ("Views are invited particularly from young people on the best way of rolling out identity cards to them") without saying how little you like the government asking how best to market its unloved policy to vulnerable groups and when the responses are eventually released the government can say there are now no objectors to the scheme. Submit comments to the effect that the whole National Identity scheme is poorly conceived and inappropriate, and anything else you say is likely to be ignored on the grounds that they've heard all that and it's irrelevant to the present consultation. Comments are due by June 30.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

March 14, 2008

Uninformed consent

Apparently the US Congress is now being scripted by Jon Stewart of the Daily Show. In a twist of perfect irony, the House of Representatives has decided to hold its first closed session in 25 years to debate - surveillance.

But it's obvious why they want closed doors: they want to talk about the AT&T case. To recap: AT&T is being sued for its complicity in the Bush administration's warrantless surveillance of US citizens after its technician Mark Klein blew the whistle by taking documents to the Electronic Frontier Foundation (which a couple of weeks ago gave him a Pioneer Award for his trouble).

Bush has, of course, resisted any effort to peer into the innards of his surveillance program by claiming it's all a state secret, and that's part of the point of this Congressional move: the Democrats have fielded a bill that would give the whole program some more oversight and, significantly, reject the idea of giving telecommunications companies - that is, AT&T - immunity from prosecution for breaking the law by participating in warrantless wiretapping. 'Snot fair that they should deprive us of the fun of watching the horse-trading. It can't, surely, be that they think we'll be upset by watching them slag each other off. In an election year?

But it's been a week for irony, as Wikipedia founder Jimmy Wales has had his sex life exposed when he dumped his girlfriendand been accused of - let's call it sloppiness - in his expense accounts. Worse, he stands accused of trading favorable page edits for cash. There's always been a strong element of Schadenpedia around, but the edit-for-cash thing really goes to the heart of what Wikipedia is supposed to be about.

I suspect that nonetheless Wikipedia will survive it: if the foundation has the sense it seems to have, it will display zero tolerance. But the incident has raised valid questions about how Wikipedia can possibly sustain itself financially going forward. The site is big and has enviable masses of traffic; but it sells no advertising, choosing instead to live on hand-outs and the work of volunteers. The idea, I suppose, is that accepting advertising might taint the site's neutral viewpoint, but donations can do the same thing if they're not properly walled off: just ask the US Congress. It seems to me that an automated advertising system they did not control would be, if anything, safer. And then maybe they could pay some of those volunteers, even though it would be a pity to lose some of the site's best entertainment.

With respect to advertising, it's worth noting that Phorm, which we is under increasing pressure. Earlier this week, we had an opportunity to talk to Kent Ertegrul, CEO of Phorm, who continues to maintain that Phorm's system, because it does not store data, is more protective of privacy than today's cookie-driven Web. This may in fact be true.

Less certain is Ertegrul's belief that the system does not contravene the Regulation of Investigatory Powers Act, which lays down rules about interception. Ertegrul has some support from a informal letter from the Home Office whose reasoning seems to be that if users have consented and have been told how they can opt out, it's legal. Well, we'll see; there's a lot of debate going on about this claim and it will be interesting to hear the Information Commissioner's view. If the Home Office's interpretation is correct, it could open a lot of scope for abusive behavior that could be imposed upon users simply by adding it to the terms of service to which they theoretically consent when they sign up, and a UK equivalent of AT&T wanting to assist the government with wholesale warrantless wiretapping would have only to add it to the terms of service.

The real problem is that no one really knows how Phorm's system works. Phorm doesn't retain your IP address, but the ad servers surely have to know it when they're sending you ads. If you opt out but can still opt back in (as Ertegrul said you can), doesn't that mean you still have a cookie on your system and that your data is still passed to Phorm's system, which discards it instead of sending you ads? If that's the case, doesn't that mean you can not opt out of having your data shared? If that isn't how it works, then how does it work? I thought I understood it after talking to Ertegrul, I really did - and then someone asked me to explain how Phorm's cookie's usefulness persisted between sessions, and I wasn't sure any more. I think the Open Rights Group: Phorm should publish details of how its system works for experts to scrutinize. Until Phorm does that the misinformation Ertegrul is so upset about will continue. (More disclosure: I am on ORG's Advisory Council.

But maybe the Home Office is on to something. Bush could solve his whole problem by getting everyone to give consent to being surveilled at the moment they take US citizenship. Surely a newborn baby's footprint is sufficient agreement?

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

March 7, 2008

Techitics

This year, 2008, may go down in history as the year geeks got politics. At etech this week I caught a few disparaging references to hippies' efforts to change politics. Which, you know, seemed kind of unfair, for two reasons. First: the 1960s generation did change an awful lot of things, though not nearly as many as they hoped. Second: a lot of those hippies are geeks now.

But still. Give a geek something that's broken and he'll itch to fix it. And one thing leads to another. Which is why on Wednesday night Lawrence Lessig explained in an hour-long keynote that got a standing ovation how he plans to fix what's wrong with Congress.

No, he's not going to run. Some 4,500 people on Facebook were trying to push him into it, and he thought about it, but preliminary research showed that his chances of beating popular Silicon Valley favorite, Jackie Speier, were approximately zero.

"I wasn't afraid of losing," he said, noting ruefully that in ten years of copyfighting he's gotten good at it. Instead, the problem was that Silicon Valley insiders would have known that no one was going to beat Jackie Speier. But outsiders would have pointed, laughed, and said, "See? The idea of Congressional reform has no legs." And on to business as usual. So, he said, counterproductive to run.

Instead, he's launching Change Congress. "Obama has taught us that it's possible to imagine many people contributing to real change."

The point, he said, will be to provide a "signalling function". Like Creative Commongs, Change Congress will give candidates an easy way to show what level of reform they're willing to commit tto. The system will start with three options: 1) refusing money from lobbyists and political action committees (private funding groups); 2) ban earmarks (money allocated to special projects in politicians' home states); 3) commit to public financing for campaigns. Candidates can then display the badge generated from those choices on their campaign materials.

From there, said Lessig, layer something like Emily's List on top, to help people identify candidates they're willing to suppot with monthly donations, thereby subsidizing reform.

Money, he admitted, isn't the entire problem. But, like drinking for an alcoholic, it's the first problem you must solve to be able to tackle any of the others with any hope of success.

In a related but not entirely similar vein, the guys who brought us They Work For You nearly four years ago are back with UN democracy, an attempt to provide a signalling function to the United Nations> by making it easy to find out how your national representatives are voting in UN meetings. The driving force behind UNdemocracy.com is Liverpool's Julian Todd, who took the UN's URL obscurantism as a personal challenge. Since he doesn't fly, presenting the new service were Tom Loosemore, Stefan Mogdalinski, and Danny O'Brien, who pointed out that when you start looking at the decisions and debates you start to see strange patterns: what do the US and Israel have in common with Palau and Micronesia?

The US Congress and the British Parliament are all, they said, now well accustomed to being televised, and their behaviour has adapted to the cameras. At the UN, "They don't think they're being watched at all, so you see horse trading in a fairly raw form."

The meta-version they believe can be usefully and widely applied: 1) identify broken civic institution; 2) liberate data from said institution. There were three more ingredients, but they vanished the slide too quickly. But Mogdalinski noted that where in the past they have said "Ask forgiveness, not permission", alluding to the fact that most institutions if approached will behave as though they own the data. He's less inclined to apologise now. After all, isn't it *our* data that's being released in the public interest?

Data isn't everything. But the Net community has come a long way since the early days, when the prevailing attitude was that technological superiority would wash away politics-as-usual by simply making an end run around any laws governments tried to pass. Yes, technology can change the equation a whole lot. For example, once PGP escaped laws limiting the availability of strong encryption were pretty much doomed to fail (though not without a lot of back-and-forth before it became official). Similarly, in the copyright wars it's clear that copyrighted material will continue to leak out no matter how hard they try to protect it.

But those are pretty limited bits of politics. Technology can't make such an easy end run around laws that keep shrinking the public domain. Nor can it by itself solve policies that deny the reality of global climate change or that, in one of Lessig's examples, back government recommendations off from a daily caloric intake of 10 percent sugar to one of 25 percent. Or that, in another of his examples, kept then Vice-President Al Gore from succeeding with a seventh part to the 1996 Communications Act deregulating ADSL and cable because without anything to regulate what would Congressmen do without the funds those lobbyists were sending their way? Hence, the new approach.

"Technology," Lessig said, "doesn't solve any problems. But it is the only tool we have to leverage power to effect change."

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

February 22, 2008

Strikeout

There is a certain kind of mentality that is actually proud of not understanding computers, as if there were something honorable about saying grandly, "Oh, I leave all that to my children."

Outside of computing, only television gets so many people boasting of their ignorance. Do we boast how few books we read? Do we trumpet our ignorance of other practical skills, like balancing a cheque book, cooking, or choosing wine? When someone suggests we get dressed in the morning do we say proudly, "I don't know how"?

There is so much insanity coming out of the British government on the Internet/computing front at the moment that the only possible conclusion is that the government is made up entirely of people who are engaged in a sort of reverse pissing contest with each other: I can compute less than you can, and see? here's a really dumb proposal to prove it.

How else can we explain yesterday's news that the government is determined to proceed with Contactpoint even though the report it commissioned and paid for from Deloitte warns that the risk of storing the personal details of every British child under 16 can only be managed, not eliminated? Lately, it seems that there's news of a major data breach every week. But the present government is like a batch of 20-year-olds who think that mortality can't happen to them.

Or today's news that the Department of Culture, Media, and Sport has launched its proposals for "Creative Britain", and among them is a very clear diktat to ISPs: deal with file-sharing voluntarily or we'll make you do it. By April 2009. This bit of extortion nestles in the middle of a bunch of other stuff about educating schoolchildren about the value of intellectual property. Dare we say: if there were one thing you could possibly do to ensure that kids sneer at IP, it would be to teach them about it in school.

The proposals are vague in the extreme about what kind of regulation the DCMS would accept as sufficient. Despite the leaks of last week, culture secretary Andy Burnham has told the Financial Times that the "three strikes" idea was never in the paper. As outlined by Open Rights Group executive director Becky Hogge in New Statesman, "three strikes" would mean that all Internet users would be tracked by IP address and warned by letter if they are caught uploading copyrighted content. After three letters, they would be disconnected. As Hogge says (disclosure: I am on the ORG advisory board), the punishment will fall equally on innocent bystanders who happen to share the same house. Worse, it turns ISPs into a squad of private police for a historically rapacious industry.

Charles Arthur, writing in yesterday's Guardian, presented the British Phonographic Institute's case about why the three strikes idea isn't necessarily completely awful: it's better than being sued. (These are our choices?) ISPs, of course, hate the idea: this is an industry with nanoscale margins. Who bears the liability if someone is disconnected and starts to complain? What if they sue?

We'll say it again: if the entertainment industries really want to stop file-sharing, they need to negotiate changed business models and create a legitimate market. Many people would be willing to pay a reasonable price to download TV shows and music if they could get in return reliable, fast, advertising-free, DRM-free downloads at or soon after the time of the initial release. The longer the present situation continues the more entrenched the habit of unauthorized file-sharing will become and the harder it will be to divert people to the legitimate market that eventually must be established.

But the key damning bit in Arthur's article (disclosure: he is my editor at the paper) is the BPI's admission that they cannot actually say that ending file-sharing would make sales grow. The best the BPI spokesman could come up with is, "It would send out the message that copyright is to be respected, that creative industries are to be respected and paid for."

Actually, what would really do that is a more balanced copyright law. Right now, the law is so far from what most people expect it to be - or rationally think it should be - that it is breeding contempt for itself. And it is about to get worse: term extension is back on the agenda. The 2006 Gowers Review recommended against it, but on February 14, Irish EU Commissioner Charlie McCreevy (previously: champion of software patents) has announced his intention to propose extending performers' copyright in sound recordings from the current 50-year term to 95 years. The plan seems to go something like this: whisk it past the Commission in the next two months. Then the French presidency starts and whee! new law! The UK can then say its hands are tied.

That change makes no difference to British ISPs, however, who are now under the gun to come up with some scheme to keep the government from clomping all over them. Or to the kids who are going to be tracked from cradle to alcopop by unique identity number. Maybe the first target of the government computing literacy programs should be...the government.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

February 8, 2008

If you have ID cards, drink alcohol


One of the key identifiers of an addiction is that indulgence in it persists long after all the reasons for doing it have turned from good to bad.

A sobered-up Scottish alcoholic once told me the following examplar of alcoholic thinking. A professor is lecturing to a class of alcoholics on the evils of drinking. To make his point, he takes two glasses, one filled with water, the other with alcohol. Into each glass he drops a live worm. The worm in the glass of water lives; the worm in the glass of alcohol dies.

"What," the professor asks, "can we learn from this?"

One of the alcoholics raises his hand. "If you have worms, drink alcohol."

In alcoholic thinking, of course, there is no circumstance in which the answer isn't "Drink alcohol."

So, too, with the ID card. The purpose as mooted between 2001 and 2004 was preventing benefit fraud and making life more convenient for UK citizens and residents. The plan promised perfect identification via the combination of a clean database (the National Identity Register) and biometrics (fingerprints and iris scans). The consultation document made a show of suggesting the cheaper alternative of a paper card with minimal data collection, but it was clear what they really wanted: the big, fancy stuff that would make them the envy of other major governments.

Opponents warned of the UK's poor track record with large IT projects, the privacy-invasiveness, and the huge amount such a system was likely to cost. Government estimates, now at £5.4 billion, have been slowly rising to meet Privacy International's original estimate of £6 billion.

By 2006, when the necessary legislation was passed, the government had abandoned the friendly "entitlement card" language and was calling it a national ID card. By then, also, the case had changed: less entitlement, more crime prevention.

It's 2008, and the wheels seem to be coming off. The government's original contention that the population really wanted ID cards has been shredded by the leaked documents of the last few weeks. In these, it's clear that the government knows the only way it will get people to adopt the ID card is by coercion, starting with the groups who are least able to protest by refusal: young people and foreigners.

Almost every element deemed important in the original proposal is now gone - the clean database populated through interviews and careful documentation (now the repurposed Department of Work and Pensions database); the iris scans (discarded); probably the fingerprints (too expensive except for foreigners). The one element that for sure remains is the one the government denied from the start: compulsion.

The government was always open about its intention for non-registration to become increasingly uncomfortable and eventually to make registration compulsory. But if the card is coming at least two years later than they intended, compulsion is ahead of schedule.

Of course, we've always maintained that the key to the project is the database, not the card. It's an indicator of just how much of a mess the project is that the Register, the heart of the system, was first to be scaled back because of its infeasibility. (I mean, really, guys. Interview and background-check the documentation of every one of 60 million people in any sort of reasonable time scale?)

The project is even fading in popularity with the very vendors who want to make money supplying the IT for it. How can you specify a system whose stated goals keep changing?

The late humorist and playwright Jean Kerr (probably now best known for her collection of pieces about raising five boys with her drama critic husband in a wacky old house in Larchmont, NY, Please Don't Eat the Daisies) once wrote a piece about the trials and tribulations of slogging through the out-of-town openings of one of her plays. In these pre-Broadway trial runs, lines get cut and revised; performances get reshaped and tightened. If the play is in trouble, the playwright gets no sleep for weeks. And then, she wrote, one day you look up at the stage, and, yes, the play is much better, and the performances are much better, and the audience seems to be having a good time. And yet - the play you're seeing on the stage isn't the play you had in mind at all.

It's one thing to reach that point in a project and retain enough perspective to be honest about it. It may be bad - but it isn't insane - to say, "Well, this play isn't what I had in mind, but you know, the audience is having a good time, and it will pay me enough to go away and try again."

But if you reach the point where the project you're pushing ahead clearly isn't any more the project you had in mind and sold hard, and yet you continue to pretend to yourself and everyone else that it is - then you have the kind of insanity problem where you're eating worms in order to prove you're not an alcoholic.

The honorable thing for the British government to do now is say, "Well, folks, we were wrong. Our opponents were right: the system we had in mind is too complicated, too expensive, and too unpopular because of its privacy-invasiveness. We will think again." Apparently they're so far gone that eating worms looks more sensible.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

January 18, 2008

Harmony, where is thy sting?

On the Net, John Perry Barlow observed long ago, everything is local and everything is global, but nothing is national. It's one of those pat summations that sometimes is actually right. The EU, in the interests of competing successfully with the very large market that is the US, wants to harmonize the national laws that apply to content online.

They have a point. Today's market practices were created while the intangible products of human ingenuity still had to be fixed in a physical medium. It was logical for the publishers and distributors of said media to carve up the world into national territories. But today anyone trying to, say, put a song in an online store, or create a legal TV download service has to deal with a thicket of national collection societies and licensing authorities.

Where there's a problem there's a consultation document, and so there is in this case: the EU is giving us until February 29 (leap year!) to tell them what we think (PDF).

The biggest flaw in the consultation document is that the authors (who needed a good copy editor) seem to have bought wholesale the 2005 thinking of rightsholders (whom they call "right holders"). Fully a third of the consultation is on digital rights management: should it be interoperable, should there be a dispute resolution process, should SMEs have non-discriminatory access to these systems, should EULAs be easier to read?

Well, sure. But the consultation seems to assume that DRM is a) desirable and b) an endemic practice. We have long argued that it's not desirable; DRM is profoundly anti-consumer. Meanwhile, the industry is clearly fulfilling Naxos founder Klaus Heymann's April 2007 prophecy that DRM would be gone from online music within two years. DRM is far less of an issue now than it was in 2006, when the original consultation was launched. In fact, though, these questions seem to have been written less to aid consumers than to limit the monopoly power of iTunes.

That said, DRM will continue to be embedded in some hardware devices, most especially in the form of HDCP, a form of copy protection being built, invisibly to consumers until it gets in their way, into TV sets and other home video equipment. Unfortunately, because the consultation is focused on "Creative Content Online", such broader uses of DRM aren't included.

However, because of this and because some live streaming services similarly use DRM to prevent consumers from keeping copies of their broadcasts (and probably more will in future as Internet broadcasting becomes more widespread), public interest limitations on how DRM can be used seem like a wise idea. The problem with both DRM and EULAs is that the user has no ability to negotiate terms. The consultation leaves out an important consumer consideration: what should happen to content a consumer pays for and downloads that's protected with DRM if the service that sold it closes down? So far, subscribers lose it all; this is clea

The questions regarding multi-territory licensing are far more complicated, and I suspect answers to those depend largely on whether you're someone trying to clear rights for reuse, someone trying to protect your control over your latest blockbuster's markets, or someone trying to make a living as a creative person. The first of those clearly wants to buy one license rather than dozens. The second wants to sell dozens of licenses rather than one (unless it's for a really BIG sum of money). The third, who is probably part of the "Long Tail" mentioned in the question, may be very suspicious of any regime that turns everything he created before 2005 into "back catalogue works" that are subject to a single multi-territory license. Science fiction authors, for example, have long made significant parts of their income by selling their out-of-print back titles for reprint. An old shot in a photographer's long tail may be of no value for 30 years – until suddenly the subject emerges as a Presidential candidate. Any regime that is adopted must be flexible enough to recognize that copyrighted works have values that fluctuate unpredictably over time.

The final set of question has to do with the law and piracy. Should we all follow France's lead and require ISPs to throw users offline if they're caught file-sharing more than three times? We have said all along that the best antidote to unauthorized copying is to make it easy for people to engage in authorized copying. If you knew, for example, that you could reliably watch the latest episode of The Big Bang Theory (if there ever is one) 24 hours after the US broadcast, would you bother chasing around torrent sites looking for a download that might or might not be complete? Technically, it's nonsense to think that ISPs can reliably distinguish an unauthorized download of copyrighted material from an authorized one; filtering cannot be the answer, no matter how much AT&T wants to kill itself trying. We would also remind the EU of the famed comment of another Old Netizen, John Gilmore: "The Internet perceives censorship as damage, and routes around it."

But of course no consultation can address the real problem, which isn't how to protect copyright online: it's how to encourage creators.

Wendy M. Grossman’s Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

January 4, 2008

If God had meant us to vote...

It seems like a couple of years now that people in the UK have been asking me, "Do you think Hillary or Guliani is going to win?" Sometimes they mention Obama. But it's like the meme of a few years ago about Arnold Schwarzenegger becoming president: the famous name dominates the coverage beyond all reason.

When the Schwarzenegger thing came up, I tried patiently to explain about the Constitution: to be elected president, you must have been born a US citizen. I assume the Founding Fathers, even without the benefit of having seen George Bernard Shaw's The Apple Cart, were worried that some English king would come over to the US, get himself naturalized, win the president's job in one of those democratic elections, and then push the country back to colonialism.

"They'll amend the Constitution," people said.

Well, not quite. It takes an incredible amount of effort to onstitution: the prospective amendment has to pass both legislative houses by a two-thirds majority, and then three-quarters of the states. Often, there's a time limit of seven years, which is what eventually scuppered the Equal Rights Amendment. (Apparently the fastest-ever passage of an amendment, 107 days, was not getting Prohibition repealed but lowering the voting age to 18 during the Vietnam War.) While there was, apparently, an attempt in 2004 to introduce an amendment allowing foreign-born, naturalized citizens to become president, it's hard for me to believe even Schwarzenegger thinks he has a chance in his lifetime; he's 60 this year. Certainly, it's not a possibility people inside the US seem to take seriously.

That so many people outside the US think of the chief presidential candidates as Hillary, Obama, Giuliani, and Schwarzenegger tells you how little of the US's real politics seeps out to other countries. Fantasy politics all you want, sure: as many British friends as American ones bought into The West Wing's fictional White House.

Very fictional: Josiah Bartlett might have managed to get elected president despite being a Catholic (Kennedy) and having multiple sclerosis, but he'd never be able to overcome the twin disadvantages of being a Nobel Laureate in economics and, above all, being SHORT. Martin Sheen is 5 foot 7. You have to go all the way back to 1900 and William McKinley to find a president that small, and even then that was short by historical standards. In 1988, Michael Dukakis lost the election when moving around the debate podiums to shake hands with George H.W. Bush revealed that he "http://www6.miami.edu/debate04/art/pagephotos/sphotohistory.jpg">barely came up to Bush's shoulder. Over and out.

Most reports guesstimate Obama at six feet, but Clinton reportedly clocks in at 5 foot 8 and a half – tallish for a woman, maybe, but not for a presidential candidate. Giuliani claims 5 foot 10 (though some observers claim he's shorter). And John Edwards comes in at 5 foot 10.

John Edwards? Things look very different from inside the US. Here, although Clinton, Obama, and Giuliani are still getting most of the headlines there are plenty of other candidates to pick from even just within the Democratic party, all of whom look more like a US president usually looks: white, male, and middle-aged. Giuliani's best moment may have been when, as New Yorkers gleefully keep saying, his every sentence was summed up by Democratic hopeful Senator Joseph R. Biden as "a noun, a verb, and 9/11",

Yesterday's Iowa caucus – the first primaries of the 2008 presidential election – is the first real data we've had. And reality started to hit: Giuliani polled 4 percent; the Republican front runner is former Arkansas governor Mike Huckabee, who suddenly came out of nowhere in the last few weeks . Clinton polled 30 percent, which sounds respectable until you find out she came third, narrowly below Edwards. Obama led with 37 percent. Lots more to go there.

Some more notes for the coming weeks:

- Giuliani was more or less hated in New York while he was mayor.

- Clinton, like her husband, was politically hated when she was First Lady, despite her exceptional star-name fundraising ability.

- Huckabee crossed WGA picket lines to appear on Jay Leno's Tonight Show on January 3 (without a deal with striking union writers, Huckabee was the best Leno could do for a guest.) When asked, he said he thought Leno had a deal with the WGA. No. "Oh." Oops.

- Vice-president Dick Cheney has vehemently denied all possibility that he will run. "And if elected I will not serve."

- Lots of press speculation that New York's current mayor, the megawealthy Steve Bloomberg, will enter as an independent.

- No matter who runs, from the primaries onwards the technology of voting is going to be an unholy mess and doubtless, in some districts, a deciding factor.

I can't guess November's nominees, but I don't expect to see Clinton among them unless it's as someone's vice-president (and who's going to want Bill hanging around kibitzing?). Clinton's trailing Edwards, even if it's only the first state, suggests the big show will feature dismal, "safe" choices.

Cue Utah Phillips: "If God had meant us to vote, he'd have given us candidates."

Wendy M. Grossman’s Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

November 23, 2007

Road block

There are many ways for a computer system to fail. This week's disclosure that Her Majesty's Revenue and Customs has played lost-in-the-post with two CDs holding the nation's Child Benefit data is one of the stranger ones. The Child Benefit database includes names, addresses, identifying numbers, and often bank details, on all the UK's 25 million families with a child under 16. The National Audit Office requested a subset for its routine audit; the HMRC sent the entire database off by TNT post.

There are so many things wrong with this picture that it would take a village of late-night talk show hosts to make fun of them all. But the bottom line is this: when the system was developed no one included privacy or security in the specification or thought about the fundamental change in the nature of information when paper-based records are transmogrified into electronic data. The access limitations inherent in physical storage media must be painstakingly recreated in computer systems or they do not exist. The problem with security is it tends to be inconvenient.

With paper records, the more data you provide the more expensive and time-consuming it is. With computer records, the more data you provide the cheaper and quicker it is. The NAO's file of email relating to the incident (PDF) makes this clear. What the NAO wanted (so it could check that the right people got the right benefit payments): national insurance numbers, names, and benefit numbers. What it got: everything. If the discs hadn't gotten lost, we would never have known.

Ironically enough, this week in London also saw at least three conferences on various aspects of managing digital identity: Digital Identity Forum, A Fine Balance, and Identity Matters. All these events featured the kinds of experts the UK government has been ignoring in its mad rush to create and collect more and more data. The workshop on road pricing and transport systems at the second of them, however, was particularly instructive. Led by science advisor Brian Collins, the most notable thing about this workshop is that the 15 or 20 participants couldn't agree on a single aspect of such a system.

Would it run on GPS or GSM/GPRS? Who or what is charged, the car or the driver? Do all roads cost the same or do we use differential pricing to push traffic onto less crowded routes? Most important, is the goal to raise revenue, reduce congestion, protect the environment, or rebalance the cost of motoring so the people who drive the most pay the most? The more purposes the system is intended to serve, the more complicated and expensive it will become, and the less likely it is to answer any of those goals successfully. This point has of course also been made about the National ID card by the same sort of people who have warned about the security issues inherent in large databases such as the Child Benefit database. But it's clearer when you start talking about something as limited as road charging.

For example: if you want to tag the car you would probably choose a dashboard-top box that uses GPS data to track the car's location. It will have to store and communicate location data to some kind of central server, which will use it to create a bill. The data will have to be stored for at least a few billing cycles in case of disputes. Security services and insurers alike would love to have copies. On the other hand, if you want to tag the driver it might be simpler just to tie the whole thing to a mobile phone. The phone networks are already set up to do hand-off between nodes, and tracking the driver might also let you charge passengers, or might let you give full cars a discount.

The problem is that the discussion is coming from the wrong angle. We should not be saying, "Here is a clever technological idea. Oh, look, it makes data! What shall we do with it?" We should be defining the problem and considering alternative solutions. The people who drive most already pay most via the fuel pump. If we want people to drive less, maybe we should improve public transport instead. If we're trying to reduce congestion, getting employers to be more flexible about working hours and telecommuting would be cheaper, provide greater returns, and, crucially for this discussion, not create a large database system that can be used to track the population's movements.

(Besides, said one of the workshop's participants: "We live with the congestion and are hugely productive. So why tamper with it?")

It is characteristic of our age that the favored solution is the one that creates the most data and the biggest privacy risk. No one in the cluster of organisations opposing the ID card - No2ID, Privacy International, Foundation for Information Policy Research, or Open Rights Group - wanted an incident like this week's to happen. But it is exactly what they have been warning about: large data stores carry large risks that are poorly understood, and it is not enough for politicians to wave their hands and say we can trust them. Information may want to be free, but data want to leak.

Wendy M. Grossman’s Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

August 10, 2007

Wall of sheep

Last week at Defcon my IM ID and just enough of the password to show they knew what it was appeared on the Wall of Sheep. This screen projection of the user IDs, partial passwords, and activities captured by the installed sniffer inevitably runs throughout the conference.

It's not that I forgot the sniffer was there, or that there is a risk in logging onto an IM client unencrypted over a Wi-Fi hot spot (at a hacker conference!) but that I had forgotten that it was set to log in automatically whenever it could. Easily done.

It's strange to remember now that once upon a time this crowd – or at least, type of crowd – was considered the last word in electronic evil. In 1995 the capture of Kevin Mitnick made headlines everywhere because he was supposed to be the baddest hacker ever. Yet other than gaining online access and free phone calls, Mitnick is not known to have ever profited from his crimes – he didn't sell copied source code to its owners' competitors, and he didn't rob bank accounts. We would be grateful – really grateful – if Mitnick were the worst thing we had to deal with online now.

Last night, the House of Lords Science and Technology Committee released its report on Personal Internet Security. It makes grim reading even for someone who's just been to Defcon and Black Hat. The various figures the report quotes, assembled after what seems to have been an excellent information-gathering process (that means, they name-check a lot of people I know and would have picked for them to talk to) are pretty depressing. Phishing has cost US banks around $2 billion, and although the UK lags well behind - £33.5 million in bank fraud in 2006 – here, too, it's on the rise. Team Cymru found (PDF) that on IRC channels dedicated to the underground you could buy credit card account information for between $1 (basic information on a US account) to $50 (full information for a UK account); $1,599,335.80 worth of accounts was for sale on a single IRC channel in one day. Those are among the few things that can be accurately measured: the police don't keep figures breaking out crimes committed electronically; there are no good figures on the scale of identity theft (interesting, since this is one of the things the government has claimed the ID card will guard against); and no one's really sure how many personal computers are infected with some form of botnet software – and available for control at four cents each.

The House of Lords recommendations could be summed up as "the government needs to do more". Most of them are unexceptional: fund more research into IT security, keep better statistics. Some measures will be welcomed by a lot of us: make banks responsible for losses resulting from electronic fraud (instead of allowing them to shift the liability onto consumers and merchants); criminalize the sale or purchase of botnet "services" and require notification of data breaches. (Now I know someone is going to want to say, "If you outlaw botnets, only outlaws will have botnets", but honestly, what legitimate uses are there for botnets? The trick is in defining them to include zombie PCs generating spam and exclude PCs intentionally joined to grids folding proteins.)

Streamlined Web-based reporting for "e-crime" could only be a good thing. Since the National High-Tech Crime Unit was folded into the Serious Organised Crime Agency there is no easy way for a member of the public to report online crime. Bringing in a central police e-crime unit would also help. The various kite mark schemes – for secure Internet services and so on – seem harmless but irrelevant.

The more contentious recommendations revolve around the idea that we the people need to be protected, and that it's no longer realistic to lay the burden of Internet security on individual computer users. I've said for years that ISPs should do more to stop spam (or "bad traffic") from exiting their systems; this report agrees with that idea. There will likely be a lot of industry ink spilled over the idea of making hardware and software vendors liable if "negligence can be demonstrated". What does "vendor" mean in the context of the Internet, where people decide to download software on a whim? What does it mean for open source? If I buy a copy of Red Hat Linux with a year's software updates, that company's position as a vendor is clear enough. But if I download Ubuntu and install it myself?

Finally, you have to twitch a bit when you read, "This may well require reduced adherence to the 'end-to-end' principle." That is the principle that holds that the network should carry only traffic, and that services and applications sit at the end points. The Internet's many experiments and innovations are due to that principle.
The report's basic claim is this: criminals are increasingly rampant and increasingly rapacious on the Internet. If this continues, people will catastrophically lose confidence in the Internet. So we must improve security by making the Internet safer. Couldn't we just make it safer by letting people stop using it? That's what people tell you to do when you're going to Defcon.

Wendy M. Grossman’s Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

July 27, 2007

There ain't no such thing as a free Benidorm

This has been the week for reminders that the border between real life and cyberspace is a permeable blood-brain barrier.

On Wednesday, Linden Labs announced that it was banning gambling in Second Life. The resentment expressed by some of SL residents is understandable but naive. We're not at the beginning of the online world any more; Second Life is going through the same reformation to take account of national laws as Usenet and the Web did before it.

Second, this week MySpace deleted the profiles of 29,000 American users identified as sex offenders. That sounds like a lot, but it's a tiny percentage of MySpace's 180 million profiles. None of them, be it noted, are Canadian.

There's no question that gambling in Second Life spills over into the real world. Linden dollars, the currency used in-world, have active exchange rates, like any other currency, currently running about L$270 to the US dollar. (When I was writing about a virtual technology show, one of my interviewees was horrified that my avatar didn't have any distinctive clothing; she was and is dressed in the free outfit you are issued when you join. He insisted on giving me L$1,000 to take her shopping. I solemnly reported the incident to my commissioning editor, who felt this wasn't sufficiently corrupt to worry about: US$3.75! In-world, however, that could buy her several cars.) Therefore: the fact that the wagering takes place online in a simulated casino with pretty animated decorations changes nothing. There is no meaningful difference between craps on an island in Second Life and poker on an official Web-based betting site. If both sites offer betting on real-life sporting events, there's even less difference.

But the Web site will, these days, have gone through considerable time and money to set up its business. Gaming, even outside the US, is quite difficult to get into: licenses are hard to get, and without one banks won't touch you. Compared to that, the $3,800 and 12 to 14 hours a day Brighton's Anthony Smith told Information Week he'd invested in building his SL Casino World is risibly small. You have to conclude that there are only two possibilities. Either Smith knew nothing about the gaming business - if he did, he know that the US has repeatedly cracked down on online gambling over the last ten years and that ultimately US companies will be forced to decide to live within US law. He'd also have known how hard and how expensive it is to set up an online gambling operation even in Europe. Or, he did know all those things and thought he'd found a loophole he could exploit to avoid all the red tape and regulation and build a gaming business on the cheap.

I have no personal interest in gaming; risking real money on the chance draw of a card or throw of dice seems to me a ridiculous waste of the time it took to earn it. But any time you have a service that involves real money, whether that service is selling an experience (gaming), a service, or a retail product, when the money you handle reaches a certain amount governments are going to be interested. Not only that, but people want them involved; people want protection from rip-off artists.

The MySpace decision, however, is completely different. Child abuse is, rightly, illegal everywhere. Child pornography is, more controversially, illegal just about everywhere. But I am not aware of any laws that ban sex offenders from using Web sites, even if those Web sites are social networks. Of course, in the moral panic following the MySpace announcement, someone is proposing such a law. The MySpace announcement sounds more like corporate fear (since the site is now owned by News International) than rational response. There is a legitimate subject for public and legislative debate here: how much do we want to cut convicted sex offenders out of normal social interaction? And a question for scientists: will greater isolation and alienation be effective strategies to keep them from reoffending? And, I suppose, a question for database experts: how likely is it that those 29,000 profiles all belonged to correctly identified, previously convicted sex offenders? But those questions have not been discussed. Still, this problem, at least in regards to MySpace, may solve itself: if parents become better able to track their kids' MySpace activities, all but the youngest kids will surely abandon it in favour of sites that afford them greater latitude and privacy.

A dozen years ago, John Perry Barlow (in)famously argued that national governments had no place in cyberspace. It was the most hyperbolic demonstration of what I call the "Benidorm syndrome": every summer thousands of holidaymakers descend on Benidorm, in Spain, and behave in outrageous and sometimes lawless ways that they would never dare indulge in at home in the belief that since they are far away from their normal lives there are no consequences. (Rinse and repeat for many other tourist locations worldwide, I'm sure.) It seems to me only logical that existing laws apply to behaviour in cyberspace. What we have to guard against is deforming cyberspace to conform to laws that don't exist.


Wendy M. Grossman’s Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

July 6, 2007

Born digital

Under one of my bookcases there is a box containing 40 or 50 5.25inch floppy disks next to an old floppy drive of the same size. The disks were created in SuperScripsit in the early 1980s, and require an emulator that pretends my Core2Duo is a TRS-80 Model III.

If, like me, you have had a computer for any length of time you, too, have stowed somewhere a batch of old files that you save because they are or were important to you but that you're not sure you could actually read, though you keep meaning to plug that old drive in and find out. But the Domesday Book, drafted in 1085, is still perfectly readable. In fact, it's more readable than a 1980s digital Domesday Book that was unreadable only 15 years after its creation because the technology it was stored on was outmoded.

The average life of an electronic document before it becomes obsolete is seven years. And that's if it survives that long. Paper can last centuries – and the National Archives, which holds 900 years of Britain's records, has to think in centuries.

This week, the National Archives announced it was teaming up with Microsoft to ensure that the last decade or two of government archives do not become a black hole in history.

The problem of preserving access to today's digital documents is not newly discovered. Digital preservation and archiving were on the list of topics of interest in 1997, when the Foundation for Information Policy Research was founded. Even before that, NASA had discovered the problem, in connection with the vast amounts of data collected at taxpayer expense by the various space missions. Librarians have known all along that the many format changes of the digital age posed far greater problems than deciphering an unfamiliar language chiseled into a chunk of stone.

But it takes a while for non-technical people to understand how complex a problem it really is. Most people, Natalie Ceeney, chief executive of the National Archives, said on Tuesday, think all you have to do is make back-ups. But for an archivist this isn't true, even for the simple case of, say, a departmental letter written in the early 1980s in WordStar. The National Archives wants not only to preserve the actual text of the letter but its look, feel, and functionality. To do that, you need to be able to open the document in the software in which it was originally created – which means having a machine you can run that software on. Lather, rinse, and repeat for any number of formerly common but now obsolete systems. The National Archives estimates it has 580Tb of data in obsolete formats. And more new formats are being invented every day: email, Web, instant messages, telephone text messages, email, databases, ministers' blogs, internal wikis…and as they begin to interact without human intervention that will be a whole new level of complication.

"We knew in the paper world what to keep," Ceeney said. "In the digital world, it's harder to know. But if we tried to keep everything we'd be spending the entire government budget on servers."

So for once Microsoft is looking like a good guy in providing the National Archives with Virtual PC 2007, which (it says here) combines earlier versions of Windows and Office in order to make sure that all government documents that were created using Microsoft products can be opened and read. Naturally, that isn't everything; but it's a good start. Gordon Frazer, Microsoft's UK managing director, promised open formats (or at least, Open XML) for the future. The whole mess is part of a four-year Europe-wide project called Planets.

Digital storage is surprisingly expensive compared to, say, books or film. A study reported by the head of preservation for the Swedish national archives shows that digital can cost up to eight times as much (PDF, see p4) as the same text on paper. But there is a valuable trade-off: the digital version can be easily accessed and searched by far more people. The National Archives' Web site had 66 million downloads in 2006, compared to the 250,000 visitors to its physical premises in Kew.

Listening to this discussion live, you longed to say, "Well, just print it all out, then." But even if you decided to waive the requirements for original look, feel, and functionality, not eveything could be printed out anyway. (Plus, the National Archives casually mentions that its current collection of government papers is 175 kilometres long already.) The most obvious case in point is video evidence, now being kept by police in huge amounts – and, in cases of unsolved crimes or people who have been sentenced for serious crimes, for long periods. Can't be printed. But even text-based government documents: when these were created on paper you saved the paper. The documents of the last 20 years were born digital. Paper is no longer the original but the copy. The National Archives is in the business of preserving originals.

Nor, of course, does it work to say, "Let the Internet archive take care of it: too much of the information is not published on the Web but held in internal government systems, from where it will be due to emerge in a few decades under Britain's 30-year rule. Hopefully we'll know before then that this initiative has been successful.

Wendy M. Grossman’s Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

January 26, 2007

Vote early, vote often...

It is a truth that ought to be universally acknowledged that the more you know about computer security the less you are in favor of electronic voting. We thought – optimists that we are – that the UK had abandoned the idea after all the reports of glitches from the US and the rather indeterminate results of a couple of small pilots a few years ago. But no: there are plans for further trials for the local elections in May.

It's good news, therefore, that London is to play host to two upcoming events to point out all the reasons why we should be cautious. The first, February 6, is a screening of the HBO movie Hacking Democracy, a sort of documentary thriller. The second, February 8, is a conference bringing together experts from several countries, most prominently Rebecca Mercuri, who was practically the first person to get seriously interested in the security problems surrounding electronic voting. Both events are being sponsored by the Open Rights Group and the Foundation for Information Policy Research, and will be held at University College London. Here is further information and links to reserve seats. Go, if you can. It's free.

Hacking Democracy (a popular download) tells the story of ,a href="http://www.blackboxvoting.org">Bev Harris and Andy Stephenson. Harris was minding her own business in Seattle in 2000 when the hanging chad hit the Supreme Court. She began to get interested in researching voting troubles, and then one day found online a copy of the software that runs the voting machines provided by Diebold, one of the two leading manufacturers of such things. (And, by the way, the company whose CEO vowed to deliver Ohio to Bush.) The movie follows this story and beyond, as Harris and Stephenson dumpster-dive, query election officials, and document a steady stream of glitches that all add up to the same point: electronic voting is not secure enough to protect democracy against fraud.

Harris and Stephenson are not, of course, the only people working in this area. Among computer experts such as Mercuri, David Chaum, David Dill, Deirdre Mulligan, Avi Rubin, and Peter Neumann, there's never been any question that there is a giant issue here. Much argument has been spilled over the question of how votes are recorded; less so around the technology used by the voter to choose preferences. One faction – primarily but not solely vendors of electronic voting equipment – sees nothing wrong with Direct Recording Electronic, machines that accept voter input all day and then just spit out tallies. The other group argues that you can't trust a computer to keep accurate counts, and that you have to have some way for voters to check that the vote they thought they cast is the vote that was actually recorded. A number of different schemes have been proposed for this, but the idea that's catching on across the US (and was originally promoted by Mercuri) is adding a printer that spits out a printed ballot the voter can see for verification. That way, if an audit is necessary there is a way to actually conduct one. Otherwise all you get is the machine telling you the same number over again, like a kid who has the correct answer to his math homework but mysteriously can't show you how he worked the problem.

This is where it's difficult to understand the appeal of such systems in the UK. Americans may be incredulous – I was – but a British voter goes to the polls and votes on a small square of paper with a stubby, little pencil. Everything is counted by hand. The UK can do this because all elections are very, very simple. There is only one election – local council, Parliament – at a time, and you vote for one of only a few candidates. In the US, where a lemon is the size of an orange, an orange is the size of a grapefruit, and a grapefruit is the size of a soccer ball, elections are complicated and on any given polling day there are a lot of them. The famous California governor's recall that elected Arnold Schwarzeneger, for example, had hundreds of candidates; even a more average election in a less referendum-happy state than California may have a dozen races, each with six to ten candidates. And you know Americans: they want results NOW. Like staying up for two or three days watching the election returns is a bad thing.

It is of course true that election fraud has existed in all eras; you can "lose" a box of marked paper ballots off the back of a truck, or redraw districts according to political allegiance, or "clean" people off the electoral rolls. But those types of fraud are harder to cover up entirely. A flawed count in an electronic machine run by software the vendor allows no one to inspect just vanishes down George Orwell's memory hole.

What I still can't figure out is why politicians are so enthusiastic about all this. Yes, secure machines with well-designer user interfaces might get rid of the problem of "spoiled" and therefore often uncounted ballots. But they can't really believe – can they? – that fancy voting technology will mean we're more likely to elect them? Can it?

Wendy M. Grossman’s Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

November 24, 2006

The Great Firewall of Britain

We may joke about the "Great Firewall of China", but by the end of 2007 content blocking will be a fact of Internet life in the UK. In June, Vernon Coaker, Parliamentary Under-Secretary for the Home Department told Parliament, "I have recently set the UK Internet industry a target to ensure that by the end of 2007 all Internet service providers offering broadband Internet connectivity to the UK public prevent their customers from accesssing those Web sites." By "those", he means Web sites carrying pornographic images of children.

Coaker went on to say that by the end of 2006 he expects 90 percent of ISPs to have blocked "access to sites abroad", and that, "We believe that working with the industry offers us the best way forward, but we will keep that under review if it looks likely that the targets will not be met."

The two logical next questions: How? And How much?

Like a lot of places, the UK has two major kinds of broadband access: cable and DSL. DSL is predominantly provided by BT, either retail directly to customers or wholesale to smaller ISPs. Since 2004, BT's retail service is filtered by its Cleanfeed system, which last February the company reported was blocking about 35,000 attempts to access child pornography sites per day. The list of sites to block comes from the Internet Watch Foundation, and is compiled from reports submitted by the public. ISPs pay IWF £5,000 a year to be supplied with the list – insignificant to a company like BT but not necessarily to a smaller one. But the raw cost of the IWF list is insignificant compared to the cost of reengineering a network to do content blocking.

How much will it cost for the entire industry?

Malcolm Hutty, head of public affairs at Linx, says he can't even begin to come up with a number. BT, he thinks, spent something like £1 million in creating and deploying Cleanfeed – half on original research and development, half on deployment. Most of the first half of that would not now be necessary for an ISP trying to decide how to proceed, since a lot more is known now than back in 2003.

Although it might seem logical that Cleanfeed would be available to any DSL provider reselling BT's wholesale product, that's not the case.

"You can be buying all sorts of different products to be able to provide DSL service," he says. A DSL provider might simply rebrand BT's own service – or it might only be paying BT to use the line from your home to the exchange. "You have to be pretty close to the first extreme before BT Cleanfeed can work for you." So adopting Cleanfeed might mean reengineering your entire product.

In the cable business, things are a bit different. There, an operator like ntl or Telewest owns the entire network, including the fibre to each home. If you're a cable company that implemented proxy caching in the days when bandwidth was expensive and caching was fashionable, the technology you built then will make it cheap to do content blocking. According to Hutty, ntl is in this category – but its Telewest and DSL businesses are not.

So the expense to a particular operator varies for all sorts of reasons: the complexity of the network, how it was built, what technologies it's built on. This mandate, therefore, has no information behind it as to how much it might cost, or the impact it might have on an industry that other sectors of government regard as vital for Britain's economic future.

The How question is just as complicated.

Cleanfeed itself is insecure (PDF), as Cambridge researcher Richard Clayton has recently discovered. Cleanfeed was intended to improve on previous blocking technologies by being both accurate and inexpensive. However, Clayton has found that not only can the system be circumvented but it also can be used as an "oracle to efficiently locate illegal websites".

Content blocking is going to be like every other security system: it must be constantly monitored and updated as new information and attacks becomes known or are developed. You cannot, as Clayton says, "fit and forget".

The other problem in all this is the role of the IWF. It was set up in 1996 as a way for the industry to regulate itself; the meeting where it was proposed came after threats of external regulation. If all ISPs are required to implement content blocking, and all content blocking is based on the IWF's list, the IWF will have considerable power to decide what content should be blocked. So far, the IWF has done a respectable job of sticking to clearly illegal pornography involving children. But its ten years have been marked by occasional suggestions that it should broaden its remit to include hate speech and even copyright infringement. Proposals are circulating now that the organisation should become an independent regulator rather than an industry-owned self-regulator. If IWF is not accountable to the industry it regulates; if it's not governed by Parliamentary legislation; if it's not elected….then we will have handed control of the British Internet over to a small group of people with no accountability and no transparency. That sounds almost Chinese, doesn't it?

Wendy M. Grossman’s Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

June 30, 2006

Technical enough for government work

Wednesday night was a rare moment of irrelevant glamor in my life, when I played on the Guardian team in a quiz challenge grudge match.

In March, Richard Sarson (intriguingly absent, by the way) accused MPs of not knowing which end was up, technically speaking, and BT funded a test. All good fun.

But Sarson had a serious point: MPs are spending billions and trillions of public funds without the technical knowledge to them. His particular focus was the ID card, which net.wars has written about so often. Who benefits from these very large IT contracts besides, of course, the suppliers and contractors? It must come down to Yes, Minister again: commissioning a huge, new IT system gives the Civil Service a lot of new budget and bureaucracy to play with, especially if the ministers don't understand the new system. Expanded budgets are expanded power, we know this, and if the system doesn't work right the first time you need an even bigger budget to fix them with.

And at that point, the issue collided in my mind with this week's other effort, a discussion of Vernor Vinge's ideas of where our computer-ridden world might be going. Because the strangest thing about the world Vernor Vinge proposes in his new book, Rainbows End, is that all the technology pretty much works as long as no one interferes with it. For example: this is a world filled with localizer sensors and wearable computing; it's almost impossible to get out of view of a network node. People decide to go somewhere and snap! a car rolls up and pops open its doors.

I'm wondering if Vinge has ever tried to catch a cab when it was raining in Manhattan.

There are two keys to this world. First: it is awash in so many computer chips that IPv6 might not have enough addresses (yeah, yeah, I know, no electron left behind and all that). Second: each of these chips has a little blocked off area called the Secure Hardware Environment (SHE), which is reserved for government regulation. SHE enables all sorts of things: detailed surveillance, audit trails, the blocking of undesirable behavior. One of my favorite of Vinge's ideas about this is that the whole system inverts Lawrence Lessig's idea of code is law into "law is code". When you make new law, instead of having to wait five or ten years until all the computers have been replaced so they conform to the new law, you can just install the new laws as a flash regulatory update. Kind of like Microsoft does now with Windows Genuine Advantage. Or like what I call "idiot stamps" – today's denominationless stamps, intended for people who can never remember how much postage is.

There are a lot of reasons why we don't want this future, despite the convenience of all those magically arriving cars, and despite the fact that Vinge himself says he thinks frictional costs will mean that SHE doesn't work very well. "But it will be attempted, both by the state and by civil special interest petitioners." For example, he said, take the reaction of a representative he met from a British writers' group who thought it was a nightmare scenario – but loved the bit where microroyalties were automatically and immediately transmitted up the chain. "If we could get that, but not the monstrous rest of it…"

For another, "You really need a significant number of people who are willing to be Amish to the extent that they don't allow embedded microprocessors in their lifestyle." Because, "You're getting into a situation where that becomes a single failure point. If all the microprocessors in London went out, it's hard to imagine anything short of a nuclear attack that would be a deadlier disaster."

Still, one of the things that makes this future so plausible is that you don't have to posit the vast, centralized expenditure of these huge public IT projects. It relies instead on a series of developments coming together. There are examples all around us. Manufacturers and retailers are leaping gleefully onto RFID in everything. More and more desktop and laptop computers are beginning to include the Trusted Computing Module, which is intended to provide better security through blocking all unsigned programs from running but as a by-product could also allow the widescale, hardware-level deployment of DRM. The business of keeping software updated has become so complex that most people are greatly relieved to be able to make it automatic. People and municipalities all over the place are installing wireless Internet for their own use and sharing it. To make Vinge's world, you wait until people have voluntarily bought or installed much of the necessary infrastructure and then do a Project Lite to hook it up to the functions you want.

What governments would love about the automatic regulatory upgrade is the same thing that the Post Office loves about idiot stamps: you can change the laws (or prices) without anyone's really being aware of what you're doing. And there, maybe, finally, is some real value for those huge, failed IT projects: no one in power can pretend they aren't there. Just, you know, God help us if they ever start being successful.

Wendy M. Grossman’s Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, at her personal blog, or by email to netwars@skeptic.demon.co.uk (but please turn off HTML).

November 9, 2001

Save the cookie

You would think that by this time in the Internet's history we would have reached the point where the politicians making laws would have learned a thing or two about how it works, and would therefore not be proposing (and passing) quite such stupid laws as they used to. Apparently not.

Somehow, tacked onto an otherwise sensible bill aimed at protecting consumer privacy are provisions requiring Web sites to use cookies only on an opt-in basis. Consultation to remove this bit of idiocy closes in mid-November.

The offending bit appears in the second report on the proposal for a European Parliament and Council directive concerning the processing of personal data and the protection of privacy in the electronic communications sector" (PDF), and is labelled "amendment 26 to article 5, paragraph 2a". What seems to be upsetting the EC is that cookies may enter a user's computer without that user's specific permission.

Well, that's true. On the other hand, it's pretty easy to set any browser to alert you whenever a site wants to send you a cookie - and have fun browsing like that, because you'll be interrupted about every two and a half seconds. Microsoft's Internet Explorer 6 lets you opt out of cookies entirely.

A lot of people are oddly paranoid about cookies, which are, like the Earth in the Hitchhiker's Guide to the Galaxy, mostly harmless. At heart, what cookies do is give Web sites persistent memory. Unlike what many people think, a connection to a Web site is not continuous; you request a page, and then you request another page, and without cookies the Web site does not connect the two transactions.

Cookies are what make it possible to build up an order in a shopping cart or personalize a site so it remembers your ID and password or knows you're interested in technology news and not farming. These uses do not invade privacy.

There are, of course, plenty of things you can do with cookies that are not harmless. Take Web bugs. These hidden graphics, usually 1x1 pixels, enable third parties to track what you do on the Web and harvest all sorts of information about you, your computer, and what browser you use. Privacy-protecting sites like the Anonymizer depend on cookies.

Similarly, the advertising agency DoubleClick has been under severe fire for the way it tracks users from site to site, even though it says that the data are anonymized and the purpose is merely to ensure that the ads you see are targeted to your interests rather than random.

MEPs who want to protect consumer privacy, therefore, should not be looking at the technology itself but at how the technology is used. They should be attrempting to regulate behavior that invades privacy, not the technology itself. To be fair, the report mentions all these abuses. The problem is simply that the clause is overbroad, and needs some revision. Something along the lines of requiring sites to explain in their privacy policies how they use cookies and a prohibition on actually spying on users would do nicely.

The point is to get at what people do with technology, not outlaw the technology itself.

We've had similar problems in the US, most recently and notably with the Digital Millennium Copyright Act, which also tends to criminalize technology rather than behaviour. This is the crevasse that Sklyarov fell into. For those who haven't been following the story, Sklyarov, on behalf of his Russian software company, Elcomsoft, wrote a routine that takes Adobe eBooks and converts them into standard PDFs. Yes, that makes them copiable. But it also makes it possible for people who have bought eBooks to back them up, run them through text-to-speech software (indispensable for the blind), or read them on a laptop or PDA after downloading them onto their desktop machine.

In the world of physical books, we would consider these perfectly reasonable things to do. But in the world of digital media these actions are what rightsholders most fear. Accordingly, the DMCA criminalizes creating and distributing circumvention. As opponents to the act pointed out at the time, this could include anything from scissors and a bottle of Tippex to sophisticated encryption cracking software. The fuss over DeCSS, which removes regional coding from DVDs, is another case in point. While the movie studios argue that DeCSS is wholly intended to enable people to illegally copy DVDs, the original purpose was to let Linux users play the DVDs they'd paid for on their computers, for which no one provides a working commercial software player.

The Internet Advertising Bureau has of course gone all out to save the cookie. It is certainly true, as they say, that it would impair electronic commerce in Europe, the more so because it would be impossible to impose the same restrictions on non-EU businesses.

If MEPs really want to protect consumer privacy, here's what they should do. First of all, learn something about what they are doing. Second of all, focus on behaviour.