" /> net.wars: June 2011 Archives

« May 2011 | Main | July 2011 »

June 24, 2011

Bits of the realm

Money is a collective hallucination. Or, more correctly, money is an abstraction that allows us to exchange - for example - writing words for food, heat, or a place to live. Money means the owner of the local grocery store doesn't have to decide how many pounds of flour and Serrano ham 1,000 words are worth, and I don't have to argue copyright terms while paying my mortgage.

But, as I was reading lately in The Coming Collapse of the Dollar and How to Profit From It by James Turk, the owner of GoldMoney, that's all today's currencies are: abstractions. Fiat currencies. The real thing disappeared when we left the gold standard in 1972. Accordingly none of the currencies I regularly deal with - pounds, dollars, euros - are backed by anything more than their respective governments' "full faith and credit". Is this like Tinker Bell? If I stop believing will they cease to exist? Certainly some people think so, and that's why, as James Surowiecki wrote in The New Yorker in 2004, some people believe that gold is the One True Currency.

"I've never bought gold," my father said in the late 1970s. "When it's low, it's too expensive. When it's high, I wish I'd bought it when it was low." Gold was then working its way up to its 1980 high of $850 an ounce. Until 2004 it did nothing but decline. Yesterday, it closed at $1518.

That's if you view the world from the vantage point of the dollar. If gold is your sun and other currencies revolve around it like imaginary moths, nothing's happened. An ounce just buys a lot more dollars now than it did and someday will be tradable for wagonloads of massively devalued fiat currencies. You don't buy gold; you convert your worthless promises into real stored value.

Personally, I've never seen the point of gold. It has relatively few real-world uses. You can't eat it, wear it, or burn it for heat and light. But it does have the useful quality of being a real thing, and when you could swap dollars for gold held in the US government's vault, dollars, too, were real things.

The difficulty with Bitcoins is that they have neither physical reality nor a long history (even if that history is one of increasing abstraction). Using them requires people to make the jump from the national currency they know straight into bits of code backed by a bunch of mathematics they don't understand.

Alternative currencies have been growing for some time now - probably the first was Ithaca Hours, which are accepted by many downtown merchants in my old home town of Ithaca, NY. What gives Ithaca Hours their value is that you trade them with people you know and can trust to support the local economy. Bitcoins up-end that: you trade them with strangers who can't find out who you are. The big advantage, as Bitcoin Consultancy co-founder Amir Taaki explains on Slashdot, is that their transaction costs are very, very low.

The idea of cryptographic cash is not new, though the peer-to-peer implementation is. Anonymous digital cash was first mooted by David Chaum in the 1980s; his company Digicash, began life in 1990 and by 1993 had launched ecash. At the time, it was widely believed that electronic money was an inevitable development. And so it likely is, especially if you believe e-money specialist Dave Birch, who would like nothing more than to see physical cash die a painful death.

But the successful electronic transaction systems are those that build on existing currencies and structures. Paypal, founded in 1998, achieved its success by enabling online use of existing bank accounts and credit cards. M-pesa and other world-changing mobile phone schemes are enabling safe and instant transactions to the developing world. Meanwhile, Digicash went bankrupt in 1999 and every other digital cash attempt of the 1990s also failed.

For comparison, ten-year-old GoldMoney's latest report says it's holding $1.9 billion in precious metals and currencies for its customers - still tiny by global standards. The most interesting thing about GoldMoney, however, is not the gold bug aspect but its reinvention of gold as electronic currency: you can pay other GoldMoney customers in electronic shavings of gold (minimum one-tenth of a gram) at a fraction of international banking costs.

"Humans will trade anything," writes Danny O'Brien in his excellent discussion of Bitcoins. Sure: we trade favors, baseball cards, frequent flyer miles, and information. But Birch is not optimistic about Bitcoin's long-term chances, and neither am I, though for different reasons. I believe that people are very conservative about what they will take in trade for the money they've worked hard to earn. Warren Buffett and his mentor, Benjamin Graham, typically offer this advice about investing: don't buy things you don't understand. By that rule, Bitcoins fail. Geeks are falling on them like any exciting, new start-up, but I'll guess that most people would rather bet on horses than take Bitcoins. There's a limit to how abstract we like our money to be.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

June 17, 2011

If you build it...

Lawrence Lessig once famously wrote that "Code is law". Today, at the last day of this year's Computers, Freedom, and Privacy, Ross Anderson's talk about the risks of centralized databases suggested a corollary: Architecture is policy. (A great line and all mine, so I thought, until reminded that only last year CFP had an EFF-hosted panel called exactly that.)

You may *say* that you value patient (for example) privacy. And you may believe that your role-based access rules will be sufficient to protect a centralized database of personal health information (for example), but do the math. The NHS's central database, Anderson said, includes data on 50 million people that is accessible by 800,000 people - about the same number as had access to the diplomatic cables that wound up being published by Wikileaks. And we all saw how well that worked. (Perhaps the Wikileaks Unit could be pressed into service as a measure of security risk.)

So if you want privacy-protective systems, you want the person vendors build for - "the man with the checkbook" to be someone who understands what policies will actually be implemented by your architecture and who will be around the table at the top level of government, where policy is being drafted. When the man with the checkbook is a doctor, you get a very different, much more functional, much more privacy protective system. When governments recruit and listen to a CIO you do not get a giant centralized, administratively convenient Wikileaks Unit.

How big is the threat?

Assessing that depends a lot, said Bruce Schneier, on whether you accept the rhetoric of cyberwar (Americans, he noted, are only willing to use the word "war" when there are no actual bodies involved). If we are at war, we are a population to be subdued; if we are in peacetime we are citizens to protect. The more the rhetoric around cyberwar takes over the headlines, the harder it will be to get privacy protection accepted as an important value. So many other debates all unfold differently depending whether we are rhetorically at war or at peace: attribution and anonymity; the Internet kill switch; built-in and pervasive wiretapping. The decisions we make to defend ourselves in wartime are the same ones that make us more vulnerable in peacetime.

"Privacy is a luxury in wartime."

Instead, "This" - Stuxnet, attacks on Sony and Citibank, state-tolerated (if not state-sponsored) hacking - "is what cyberspace looks like in peacetime." He might have, but didn't, say, "This is the new normal." But if on the Internet in 1995 no one knew you were a dog; on the Internet in 2011 no one knows whether your cyberattack was launched by a government-sponsored military operation or a couple of guys in a Senegalese cybercafé.

Why Senegalese? Because earlier, Mouhamadou Lo, a legal advisor from the Computing Agency of Senegal, had explained that cybercrime affects everyone. "Every street has two or three cybercafés," he said. "People stay there morning to evening and send spam around the world." And every day in his own country there are one or two victims. "it shows that cybercrime is worldwide."

And not only crime. The picture of a young Senegalese woman, posted in Facebook, appeared in the press in connection with the Strauss-Kahn affair because it seemed to correspond to a description given of the woman in the case. She did nothing wrong; but there are still consequences back home.

Somehow I doubt the solution to any of this will be found in the trend the ACLU's Jay Stanley and others highlighted towards robot policing. Forget black helicopters and CCTV; what about infrared cameras that capture private moments in the dark and helicopters the size of hummingbirds that "hover and stare". The mayor of Ogden, Utah wants blimps over his city, and, as Vernon M Keenan, director of the Georgia Bureau of Investigation put it, "Law enforcement does not do a good job of looking at new technologies through the prism of civil liberties."

Imagine, said the ACLU's Jay Stanley: "The chilling prospect of 100 percent enforcement."

Final conference thoughts, in no particular order:

- This is the first year of CFP (and I've been going since 1994) where Europe and the UK are well ahead on considering a number of issues. One was geotracking (Europe has always been ahead in mobile phones); but also electronic health care records and how to manage liability for online content. "Learn from our mistakes!" pleaded one Dutch speaker (re health records).

- #followfriday: @sfmnemonic; @privacywonk; @ehasbrouck; @CenDemTech; @openrightsgroup; @privacyint; @epic; @cfp11.

- The market in secondary use of health care data is now $2 billion (PriceWaterhouseCooper via Latanya Sweeney).

- Index on Censorship has a more thorough write-up of Bruce Schneier's talk.

- Today was IBM's 100th birthday.

- This year's chairs, Lillie Coney (EPIC) and Jules Polonetsky, did an exceptional job of finding a truly diverse range of speakers. A rarity at technology-related conferences.

- Join the weekly Twitter #privchat, Tuesdays at noon Eastern US time, hosted by the Center for Democracy and Technology.

- Have a good year, everybody! See you at CFP 2012 (and here every Friday until then).

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

June 16, 2011

The democracy divide (CFP2011 Day 2)

Good news: the Travel Security Administration audited itself and found it was doing pretty well. At least, so said Kimberly Walton, special counsellor to the administrator for the TSA.

It's always tough when you're the raw meat served up to the Computers, Freedom, and Privacy crowd, and Walton was appropriately complimented for her courage in appearing. But still: we learned little that was new, other than that the TSA wants to move to a system of identifying people who need to be scrutinized more closely.

Like CAPPS-II? asked the ACLU's Daniel Mach? "It was a terrible idea."

No. It's different. Exactly how, Walton couldn't say. Yet.

Americans spent the latter portion of last year protesting the TSA's policies - but little has happened? Why? It's arguable that a lot has to do with a lot of those protests being online complaints rather than massed ranks of rebellious passengers at airport terminals. And a lot has to do with the fact that FOIA requests and lawsuits move slowly. ACLU, said Ginger McCall, has been unable to get any answers from the TSA except by lawsuit.

Apparently it's easier to topple a government.

"Instead of the reign of terror, the reign of terrified," said Deborah Hurley.(CFP2001 chair) during the panel considering the question of social media's role in the upheavals in Egypt and Tunisia. Those on the ground - Jillian York, Nasser Weddady, Mona Eltawy - say instead that social media enabled little pockets of protest, sometimes as small as just one individual, to find each other and coalesce like the pooling blobs reforming into the liquid metal man in Terminator 2. But what appeared to be sudden reversals of rulers' fortunes to outsiders who weren't paying attention were instead the culmination of years of small rebellions.

The biggest contributor may have been video, providing non-repudiable evidence of human rights abuses. When Tunisia's President Zine al-Abidine Ben Ali blocked video sharing sites, Tunisians turned to Facebook.

"Facebook has a lot of problems with freedom of expression," said York, "but it became the platform of choice because it was accessible, and Tunisia never managed to block it for more than a couple of weeks because when they did there were street protests."

Technology may or may not be neutral, but its context never is. In the US for many years, Section 230 of the Communications Decency Act has granted somewhat greater protection to online speech than to that in traditional media. The EU long ago settled these questions by creating the framework of notice-and-takedown rules and generally refusing to award online speech any special treatment. (You may like to check out EDRI's response to the ecommerce directive (PDF).)

Paul Levy, a lawyer with Public Citizen and organizer of the S230 discussion, didn't like the sound of this. It would be, he argued, too easy for the unhappily criticized to contact site owners and threaten to sue: the heckler's veto can trump any technology, neutral or not.

What, Hurley asked Google's policy director, Bob Boorstin, to close the day, would be the one thing he would do to improve individuals' right to self-determination? Give them more secure mobile devices, he replied. "The future is all about what you hold in your hand." Across town, a little earlier, Senators Franken and Blumenthal introduced the Location Privacy Protection Act 2011.

Certainly, mobile devices - especially Talk to Tweet - gave Africa's dissidents a direct way to get their messages out. But at the same time, the tools used by dictators to censor and suppress Internet speech are those created by (almost entirely) US companies.

Said Weddady in some frustration, "Weapons are highly regulated. If you're trading in fighter jets there are very stringent frames of regulations that prevent these things from falling into the wrong hands. What is there for the Internet? Not much." Worse, he said, no one seems to be putting political behind enforcing the rules that do exist. In the West we argue about filtering as a philosophical issue. Elsewhere, he said, it's life or death. "What am I worth if my ideas remain locked in my head?"

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

June 15, 2011

Public private lives

A bookshop assistant followed me home the other day, wrote down my street address, took a photograph of my house. Ever since, every morning I find an advertising banner draped over my car windshield that I have to remove before I can drive to work.

That is, of course, a fantasy scenario. But it's an attempt to describe what some of today's Web site practices would look like if transferred into the physical world. That shops do not follow you home is why the analogy between Web tracking and walking on a public street or going into a shop doesn't work. It was raised by Jim Harper, the director of information policy studies at the Cato Institute, on the first day of ACM Computers, Freedom, and Privacy, at his panel on the US's Do Not Track legislation. Casual observers on the street are not watching you in a systematic way; you can visit a shop anonymously, and. depending on its size and the number of staff, you may or may not be recognized the next time your visit.

This is not how the Web works. Web sites can fingerprint your browser by the ecology of add-ins that are peculiar to you and use technologies such as cookies and Flash cookies to track you across the Web and serve up behaviorally targeted ads. The key element - and why this is different from, say, using Gmail, which also analyzes content to post contextual ads - is that all of this is invisible to the consumer. As Harlan Yu, a PhD student in computer science at Princeton, said, advertisers and consumers are in an arms race. How wrong is this?

Clearly, enough consumers find behavioral targeting creepy enough that there is a small but real ecology of ad-blocking technologies - the balking consumer side of the arms race - including everything from Flashblock and Adblock for Mozilla to the do-not-track setting in the latest version of Internet Explorer. (Though there are more reasons to turn off ads than privacy concerns: I block them because anything moving or blinking on a page I'm trying to read is unbearably distracting.)

Harper addressed his warring panellists by asking the legislation's opponents, "Why do you think the Internet should be allowed to prey on the entrails of the hapless consumer?" And of the legislation's sympathizers, "What did the Internet ever do to you that you want to drown it in the bathtub?"

Much of the ensuing, very lively discussion centered on the issue of trade-offs, something that's been discussed here many times: if users all opt out of receiving ads, what will fund free content? Nah, said Ed Felten, on leave from Princeton for a stint at the FTC, what's at stake is behaviorally targeted ads, not *all* ads.

The good news is that although it's the older generation who are most concerned about issues like behavioral targeting, teens have their own privacy concerns. My own belief for years has been that gloomy prognostications that teens do not care about privacy are all wrong. Teens certainly do value their privacy; it's just that their threat model is their parents. To a large extent Danah Boyd provided evidence for this view. Teens, she said, faced with the constant surveillance of well-meaning but intrusive teachers and parents, develop all sorts of strategies to live their private lives in public. One teen deactivates her Facebook profile every morning and reactivates it to use at night, when she knows her parents won't be looking. Another works hard to separate his friends list into groups so he can talk to each in the manner they expect. A third practices a sort of steganography, hiding her meaning in plain sight by encoding it in cultural references she knows her friends will understand but her mother will misinterpret.

Meantime, the FTC is gearing up to come down hard on mobile privacy. Commissioner Edith Ramirez of course favors consumer education, but she noted that the FTC will be taking a hard line with the handful of large companies who act as gatekeepers to the mobile world. Google, which violated Gmail users' privacy by integrating the social networking facility Buzz without first asking consent, will have to submit to privacy audits for the next 20 years. Twitter, whose private messaging was broken into by hackers, will be audited for the next ten years - twice as long as the company has been in existence.

"No company wants to be the subject of an FTC enforcement action," she said. "What happens next is largely in industry's hands." Engineers and developers, she said, should provide voluntary, workable solutions.

Europeans like to think the EU manages privacy somewhat better, but one of the key lessons to emerge from the first panel of the day, a compare-and-contrast discussion of data-sharing between the EU and the US was that there's greater parity than you might think. What matters, said Edward Hasbrouck, is not data protection but how the use of data affects fundamental rights - to fly or transfer money.

In that discussion, while the Department of Homeland Security representative, Mary Ellen Callahan, argued that the US is much more protective of privacy than a simple comparison of data protection laws might suggest. (There is a slew of pieces of US privacy legislation in progress.) The US operates fewer wiretaps by a factor of thousands, she argued, and is far more transparent.

Ah, yes, said Frank Schmiedel, answering questions to supplement the videotaped appearance of European Commission vice-president Viviane Reding, but if the US is going to persist in its demand that the EU transfer passenger name record, financial, and other data, one of these days, Alice, one of these days...the EU may come knocking, expecting reciprocity. Won't that be fun?

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

June 14, 2011

Untrusted systems

Why does no one trust patients?

On the TV series House, the eponymous sort-of-hero has a simple answer: "Everybody lies." Because he believes this, and because no one appears able to stop him, he sends his minions to search his patients' homes hoping they will find clues to the obscure ailments he's trying to diagnose.

Today's Health Privacy Summit in Washington, DC, the zeroth day of this year's Computers, Freedom, and Privacy conference, pulled together, in the best Computers, Freedom, and Privacy tradition, speakers from all aspects of health care privacy. Yet many of them agreed on one thing: health data is complex, decisions about health data are complex, and it's demanding too much of patients to expect them to be able to navigate these complex waters. And this is in the US, where to a much larger extent than in Europe the patient is the customer. In the UK, by contrast, the customer is really the GP and the patient has far less direct control. (Just try looking up a specialist in the phone book.)

The reality is, however, as several speakers pointed out, that doctors are not going to surrender control of their data either. Both physicians and patients have an interest in medical records. Patients need to know about their care; doctors need records both for patient care and for billing and administrative purposes. But beyond these two parties are many other interests who would like access to the intimate information doctors and patients originate: insurers, researchers, marketers, governments, epidemiologists. Yet no one really trusts patients to agree to hand over their data; if they did, these decisions would be a lot simpler. But if patients can't trust their doctor's confidentiality, they will avoid seeking health care until they're in a crisis. In some situations - say, cancer - that can end their lives much sooner than is necessary.

The loss of trust, said lawyer Jim Pyles, could bring on an insurance crisis, since the cost of electronic privacy breaches could be infinite, unlike the ability of insurers to insure those breaches. "If you cannot get insurance for these systems you cannot use them."

If this all (except for the insurance concerns) sounds familiar to UK folk, it's not surprising. As Ross Anderson pointed out, greatly to the Americans' surprise, the UK is way ahead on this particular debate. Nationalized medicine meant that discussions began in the UK as long ago as 1992.

One of Anderson's repeated points is that the notion of the electronic patient record has little to do with the day-to-day reality of patient care. Clinicians, particularly in emergency situations, want to look at the patient. As you want them to do: they might have the wrong record, but you know they haven't got the wrong patient.

"The record is not the patient," said Westley Clarke, and he was so right that this statement was repeated by several subsequent speakers.

One thing that apparently hasn't helped much is the Health Insurance Portability and Accountability Act, which one of the breakout sessions considered scrapping. Is HIPAA a failure or, as long-time Canadian privacy activist Stephanie Perrin would prefer it, a first step? The distinction is important: if HIPPA is seen as an expensive failure it might be scrapped and not replaced. First steps can be succeeded by further, better steps.

Perhaps the first of those should be another of Perrin's suggestions: a map of where your data goes, much like Barbara Garson's book Money Makes the World Go Around? followed her bank deposit as it was loaned out across the world. Most of us would like to believe that what we tell our doctors remains cosily tucked away in their files. These days, not so much.

For more detail see Andy Oram's blog.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

June 10, 2011

The creepiness factor

"Facebook is creepy," said the person next to me in the pub on Tuesday night.

The woman across from us nodded in agreement and launched into an account of her latest foray onto the service. She had, she said uploaded a batch of 15 photographs of herself and a friend. The system immediately tagged all of the photographs of the friend correctly. It then grouped the images of her and demanded to know, "Who is this?"

What was interesting about this particular conversation was that these people were not privacy advocates or techies; they were ordinary people just discovering their discomfort level. The sad thing is that Facebook will likely continue to get away with this sort of thing: it will say it's sorry, modify some privacy settings, and people will gradually get used to the convenience of having the system save them the work of tagging photographs.

In launching its facial recognition system, Facebook has done what many would have thought impossible: it has rolled out technology that just a few weeks ago *Google* thought was too creepy for prime time.

Wired UK has a set of instructions for turning tagging off. But underneath, the system will, I imagine, still recognize you. What records are kept of this underlying data and what mining the company may be able to do on them is, of course, not something we're told about.

Facebook has had to rein in new elements of its service so many times now - the Beacon advertising platform, the many revamps to its privacy settings - that the company's behavior is beginning to seem like a marketing strategy rather than a series of bungling missteps. The company can't be entirely privacy-deaf; it numbers among its staff the open rights advocate and former MP Richard Allan. Is it listening to its own people?

If it's a strategy it's not without antecedents. Google, for example, built its entire business without TV or print ads. Instead, every so often it would launch something so cool everyone wanted to use it that would get it more free coverage than it could ever have afforded to pay for. Is Facebook inverting this strategy by releasing projects it knows will cause widely covered controversy and then reining them back in only as far as the boundary of user complaints? Because these are smart people, and normally smart people learn from their own mistakes. But Zuckerberg, whose comments on online privacy have approached arrogance, is apparently justified, in that no matter what mistakes the company has made, its user base continues to grow. As long as business success is your metric, until masses of people resign in protest, he's golden. Especially when the IPO moment arrives, expected to be before April 2012.

The creepiness factor has so far done nothing to hurt its IPO prospects - which, in the absence of an actual IPO, seem to be rubbing off on the other social media companies going public. Pandora (net loss last quarter: $6.8 million) has even increased the number of shares on offer.

One thing that seems to be getting lost in the rush to buy shares - LinkedIn popped to over $100 on its first day, and has now settled back to $72 and change (for a Price/Earnings ratio 1076) - is that buying first-day shares isn't what it used to be. Even during the millennial technology bubble, buying shares at the launch of an IPO was approximately like joining a queue at midnight to buy the new Apple whizmo on the first day, even though you know you'll be able to get it cheaper and debugged in a couple of months. Anyone could have gotten much better prices on Amazon shares for some months after that first-day bonanza, for example (and either way, in the long term, you'd have profited handsomely).

Since then, however, a new game has arrived in town: private exchanges, where people who meet a few basic criteria for being able to afford to take risks, trade pre-IPO shares. The upshot is that even more of the best deals have already gone by the time a company goes public.

In no case is this clearer than the Groupon IPO, about which hardly anyone has anything good to say. Investors buying in would be the greater fools; a co-founder's past raises questions, and its business model is not sustainable.

Years ago, Roger Clarke predicted that the then brand-new concept of social networks would inevitably become data abusers simply because they had no other viable business model. As powerful as the temptation to do this has been while these companies have been growing, it seems clear the temptation can only become greater when they have public markets and shareholders to answer to. New technologies are going to exacerbate this: performing accurate facial recognition on user-uploaded photographs wasn't possible when the first pictures were being uploaded. What capabilities will these networks be able to deploy in the future to mine and match our data? And how much will they need to do it to keep their profits coming?

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

June 3, 2011

A forgotten man and a bowl of Japanese goldfish

"I'm the forgotten man," Godfrey (William Powell) explains in the 1936 film My Man Godfrey.

Godfrey was speaking during the Great Depression, when prosperity was just around the corner ("Yes, it's been there a long time," says one of Godfrey's fellow city dump dwellers) but the reality for many people was unemployment, poverty, and a general sense that they had ceased to exist except, perhaps, as curiosities to be collected by the rich in a scavenger hunt. Today the rich in question would record their visit to the city dump in an increasingly drunken stream of Tweets and Facebook postings, and people in Nepal would be viewing photographs and video clips even if Godfrey didn't use a library computer to create his own Facebook page.

The EU's push for a right to be forgotten is a logical outgrowth of today's data protection principles, which revolve around the idea that you have rights over your data even when someone else has paid to collect it. EU law grants the right to inspect and correct the data held about us and to prevent its use in unwanted marketing. The idea that we should also have the right to delete data we ourselves have posted seems simple and fair, especially given the widely reported difficulty of leaving social networks.

But reality is complicated. Godfrey was fictional; take a real case, from Pennsylvania. A radiology trainee, unsure what to do when she wanted a reality check whether the radiologist she was shadowing was behaving inappropriately, sought advice from her sister, also a health care worker before reporting the incident. The sister told a co-worker about the call, who told others, and someone in that widening ripple posted the story on Facebook, from where it was reported back to the student's program director. Result: the not-on-Facebook trainee was expelled on the grounds that she had discussed a confidential issue on a cell phone. Lawsuit.

So many things had to go wrong for that story to rebound and hit that trainee in the ass. No one - except presumably the radiologist under scrutiny - did anything actually wrong, though the incident illustrates the point that than people think. Preventing this kind of thing is hard. No contract can bar unrelated, third-hand gossipers from posting information that comes their way. There's nothing to invoke libel law. The worst you can say is that the sister was indiscreet and that the program administrator misunderstood and overreacted. But the key point for our purposes here is: which data belongs to whom?

Lilian Edwards has a nice analysis of the conflict between privacy and freedom of expression that is raised by the right to forget. The comments and photographs I post seem to me to belong to me, though they may be about a dozen other people. But on a social network your circle of friends are also stakeholders in what you post; you become part of their library. Howard Rheingold, writing in his 1992 book The Virtual Community, noted the ripped and gaping fabric of conversations on The Well when early member Blair Newman deleted all his messages. Photographs and today's far more pervasive, faster-paced technology make such holes deeper and multi-dimensional. How far do we need to go in granting deletion rights?

The short history of the Net suggests that complete withdrawal is roughly impossible. In the 1980s, Usenet was thought of as an ephemeral medium. People posted in the - they thought - safe assumption that anything they wrote would expire off the world's servers in a couple of weeks. And as long as everyone read live online that was probably true. But along came offline readers and people with large hard disks and Deja News, and Usenet messages written in 1981 with no thought of any future context are a few search terms away.

"It's a mistake to only have this conversation about absolutes," said Google's Alma Whitten at the Big Tent event two weeks ago, arguing that it's impossible to delete every scrap about anyone. Whitten favors a "reasonable effort" approach and a user dashboard to enable that so users can see and control the data that's being held. But we all know the problem with market forces: it is unlikely that any of the large corporations will come up with really effective tools unless forced. For one thing, there is a cultural clash here between the EU and the US, the home of many of these companies. But more important, it's just not in their interests to enable deletion: mining that data is how those companies make a living and in return we get free stuff.

Finding the right balance between freedom of expression (my right to post about my own life) and privacy, including the right to delete, will require a mix of answers as complex as the questions: technology (such as William Heath's Mydex), community standards, and, yes, law, applied carefully. We don't want to replace Britain's chilling libel laws with a DMCA-like deletion law.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.