" /> net.wars: November 2014 Archives

« October 2014 | Main | December 2014 »

November 28, 2014

Roger Clarke explains it all for you

"You'll hear this term a lot," Roger Clarke said. "Cultural cringe." The initial explanation was long and rambled from the 20th century world wars to the 1986 movie Crocodile Dundee. Meanwhile, we were on a whirlwind tour of Canberra: the old and new Parliament houses, Anzac Memorial Drive@@, the bus station. The hills were alive with kangaroos, despite my extreme disability to see animals from moving vehicles.

Clarke later boiled it down to two shorter versions:

"It's always done better somewhere else."


"The need for external approval."

It turned out to be a phrase no one I met used but everyone seemed to recognize.

The neighbour of one of my hosts along the coast north of Sydney put it this way: "It's a lack of belief in your own history."

To some extent this is true in a lot of places. The US, which has such an apparently similar history, seems to have gotten all of Britain's arrogance about its place in the world: what American feels the need to prove themselves outside their own country? But, says Clarke, late 19th century Australian artists went to Paris for validation whle late 20th century Australian intellectuals went to London. Today, American TV sets the standard, which all may at least partially explain why the morning news sounded so...familiar. The snow in Buffalo, NY and Britain's raised state of terror alert have both made the morning news in the last two weeks. (On the other hand, wandering the TV dial unearthed French, Chinese, and Indonesian news.)

I doubt Clarke himself, probably Australia's longest-serving and best-known privacy advocate, would claim that his view of the country he's lived in most of his life are typical of Australia's 23.6 million inhabitants. Even so, his thoughts resolve the confusions of ">last week into some kind of coherence.

"We've been an outpost from the beginning," he says, and pointed to the Australian flag as a slightly fanciful example: Union Jack in the upper left corner. Then the remaining three-quarters, in his view, available for the Japanese flag, the US flag, and whoever comes next. "We started with the mentality that elsewhere was better."

It's a roundabout way of explaining why Australia, as remote as it is, isn't as distinct in terms of policies and culture as you might expect. Foreign affairs minister Julie Bishop looks and sounds like Theresa May's understudy. Australia is in the process of passing new-to-them, familiar-to-us anti-terrorism laws. The National Security Legislation Amendment expands the surveillance powers of the Australian Security and Intelligence Organisation (ASIO, Australia's equivalent of Britain's MI5). The Foreign Fighters bill expands overseas surveillance and prohibits speech "advocating terrorism". The Telecommunications (Interception and Access) Amendment (Data Retention) Bill seeks to require ISPs to retain communications and location data for two years.

"Australia had regarded ASIO as being largely buffoons until quite some years after 2001," Clarke says. ASIS, the Australian Secret Intelligence Service, the equivalent of Britain's MI6, "were more dangerous because they were armed" - but they functioned overseas.

"Somewhere around the mid-2000s they all ended up with crew cuts or no hair, and they sounded more and more like Americans. From then on, they began taking the kinds of measures the US and equally scared UK came up with and dropping them into Australian law."

For Clarke, all of that is explicable as part of the country's psyche. The Anzac troops rushed to help fight Europe's wars (and the memorials are everywhere here). When the country's northern outpost, Darwin, was bombed in World War II, it was American, rather than British, troops that kept the rest of the country from being invaded. ""The reliance gene transferred."

And then came the Internet, created by laying US-invented TCP/IP over networking inventions created jointly around the world. Australian computer scientists immediately saw the benefits, and connections were up and running as early as 1989 and Australian engineers were involved in all of the Internet bodies. Electronic Frontiers Australia was founded in 1994, only a few years after the US's EFF - and ten years before Britain's Open Rights Group. The 14th Web server in the world was in Australia. For the first ten years, all Australia's international Internet traffic flowed via the US.

Meanwhile, although Australia was involved in the creation of and has acceded to both the Universal Declaration of Human Rights and the International Covenant on Civil and Political Rights, it hasn't given them force in domestic legislation even though that's one of the obligations inherent in accession.

"It wouldn't matter if we had something like the Bill of Rights," says Clarke, "but the actual situation in Australian law is that the Constitution embodies about six rights for individuals, none of them human rights." Among the six: the right to vote, the right not to have property seized by the government. "Any human rights that exist in Australian law derive from common law, which in large part derives from 1901 British law." So: no statutory right to freedom of expression or privacy, and no framework via which new laws can be challenged - and it is hard work to prevent the government from succeeding in passing laws that in the US (for example) could be challenged on Constitutional grounds.

"We are such a British country and such an American country, and yet so different."

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

November 21, 2014

The life of Bryan

"Is 'Dead Bryan' a real person?" No one in our group in this Melbourne bar seemed very sure. Freedom of information request! Or DuckDuckGo search.

"Dead Bryan" is the near-affectionate name for the guy in the photographs on Australia's plain cigarette packaging, required by law since 2012. Liberally framed in black health warnings, branding is limited to olive green lettering giving the manufacturer's name. The most eye-catching feature is the twin pictures of a healthy young man and what purports to be the same young man, six weeks later: Bryan, dead at 34. He looks terrible. Besides being dead, I mean.

This deterrent packaging is an example of the troubles a country faces these days when it's the first to adopt a new idea that takes square aim at the interests of corporations and other countries. It's costing Australia millions to defend its plain-packaging law against lawsuits brought to the WTO by Philip Morris and five tobacco-producing countries: Cuba, Dominican Republic, Honduras, Indonesia, and Ukraine, which claim the packaging violates trade and intellectual property law by interfering with their ability to promote their premium tobacco products. Suddenly there is a new twist to intellectual property laws as a public health issue.

If the TTIP and TTP treaties come into being, we can expect many more suits like this case under the treaties' ISDS provisions. These Investor--State Dispute Settlement rules that make it easier for companies to seek legal redress against the actions of democratically elected governments. In the plain packaging case, even if the principle didn't matter, probably the cost of defending the lawsuits is dwarfed by the cost of medical care for generations of smokers.

Dead Bryan comes from the this is your brain on drugs school of deterrent advertising. "I don't really like looking at him," says one of my companions, noting that at A$26 peer pack of 25 he really can't afford to smoke. And yet, a few minutes later he's lighting up. Ingrained addictions are hard to shake.

Aside from the three absurd kangaroos bounding across a Victoria field the other day, much of the landscape I'm encountering is familiar. The morning news this week has been featured Julie Bishop, the minister for foreign affairs, who could be auditioning for Theresa May's job as the British Home Secretary. Early in the week Bishop warned the UN of the dangers of the young, modern terrorist. Like the UK authorities, Bishop talks about the dangers of social media's being used to recruit and incite disaffected young people. As in the UK, there is talk of not allowing the back into the country if they leave.

I'm not clear on the logic of such proposals. "We don't want terrorists living in our country" is simple enough. Butt do you really want to create a class of much angrier, even more disaffected individuals who are now stateless? Sending misfits somewhere else worked when Australia and the US were viewed as empty continents; unless climate change warms up Antarctica by a whole lot we don't have that option any more. As things are, these we-won't-let-them-travel ideas are reminiscent of the McCarthy era in the US, when people believed to be Communists were denied passports. I knew several on the folk scene who were harmless unless you hate banjo. When we see the same proposals coming from such disparate places and party affiliations it's natural to ask: are Bishop (and her prime minister, I'm not a tech head Tony Abbott), Obama, and Theresa May all facing the same problems - or being briefed by the same security agencies and technology vendors?

And yet the Migration Museum in Adelaide shows the commonality of thought goes some way back. During World War II, when the US was interning Japanese-Americans in camps, Australians were doing the same thing with its German immigrants. One of the profiles the museum publishes notes that after the war, destitute, one such internee begged the government to revoke his naturalized citizenship and send him back to Germany.

The contradictions and subtleties of a newly-met country are always hard to grasp: Australia is famous for loud, hearty matiness - and yet you can be fined $240 for swearing in public in Melbourne. I'm told the current government doesn't believe climate change is a real thing, and yet Australia is the poster child for the effects. The temperature today in Sydney is expected to hit 100 degrees (40C) - and it's only November.

The best analogy I can come up with is to the landscape: the English brought roses, lawns, and sculptured gardens; the Italians (and probably Greeks) brought olive trees, which actually do fit the climate; the Scots brought gorse, which here is a weed in need of pest control. Similarly, Adelaide's distinctive look is European housing styles built out of local stone and decorated with 150-year-old iron lacework. I suppose it's unreasonable to expect the country's laws to be any different.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

November 14, 2014

Half time

I can't remember when I learned that the world was neatly divided up into 24 steps that translated geographically into time zones, so I must have been very young. The rampant spread of international communications and travel probably means most people learn pretty early that time-keeping moves across the globe. The all-day coverage of the turn of the millennium was a great worked example: every hour, CNN would turn to a new group of cities in a new time zone where it was midnight all over again.

It was only this week that I became more forcibly aware of the areas of the world where this neatly stepped, digital-friendly way of looking at things breaks down. To wit: Adelaide, South Australia, where time slips away at a half-hour offset from everyone else. Here, depending on the time of year it's anything from eight and a half to ten and a half hours ahead of London, 13 and a half to 15 and a half hours ahead of New York. The stray half-hour (plus the two countries' recent asynchronous moves to winter and summer timer respectively) makes calculating the time "back home" remarkably more confusing than you'd think it should be.

Wikipedia informs me that there are all sorts of oddities I've never contemplated, some of them embarrassing omissions from my general knowledge. Prime among the latter are India and Sri Lanka, which I'd managed never to notice in 60 years also live on a half-hour offset (UTC+5:30). If you're looking at the world in those 24 steps, India spans two time zones, but apparently saw compromising on a midway average, the half-hour offset, as an attractive way to help unify a newly independent country.

More oddities: two zones in the Pacific that lie on a north-south plane have the same time but dates a day apart. For similar reasons, the world actually has 26 hourly time zones, not 24. And then there's New Zealand's Chatham Islands, which lope along on a quarter-hour offset (UTC+12:45).

All of this is the standardized - analogous to digitized - version of time. In the analog world that pre-dated the need for standardized time created in the 19th century by the railways' need for predictable schedules, time was defined by whatever your nearest sundial said. Noon was when the sun said it was, and time was inextricably linked to the movements of sun, planets, and stars in the physical world.

In 2005, I wrote about efforts by US scientists to eliminate leap seconds. The incongruity was fun: scientists, engineers, the ITU, and national governments, were all spending years squabbling over seconds. Now it's decades: the issue has been put off and left unresolved, and there's a new vote in 2015. Yes: it's' now decades they've spent squabbling over seconds.

But of course the debate isn't really about seconds: it's about whether to take yet another step in divorcing time as we use, understand, and measure it from its natural origins. The issue is the earth's slowing rotation, which means that standardized time has to be periodically corrected - by adding leap seconds - to bring it back into alignment. The proposed redefinition of the second would eliminate this problem. Currently and since 1967, a second is defined as 1/86,400 of the mean solar day; the replacement would define it by the rate of decay of caesium atoms. Either decision creates problems for someone: astronomers, GPS, other navigational instruments, and satellite communications all are affected by how time is defined.

It's difficult for a non-expert to evaluate the validity of the claims on either side or which would suffer more if the principle they oppose is adopted. On a personal level, there's no need for most of us to care as long as the stuff we use every day keeps working. The average person's uses of time do not require the level of precision that GPS satellites and giant telescopes do. Does either of these arcane proposals really affect when you have lunch?

And yet emotionally, the argument that man has used sun and stars for navigation for tens of thousands of years, and we should not lightly abandon that physical connection has a certain resonance. For one thing, I like the notion that the correct time can be derived with some precision from devices that one person can create with little more than a stick and a piece of string - or machine tools and precision drawings. The ability to derive the basic principles of science the same way our forebears did is important.

We forget physics at our peril. Physics is why the traders in Michael Lewis's Flash Boys sought to shave inches off the fiber connections that connected them to the exchanges. Physics is why the 2014 map of undersea cables looks a lot like the 1891 map of world telegraph lines. One look at either map, and you know precisely where to put the wiretaps (or the backhoes).

When we talk about taking back the Net, physics matters. We can lobby for better law; we can promote different norms; and we can change the values embedded in computer code. About physics, we can't do a damn thing.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

November 7, 2014

Private fears in public places

A lot has been written this week about the Samaritans' peculiar decision to release the Radar app, which monitors people's followers' tweets in case they need help, pulled minutes before this column was posted. It's a nice bit of mathematical exposition on the reach of social media to note that it only took a little over 3,000 people activating the app to have Radar monitoring 1.64 million Twitter accounts.

A petition asks Twitter to shut the app shut down, while IT Pro notes a large number of both false positives (tweets incorrectly identified as worrying) and false negatives (tweets the app ought to have spotted). The Guardian says the Information Commissioner is looking into the myriad complaints.

The most peculiar aspect of this is that the Samaritans are not online newcomers. Instead, they have a long history going back to the mid-1990s, when it opened up an email account and, if I remember correctly, a conference on CIX. Twenty years ago, I remember (perhaps incorrectly) being told that relatively few people used email to access their services, but those that did tended to be the most isolated and desperate.

Legally, the Samaritans may be right (IANAL) to argue that Radar does not break the law. Data protection principles ban collecting data for one stated purpose and reusing it without consent for a different one; but the Samaritans do not own or operate either Twitter or the tweets the app analyzes, which were, of course, publicly posted on a service that everyone knows is searched and analyzed by advertisers looking for appropriate targets. Yet it feels completely wrong that an organization whose only goal is to help people has acted in this way.

The answer, I think, lies in an October 2003 posting by Danny O'Brien in which he discusses the Net's inability to support "public-private" spaces. If you have a private conversation with a friend in a busy pub, the fact that you are in a public space filled with passers-by doesn't lessen your expectation of (mostly) privacy. You're talking to your friend, not to the waitress who takes away your empty glasses or the people eavesdropping at the next table. If you expressed despair in that conversation, you would not be at all pleased if someone came over and interrupted and said to your friend - not even to you - "Are you worried about him? Do you think he might be suicidal? Do you want us to help?" Even while recognizing (eventually) that the interloper was well-meaning, I imagine anger, harsh words, and a quick departure.

Danny's posting makes this key point: "They are not talking to *you*."

In the physical world, social norms hold that when a private conversation takes place in public even if we can hear it we pretend otherwise. Mobile phones have disrupted this imaginary Cone of Silence by allowing people to dissociate their loud, lengthy, one-sided, public conversations from the others sharing the same space, who may stare at each other amazed at the detail being disclosed. Would The Samaritans station people in coffee shops to monitor these conversations and offer an assessment if the overheard half sounded as though the person on the other end were upset? Highly unlikely.

Which all leads me to suspect that the often-discussed distancing effect of digital media has a role to play here. That's obviously not the only factor - The Samaritans would surely think it wrong to hack into people's email to monitor their state of mind - but it's a contributor. The people leaving messages on social media are not always easy to discern behind the user names and raw text.

The security services have been arguing for some time that data collection (as in mass surveillance) is innocuous if it's not monitored by humans - that automated surveillance is not an invasion of privacy. I've tried in the past to outline why I think this is wrong: that the collection itself does damage. Jay Stanley argues that what matters most is not whether robots or humans do the collecting but the resulting "reverberations" throughout one's life. One such reverberation is chilled speech. Probably most distressed people consider carefully what they can tell their friends without alarming them.

Whether the organization has broken the law misses the point. Lawrence Lessig, among others, has noted three ways humans create social order: law, code, and norms. The Samaritans began with poorly considered code, and have sought to back it up with law. Instead, what matters here - leaving aside the knotty and complex questions of how to treat people with mental disorders - is norms. The Samaritans' response - telling people to "opt out" by taking their tweets private, or to join the organization's' whitelist - is analogous to advising you to take your pub conversation home if you don't want to be interrupted. The Samaritans' first clue that it was disconnected from our social norms should have come when it drafted its own press release calling Twitter an important surveillance tool. That's not what most of us mean by "social media".

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.