Main

March 5, 2021

Covid's children

LSE-Livingstone-panel-2021-03.pngI wonder a lot about how the baby downstairs will develop differently because of his September 2020 birth date. In his first five months, the only humans who have been in close contact are his two parents, a smattering of doctors and nurses, and a stray neighbor who occasionally takes him for walks. Walks, I might add, in which he never gets out of his stroller but in which he exhibits real talent for staring contests (though less for intelligent conversation). His grandparents he only knows through video calls. His parents think he's grasped that they're real, though not present, people. But it's hard to be sure.

The effects of the pandemic are likely to be clear a lot sooner for the older children and young people whose lives and education have been disrupted over the past year. This week, as part of the LSE Post-Covid World Festival, Sonia Livingstone (for whose project I wrote some book reviews a few years ago) led a panel to discuss those effects.

Few researchers in the UK - Livingstone, along with Andy Phippen, is one of the exceptions, as is, less formally, filmmaker and House of Lords member Beeban Kidron, whose 2013 film InRealLife explores teens' use of the Internet - ever bother to consult children to find out what their online experiences and concerns really are. Instead, the agenda shaped by politicians and policy makers centers on adults' fears, particularly those that can be parlayed into electoral success. The same people who fret that social media is posing entirely new problems today's adults never encountered as children refuse to find out what those problems look like to the people actually experiencing them. Worse, the focus is narrow: protecting children from pornography, grooming, and radicalization is everywhere, but protecting them from data exploitation is barely discussed. In the UK, as Jen Persson, founder of DefendDigitlMe, keeps reminding us, collecting children's data is endemic in education.

This was why the panel was interesting: all four speakers are involved in projects aimed to understand and amplify children's and young people's own concerns. From that experience, all four - Konstantinos Papachristou, the youth lead for the #CovidUnder19 project, Maya Götz, who researches children, youth, and television, Patricio Cuevas-Parra, who is part of a survey of 10,000 children and young people, and Laurie Day - highlighted similar issues of lack of access and inequality - not just to the Internet but also to vaccines and good information.

In all countries, the shift to remote leaning has been abrupt, exposing infrastructure issues that were always urgent, but never quite urgent enough to fix. Götz noted that in some Asian countries and Chile she's seeing older technologies being pressed into service to remedy some of this - technologies like broadcast TV and radio; even in the UK, after the first lockdown showed how many low-income families could not afford sufficient data plans, the the BBC began broadcasting curriculum-based programming.

"Going back to normal," Day said, "needs a rethink of what support is needed." Yet for some students the move to online learning has been liberating, lightening social and academic pressures and giving space to think about their values and the opportunity to be creative. We don't hear so much about that; British media focus on depression and loss.

By the time the baby downstairs reaches school age, the pandemic will be over, but its footprint will be all over how his education proceeds.

Persson, who focuses on the state's use of data in education, says that one consequence of the pandemic is that Microsoft and Google have entrenched themselves much more deeply into the UK's education infrastructure.

"With or without covid, schools are dependent on them for their core infrastructure now, and that's through platforms joining up their core personal data about students and staff - email addresses, phone numbers, names, organizational data - and joining all that up," she says. Parents are encouraged to link to their children's accounts, and there is, for the children concerned, effectively, "no privacy". The software, she adds, was really designed for business and incompletely adapted for education. For example, while there are controls schools can use for privacy protection, the defaults, as always, are towards open sharing. In her own children's school, which has 2,000 students, the software was set up so every user could see everyone else's email address.

"It's a huge contrast to [the concern about] online harms, child safety, and the protection mantra that we have to watch everything because the world is so unsafe," she says. Partly, this is also a matter of perception: policy makers tend to focus on "stranger danger" and limiting online content rather than ID theft, privacy, and how all this collected data may be used in the future. The European Digital Rights Initiative (EDRi) highlights the similar thinking behind European Commission proposals to require the platforms to scan private communications as part of combating child sexual abuse online.

All this awaits the baby downstairs. The other day, an 18-month-old girl ran up to him, entranced. Her mother pulled her back before she could touch him or the toys tied to his stroller. For now, he, like other pandemic babies, is surrounded by an invisible barrier. We won't know for several decades what the long-term effect will be.


Illustrations: Illustrations: Sonia Livingstone's LSE panel.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

December 13, 2019

Becoming a science writer

JamesRandi-Florida2016jpg
As an Association of British Science Writers board member, I occasionally speak to science PhD students and postdocs about science writing. Since the most recent of these excursions was just this week, I thought I'd summarize some of what I've said.

To trained scientists aiming to make the switch: you are starting from a more knowledgeable place than the arts graduates who mostly populate this field. You already know how to investigate and add to a complex field of study, have a body of knowledge from which to reliably evaluate new claims, and know the significant contributors to your field and adjacent ones. What you need to learn are basic journalism skills such as interviewing, identifying stories, pitching them to venues where they might fit, remaining on the right side of libel law, and journalistic ethics and culture. Your new deadlines will seem really short!

Figuring out what kind of help you need is where an organization like the ABSW (and its counterparts in other countries) can help, first by offering opportunities for networking with other science writers, and second by providing training and resources. ABSW maintains, for example, a page that includes some basics and links.

Besides that, if you put "So You Want to Be a Science Writer" into your favorite search engine, you will find many guides from reputable sources such as other science writers' associations and university programs. I particularly like Ivan Oransky's talk for the National Association of Science Writers, because he begins with "my first failures".

Every career path is idiosyncratic enough that no one can copy its specifics. I began my writing career by founding The Skeptic magazine in 1987. Through the skeptics, I met all sorts of people, including one who got me my first writing-related job as a temporary subeditor on a computer magazine. Within weeks, I knew the editors of all the other magazines on its floor, and began writing features for them. In 1991, when I got online and sent my first email, I decided to specialize on the Internet because it was obviously the future of communication. A friend advises that if you find a fast-moving field, there will always be people willing to pay you to explain it to them.

So: I self-published, networked early and often - I joined the ABSW as soon as I was qualified - and luckily landed on a green field at the beginning of a complex and far-reaching social, cultural, political, and technological revolution. Today's early-career science writers will have to work harder to build their own networks than in the early 1990s, when we all met regularly at press conferences and shows - but they have vastly further reach than we had.

I have never had a job, so I can't tell people how to get one. I can, however, observe that if you focus solely on traditional media you will be aiming at a shrinking number of slots. Think more broadly about what science communication is, who does it, and in what context. The kind of journalism that used to be the sole province of newspapers and news magazines now has a home in NGOs, who also hire people who can do solid research, crunch data, and think creatively about new areas for investigation. You should also broaden your idea of "media" and "science communication". Few can be Robin Ince or Richard Wiseman, who combine comedy, magic, and science into sell-out shows, but everyone can work in non-traditional contexts in which to communicate science.

At the moment, commercial money is going into podcasts; people are building big followings for niche interests on YouTube and through self-publishing ebooks; and constant tweeters are important communicators, as botanist James Wong proves every day. Edward Hasbrouck, at the National Writers Union, has published solid advice on writing for digital formats: look to build revenue streams. The Internet offers many opportunities, but, as Hasbrouck writes, many are invisible to traditional publishing; as he also writes, traditional employment is just one of writers' many business models.

The big difficulty for trained academics is rethinking how you approach telling a story. Forget the academic structure of: 1) here is what I am going to say; 2) this is what I'm saying; 3) this is my summary of what I just said. Instead, when writing for the general public, put your most important findings first and tell your specific audience why it matters to *them*. Then show why they can have confidence in your claim by explaining your methods and how your findings fit into the rest of the relevant body of scientific knowledge. (Do not use net.wars as your model!)

Over time, you will probably want to branch out into other fields. Do not fear this; you know how to learn a complex field, and if you can learn one you can learn another.

It's inevitable that you will make some mistakes. When it happens, do your best to correct them, learn from how you made them, and avoid making the same one again.

Finally just a couple of other resources. My favorite book on writing is William Goldman's Adventures in the Screen Trade. He has solid advice for story structure no matter what you're writing. A handout I wrote for a blogging workshop for scientists (PDF) has some (I hope, useful) writing tips. Good luck!


Illustrations: Magician James Randi communicates science, Florida 2016.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

September 14, 2018

Hide by default

Beeban-Kidron-Dubai-2016.jpgLast week, defenddigitalme, a group that campaigns for children's data privacy and other digital rights, and Livingstone's group at the London School of Economics assembled a discussion of the Information Commissioner's Office's consultation on age-appropriate design for information society services, which is open for submissions until September 19. The eventual code will be used by the Information Commissioner when she considers regulatory action, may be used as evidence in court, and is intended to guide website design. It must take into account both the child-related provisions of the child-related provisions of the General Data Protection Regulation and the United National Convention on the Rights of the Child.

There are some baseline principles: data minimization, comprehensible terms and conditions and privacy policies. The last is a design question: since most adults either can't understand or can't bear to read terms and conditions and privacy policies, what hope of making them comprehensible to children? The summer's crop of GDPR notices is not a good sign.

There are other practical questions: when is a child not a child any more? Do age bands make sense when the capabilities of one eight-year-old may be very different from those of another? Capacity might be a better approach - but would we want Instagram making these assessments? Also, while we talk most about the data aggregated by commercial companies, government and schools collect much more, including biometrics.

Most important, what is the threat model? What you implement and how is very different if you're trying to protect children's spaces from ingress by abusers than if you're trying to protect children from commercial data aggregation or content deemed harmful. Lacking a threat model, "freedom", "privacy", and "security" are abstract concepts with no practical meaning.

There is no formal threat model, as the Yes, Minister episode The Challenge (series 3, episode 2), would predict. Too close to "failure standards". The lack is particularly dangerous here, because "protecting children" means such different things to different people.

The other significant gap is research. We've commented here before on the stratification of social media demographics: you can practically carbon-date someone by the medium they prefer. This poses a particular problem for academics, in that research from just five years ago is barely relevant. What children know about data collection has markedly changed, and the services du jour have different affordances. Against that, new devices have greater spying capabilities, and, the Norwegian Consumer Council finds (PDF), Silicon Valley pays top-class psychologists to deceive us with dark patterns.

Seeking to fill the research gap are Sonia Livingstone and Mariya Stoilova. In their preliminary work, they are finding that children generally care deeply about their privacy and the data they share, but often have little agency and think primarily in interpersonal terms. The Cambridge Analytica scandal has helped inform them about the corporate aggregation that's taking place, but they may, through familiarity, come to trust people such as their favorite YouTubers and constantly available things like Alexa in ways their adults disl. The focus on Internet safety has left many thinking that's what privacy means. In real-world safety, younger children are typically more at risk than older ones; online, the situation is often reversed because older children are less supervised, explore further, and take more risks.

The breath of passionate fresh air in all this, is Beeban Kidron, an independent - that is, appointed - member of the House of Lords who first came to my attention by saying intelligent and measured things during the post-referendum debate on Brexit. She refuses to accept the idea that oh, well, that's the Internet, there's nothing we can do. However, she *also* genuinely seems to want to find solutions that preserve the Internet's benefits and incorporate the often-overlooked child's right to develop and make mistakes. But she wants services to incorporate the idea of childhood: if all users are equal, then children are treated as adults, a "category error". Why should children have to be resilient against systemic abuse and indifference?

Kidron, who is a filmmaker, began by doing her native form of research: in 2013 she made a the full-length documentary InRealLife that studied a number of teens using the Internet. While the film concludes on a positive note, many of the stories depressingly confirm some parents' worst fears. Even so it's a fine piece of work because it's clear she was able to gain the trust of even the most alienated of the young people she profiles.

Kidron's 5Rights framework proposes five essential rights children should have: remove, know, safety and support, informed and conscious use, digital literacy. To implement these, she proposes that the industry should reverse its current pattern of defaults which, as is widely known, 95% of users never change (while 98% never read terms and conditions). Companies know this, and keep resetting the defaults in their favor. Why shouldn't it be "hide by default"?

This approach sparked ideas. A light that tells a child they're being tracked or recorded so they can check who's doing it? Collective redress is essential: what 12-year-old can bring their own court case?

The industry will almost certainly resist. Giving children the transparency and tools with which to protect themselves, resetting the defaults to "hide"...aren't these things adults want, too?


Illustrations: Beeban Kidron (via Wikimedia)

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

December 22, 2012

The personal connection

This week my previously rather blasé view of online education - known in its latest incarnation as MOOCs, for massive open online courses - got a massive shove toward the enthusiastic by the story of 17-year-old Daniel Bergmann's experience in one such course in modern poetry, a ten-week class given by Al Filreis at the University of Pennsylvania and offered by Coursera.

How Daniel, the son of two of my oldest friends, got there is a long story better told on Filreis's blog by Filreis and Daniel himself. The ultra-brief summary is this: Daniel is, as he writes to Filreis (reproduced in that blog entry), "emerging from autism"; he communicates by spelling out words - sentences - paragraphs - on a letterboard app on his iPad. That part of the story was told here in 2010. But the point is this: 36,000 students signed up for the class, and 2,200 finished with a certificate. Daniel was one of them.

The standard knocks on online education have been that:

- students lose most or all of the interaction with each other that really defines the benefit of the college experience;

- the drop-out rate is staggering;

- there are issues surrounding accreditation and credentials;

- it's just not as good as "the real thing";

- there may not be a business model (!);

- but maybe for some people who don't have good access to traditional education and who have defined goals it will work.

To some extent all these things are true, but with caveats. For one thing, what do we mean by "as good"? If we mean that the credential from an online course isn't as likely to land you a high-paying or high-influence job as a diploma from Cornell or Cambridge, that's true of all but some handfuls of the world's institutions of higher learning. If we mean the quality of the connections you make with professors who write the textbooks and students who are tomorrow's stars, that's true of many real-world institutions as well. If you mean the quality of the education - which hardly anyone seems to mean these days - that's less clear. Only a small minority can get into - or afford - the top universities of this world; education you can have has to be better than education you can't.

The drop-out numbers are indeed high, but as The Atlantic points out, we're at the beginning of an experiment. The 160,000 people who signed up for Sebastian Thrun's Udacity course on AI aren't losing their only chance by not completing it; how you spend the four years between 18 and 22 is a zero-sum game, but education in other contexts is not.

In July 1999, when I wrote about the first push driving education online for Scientific American, a reader wrote in accusing me of elitism. She was only a little bit right: I was and am dubious that in the credential-obsessed United States any online education will be carry as much clout as the traditional degree from a good school. But the perceived value of that credential lies behind the grotesque inflation of tuition fees. The desire to learn is entirely different, and I cannot argue against anything that will give greater opportunities to exercise that.

At this year's Singularity Summit, Peter Norvig, the director of research at Google, recounted his experience of teaching Udacity's artificial intelligence class with Udacity founder Sebastian Thrum. One of the benefits of MOOCs, he said, is that the scale helps you improve the teaching. They found, for example, a test problem where the good students were not doing well; analysis showed the wording was ambiguous. Vernor Vinge, the retired mathematics professor and science fiction writer, at the same press conference, was impressed: you can do that in traditional education, but it would take 20 years to build up an adequate (though still comparatively tiny) sample size. Norvig also hopes that watching millions of people learn might help inform research in modeling intelligence. There's a certain elegance to this.

Of course in education you always hope for a meritocracy in which the best minds earn both the best grades and the best attention. But humans being what they are, we know from studies that prejudices apply here as elsewhere. In his recently published book, Oddly Nomal, John Schwartz of the New York Times recounts the many difficulties his gay son faced in navigating a childhood in which his personal differences from the norm sentenced him to being viewed as trouble. If on the Internet nobody knows you're a dog, equally, nobody has to know you're a ten-year-old boy wearing pink light-up shoes. Or, as in Daniel's case, a 17-year-old who has struggled with severe autism for nearly all his life and for whom traditional classroom-based education is out of reach physically - but not mentally.

In Filreis's blog entry, Daniel writes, "Your notion that digital learning need not be isolating is very right where I am concerned."

Norvig, from the other side of the teaching effort similarly said: "We thought it was all about recording flawless videos and then decided it was not that important. We made mistakes and students didn't care. What mattered was the personal connection."

This is the priceless thing that online education has struggled to emulate from successful classrooms. Maybe we're finally getting there.

"Emerging from autism." Such a wonderful and hopeful phrase.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.


September 14, 2012

What did you learn in school today?

One of the more astonishing bits of news this week came from Big Brother Watch: 207 schools across Britain have placed 825 CCTV cameras in toilets or changing rooms. The survey included more than 2,000 schools, so what this is basically saying is that a tenth of the schools surveyed apparently saw nothing wrong in spying on its pupils in these most intimate situations. Overall, the survey found that English, Welsh, and Scottish secondary schools and academies have a total of 106,710 cameras overall, or an average camera-to-pupil ratio of 1:38. As a computer scientist would say, this is non-trivial.

Some added background: the mid 2000s saw the growth of fingerprinting systems for managing payments in school cafeterias, checking library books in and out, and registering attendance. In 2008, the Leave Them Kids Alone campaign, set up by a concerned parent, estimated that more than 2 million UK kids had been fingerprinted, often without the consent of their parents. The Protection of Freedoms Act 2012 finally requires schools and colleges to get parental consent before collecting children's biometrics. That doesn't stop the practice but at least it establishes that these are serious decisions whose consequences need to be considered.

Meanwhile, Ruth Cousteau, the editor of the Open Rights Group's ORGzine, one of the locations where you can find net.wars every week, sends the story that a Texas school district is requiring pupils to carry RFID-enabled cards at all times while on school grounds. The really interesting element is that the real goal here is primarily and unashamedly financial, imposed on the school by its district: the school gets paid per pupil per day, and if a student isn't in homeroom when the teacher takes attendance, that's a little less money to finance the school in doing its job. The RFID cards enable the school to count the pupils who are present somewhere on the grounds but not in their seats, as if they were laptops in danger of being stolen. In the Wired write-up linked above, the school's principal seems not to see any privacy issues connecting to the fact that the school can track kids anywhere on the campus. It's good for safety. And so on.

There is constant debate about what kids should be taught in schools with respect to computers. In these discussions, the focus tends to be on what kids should be directly taught. When I covered Young Rewired State in 2011, one of the things we asked the teams I followed was about the state of computer education in their schools. Their answers: dire. Schools, apparently under the impression that their job was to train the office workforce of the previous decade, were teaching kids how to use word processors, but nothing or very little about how computers work, how to program, or how to build things.

There are signs that this particular problem is beginning to be rectified. Things like the Raspberry Pi and the Arduino, coupled with open source software, are beginning provide ways to recapture teaching in this area, essential if we are to have a next generation of computer scientists. This is all welcome stuff: teaching kids about computers by supplying them with fundamentally closed devices like iPads and Kindles is the equivalent of teaching kids sports by wheeling in a TV and playing a videotape of last Monday's US Open final between Andy Murray and Novak Djokovic.

But here's the most telling quote from that Wired article: "The kids are used to being monitored."

Yes, they are. And when they are adults, they will also be used to being monitored. I'm not quite paranoid enough to suggest that there's a large conspiracy to "soften up" the next generation (as Terri Dowty used to put it when she was running Action for the Rights of Children), but you can have the effect whether or not you have the intent. All these trends are happening in multiple locations: in the UK, for example, there were experiments in 2007 with school uniforms with embedded RFID chips (that wouldn't work in the US, where school uniforms are a rarity); in the trial, these not only tracked students' movements but pulled up data on academic performance.

These are the lessons we are teaching these kids indirectly. We tell them that putting naked photos on Facebook is a dumb idea and may come back to bite them in the future - but simultaneously we pretend to them that their electronic school records, down to the last, tiniest infraction, pose no similar risk. We tell them that plagiarism is bad and try to teach them about copyright and copying - but real life is meanwhile teaching them that a lot of news is scraped almost directly from press releases and that cheating goes on everywhere from financial markets and sports to scientific research. And although we try to tell them that security is important, we teach them by implication that it's OK to use sensitive personal data such as fingerprints and other biometrics for relatively trivial purposes, even knowing that these data's next outing may be to protect their bank accounts and validate their passports.

We should remember: what we do to them now they will do to us when we are old and feeble, and they're the ones in charge.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series

.

July 14, 2012

The ninth circle of HOPE

Why do technologies fail? And what do we mean by failure?

These questions arise in the first couple of hours of HOPE 9, this year's edition of the hacker conference run biannually by 2600, the hacker quarterly.

Technology failure has a particular meaning in the UK, where large government projects have traditionally wasted large amounts of public money and time. Many failures are more subtle. To take a very simple example: this morning, the elevators failed. It was not a design flaw or loss of functionality: the technology worked perfectly as intended. It was not a usability flaw: what could be simpler than pushing a button? It was not even an accessibility or availability flaw: there were plenty of elevators. What it was, in fact, was a social - or perhaps a contextual - flaw. This group of people who break down complex systems to their finest components to understand them and make them jump through hoops simply failed to notice or read the sign that gave the hours of operation even though it was written in big letters and placed at eye level, just above the call button. This was, after all, well-understood technology that needed no study. And so they stood around in groups, waiting until someone came, pointed out the sign, and chased them away. RTFM, indeed.

But this is what humans do: we make assumptions based on our existing knowledge. To the person with a hammer, everything looks like a nail. To the person with a cup and nowhere to put it, the unfamiliar CD drive looks like a cup holder. To the kids discovering the Hole in the Wall project, a 2000 experiment with installing a connected computer in an Indian slum, the familiar wait-and-wait-some-more hourglass was a drum. Though that last is only a failure if you think it's important that the kids know it's an hourglass; they understood perfectly well the thing that mattered, which is that it was a sign the thing in the wall was doing something and they had to wait.

We also pursue our own interests, sometimes at the expense of what actually matters in a situation. Far Kron, speaking on the last four years of community fabrication, noted that the Global Village Construction project, which is intended to include a full set of the machines necessarily to build a civilization, includes nothing to aid more mundane things like fetching fresh water and washing clothes, which are overall a bigger drain on human time. I am tempted to suggest that perhaps the project needs to recruit some more women (who around the world tend to do most of the water fetching and clothes washing), but it may simply be that small, daily chores are things you worry about after you have your village. (Though this is the inverse of how human settlements have historically worked.)

A more intriguing example, cited by Chris Anderson, a former organizer with New York's IndyMedia, in the early panel on Technology to Change Society that inspired this piece, is Twitter. How is one of the most important social networks and messaging platforms in the world a failure?

"If you define success in technical terms you might only *be* successful in technical terms," he said. Twitter, he explained grew out of a number of prior open-source projects the founders were working. "Indymedia saw technology as being in service to goals, but lacks the social goals those projects started with."

Gus Andrews, producer of The Media Show, a YouTube series on digital media literacy, focused on the hidden assumptions creators make. Some believed, for example, that open source software was vital to One Laptop Per Child, for example, believed that being able to fix the software was a crucial benefit for the recipients.

In 2000, Lawrence Lessig argued that "code is law", and that technological design controls how it can be used. Andrews took a different view: "To believe that things are ineluctably coded into technology is to deny free will." Pointing at Everett Rogers' 1995 book, The Diffusion of Innovations, she said, "There are things we know about how technology enacts social change and one of the thing we know is that it's not the technology."

Not the technology? You might think that if anyone were going to be technology obsessed it would be the folks at a hacker conference. And certainly the public areas are filled with people fidgeting with radio frequencies, teaching others to solder, and showing off their latest 3D printers and their creations (this year's vogue: printing in brightly colored Lego plastic). But the roots of the hacker movement in general and of 2600 in particular are as much social and educational as they are technological.

Eric Corley, who has styled himself "Emmanuel Goldstein", edits the magazine, and does a weekly radio show for WBAI-FM in New York. At a London hacker conference in 1995, he summed up this ethos for me (and The Independent) by talking about hacking as a form of consumer advocacy. His ideas about keeping the Internet open and free, and about ferreting out information corporations would rather keep hidden were niche - and to many people scary - then, but mainstream now.

HOPE continues through Sunday.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

June 15, 2012

A license to print money

"It's only a draft," Julian Huppert, the Liberal Democrat MP for Cambridge, said repeatedly yesterday. He was talking about the Draft Communications Data Bill (PDF), which was published on Wednesday. Yesterday, in a room in a Parliamentary turret, Hupper convened a meeting to discuss the draft; in attendance were a variety of Parliamentarians plus experts from civil society groups such as Privacy International, the Open Rights Group, Liberty, and Big Brother Watch. Do we want to be a nation of suspects?

The Home Office characterizes the provisions in the draft bill as vital powers to help catch criminals, save lives, and protect children. Everyone else - the Guardian, ZDNet UK, and dozens more - is calling them the "Snooper's charter".

Huppert's point is important. Like the Defamation Bill before it, publishing a draft means there will be a select committee with 12 members, discussion, comments, evidence taken, a report (by November 30, 2012), and then a rewritten bill. This draft will not be voted on in Parliament. We don't have to convince 650 MPs that the bill is wrong; it's a lot easier to talk to 12 people. This bill, as is, would never pass either House in any case, he suggested.

This is the optimistic view. The cynic might suggest that since it's been clear for something like ten years that the British security services (or perhaps their civil servants) have a recurring wet dream in which their mountain of data is the envy of other governments, they're just trying to see what they can get away with. The comprehensive provisions in the first draft set the bar, softening us up to give away far more than we would have in future versions. Psychologists call this anchoring, and while probably few outside the security services would regard the wholesale surveillance and monitoring of innocent people as normal, the crucial bit is where you set the initial bar for comparison for future drafts of the legislation. However invasive the next proposals are, it will be easy for us to lose the bearings we came in with and feel that we've successfully beaten back at least some of the intrusiveness.

But Huppert is keeping his eye on the ball: maybe we can not only get the worst stuff out of this bill but make things actually better than they are now; it will amend RIPA. The Independent argues that private companies hold much more data on us overall but that article misses that this bill intends to grant government access to all of it, at any time, without notice.

The big disappointment in all this, as William Heath said yesterday, is that it marks a return to the old, bad, government IT ways of the past. We were just getting away from giant, failed public IT projects like the late unlamented NHS platform for IT and the even more unlamented ID card towards agile, cheap public projects run by smart guys who know what they're doing. And now we're going to spend £1.8 billion of public money over ten years (draft bill, p92) building something no one much wants and that probably won't work? The draft bill claims - on what authority is unclear - that the expenditure will bring in £5 to £6 billion in revenues. From what? Are they planning to sell the data?

Or are they imagining the economic growth implied by the activity that will be necessary to build, install, maintain, and update the black boxes that will be needed by every ISP in order to comply with the law. The security consultant Alec Muffet has laid out the parameters for this SpookBox 5000: certified, tested, tamperproof, made by, say, three trusted British companies. Hundreds of them, legally required, with ongoing maintenance contracts. "A license to print money," he calls them. Nice work if you can get it, of course.

So we're talking - again - about spending huge sums of government money on a project that only a handful of people want and whose objectives could be better achieved by less intrusive means. Give police better training in computer forensics, for example, so they can retrieve the evidence they need from the devices they find when executing a search warrant.

Ultimately, the real enemy is the lack of detail in the draft bill. Using the excuse that the communications environment is changing rapidly and continuously, the notes argue that flexibility is absolutely necessary for Clause 1, the one that grants the government all the actual surveillance power, and so it's been drafted to include pretty much everything, like those contracts that claim copyright in perpetuity in all forms of media that exist now or may hereinafter be invented throughout the universe. This is dangerous because in recent years the use of statutory instruments to bypass Parliamentary debate has skyrocketed. No. Make the defenders of this bill prove every contention; make them show the evidence that makes every extra bit of intrusion necessary.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.


May 4, 2012

A matter of degree

What matters about a university degree? Is it the credential, the interaction with peers and professors, the chance to play a little while longer before turning adult, or the stuff you actually learn? Given how much a degree costs, these are pressing questions for the college-bound and their parents.

This is particularly true in the US, where today's tuition fees at Cornell University's College of Arts and Sciences, are 14 times what they were when I started there as a freshman in 1971. This week, CNBC highlighted the costs of liberal arts colleges such as Colorado's Pepperdine, where tuition, housing, and meals add up to $54,000 a year. Hah, said friends: it's $56,000 at Haverford, where their son is a sophomore.

These are crazy numbers even if you pursue a "sensible" degree, like engineering, mathematics, or a science. In fact, it's beginning to approach the level after which a top-class private university degree no longer makes the barest economic sense. A Reuters study announced this week found that the difference between a two-year "associate" degree and a four-year BA or BSc over the course of a 30-year career is $500,000 to $600,000 (enough to pay for your child's college degree, maybe). Over a career a college degree adds about $1 million over a high school diploma, depending on the major you pick and the field you go into. An accountant could argue that there's still some room for additional tuition increases - but then, even if that accountant has teenaged kids his earnings are likely well above average.

Anthony Carnevale, the director of the center that conducted this research, tells Reuters this is a commercialization of education. Yes, of course - but if college costs as much per child as the family home inevitably commercial considerations will apply even if you don't accept Paypal founder Peter Thiel's argument about a higher education bubble.

All this provides context for this week's announcement that Harvard and MIT are funding a $60 million initiative, EDx, to provide online courses for all and sundry. Given that Britain's relatively venerable Open University was set up in 1969 to bring university-level education to a wide range of non-traditional students, remote learning is nothing new. Still, EDx is one of a number of new online education initiatives.

Experimentation with using the Internet as a delivery medium for higher education began in the mid 1990s (TXT). The Open University augmented the ability for students to interact with each other by adding online conferencing to its media mix, and many other institutions began offering online degrees. Almost the only dissenting voice at the time was that of David F. Noble, a professor at Canada's York University. In a series of essays written from 1997 to 2001, Digital Diploma Mills he criticized the commercialization of higher education and the move toward online instruction. Coursework that formerly belonged to professors and teachers, he argued, would now become a product sold by the university itself; copyright ownership would be crucial. By 2001, he was writing about the failure of many of the online ventures to return the additional revenues their institutions had hoped for.

When I wrote about these various concerns in 1999 for Scientific American (TXT) reader email accused me of being an entitled elitist and gleefully threatened me with a wave of highly motivated, previously locked-out students who would sweep the world. The main thing I hoped I highlighted, however, was the comparatively high drop-out rate of online students. This is a pattern that has continued through to mid-2000s today with little change. This seems to me a significant problem for the industry - but explains why MIT and Harvard, like some other recent newcomers, are talking about charging for exams or completion certificates rather than the courses themselves. Education on the shareware model: certainly fairer for students hoping for career advancement and great for people who just want to learn from the best brands. (Not, thankfully, the future envisaged by one of the interviewees in those articles, who feared online education would be dominated by Microsoft and Disney).

In an economic context, the US's endemic credentialism means it's the certificate that has economic value, not necessarily the learning itself. But across the wider world, it's easy to imagine local authorities taking advantage of the courses that are available and setting their own exams and certification systems. For Harvard and MIT, the courses may also provide a way of spotting far-flung talent to scoop up and educate more traditionally.

Of course, economics are not the only reason to go to college: it may make other kinds of sense. Today's college-educated parents often want their kids to go to college for more complex reasons to do with quality of life, adaptability to a changing future, and the kind of person they would like their kids to be. In my own case, the education I had gave me choices and the confidence that I could learn anything if I needed to. That sort of motivation, sadly, is being priced out of the middle class. Soon it will be open only to the very talented and poor who qualify for scholarships, and the very wealthy who can afford the luxury. No wonder the market sees an opportunity.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.