Main

December 22, 2012

The personal connection

This week my previously rather blasé view of online education - known in its latest incarnation as MOOCs, for massive open online courses - got a massive shove toward the enthusiastic by the story of 17-year-old Daniel Bergmann's experience in one such course in modern poetry, a ten-week class given by Al Filreis at the University of Pennsylvania and offered by Coursera.

How Daniel, the son of two of my oldest friends, got there is a long story better told on Filreis's blog by Filreis and Daniel himself. The ultra-brief summary is this: Daniel is, as he writes to Filreis (reproduced in that blog entry), "emerging from autism"; he communicates by spelling out words - sentences - paragraphs - on a letterboard app on his iPad. That part of the story was told here in 2010. But the point is this: 36,000 students signed up for the class, and 2,200 finished with a certificate. Daniel was one of them.

The standard knocks on online education have been that:

- students lose most or all of the interaction with each other that really defines the benefit of the college experience;

- the drop-out rate is staggering;

- there are issues surrounding accreditation and credentials;

- it's just not as good as "the real thing";

- there may not be a business model (!);

- but maybe for some people who don't have good access to traditional education and who have defined goals it will work.

To some extent all these things are true, but with caveats. For one thing, what do we mean by "as good"? If we mean that the credential from an online course isn't as likely to land you a high-paying or high-influence job as a diploma from Cornell or Cambridge, that's true of all but some handfuls of the world's institutions of higher learning. If we mean the quality of the connections you make with professors who write the textbooks and students who are tomorrow's stars, that's true of many real-world institutions as well. If you mean the quality of the education - which hardly anyone seems to mean these days - that's less clear. Only a small minority can get into - or afford - the top universities of this world; education you can have has to be better than education you can't.

The drop-out numbers are indeed high, but as The Atlantic points out, we're at the beginning of an experiment. The 160,000 people who signed up for Sebastian Thrun's Udacity course on AI aren't losing their only chance by not completing it; how you spend the four years between 18 and 22 is a zero-sum game, but education in other contexts is not.

In July 1999, when I wrote about the first push driving education online for Scientific American, a reader wrote in accusing me of elitism. She was only a little bit right: I was and am dubious that in the credential-obsessed United States any online education will be carry as much clout as the traditional degree from a good school. But the perceived value of that credential lies behind the grotesque inflation of tuition fees. The desire to learn is entirely different, and I cannot argue against anything that will give greater opportunities to exercise that.

At this year's Singularity Summit, Peter Norvig, the director of research at Google, recounted his experience of teaching Udacity's artificial intelligence class with Udacity founder Sebastian Thrum. One of the benefits of MOOCs, he said, is that the scale helps you improve the teaching. They found, for example, a test problem where the good students were not doing well; analysis showed the wording was ambiguous. Vernor Vinge, the retired mathematics professor and science fiction writer, at the same press conference, was impressed: you can do that in traditional education, but it would take 20 years to build up an adequate (though still comparatively tiny) sample size. Norvig also hopes that watching millions of people learn might help inform research in modeling intelligence. There's a certain elegance to this.

Of course in education you always hope for a meritocracy in which the best minds earn both the best grades and the best attention. But humans being what they are, we know from studies that prejudices apply here as elsewhere. In his recently published book, Oddly Nomal, John Schwartz of the New York Times recounts the many difficulties his gay son faced in navigating a childhood in which his personal differences from the norm sentenced him to being viewed as trouble. If on the Internet nobody knows you're a dog, equally, nobody has to know you're a ten-year-old boy wearing pink light-up shoes. Or, as in Daniel's case, a 17-year-old who has struggled with severe autism for nearly all his life and for whom traditional classroom-based education is out of reach physically - but not mentally.

In Filreis's blog entry, Daniel writes, "Your notion that digital learning need not be isolating is very right where I am concerned."

Norvig, from the other side of the teaching effort similarly said: "We thought it was all about recording flawless videos and then decided it was not that important. We made mistakes and students didn't care. What mattered was the personal connection."

This is the priceless thing that online education has struggled to emulate from successful classrooms. Maybe we're finally getting there.

"Emerging from autism." Such a wonderful and hopeful phrase.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.


September 14, 2012

What did you learn in school today?

One of the more astonishing bits of news this week came from Big Brother Watch: 207 schools across Britain have placed 825 CCTV cameras in toilets or changing rooms. The survey included more than 2,000 schools, so what this is basically saying is that a tenth of the schools surveyed apparently saw nothing wrong in spying on its pupils in these most intimate situations. Overall, the survey found that English, Welsh, and Scottish secondary schools and academies have a total of 106,710 cameras overall, or an average camera-to-pupil ratio of 1:38. As a computer scientist would say, this is non-trivial.

Some added background: the mid 2000s saw the growth of fingerprinting systems for managing payments in school cafeterias, checking library books in and out, and registering attendance. In 2008, the Leave Them Kids Alone campaign, set up by a concerned parent, estimated that more than 2 million UK kids had been fingerprinted, often without the consent of their parents. The Protection of Freedoms Act 2012 finally requires schools and colleges to get parental consent before collecting children's biometrics. That doesn't stop the practice but at least it establishes that these are serious decisions whose consequences need to be considered.

Meanwhile, Ruth Cousteau, the editor of the Open Rights Group's ORGzine, one of the locations where you can find net.wars every week, sends the story that a Texas school district is requiring pupils to carry RFID-enabled cards at all times while on school grounds. The really interesting element is that the real goal here is primarily and unashamedly financial, imposed on the school by its district: the school gets paid per pupil per day, and if a student isn't in homeroom when the teacher takes attendance, that's a little less money to finance the school in doing its job. The RFID cards enable the school to count the pupils who are present somewhere on the grounds but not in their seats, as if they were laptops in danger of being stolen. In the Wired write-up linked above, the school's principal seems not to see any privacy issues connecting to the fact that the school can track kids anywhere on the campus. It's good for safety. And so on.

There is constant debate about what kids should be taught in schools with respect to computers. In these discussions, the focus tends to be on what kids should be directly taught. When I covered Young Rewired State in 2011, one of the things we asked the teams I followed was about the state of computer education in their schools. Their answers: dire. Schools, apparently under the impression that their job was to train the office workforce of the previous decade, were teaching kids how to use word processors, but nothing or very little about how computers work, how to program, or how to build things.

There are signs that this particular problem is beginning to be rectified. Things like the Raspberry Pi and the Arduino, coupled with open source software, are beginning provide ways to recapture teaching in this area, essential if we are to have a next generation of computer scientists. This is all welcome stuff: teaching kids about computers by supplying them with fundamentally closed devices like iPads and Kindles is the equivalent of teaching kids sports by wheeling in a TV and playing a videotape of last Monday's US Open final between Andy Murray and Novak Djokovic.

But here's the most telling quote from that Wired article: "The kids are used to being monitored."

Yes, they are. And when they are adults, they will also be used to being monitored. I'm not quite paranoid enough to suggest that there's a large conspiracy to "soften up" the next generation (as Terri Dowty used to put it when she was running Action for the Rights of Children), but you can have the effect whether or not you have the intent. All these trends are happening in multiple locations: in the UK, for example, there were experiments in 2007 with school uniforms with embedded RFID chips (that wouldn't work in the US, where school uniforms are a rarity); in the trial, these not only tracked students' movements but pulled up data on academic performance.

These are the lessons we are teaching these kids indirectly. We tell them that putting naked photos on Facebook is a dumb idea and may come back to bite them in the future - but simultaneously we pretend to them that their electronic school records, down to the last, tiniest infraction, pose no similar risk. We tell them that plagiarism is bad and try to teach them about copyright and copying - but real life is meanwhile teaching them that a lot of news is scraped almost directly from press releases and that cheating goes on everywhere from financial markets and sports to scientific research. And although we try to tell them that security is important, we teach them by implication that it's OK to use sensitive personal data such as fingerprints and other biometrics for relatively trivial purposes, even knowing that these data's next outing may be to protect their bank accounts and validate their passports.

We should remember: what we do to them now they will do to us when we are old and feeble, and they're the ones in charge.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series

.

July 14, 2012

The ninth circle of HOPE

Why do technologies fail? And what do we mean by failure?

These questions arise in the first couple of hours of HOPE 9, this year's edition of the hacker conference run biannually by 2600, the hacker quarterly.

Technology failure has a particular meaning in the UK, where large government projects have traditionally wasted large amounts of public money and time. Many failures are more subtle. To take a very simple example: this morning, the elevators failed. It was not a design flaw or loss of functionality: the technology worked perfectly as intended. It was not a usability flaw: what could be simpler than pushing a button? It was not even an accessibility or availability flaw: there were plenty of elevators. What it was, in fact, was a social - or perhaps a contextual - flaw. This group of people who break down complex systems to their finest components to understand them and make them jump through hoops simply failed to notice or read the sign that gave the hours of operation even though it was written in big letters and placed at eye level, just above the call button. This was, after all, well-understood technology that needed no study. And so they stood around in groups, waiting until someone came, pointed out the sign, and chased them away. RTFM, indeed.

But this is what humans do: we make assumptions based on our existing knowledge. To the person with a hammer, everything looks like a nail. To the person with a cup and nowhere to put it, the unfamiliar CD drive looks like a cup holder. To the kids discovering the Hole in the Wall project, a 2000 experiment with installing a connected computer in an Indian slum, the familiar wait-and-wait-some-more hourglass was a drum. Though that last is only a failure if you think it's important that the kids know it's an hourglass; they understood perfectly well the thing that mattered, which is that it was a sign the thing in the wall was doing something and they had to wait.

We also pursue our own interests, sometimes at the expense of what actually matters in a situation. Far Kron, speaking on the last four years of community fabrication, noted that the Global Village Construction project, which is intended to include a full set of the machines necessarily to build a civilization, includes nothing to aid more mundane things like fetching fresh water and washing clothes, which are overall a bigger drain on human time. I am tempted to suggest that perhaps the project needs to recruit some more women (who around the world tend to do most of the water fetching and clothes washing), but it may simply be that small, daily chores are things you worry about after you have your village. (Though this is the inverse of how human settlements have historically worked.)

A more intriguing example, cited by Chris Anderson, a former organizer with New York's IndyMedia, in the early panel on Technology to Change Society that inspired this piece, is Twitter. How is one of the most important social networks and messaging platforms in the world a failure?

"If you define success in technical terms you might only *be* successful in technical terms," he said. Twitter, he explained grew out of a number of prior open-source projects the founders were working. "Indymedia saw technology as being in service to goals, but lacks the social goals those projects started with."

Gus Andrews, producer of The Media Show, a YouTube series on digital media literacy, focused on the hidden assumptions creators make. Some believed, for example, that open source software was vital to One Laptop Per Child, for example, believed that being able to fix the software was a crucial benefit for the recipients.

In 2000, Lawrence Lessig argued that "code is law", and that technological design controls how it can be used. Andrews took a different view: "To believe that things are ineluctably coded into technology is to deny free will." Pointing at Everett Rogers' 1995 book, The Diffusion of Innovations, she said, "There are things we know about how technology enacts social change and one of the thing we know is that it's not the technology."

Not the technology? You might think that if anyone were going to be technology obsessed it would be the folks at a hacker conference. And certainly the public areas are filled with people fidgeting with radio frequencies, teaching others to solder, and showing off their latest 3D printers and their creations (this year's vogue: printing in brightly colored Lego plastic). But the roots of the hacker movement in general and of 2600 in particular are as much social and educational as they are technological.

Eric Corley, who has styled himself "Emmanuel Goldstein", edits the magazine, and does a weekly radio show for WBAI-FM in New York. At a London hacker conference in 1995, he summed up this ethos for me (and The Independent) by talking about hacking as a form of consumer advocacy. His ideas about keeping the Internet open and free, and about ferreting out information corporations would rather keep hidden were niche - and to many people scary - then, but mainstream now.

HOPE continues through Sunday.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.

June 15, 2012

A license to print money

"It's only a draft," Julian Huppert, the Liberal Democrat MP for Cambridge, said repeatedly yesterday. He was talking about the Draft Communications Data Bill (PDF), which was published on Wednesday. Yesterday, in a room in a Parliamentary turret, Hupper convened a meeting to discuss the draft; in attendance were a variety of Parliamentarians plus experts from civil society groups such as Privacy International, the Open Rights Group, Liberty, and Big Brother Watch. Do we want to be a nation of suspects?

The Home Office characterizes the provisions in the draft bill as vital powers to help catch criminals, save lives, and protect children. Everyone else - the Guardian, ZDNet UK, and dozens more - is calling them the "Snooper's charter".

Huppert's point is important. Like the Defamation Bill before it, publishing a draft means there will be a select committee with 12 members, discussion, comments, evidence taken, a report (by November 30, 2012), and then a rewritten bill. This draft will not be voted on in Parliament. We don't have to convince 650 MPs that the bill is wrong; it's a lot easier to talk to 12 people. This bill, as is, would never pass either House in any case, he suggested.

This is the optimistic view. The cynic might suggest that since it's been clear for something like ten years that the British security services (or perhaps their civil servants) have a recurring wet dream in which their mountain of data is the envy of other governments, they're just trying to see what they can get away with. The comprehensive provisions in the first draft set the bar, softening us up to give away far more than we would have in future versions. Psychologists call this anchoring, and while probably few outside the security services would regard the wholesale surveillance and monitoring of innocent people as normal, the crucial bit is where you set the initial bar for comparison for future drafts of the legislation. However invasive the next proposals are, it will be easy for us to lose the bearings we came in with and feel that we've successfully beaten back at least some of the intrusiveness.

But Huppert is keeping his eye on the ball: maybe we can not only get the worst stuff out of this bill but make things actually better than they are now; it will amend RIPA. The Independent argues that private companies hold much more data on us overall but that article misses that this bill intends to grant government access to all of it, at any time, without notice.

The big disappointment in all this, as William Heath said yesterday, is that it marks a return to the old, bad, government IT ways of the past. We were just getting away from giant, failed public IT projects like the late unlamented NHS platform for IT and the even more unlamented ID card towards agile, cheap public projects run by smart guys who know what they're doing. And now we're going to spend £1.8 billion of public money over ten years (draft bill, p92) building something no one much wants and that probably won't work? The draft bill claims - on what authority is unclear - that the expenditure will bring in £5 to £6 billion in revenues. From what? Are they planning to sell the data?

Or are they imagining the economic growth implied by the activity that will be necessary to build, install, maintain, and update the black boxes that will be needed by every ISP in order to comply with the law. The security consultant Alec Muffet has laid out the parameters for this SpookBox 5000: certified, tested, tamperproof, made by, say, three trusted British companies. Hundreds of them, legally required, with ongoing maintenance contracts. "A license to print money," he calls them. Nice work if you can get it, of course.

So we're talking - again - about spending huge sums of government money on a project that only a handful of people want and whose objectives could be better achieved by less intrusive means. Give police better training in computer forensics, for example, so they can retrieve the evidence they need from the devices they find when executing a search warrant.

Ultimately, the real enemy is the lack of detail in the draft bill. Using the excuse that the communications environment is changing rapidly and continuously, the notes argue that flexibility is absolutely necessary for Clause 1, the one that grants the government all the actual surveillance power, and so it's been drafted to include pretty much everything, like those contracts that claim copyright in perpetuity in all forms of media that exist now or may hereinafter be invented throughout the universe. This is dangerous because in recent years the use of statutory instruments to bypass Parliamentary debate has skyrocketed. No. Make the defenders of this bill prove every contention; make them show the evidence that makes every extra bit of intrusion necessary.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.


May 4, 2012

A matter of degree

What matters about a university degree? Is it the credential, the interaction with peers and professors, the chance to play a little while longer before turning adult, or the stuff you actually learn? Given how much a degree costs, these are pressing questions for the college-bound and their parents.

This is particularly true in the US, where today's tuition fees at Cornell University's College of Arts and Sciences, are 14 times what they were when I started there as a freshman in 1971. This week, CNBC highlighted the costs of liberal arts colleges such as Colorado's Pepperdine, where tuition, housing, and meals add up to $54,000 a year. Hah, said friends: it's $56,000 at Haverford, where their son is a sophomore.

These are crazy numbers even if you pursue a "sensible" degree, like engineering, mathematics, or a science. In fact, it's beginning to approach the level after which a top-class private university degree no longer makes the barest economic sense. A Reuters study announced this week found that the difference between a two-year "associate" degree and a four-year BA or BSc over the course of a 30-year career is $500,000 to $600,000 (enough to pay for your child's college degree, maybe). Over a career a college degree adds about $1 million over a high school diploma, depending on the major you pick and the field you go into. An accountant could argue that there's still some room for additional tuition increases - but then, even if that accountant has teenaged kids his earnings are likely well above average.

Anthony Carnevale, the director of the center that conducted this research, tells Reuters this is a commercialization of education. Yes, of course - but if college costs as much per child as the family home inevitably commercial considerations will apply even if you don't accept Paypal founder Peter Thiel's argument about a higher education bubble.

All this provides context for this week's announcement that Harvard and MIT are funding a $60 million initiative, EDx, to provide online courses for all and sundry. Given that Britain's relatively venerable Open University was set up in 1969 to bring university-level education to a wide range of non-traditional students, remote learning is nothing new. Still, EDx is one of a number of new online education initiatives.

Experimentation with using the Internet as a delivery medium for higher education began in the mid 1990s (TXT). The Open University augmented the ability for students to interact with each other by adding online conferencing to its media mix, and many other institutions began offering online degrees. Almost the only dissenting voice at the time was that of David F. Noble, a professor at Canada's York University. In a series of essays written from 1997 to 2001, Digital Diploma Mills he criticized the commercialization of higher education and the move toward online instruction. Coursework that formerly belonged to professors and teachers, he argued, would now become a product sold by the university itself; copyright ownership would be crucial. By 2001, he was writing about the failure of many of the online ventures to return the additional revenues their institutions had hoped for.

When I wrote about these various concerns in 1999 for Scientific American (TXT) reader email accused me of being an entitled elitist and gleefully threatened me with a wave of highly motivated, previously locked-out students who would sweep the world. The main thing I hoped I highlighted, however, was the comparatively high drop-out rate of online students. This is a pattern that has continued through to mid-2000s today with little change. This seems to me a significant problem for the industry - but explains why MIT and Harvard, like some other recent newcomers, are talking about charging for exams or completion certificates rather than the courses themselves. Education on the shareware model: certainly fairer for students hoping for career advancement and great for people who just want to learn from the best brands. (Not, thankfully, the future envisaged by one of the interviewees in those articles, who feared online education would be dominated by Microsoft and Disney).

In an economic context, the US's endemic credentialism means it's the certificate that has economic value, not necessarily the learning itself. But across the wider world, it's easy to imagine local authorities taking advantage of the courses that are available and setting their own exams and certification systems. For Harvard and MIT, the courses may also provide a way of spotting far-flung talent to scoop up and educate more traditionally.

Of course, economics are not the only reason to go to college: it may make other kinds of sense. Today's college-educated parents often want their kids to go to college for more complex reasons to do with quality of life, adaptability to a changing future, and the kind of person they would like their kids to be. In my own case, the education I had gave me choices and the confidence that I could learn anything if I needed to. That sort of motivation, sadly, is being priced out of the middle class. Soon it will be open only to the very talented and poor who qualify for scholarships, and the very wealthy who can afford the luxury. No wonder the market sees an opportunity.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series.