The 0.06 percent
In line with increasing numbers of news sites - Popular Science, The Verge and the Daily Dot, Mic - National Public Radio has decided to get rid of its comments. The reasoning, ombudsman Elizabeth Jenkins writes, is based on numbers familiar in outline if not detail to anyone who has followed how virtual communities work. NPR's 33 million unique users in July, produced 491,000 comments - written by only 19,400 commenters, or 0.06% of users. Just 4,300 users accounted for 67% of comments, and over May, June, and July 47% of comments came from 2,600 users. NPR concluded that therefore the comments system is serving a very small percentage of its users - and since it's costing twice what the organization budgeted, and NPR finds demographically broader engagement on Twitter and Facebook, out it goes.
I do sympathize; NPR, a fine organization, struggles constantly for funding (like every journalism organisation now). And really, aside from purely local papers which actually serve a definable community, why should reading news mean joining a community? News site comment boards are the product of two dated trends: the 2000 belief that community was essential to online business success, and "citizen journalism".
NPR's comment board troubles are same-old: in-fighting, point-scoring across unrelated topics, racism, sexism, general nastiness. Online was ever thus, and the only known solution is heavy, persistent, consistent consistent, human moderation. It's expensive and labor-intensive, as former Guardian moderator Marc Burrows describes. As we found on CompuServe's UK Journalism Forum in 1993, real names aren't a fix, though what I call "Benidorm syndrome" definitely makes things worse - that is, the belief that online behavior has no real-life consequences. There have been some interesting attempts - most recently, a paper published by Yahoo! (PDF) - to automate identifying hate speech and abusive language, but that's just one aspect of online abuse, as the outline of Twitter's new quality filter shows.
Much public debate on this topic focuses on ad feminem attacks, but Burrows helpfully dissects the many forms of and motives for trolling from "agenda trolls" up to paid national armies. Burrows also provides numbers: 100 million monthly users, 73,000 comments a day.
I think that besides the often-discussed distancing effect of being alone behind a screen and the speed with which postings can pile on, the volatility of online discourse is similar to why a driver's conversation with a passenger is less distracting than one over mobile phone. In a physical space, everyone present helps moderate the trolls, and an authoritative presence can enlist others to the cause. Plus, the initial effort to attend tends to filter out casually obnoxious participants.
Online, however, travel and effort collapse, and all individuals are reduced to piles of letters. The room has no front, and no one has a microphone or a stage. Community standards and social norms require enormous discipline to maintain. Historical experience shows that smaller communities (such as the WELL) with communication backchannels have the edge in countering these effects. But technological support really helps: when, in 1994, the newsgroup alt.religion.scientology became the first serious battleground between regular posters and a well-resourced attacking external organization, users devised technical barriers. Today's poorly threaded web boards offer no such user-implementable aids. Better-designed technology could enable users to assist moderators, lowering the burden and cost. I'd guess most boards are designed less to foster community than to enable advertising and profiling, and that sites are sold on tools for moderators, not users. By contrast, on Facebook and Twitter users are their own curators.
Jensen acknowledges the sadness that when Disqus removes the comment system, the archived comments will go, too. Though Disqus will presumably retain the data, former posters will not be able to visit the community they had formed. Short-term, I see the point. Long-term, I think sites will need to find their own alternatives to handing off engagement to businesses with their own own agendas. We could see the rebirth of more general online forums - but these, too, will have to be moderated.
The Atlantic argues that comments are actively damaging to both readers' perception of a site's articles and their engagement with same. My question: what about those readers? Virtual communities back to the community Dot, has consistently found at least 90% lurk, 10% post, and 1% do the bulk of the posting. As online communities have scaled up, the percentages have skewed much further towards lurkers, as NPR's numbers indicate. In the comments to Jensen's piece, you find a number of people who now feel bereft of discourse they'd come to appreciate. Don't lurkers and Facebook refuseniks have rights, too?
Anyone who's tried to comment here in the last few years may see the irony of this complaint. Here's what happened: It took me a while to figure out the publishing template was broken by an upgrade to the site's underlying software; all I knew was that the site had begun eating all non-spam comments. The template needs to be replaced...but in the meantime, more recently, spambot activity made posting new columns difficult and time-consuming, and the good people at Pair.com suggested disabling the comment system to frustrate the bots into going away. I do apologize. Fixing any of it is non-trivial (to me).
Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.