Wednesday, March 14, 2012

"Jane, you ignorant slut."

"Dan, you pompous ass."
Remember the '70s when phrases like that in an SNL sketch were so outrageous, so over the top, they actually were kinda funny? Today, of course, those old Point / Counter Point sketches seem more prescient than outrageous, as "serious" news commentators routinely say things like this with the same calm, deliberate, faux intellectual deadpan Dan Aykroyd and Jane Curtin used back then. And it's not just talk radio where this occurs.

Most major online news outlets and blogs now allow readers to comment on op-ed pieces they publish. If you've ever read any of those commentary streams, you know they can quickly degrade into name calling and personal attacks, especially on those sites that allow commentators to post using anonymous pseudonyms. The grandiose idea that thousands of readers could contribute something constructive and positive to a thoughtful online conversation has been, to date, one of the most miserable failures associated with the Internet since its inception.

Nick Denton, the founder of Gawker Media and owner of a bunch of popular blogs, makes no bones about it. He says,  "It's a promise that has so not happened that people don't even have that ambition anymore. The idea of capturing the intelligence of the readership — that's a joke."

Why has this laudable Internet ambition been such an abject failure, and what, if anything can be done to fix it? I have spent the past few months reading thousands of comments attached to articles on a wide variety of topics.  To be sure, the effect is often depressing and sometimes frightening. But I do not think the reason this hasn't worked is a lack of intelligent people out there. On the contrary, I see many well-reasoned, well-written posts, some making points with which I agree and some with which I disagree. People still remember how to disagree with one another without ripping each other to shreds with ad hominem attacks. The trouble is, the constructive comments are often submerged in a cesspool of vile language, deliberate disinformation, intellectual immaturity, and troll bait.

I have developed a few tricks that help me leaf through the commentary streams quickly, identifying the comments that may have value and skipping the ones that probably should not have been written in the first place. One such trick is based on a quote often attributed to Eleanor Roosevelt,
"Great minds discuss ideas, average minds discuss events, small minds discuss people."
I've found you can quickly scan a post to determine whether the author is going to talk about ideas, events, or people. The first category are usually interesting, the second sometimes interesting, and the last almost never. Here are some examples from the commentary stream on the CNN article I linked above.
Ideas
@veggiedude - Siri (and IBM's Watson) has shown how well artificial intelligence can work. The solution is to employ a Siri type moderator to decide whether to post or not post the comments from the users. This way, a machine is in control, and there is no one to get mad at - unless a person 'enjoys' getting mad at a machine.
@maestro406 - Allow a "squelch" option. Everyone has the right to post, but that doesn't mean I have to read every comment from every poster. I should be able to block individual posters on my computer.
Events/Examples
@swohio - And the funny thing is, so many news sites have switched over to making people log in using Facebook in order to leave comments, stating it will help facilitate more mature, reasonable, and engaging discussion/comments. Sad to say, it hasn't at the sites I used to comment on where they've switched over to that system.
@GloriaBTM - The most enjoyable board I've posted on were IMDB around 2005-07, both because of the format, variety of topics, and the way it was moderated. (I'm not sure what it's like now...) We were able to have real conversations about current events, science, religion (and yes, movies), without have to wade through so much muck. Most importantly, posters whose posts were repeatedly deleted for violating T&C had their accounts deleted, fairly quickly. Yes, people could create new accounts, but it did slow the nastiness down.People could also IGNORE certain posters. It was brilliant. You just click ignore and you don't have to see them ever again.I found the format easier to navigate than any other board I've been on. Each thread could collapse (truly collapse, unlike here, where it still takes up space, though the message is blanked). Then you could quickly scroll through and find your conversation and open that thread.
People
@Tzckrl - STOOPID! R u kidding? Palin and Santarum 2012! Anyone who don't think so is a twerp!
@TommiGI - South by Southwest is a joke and Nick Denton is the punchline.
Another, sometimes more reliable, trick is to jot down the aliases of commentators who post useful or interesting comments and then scan for other comments by those same aliases. Of course, this is made more difficult if users are allowed to choose arbitrary, anonymous handles each time they log in. As @swohio indicates above, the recent tendency of other sites to require Facebook logins in order to try to enforce authenticity has had mixed results. While that approach doesn't seem to help moderate the crazies, as you might think it would, it does at least provide a consistent screen name for posters so that you can more easily spot the ones who have had a better track record in the past.

In Trust Me, I'm @HamsterOfDoom I talked about how a framework like the Ethosphere can help develop and maintain trust relationships with pseudonymous identities. By maintaining a numerical ranking, a rep as I call it in the Ethosphere, that reflects the degree to which an alias has contributed constructively to the community in the past, we can more easily filter out the noise while still allowing open participation. The trolls can still troll, but their influence in the group will be limited.

Another possible approach, proposed by Nick Denton in the linked interview, could have a similar impact.
The answer? Denton said his sites are planning to post some stories that allow only a hand-picked, pre-approved group of people to comment on them. That, he said, would make the comment section an extension of the story and allow people [...] to have their say without fear of being piled onto by others.
The fundamental difficulty with this approach is that someone else, the publisher?, anoints an elite few who may participate in the conversation. This is counter to the original goal of open participation and violates an unwritten Internet law just as much as requiring authentic logins would. A better solution is to allow the participants themselves to choose who should have proportionately greater influence, in a dynamic and ongoing fashion. In I'm Not Elitist, Just Better Than You, I discuss how and why this is done within the Ethosphere.

Do you think pseudonyms + reputation could be a possible solution to this problem?  Are there other self-policing, self-organizing ways to do this?

No comments:

Post a Comment