Science, Merit, and the Internet (Part 1 of 2)

In the past six weeks, online science journalism has been rocked by two controversies: in late September, the nearly 150 year old magazine, Popular Science, decided to turn off the comment function on its website because, as its editors claimed, "Comments can be bad for science." And a few weeks later, Bora Zivkovic, the blogs editor for Scientific American, resigned over charges that he sexually harassed female science writers. Both of these issues have important implications for the future of science communication on the Internet, and these controversies also have deep connections to issues that historians and other members of the interdisciplinary field of Science and Technology Studies (STS) have been examining for a long time. In the next two blog posts, I'll be examining the controversies in turn and trying to say something about they mean for science writing when they are viewed together. In these posts, I hope to build on Hank's brilliant four part analysis of  "The Fall of Jonah Lehrer" here on American Science (for example, here), but I will say more about that later.
(Source: Wellcome Images) The image that accompanied Popular Science's announcement that it would no longer allow comments on its online articles.  

In an interview with National Public Radio, Jacob Ward, the editor and chief of Popular Science, claimed that his staff members shutdown the comment function for multiple reasons:

"We had three deciding factors that it came down to. One is the rise of trolls, which is a pretty well-understood term these days - basically, people who come into a comment section of a website to be abusive or unpleasant. Second, we had bumped into on our own site, and then had seen it sort of confirmed in other places - and seen, also, studies about this - we discovered that troll behavior - that being unpleasant, being uncivil, sort being really fractious in a debate - can cause readers to actually misunderstand things that are scientifically validated.

And third, we decided that it was a matter of resources. There is a way - there are many ways to patrol the comments on one's own site; but if we have a limited number of resources - and everybody does - I'd rather pour that into our primary mission, which is great journalism; putting out the best science journalism we can, rather than just trying to patrol our comments for all time."

Ward's words evoke many issues, several of which connect with work done in STS. For now, I'd like to only focus on two issues. First, as scholars are increasingly pointing out, governance—and that's how I see Pop Sci's move, as an attempt to govern its virtual space—is often based on implicit or explicit ideas about human nature or psychology. (See Hank on related themes in his discussion of Chris Mooney here.) To give a few too simple examples, on the one hand, liberalism (in the classic sense) is founded on the idea that humans are at least semi-rational creatures who can make the right decisions if they have access to good information. On the other hand, conservatism sees humans as basically mean and stupid and in bad need of control. It's interesting, therefore, to see how a certain picture of human psychology was built into Popular Science's explanation of why it was doing what it was doing. In its statement, it cited a New York Times op-ed by Dominique Brossard and Dietram Scheufele, two University of Wisconsin professors of science communication. In that op-ed, Brossard and Scheufele explained that their research found that, "Uncivil comments not only polarized readers, but they often changed a participant's interpretation of the news story itself."

One of my friends said that Popular Science's citation of this research amounted to pseudoscience. I wouldn't go that far, but it's pretty clear that Popular Science took the research out of context and used it for its own ends, through a leap of questionable logic. In the NPR interview, Robert Siegel called Ward out on this:

"SIEGEL: In making this decision, you cited a University of Wisconsin study which you say found that uncivil discourse in the comments after an article effects the way people digest what they just read and it can make a settled fact appear to be contentious, for example. But doesn't the study say actually that negative comments do not persuade, they only edify people who are leaning one way or another?
WARD: The researchers found that the reaction was skewed by the level of fractiousness in the debate, that basically the more uncivil the discourse, the more people's perceptions had changed, their opinions changed over time. Now, you're absolutely right that a portion of readers are going to able to come through that unscathed and...
SIEGEL: But was the finding of the University of Wisconsin study that readers typically were affected by comments or that readers who were leaning one way or the other leaned farther after reading those comments?"

I think the sad fact is that it is the issue of labor and limited resources and not these social scientific ideas that drove the decision at Popular Science. In a recent interview on On the Media, Arianna Huffington discussed the Huffington Post's decision to forbid anonymous comments because of trolling. In the interview, she mentioned that Huff Po has 40 full-time comments moderators!! But like most journalistic enterprises, Popular Science faces seriously constrained resources, and it doesn't have the budget to moderate its pages properly. Nevertheless, there are increasing conversations about the psychology of the Internet, focusing, for instance, on what explains trolling and whether there is something inherently bad about anonymity in certain contexts. We can imagine a future where a social psychology of the Internet that might inform future governance (that is, beyond the social psychology that is already informing how Facebook and other corporations design and govern their products), but I think that day is still some way off. Even when it does come, we will have to ask the classic questions, who will get to decide what is 'good' social science? And who will decide how social scientific ideas are translated into governance structures?

This brings me to my second issue, which is, what role does democracy play in all of this? Because a certain notion of democracy plays an essential role in Internet ideology, including the ideology that argues for the virtue of comments fields. (See Andy Russell's recent post on this ideology on American Science here.) We see some reflections on democracy or at least its near relatives in the NPR interview with the Popular Science editor:
"WARD: In some cases, [negative comments] can skew people's understanding of an otherwise established scientific principle and . . . lend this air of debate to places where science has really come to a place of consensus.
SIEGEL: And you seem to have bumped into the problem here that science, at its heart, is not a populist enterprise. There's expertise. There are people who stage experiments and prove things and that's different from someone else having an opinion on the same subject."

Wow! There's so much going on in this brief exchange. First of all, there is science, which has "come to a place of consensus" (whose consensus?), a consensus that Popular Science is duty bound to communicate to the public (that's the "popular" part). Then there is expertise and people "proving" things and the difference between mere "opinion" and something else (science, one assumes). It's interesting, for example, that Siegel used the word "populist" and not its close relative "democracy," because, as historians have long known, there is a long-running anti-populist tradition in the United States. Yet, Internet gurus are always preaching the wisdom of crowds(!), which seems to suggest that, despite the Internet's early use by academics, the technology has come into a fundamental tension with at least some notion of science.

In the end, the important point is that you can have fellow-feeling for Popular Science's quandry and come out in the exact opposite position. In an email, my buddy and Stevens colleague, the science journalist, John Horgan, wrote, "I sympathize with Pop Sci's decision to shut down comments. I write about a lot of hot button issues, like drones, human genetics, psychiatric drugs, inequality, climate change, and so I get a lot of hostile, crazy comments. Sometimes I'm tempted to censor the especially offensive ones or just delete them all, as Pop Sci did. But I'm a free speech absolutist, and I figure that if I express strong opinions on a topic, others should have their say. Also, as naive as it sounds, my hope is that over the long run reason will prevail over the forces of darkness."

I am going to leave my analysis of what has gone wrong in Popular Science's decision and what science writers need to do differently for the next blog post. For now, I would like to end with a question. The image above is the one that Popular Science published with its post. The image seems to say, "Look, this is science; this is what we do; we bring it to you." (The .jpg is named "nano," and we all know how much we love to hear about nano.)

But what if Popular Science had used a different image to envision its mission? What if it used this one?

Or even this one?


Why does everything have to be a virtual social network nowadays? Outside of the virtual realm, you would read an article and discuss it with your friends who, in the interest of staying your friends, wouldn't troll you into changing your opinion on whatever subject you're discussing. Any sort of anonymity kills this notion. Regardless, there are enough outlets for discussion on the internet and decentralizing them among the countless websites providing any sort of academic discourse only really appeals to idealists. I assume that whatever study Ward was referencing left one important facet unexplored, and that is the general psychology of trolls. I like to think that an affinity for chaos and discord is grouped with a host of very base human qualities, including laziness. The average troll will troll out of boredom. If the opportunity for havoc arises, the troll will grasp it. However, I very much doubt the average troll would seek out a dedicated forum for discussion just to sow destructive ideals. As it is, commenting on an article is easy. Any reader can finish reading and comment within the same minute. Thus, centralizing discussion in a site away from the content would thin out the lazy members of the herd, reducing the need for moderation and potentially hindering lazy trolls. Now this idea is not original, nor has it shown to succeed in the long run (see Reddit), but it's one humble student's suggestion that seems to have some logic behind it.

Note: Only a member of this blog may post a comment.

back to top