Mon March 11, 2013
The 'Nasty Effect': How Comments Color Comprehension
Originally published on Tue March 12, 2013 4:56 pm
At its best, the Web is a place for unlimited exchange of ideas. But Web-savvy news junkies have known for a long time that reader feedback can often turn nasty. Now a study in the Journal of Computer-Mediated Communication suggests that rude comments on articles can even change the way we interpret the news.
"It's a little bit like the Wild West. The trolls are winning," says Dominique Brossard, co-author of the study on the so-called nasty effect. Those trolls she's referring to are commenters who make contributions designed to divert online conversations.
Researchers at the University of Wisconsin, Madison, and Virginia's George Mason University worked with a science writer to construct a balanced news story on the pros and cons of nanotechnology. More than 1,000 subjects reviewed the blog post from a Canadian newspaper that discussed the water contamination risks of nanosilver particles and the antibacterial benefits.
Half saw the story with polite comments, and the other half saw rude comments like, "If you don't see the benefits of using nanotechnology in these products, you're an idiot."
"Basically what we saw," Brossard says, "is people that were exposed to the polite comments didn't change their views really about the issue covering the story, versus the people that did see the rude comments became polarized — they became more against the technology that was covered in the story."
Brossard said they chose the nanotechnology topic so that readers would have to make sense of a complicated issue with low familiarity. She says communication research shows that people use mental shortcuts to make sense of things they don't understand.
"We need to have an anchor to make sense of this," she says. "And it seems that rudeness and incivility is used as a mental shortcut to make sense of those complicated issues."
Brossard says there's no quick fix for this issue. While she thinks it's important to foster conversation through comments sections, every media organization has to figure out where to draw the line when comments get out of control.
"You don't want to be censoring opinions, but you don't want to allow neither points that are out of topic and that are offensive to the other people that are discussing," she says.
Some sites remove offensive comments, some have moderators to regulate the conversations, and others turn off commenting features once a certain number is reached. Brossard says it's important for people involved in journalism and online communication to realize the influence that comments can have and to formulate appropriate policies.
"I think what we need to define now on the Web, what is a good conversation? What are the things that are allowed socially? Also, as an audience, what do we let happen there?"
All good things to keep in mind before you post a comment below.
NEAL CONAN, HOST:
What was once a democratic forum for opinion and information has been reduced to a feeding frenzy. No, not Congress. We're talking about the comments section. Web-savvy news junkies have known for a long time that reader feedback can often turn nasty. Now, a new study finds that just reading that uncivil discourse can change the way we interpret the news. The study comes from this month's Journal of Computer-Mediated Communication by five researchers from the University of Wisconsin-Madison and George Mason University.
So if you write for the web, do you keep your commenters in mind as you write? 800-989-8255. Email us: firstname.lastname@example.org. You can also join the conversation at our website. That's at npr.org, click on TALK OF THE NATION. Dominique Brossard is a professor in the department of life sciences communication at the University of Wisconsin - Madison, co-author of the first study looking at the so-called nasty effect. She joins us from WPR in Madison. Nice to have you with us today.
DOMINIQUE BROSSARD: Thanks for having me.
CONAN: And did it come as a surprise to find that opinion on an article changed if someone wrote "that's the stupidest thing I've ever heard of" at the end of it?
BROSSARD: Well, there's been some concern about uncivility and, you know, the fact that people tend to be rude on the online environment for quite a long time. But as far as the effect of the comments themselves being rude in changing the perception of the issue covering the study, yes, we investigated this from an open-minded perspective and we were surprised to see that what we use as rude comments, which as one of the news reporters that covered our study said - as reported to the online environment was quite tame, you know, like name calling such as idiot or things like that, were enough to actually change, you know, the perception of the story. So what we use as uncivil comments, were actually quite tame. So if those have an effect, then the question is what happen when people are much more outrageous in their way of behaving online.
CONAN: So you tested for this by writing a spurious article and then people read it without comments and people read it with nasty comments.
BROSSARD: No, not exactly, actually. What we did is that we wrote a balanced news story about something complicated, an emerging technology, nanosilver, that can have some risks and some benefits. And we wrote this with a science writer to make sure that, you know, we would not pushing more the risk or the benefits related to the issue. And then we had more than 1,000 Americans, a representative sample, look at this story. Half of the sample saw the story plus comments that were actually quite polite. And then the other half of the sample saw the same story but with the comments that were, you know, followed by name calling. So the same comments but just name calling and just rudeness, you know, overall in the comments.
So basically what we saw is people that were exposed to the polite comments didn't change their views really about the issue covered in the story versus the people that did see the rude comments became polarized, they became more against the technology that was covered in the story. And again, remember, like everybody saw the same story. So it was really the fact that you had that name-calling that, you know, that really out-of-the-board type of behavior that made a difference.
CONAN: As you said, the nasty comments you included were not all that nasty as far as the Web is concerned, and nanosilver, I have to say, is not as controversial a subject as, say, abortion or gun rights or for that matter, climate change.
BROSSARD: Yes. Exactly and that's a good point. We did choose nanosilver. We are interested in emerging technologies. How can people make sense of complicated issues and that have multiple dimensions, that not just about oh, I'm against or for this because my values tell me that that's not OK, right? But if you think of complex issues such as, let's say, foreign policy or, you know, the economy or you just mentioned climate change, that are complicated - you know, they have different dimension: social dimension, technical dimension, legal dimension, ethical dimension. So how do people make sense of this kind of complicated issue? And we know in communication research that people - all of us, we use mental shortcut to make sense of things that are complicated, right, because we cannot know everything about everything.
So we need to have an anchor to make sense of this. And it seems that rudeness and incivility is used as a mental shortcut to make sense of those complicated issues. So as you say, Neal, and it's a good point, this is likely to happen for other issues such as climate change, for example, that are also very complicated and have different dimensions.
CONAN: So how do we deal with this? What do we do?
BROSSARD: You know, I wish I could answer that question. I think what we do is like we all put our heads around the issue and try to find a way, all of us, you know, that are involved in journalism, communication, online communication, and that actually we believe that it's important, as you do in your show by having callers call and be part of the conversation. It's important to have that conversation. But how do we foster a constructive conversation? And I think what we need to define now on the, you know, like on the Web, what is a good conversation? What are the things that are allowed socially? Also, as an audience, what do we let happen there?
Because what happen right now is, you know, as I sense elsewhere, it seems to be like the Wild West. The trolls are winning. Now people, you know, they are used to see people being very outrageous in those comments, and then they won't comment because they don't want to be, you know, included in this type of discussion. So I think we need to really move towards how can we moderate those comments in a way that's good for everybody?
CONAN: So we want to hear from those of you involved in online communications. How do you deal with flamers or, well, do you anticipate their comments and adjust your articles accordingly? Give us a call: 800-989-8255. Email us: email@example.com. We'll start with Brett, and Brett's on the line with us from Phoenix.
BRETT: Yeah, hi. Very interesting. I have been commenting mostly at the New York Times on many of the stories that I read, especially in the opinion section. And for example, I just read Paul Krugman's very good article, I believe, in the last day. And I find that it adds layers of depth to the overall opinion piece or article that I don't get in the article itself in many cases. And I've been reading these comments for a number of years.
I would like to say one thing, though. I kind of disagree with the person on the show to say that we have to, quote, foster or moderate. I think this is a reality of today's online, and I don't see how it's any different than going to a town meeting where you might have a few hecklers and people in the audience. But still, the conversation goes on and information can be exchanged. So I'll take my answer off the air.
CONAN: Just a comment, Brett. I think that the New York Times and other places, too, will have people remove comments that are considered offensive.
BRETT: Right. And I don't think they have to be heavily edited or moderated. I do want to hear divergent points of view, and I appreciate that.
CONAN: OK. That's a different point and, I think, a good one. Dominique Brossard?
BROSSARD: Yes, indeed. And I hope, you know, I won't be misunderstood, really not my intention to suggest that we need to heavily edit comments, and I do believe, indeed, that you can have an excellent, you know, comments that follow an article and bring a lot to the conversation. As a matter of fact, we had - a colleague of mine had an opinion editorial in the New York Times last Sunday, and you know, like, the readers brought amazing good points that really added value to our piece.
Unfortunately, it's not always the case. And to give you another example, our research was covered in the Journal Sentinel here in Milwaukee, and we got some comments that were totally disconnected to the topic of the article and were, you know, borderline offensive. So every newspaper, every media organization has, you know, different ways to deal with this. And the question is, and I agree with your caller, where you do you draw the line? You don't want to be censoring opinions, but you don't want to allow, neither, points that are, you know, out of topic and that are offensive to the other people that are discussing.
CONAN: Let's get Mike on the line. Mike's on the line with us from Cape Cod.
MIKE: Good afternoon.
CONAN: Good afternoon.
MIKE: I'm an editor of a website about Earth science. And one of the things that we do is we try to edit out comments, delete comments that don't advance the conversation. I don't necessarily mean ones that are disagreeing with the point of view or, in the case of science, you'll get people who are denying the science. We don't necessarily even weed those out. But the things that are nasty, that are uninformative, that are just sort of someone's making it up, we just go out of our way and nip those comments in the bud. They're not advancing the conversation. And just because you have a First Amendment right to have freedom of speech does not mean you have a First Amendment right to express that speech on my website.
CONAN: I understand that, too, but doesn't this take up a fair amount of time?
MIKE: It can. That's what the delete button is for.
CONAN: I see. OK. But as there are other journals and other publications that are not - don't have editors like you to say, well, I don't know. Maybe this one, maybe not that one.
MIKE: Yes. I mean, that's the problem. But I think it's an obligation. If you're going to open up the conversation online, you need to be prepared to moderate it and to stay engaged with it. A lot of - where a lot of sites go wrong, whether it's at federal agencies or at newspapers, is that they open this up and then they let the free-for-all happen, and then they wonder why it goes wrong. If you're going to have - if you can't manage the conversation, you shouldn't open the conversation. Just don't take comments.
CONAN: Just as a question of technique, do you respond to, you know, commenters' questions? I mean, for example, if they ask a factual question, would you say "the answer is 476, Ed" or something like that?
MIKE: Yes, we will do it. We've been known to make corrections or make adjustments or additions to a story and note that, that readers pointed that out.
CONAN: So there is a hint that there is an intelligence there that is moderating the conversation.
MIKE: Yes, absolutely. And we owe that to the readers. It's not easy. It's not - it does take time, but it - that's our responsibility, if you're going to have a conversation. That makes it two-way. Otherwise, you're just talking at people.
CONAN: Mike, thanks very much for the call, appreciate it.
MIKE: Thank you.
CONAN: We're talking with Dominique Brossard, a professor in the Department of Life Sciences Communication at the University of Wisconsin-Madison and one of the authors of a new study in this month's Journal of Computer-Mediated Communication on the online nasty effect. You're listening to TALK OF THE NATION from NPR News.
And Howard's on the line, Howard calling from San Antonio.
HOWARD: Yes, hi. I just was going to comment and say that I have a site, and it's a soccer site. It's americanizesoccer.com. And with a name like that, you can imagine that there's a lot of people who disagree and are very upset that here I am stating some of the rules in soccer may need to change and - for Americans to like the game better. So I know ahead of time that people are not going to agree with what I'm saying.
But what I try to do and because I've had so many people comment and so - the trolls that you talk about - people just out there just make horrible comments just for the heck of it, I try to be as clear, concise. I try to make common sense. And in the end, people are going to disagree, but at least I can feel good about the fact that I've written the article to try to explain exactly why, and the Americanized soccer is a thing that we should look for, for the future in soccer.
CONAN: Well, I don't - I will not agree or disagree about Americanized soccer, but I will appreciate your difficulties in writing about something that people, as you suggest, feel so passionately about. So I don't envy your editing tasks.
HOWARD: Exactly. And thank you for allowing me to comment.
CONAN: Thanks very much, Howard. And let's see if we go next to - this is...
BROSSARD: Neal, if I can say something.
CONAN: Go ahead, please.
BROSSARD: I think it's important to keep in mind, I think your callers, you know, it was great to hear that they wanted to establish a fruitful conversation with their readership. But I think the point that you brought up earlier, which is related to the volume, is extremely important. I mean, how can you deal like - our opinion editorial in the New York Times, you know, had 400 comments after two hours. So they just stopped allowing, you know, those comments to be posted.
So the problem is really a question of human power, or how can you deal with this? And I think that's where we are right now. And you do have different companies that are trying to develop different ways of having automated ways of moderating the comments, you know, but this is complicated. You need to have intelligent algorithms that let you, you know, still keep the essence of the conversation and not be too harsh in eliminating the ones that do not feed the flow, for example.
CONAN: Here's a tweet from Jessie Hudgens: "I try to engage with trolls. Often, gentle pushback can lead to concessions and convert rants into fruitful debate." And again, that's the moderated question, but what if you got 400 of them? And that's the point that Dominique Brossard just made. Matisse is on the line calling from San Francisco.
MATISSE: Hi, there. So I have about 25 years of experience in online discussion forums because I started using and eventually became an employee of and then continue to use a system called The WELL, which was one of the early online discussion systems. And my comments or observation is that a technique that is not a silver bullet but has tremendous value is to not allow anonymous commentary.
So, for example, if people must use their real name and it is linked to, say, their zip code, if there's a national or international forum, and backed by the use of a credit card, not that they have to pay in order to make a comment, but that it's backed to a real verified identity, that it helps a lot. Obviously, NPR uses, like, a much more - higher-cost technique of screening every caller...
MATISSE: ...before they're allowed on the air. And you can't do that if you're getting 400 in two hours. But if you restrict comments to people who are publicly identified by their real name and do not allow anonymous...
CONAN: And I don't mean to cut you off, Matisse, I just wanted to give Dominique Brossard a chance to respond. Is the cloak of anonymity a problem here?
BROSSARD: Well, actually, it does help to some extent to have, you know, way to link the commenter to some kind of identity. But it's not foolproof either. And, for example, that news story I was commenting on, that generated a lot of nasty comments for our own, you know, study, you had to actually log in with your Facebook account to this.
BROSSARD: So - and still we got a fair number of the - we got more than 170 comments, so our numbers are very good. Ones, you know - when I say good, it's like either they disagree or agree, but in, you know, constructive terms. And we still got the nasty ones. So let's say that didn't prevent the trolls from invading our space.
CONAN: And, Matisse, thanks very much. As you suggest, not a silver bullet, but perhaps helpful. Our guest was Dominique Brossard, co-author of a new study on the Web's nasty effect. She joined us from Wisconsin Public Radio in Madison, and we thank her very much for her time. Tomorrow, doctors work through on pot prescriptions. Join us for that. I'm Neal Conan, it's the TALK OF THE NATION from NPR News. Transcript provided by NPR, Copyright NPR.