nettime's_elderly_janitor on Sat, 14 Mar 2020 20:37:24 +0100 (CET)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> NYT > opinion > Newitz > We Forgot About the Most Important Job on the Internet


< https://www.nytimes.com/2020/03/13/opinion/sunday/online-comment-moderation.html >

Opinion

We Forgot About the Most Important Job on the Internet

Content moderators are essential gatekeepers, but also our
greeters, paramedics, law enforcers, teachers and curators.

Annalee Newitz

March 13, 2020

Most of the internet is made of comments. Some are like the
old-fashioned ones you can see accompanying this article
online, but others take the form of memes, gamer
live-streams, or even breaking news from people on the
ground at a disaster scene.

And yet, the more ubiquitous comments are, the more that
tech companies treat them like the detritus of the internet
-- little more than raw data to be mined and analyzed for
political candidates or marketers, or mechanically sorted by
algorithms for posting or rejecting.

It doesn't have to be this way. In all these efforts to
process comments in various ways, we've lost sight of one of
the most crucial jobs created by the internet economy: the
moderator.

We need to put human moderators back at the center of our
social media, where they belong. But to do it, we'll need to
acknowledge what moderators have already done, and what the
job actually involves.

     [What are your thoughts on the role of comments on the
     internet? What has your experience been commenting online?
     Tell us about it in the (ahem) comments.]

"Moderator" became a tech job in the early 2000s, right
around the time when people started joking, "Never read the
comments," because they were so unbearable. Companies hired
moderators to prevent abuse, report illegal content to law
enforcement, ban commenters who broke the rules and
generally keep the peace.

But the gig was more than that. Jessamyn West, a librarian
who was a moderator for 10 years at MetaFilter, said the job
is like what Catskill entertainers of the mid-20th century
called a tummler, "the person in the room who isn't quite
the M.C. but walks around and makes sure you're doing OK."
Tummlers were basically professional minglers at shows and
social gatherings. If you were feeling shy, they'd even help
you strike up a conversation with other vacationers at the
resort.

Then, as the number of commenters soared, behemoth platforms
like Facebook and YouTube had a tough time scaling up the
tummler model. They also needed a new kind of moderator, one
who was more like a paramedic than a social director.

These moderators are the people who review abuse complaints,
usually on posts that have been flagged by users. Like
paramedics in real life, they see a lot of things they wish
they could unsee. Sarah T. Roberts, an information studies
professor at the University of California, Los Angeles, has
interviewed moderators who report spending days at a time
looking at videos of animal torture, child abuse and worse.
In her recent book, "Behind the Screen," she found that
moderators suffer traumas that are very similar to those
felt by rescue workers at a disaster scene.

To cope, some companies have tried to replace human
moderators with algorithms. The results have been mixed at
best. Some of the most high-profile failures were at
Facebook, where algorithms censored archaeological images
showing a 30,000 year-old nude figurine, while allowing live
video of suicides to circulate widely. Facebook promised
last year to hire thousands of human moderators -- and, in
some cases, to provide them with trauma therapy.

Those are good first steps for disaster-response moderation,
but we also need to revive what Ms. West called the tummler
part of the job. It's a tough gig, but it can be done.
Especially if companies admit that there is no
one-size-fits-all solution for moderation.

This is why human moderators are so valuable: they can
understand what's important to the community they're
moderating. On the Reddit forum r/science, for example,
moderators will delete posts that aren't based on
peer-reviewed scientific research. And on the fan-fiction
forum An Archive of Our Own, where many people prefer to
post stories under pseudonyms, members can be banned for
revealing the legal names of another member.

A well-trained moderator enforces these rules not just to
delete abuse, but also to build up a unique community. At
AO3, for example, there is a class of moderator called a
"tag wrangler," whose job is to make sure stories are
labeled properly for users who don't want "Iron Man" fic
mixed in with "Iron Giant" fic. Or "Iron Chef"! The forum is
also recruiting bilingual moderators who can answer
questions and post items of interest for its growing
community on Weibo, China's most popular microblogging site.

Monique Judge, an editor at the black news site The Root,
told me that she and her colleagues are inundated with
racist comments. But instead of banning the commenters, or
deleting their words, The Root lets them stand. "We let
those stay so that people can see how ignorant they are,"
she said. "I feel like those comments are just our reality
as black journalists. No matter what we talk about, people
will say, 'Don't discuss this because you're black.'"

Ms. Judge's point is that context matters. Racist comments
mean one thing in The Root's community, where black
perspectives are centered, and quite another on Twitter,
where they are not.

Moderators aren't the only ones responsible, though. They
are effective only if they have the support of their
employers. Anil Dash, a social critic and podcaster who runs
the app development community Glitch, once argued, in an
essay that has become a classic among moderators, that if a
website's comment section is full of jerks, "It's your
fault."

Now, he finds that the problem is a conflict of interest
between moderators, who want to enforce the rules, and
executives who want content from famous or controversial
people, at any cost. "That's why Twitter hasn't banned
Donald Trump from its platform," he said, even though the
president has repeatedly broken Twitter's rules against
posting violent threats. Moderation works only if the rules
apply to everybody on a platform.

There's something just as important as banning people, Mr.
Dash notes. "When you define rules of suspension, you have
to make rules of reinstatement, too," he said. When Twitter
and other platforms ban people, they should also tell them
what they need to do to become commenters in good standing
again. Perhaps they have to delete the tweet or comment that
broke the rules, or apologize. The idea is a bit like
restorative justice for the internet, where offenders are
given a clear pathway back into their community if they
choose.

Moderators are gatekeepers, but they're also the welcoming
committee. As well as the paramedics, the law enforcers, the
teachers and the curators. And, sometimes, they're friends.
Unfortunately, no single person can be all of these things.

We need to expand the ranks of moderators, and acknowledge
that the job has many subspecialities. But most of all, we
can't forget why we needed moderators in the first place:
They're our tummlers, helping us have a good time.



Annalee Newitz (@Annaleen) is a contributing opinion writer
and the author, most recently, of "The Future of Another
Timeline."

#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nettime@kein.org
#  @nettime_bot tweets mail w/ sender unless #ANON is in Subject: