Identity, Anonymity, and Controlling Trolls

Multiple personalities

Flame wars and jihadist rants and generally worthless behavior in the comments: that’s the problem I’m trying to solve here.

And I’m trying to do it while preserving anonymity. Internet conversation can get nasty when the participants are anonymous, which has led to proposals of tying all online identities to “real” identities. This is the wrong solution to the troll problem, because it destroys privacy in a serious way. I want to build discussion systems that allow anonymous comments, yet remain orderly, civil, and enlightening. I think this can be done with filtering systems based on reputation.

Reputation is a thing that sticks to an identity. Historically most people had only one identity, closely tied to their physical presence. But now, online, every one of us has multiple identities: think of how many user names and logins you have. There’s some consolidation going on, in the increasing acceptance of Google, Twitter, and Facebook logins across the web, and this is mostly a good thing.  But I don’t think we want to aim for a world where each person has only one online identity. Multiple identities are good and useful.

Multiple identities are closely related to anonymity. Anonymity doesn’t mean having no identity, it means not being able to tie one of my identities to the others. I want to be very careful about who gets to tie the different parts of me together. I’m going to give two arguments for this, which I’ll call the “does your mother know” and “totalitarian state” arguments. They’re both really important. I’d be really if sad if we lost anonymity in either case. And after I’ve convinced you that we need anonymity, I’ll talk about how we get people to behave even if they don’t leave a name.

Keeping the different facets of ourselves apart is the essence of privacy. We’ve always been different people in different contexts, but this was only possible because we could expect that word of what we did with our friends last night would not get back to our mother. This expectation depends upon the ability to separate our actions in different contexts;  your mom or your boss knows that someone in the community is going on a bender/having kinky sex/voting Republican, but she doesn’t know it’s you. The ability to have different identities in different contexts is intricately tied to privacy, and in my mind no different than setting a post to “friends only” or denying the details of your personal life. Although the boundaries around what is “personal” are surely changing, if you really think we’re heading toward a world where everybody knows everything about everyone, you’re mad. For one thing, secrets are immensely valuable to the business world.

And then there’s China. I live right next door to the most invasive regime in the world. The Chinese government, and certain others such as Korea, are trying very hard to tie online and corporeal identities together by instituting real name policies. This makes enforcement of legal and social norms easier. Which is great until you disagree. Every damn blog comment everywhere is traceable to you. Every Wikipedia edit. Everything. China is trying as hard as it can to make opposing speech literally impossible. This is not theoretical. As of last week, you can’t send dirty words through SMS.

When the digital panopticon is a real possibility, I think that the ability to speak without censure is vital to the balance of power in all sectors. Anonymity is important to a very wide range of interests, as the diversity of the Tor project shows us. Tor is a tool and a network for anonymity online, and it is sponsored by everyone from rights activist groups to the US Department of Defense to journalists and spies. Anonymity is very, very useful, and is deeply tied to the human right of privacy.

Right, but… how do we get sociopaths to play nice in the comments section if they can say anything they want without repercussions?

The general answer is that we encourage social behavior online in exactly the way we encourage it offline: social norms and peer pressure. We can build social tools into our online systems, just like we already do. A simple example is the “flag this” link on many commenting systems. Let’s teach people to click it when they mean “this is a useless post by troll.” Collaborative moderation systems — such as “rate this post” features of all kinds — work similarly.

Collaborative moderation is a really big, important topic, and I’ll write more about it later. There are voting systems of all kinds, and the details matter. Compare Slashdot versus Digg versus Reddit. But all of these systems rate comments, not users, and I think this makes them weaker than they could be at suppressing trolls and spam. Identities matter, because identities have reputations.

Reputation is an expectation about how an identity will behave. It is built up over time. Crucially, a throw-away “anonymous” identity doesn’t have it. That’s why systems based on reputation in various forms work to produce social behavior. There are “currency” systems like StackOverflow‘s karma where one user can give another credit for answering a question. There are voting systems such as the Huffington Post‘s “I’m a fan of (comment poster)” which are designed to identity trustworthy users. Even Twitter Lists are a form of reputation system, where one user can choose to continuously rebroadcast someone else’s tweets.

And in the context of online discussion, you use reputation to direct attention.

That’s what filtering is: directing attention. And this is how you deal with trolls without restricting freedom of speech: you build collaborative filters based on reputation. Reputation is powerful precisely because it predicts behavior. New or “anonymous” identities would have no reputation and thus command little attention (at least until they said a few interesting things) while repeat offenders would sink to the bottom. Trolls would still exist, but they simply wouldn’t be heard.

NB, none of this requires tying online identities to corporeal people. Rather than being frightened of anonymity and multiple identities, I think we need to embrace them. We need to trust that we can evolve the right mixes of software and norms so that collaboration overwhelms vandalism, just as Wikipedia did. This field is mostly unexplored. We need to learn how identity relates to trust and reputation and action. And we need to think of social software as architecture, a space that shapes and channels the behavior of the people in it.

Simply trying to make it impossible to do anything bad will destroy much that is great about the internet. And it lacks imagination.

4 thoughts on “Identity, Anonymity, and Controlling Trolls”

  1. Neat trick if you can pull it off, but I suspect that you’ll have similar issues to post rating systems–that is, that group think will still stifle dissenting views by rating unwelcome commenters such that they never get seen or whatever the punishment is for not agreeing.

    Basically, you’re not addressing the issue of the norms of the site being posted on. For example your crack here …bender/having kinky sex/voting Republican… implies a certain point of view, and dissenting voices will not be welcome, and will get supressed. (Maybe you’re not like this, but I know from experience that people like bitch phd squelch comments of people they don’t like the opinons of). You say nothing about this echo chamber issue.

    Perhaps the whole thing is unsolvable, as anonymity absolutely will enable trolls looking for ‘teh lulz’ and all that stuff. From what I can see so far on sites that I frequent, is that the trolls (if they’re not actual sociopaths) generally drop a comment or two, and if there is no rising to their bait, they soon stop commenting. This is probably the best you can hope for.

  2. You are quite right that every community will have and enforce its own standards of relevance and acceptability. The crucial point is to ask what “free speech” means. I think it means the ability to publish something, which is not the same thing as demanding that it receive attention or validation from the audience.

    “Dropping a comment or two” and then leaving is exactly what I’m hoping for. I hope to make trolling unpopular by building systems that tend to make comments by unknown identities irrelevant, until they’ve established a track record.

  3. Well let me just start by saying its always good to read anything which someone is passionate about regardless of meeting a successful objective; but I just want to add some additional perspective to the mix.
    What is it that differentiates an anonymous poster from a troll, or even someone with a numerologically based name? We can surely agree that it is the content of their post/argument which ‘should’ define it as such.
    It is therefore society which labels the Troll as much as the Troll labels himself. The more credence we give to such individualistic tendencies the more prolific they will become.

    To summarise- its better we’re all aware of how we absorb information and what importances we attach. The idea of reputation is sensible by ideal but unfortunately just when you think you can predict an elements behaviour odd things start to happen. Our minds are all geared up for logic, but anger makes us slave to our ego.

    Peace out!
    (oh and i’m really not down with the idea of anonimity. We need to take some responsibility for ourselves and take pride in our lives.)

Leave a Reply

Your email address will not be published. Required fields are marked *