APPROACHES TO MANAGING DEVIANT BEHAVIOR
			    IN VIRTUAL COMMUNITIES

				       
		    Amy Bruckman (organizer and panelist)
       MIT Media Lab.  20 Ames St., MIT E15-315a, Cambridge, MA  02139.
			  E-mail:  asb@media.mit.edu
				       
			   Pavel Curtis (panelist)
	  Xerox PARC.  3333 Coyote Hill Road, Palo Alto, CA  94304.
			E-mail:  pavel@PARC.xerox.com
				       
			   Cliff Figallo (panelist)
Former Director of The WELL.  33 Roque Moraes Ct. #7, Mill Valley, CA  94941.
			     E-mail:  fig@eff.org
				       
			  Brenda Laurel (moderator)
	Interval Research.  1801 Page Mill Road, Palo Alto, CA  94304.
			 E-mail:  laurel@interval.com



ABSTRACT
It is an unfortunate fact of life that where there are multi-user computer
systems, there will be antisocial behavior.  On bulletin board systems (BBSs),
there are those who persist in being obscene, harassing, and libelous.  In
virtual worlds such as MUDs, there are problems of theft, vandalism, and
virtual rape.

Behavior is "deviant" if it is not in accordance with community standards.
How are such standards developed?  Should standards be established by system
administrators and accepted as a condition of participation, or should they be
developed by community members?  Once a particular person's behavior is deemed
unacceptable, what steps should be taken?  Should such steps be taken by
individuals, such as "filters" or "kill" files on BBSs, and "gagging" or
"ignoring" on MUDs?  Or should the administrators take action, banning an
individual from the system or censoring their postings?  What is the
appropriate balance between centralized and decentralized solutions? (Figure
1-- omitted in plain text version).

Gags and filters are computational solutions to deviant behavior.  Are there
appropriate social solutions?  How effective are approaches like feedback from
peers, community forums, and heart-to-heart chats with sympathetic system
administrators?  Are different approaches effective with communities of
different sizes?  What is the appropriate balance between social and
technological solutions?

KEYWORDS: Community, standards, behavior, social versus technological
approaches, virtual communities, MUDs, Bulletin Board Systems (BBSs).


[Figure omitted in plain text version.]

Figure 1:  Approaches to Deviant Behavior: Two Continuums

POSITION STATEMENTS

BRENDA LAUREL, INTERVAL RESEARCH (MODERATOR)
BACKGROUND
Brenda Laurel is a researcher and writer whose work focuses on human-computer
interaction and cultural aspects of technology.  She is a Member of the
Research Staff at Interval Research Corporation in Palo Alto, California.  She
is editor of the book, The Art of Human-Computer Interface Design
[Addison-Wesley 1990] and author of Computers as Theatre [Addison-Wesley 1991;
2nd edition 1993].

POSITION
In rural Nova Scotia, some say, one small community deals with socially
unacceptable behavior in a novel way.  They put a live lobster on the
offender's back.  It can only be removed with the help of others.  The
technique is said to promote intensive individual learning and a high degree
of social conformity.

It is likely that virtual communities will be at least as diverse-culturally,
demographically, ethically, and politically-as actual communities.  What are
some potential means for virtual communities to deal with "antisocial"
behavior?  How effective are they?  What are the tradeoffs involved in various
"solutions"-for example, how do they affect the character of a community and
the rights of individuals?  In terms of both problems and solutions, how are
virtual and actual communities different, and how are they the same?

AMY BRUCKMAN, MIT MEDIA LAB
BACKGROUND
Amy Bruckman is a doctoral candidate at the Media Lab at MIT, where she
founded MediaMOO, a text-based virtual reality environment or "MUD" designed
to be a professional community for media researchers.  Amy received her
master's degree from the Media Lab's Interactive Cinema Group in 1991.  For
her dissertation, she is creating a MUD for kids called MOOSE Crossing.  MOOSE
Crossing is designed to provide an authentic context for kids to learn
reading, writing, and programming.

POSITION
In computer-based communities, it is tempting to throw technological solutions
at social problems.  Someone programmed virtual guns?  Delete them.  Got an
obnoxious user?  Cancel their account.  I will argue that social solutions are
often more effective and also help to reinforce a sense of community.

I have had success with a psychoanalytic approach to dealing with problem
users.  Someone who is causing trouble probably wants attention.  A heart to
heart chat with a sympathetic system administrator can often solve the
problem.

Technological interventions are rarely more than a band-aid for social
problems.  However, social solutions require time, effort, and leadership.
Being able to take the time to engage each problem user in a dialogue is a
luxury that comes from having a small community size.  Larger communities
necessarily become bureaucracies; in a real sense, they cease to be
communities at all.  I will propose a model of clusters of small, affiliated
communities and sub-communities as a structure for preventing and managing
social problems.

PAVEL CURTIS, XEROX PARC
BACKGROUND
Pavel Curtis has been a member of the research community at the Xerox Palo
Alto Research Center since 1983, during which time he has worked on
programming environments and on other projects mostly related to the design
and implementation of programming languages.  His current work centers on the
Social Virtual Reality project, investigating the implementation,
applications, and implications of systems that allow multiple simultaneous
users to communicate and interact in pseudo-physical surroundings.  He is the
founder and chief administrator of LambdaMOO, one of the most popular
recreational social virtual realities on the Internet.

POSITION
For behavior to be deemed "deviant," it must by definition deviate from some
accepted norm.  Who defines the norms in any given society and how are those
norms communicated to newcomers?  In LambdaMOO (an online community on the
Internet), there are a number of mechanisms through which the community as a
whole can decide upon "the rules" and communicate those decisions to all.
I'll discuss the origins and evolution of some of those mechanisms.

Hand-in-hand with establishing behavioral norms, societies decide how to cope
with members or visitors who violate those norms.  I will argue that we should
distinguish two broad categories of deviants and craft separate policies for
dealing with them.  Finally, I will describe some of the coping mechanisms
suggested by LambdaMOO users, including some that have been implemented and
applied in practice.

CLIFF FIGALLO, THE WELL
BACKGROUND
Cliff Figallo was Managing Director of the Whole Earth Lectronic Link (the
WELL) during six of its first seven formative years, and has worked as Online
Communications Coordinator for the Electronic Frontier Foundation.  He is
currently consulting in the field of online communications.  Before becoming
involved in online community activities, Cliff spent twelve years living and
working in an intentional "real life" community called the Farm.

POSITION
On the WELL, system managers sought to nurture a community where free speech
was the norm and where all users felt safe to express themselves.  System
managers took care not to publicly exercise power in ways that might inhibit
open group interaction.  By encouraging the formation of core groups of users
who shared their desire for minimal social disruption, management not only
relieved itself of the need to intervene as the authority in minor cases of
disruption, but it also gained the socializing influence of a dispersed
citizenry actively supporting community standards of behavior and passing them
on to new arrivals.

Online system managers are easy targets for challengers of authority.  If peer
pressure can be relied on to quell minor disruptive incidents, management can
be more effective as a court of last resort for more incorrigible violators of
social norms.  Management can also be creative in its treatment of disruptive
but non-malevolent users.  Expulsion from the system is, like capital
punishment in Real Life, the most extreme option and, in these new media,
there may well be technical remedies where social ones are lacking.



Copyright of the Association for Computing Machinery, 1994.  
To be presented at CHI'94 in Boston, MA in April 1994.




------------------------------------------------------------------------------
ftp://ftp.game.org/pub/mud      FTP.GAME.ORG      http://www.game.org/ftpsite/
------------------------------------------------------------------------------

 This document came from FTP.GAME.ORG, the ultimate source for MUD resources.

------------------------------------------------------------------------------