|
The Net and extremists:
|
What is the responsibility of U.S. Web publishers and Internet Access Providers? Should they monitor online content or should they uphold the ideals of free speech and possibly risk legal liability for fraud or defamation? How is censorship defined? Such are the controversial questions being debated in our new world. So far, there are definite opinions, but no simple solutions. Where do we go from here? (3,200 words, including a sidebar on the Canadian experience)
Mail this article to a friend |
Should Web publishers and Internet Access Providers (IAPs) facilitate unfettered access to the Net in a completely uncritical manner, even for the most extreme propaganda? What about demonstrably false claims? Do Web publishers or IAPs risk legal liability for fraud or defamation if they do nothing? Is it irresponsible to do nothing and might inaction provoke greater regulatory control by government? Or does this entire discussion smack of censorship?
The Internet community clearly must wrestle with some complex issues. The Net's powerful communications revolution has gone a long way towards democratizing publishing. For $30 a month -- even free in some places -- access providers deliver e-mail services and a couple of megabytes of space for Web pages. A modest investment beyond the basics and you can serve up content from your own systems via a network service provider. Printing and postage costs can be eliminated. You are no longer left to stuff leaflets under windshield wipers in mall parking lots.
Growth of the extreme right on the Net
"I don't think the Internet has made our job any easier," said Angie
Lowry, director of the Klan Watch Division of the Southern Poverty Law
Center in Montgomery, AL, which tracks the Klan and white supremacists,
Christian Patriots, militias, and hate-crime activity.
"About three years ago we subscribed to all the principal publications -- and tracked about 100 groups. We read and analyzed them, digested what they thought and taught, and what activities they promoted. With the Internet we now have about 300 Web sites to monitor," Lowry said.
There are about five Usenet newsgroups almost exclusively oriented towards white supremacists, but over the last few years individual activists and spokespersons for larger groups targeted other newsgroups where they hoped to find receptive participants. Spokespersons learned to weave their themes into messages relevant to newsgroups on government, constitutional issues, politics, and gun control. By fanning out to a broader range of groups, activists hope to gain greater influence and recruiting leverage, Lowry said.
This is exactly the strategy advocated by Milton John Kleim Jr., a self-described founder of the Aryan News Agency and the former chief Internet propagandists for the white supremacists National Alliance. In June of this year, Kleim disassociated himself from the movement, and is now speaking out via the Net about his two years of focused attention on Usenet propaganda.
In the interest of balance, it is important to note that the constellation of extreme right-wing groups is complex -- both overlapping and distinct. These groups are not all the same and their Internet-based advocacy may raise different questions for access providers, or raise no questions at all.
There are Christian Patriot groups that are not racists or dangerous, Lowry said, and many of the neo-Nazi groups do not associate with Klan groups. Various militias may identify with Christian Patriot secessionists or white supremacists, or may reject both and organize themselves primarily around their opposition to gun control and land use regulations.
|
|
|
|
Tough cases
Here are a few examples of hypothetical cases drawn from
actual events that may cause concern for the Internet community.
What happens when the government seizes a person's assets for non-payment? Has the IAP facilitated a fraud? Even if not legally responsible, should general ethical concerns compel the IAP to pull the plug on the site or compel the Webmaster to add prominent disclaimers?
Is the IAP contributing towards hate-speech or defamation? Does an IAP have any responsibility to act when complaints are leveled against the content?
If a reader of the site injures or kills someone with a violent act promoted by the site, is the IAP liable for civil damages? Should social responsibility compel the IAP to pull the plug on the site when it becomes aware of it?
These are tough issues for Web publishers and access providers. It requires them to carefully think through their views on legal liability, social responsibility, and censorship. Unfortunately, there is little concrete direction offered by case law today, and public policy is equally unclear.
Legal liability
The question for U.S. Web publishers and IAPs is intermediary liability,
said David Post, associate professor at Georgetown Law Center in
Washington, DC, and co-founder of the Cyberspace Law Institute.
"Unfortunately, there is no general answer. Each area of the law defines differently the liability of everyone standing in between a wrong doer and the victim," Post said.
Most of the attention in legal circles is now focused on copyright issues, rather than fraud, defamation, or issues like sexual harassment. The reason for this is that intermediaries can be held strictly liable for copyright infringement. In plain words, the access provider can be liable even if it has no knowledge of the infringement. The nature of the Net makes that a scary thought.
By contrast, Post said, if an access provider has no knowledge that one of its clients is fraudulently offering bad tax advice, inciting violence or hate crimes, or defaming an individual or group, it would be quite a stretch for U.S. courts to hold IAPs liable for facilitating the act.
"Absolutely," said Eugene Volokh, a former programmer and a professor at the UCLA Law School. "It seems fairly clear that IAPs are not liable for defamation unless they know or have reason to know about the defamatory material, and the "have reason to know" does not create a duty to inquire," he said.
Holding an access provider somehow liable for fraud is also very unlikely, Volokh said.
"First, mere false political statements -- even lies -- don't qualify as `fraud' as such. Second, it seems to me that an IAP would not be liable even for commercial fraud by his users unless he's somehow in cahoots with them. Even if I'm wrong as to the latter point, again he'd have to know or have reason to know of the commercial fraud," Volokh said.
The situation becomes more difficult, said Georgetown's Post, when someone contacts the IAP and specifically informs them that there is fraudulent, libelous, obscene, or kiddie porn -- whatever happens to be the sensitive issue of the day -- and they ask the IAP to take action to disable those pages.
"What do service providers do then? That is a much harder question because now the IAP has information about a specific site and an allegation that there is something wrongful. There are only a half dozen cases, and the [U.S.] law has not worked this out," Post said. (See sidebar, The Canadian experience for a comparision to Canadian law.)
Stanton McCandlish, program director for the Electronic Frontier Foundation (EFF), agrees that this is very unsettled legal territory.
"Even the few bits of case law that are there are conflicting in different districts and circuits. Until legislation, or a really clear Supreme Court case settles it all, it's an open question," McCandlish said. "Webmasters and BBS operators, sysadmins and moderators all need to be aware that they are pioneers, and this isn't the Oklahoma! musical. There are real dangers in this outback."
The biggest problem from the legal perspective is that the law does not yet know how to classify the Internet business community and Internet-related activities.
Legislation has not been able to keep up with the Net. Case law is sparse, and even the Internet community has differing views at different times over different issues.
Common carrier status
Much of the Internet community wants to be an innocent conduit for
communication -- what has been known as a common carrier. But under
existing law there is a big gotcha. The price of being free of legal
liability has been heavy regulation by the government.
"Everyone is searching for a box -- a preexisting legal category -- to fit the Internet. On the one hand the common carried box looks nice to service providers because it limits liability, but historically that has come with a high price. Common carriers cannot discriminate. They have to let everyone on. Well what if you have a teen chat room and then decide to kick someone off the system because they refuse to behave appropriately? Common carriers are not supposed to do that. Common carriers are suppose to let everyone use the system, and they are not supposed to care what they do while using the system. That puts Internet service providers in a tough spot," Post said.
The EFF has encouraged the Internet community to push for legislation creating a new type of common carrier status -- one that protects the access provider but does not come with the regulatory baggage.
The whole issue of whether Internet access providers should be treated as common carriers is still a confusing debate, said Beryl Howell, senior counsel to Senator Patrick Leahy (D-Vermont) on the Senate Judiciary. When it comes to copyright liability, the Internet community wants to be a common carrier so that it is exempt from liability. But when it comes to Communications Assistance for Law Enforcement Act, the digital telephony law passed last year, the Internet community wanted to be classified as information providers, not as common carriers.
The U.S. Internet community wants to have it both ways, Howell said, but that is a difficult policy to sell. There are lots of early drafts floating around Congress, and Howell said there's little doubt that liability issues, especially on copyrights, will be revisited in the next session.
The good news, in Howell's opinion, is that just as the Internet is capturing people's attention, progress educating staff and members of Congress about the complicated nature of the Net is also being made.
Social responsibility
In the last year or so, the Simon Wiesenthal Center (SWC) has upset a
good portion of the Internet community with its call for access
providers, and Web publishers in particular, to assume greater
responsibility for the content published on the Net. Not responsibility
in the legal sense -- at least not in the U.S. -- but greater social
responsibility.
Rather than promoting the common carrier model, SWC's Rabbi Cooper argues a distinction should be made between types of Internet communication based largely on the level of interactivity.
"I think the best argument for unencumbered speech is when you have a give and take of ideas -- a true discussion -- chat rooms, discussion groups via Usenet, or e-mail listservs," Cooper said.
That argument breaks down when it comes to many extremist Web sites, Cooper said. He cites SWC's experience exposing minority students to the Net, and watching their reaction when they see some of the racist advocacy sites on the Net. Many, he claims, provide no reasonable mechanism for discussion or debate. There is no opportunity to combat bad speech with more speech, and ensure that people who see the former also have the opportunity to see that latter.
Cooper contends such Web sites are not discussions, but more like advertisements. And just as newspapers and TV stations will not run every advertisement an extremist group is willing to buy, so too should access provides be more discriminating. The Internet community ought to see themselves as publishers, set basic ethical and truth standards, and hold their clients to them.
"I don't think the traditional responses to hate speech computes, especially when it comes to the World Wide Web," Cooper said. "I think we need to say to the Internet community that gave us this great technology, `you guys have arrived. You are now gatekeepers of communications like network and cable TV, and newspapers. You may not technically be publishers, but that is what the public perceives you to be.' They need to do better than simply saying, `it is not our job,'" Cooper said.
"I'm not saying there should be legal liability. I'm looking for a way to cultivate social responsibility," he said.
The EFF's McCandlish has kind words for the role the SWC has played in combating hate groups, but he is not complementary when it comes to Cooper's views on the Internet.
"When it comes to the Net, the [SWC] has no idea what they are talking about or doing," McCandlish said. "Cooper's position, when generalized outside the Jew versus Nazi flame war, essentially says the National Enquirer must be censored out of publication because no one knows where to find dissenting opinions about whether or not 12 U.S. Senators really are space aliens," he said.
McCandlish may or may not be correctly characterizing SWC's prior positions, or what it is advocating in European democracies where governments tend to reserve broader rights to regulate public speech. In the research for this story, however, Cooper was adamant that he advocated voluntary self-regulation based upon providers' respect for editorial standards and social responsibility.
McCandlish held up the Nizkor Project, based in British Columbia, Canada, as an effective example of an anti-Nazi advocacy project that is a strong advocate of free speech.
Nizkor spokesperson, Kenneth McVay said that access providers should be free to accept or reject clients as they choose based upon their own use policies. And while advocating common carrier status for access providers, McVay still wants the right to reject service on a case-by-case basis. The key for McVay is simply to keep the government out of the whole issue.
EFF's McCandlish urges the Internet community to walk a fine line when it comes to reports of potentially illegal content or activities on a Web site. He urges providers to "stay as far away as possible from monitoring and content control", but to look into reports of illegal activities or child porn. Once a complaint makes a provider aware of illegal -- not simply offensive -- material, courts may consider that "knowledge of illegal activity." That said, McCandlish urges access providers not to jump the gun and censor stuff just because someone complains.
"It is a juggling act. Welcome to the Wild West," McCandlish said.
McVay of Nizkor does take issue with Cooper's assertion that the nature of the Web precludes an effective debate.
"If that is [Cooper's] conclusion, he isn't spending much time dealing with Holocaust deniers on the Net," said McVay. "I can cite examples where Web sites have been forced to remove outright lies, through the simple vehicle of exposing those lies permanently, and doing so in a highly visible public forum," he said.
We clearly have a long way to go in defining the nexus of rights and responsibilities, and whether asking the Internet community to pull the plug on hate, fraud, or falsehood is worth running the risk of libertarians crying "censorship."
|
Resources
If you have technical problems with this magazine, contact webmaster@sunworld.com
URL: http://www.sunworld.com/swol-11-1996/swol-11-extremists.html
Last modified:
The Internet community in Canada is also wrestling with how it deals with legal requirements, social responsibility, and concerns over censorship. One fundamental difference between the U.S. and Canada is that Canada has a constitutional rider permitting limitations on speech when that speech threatens society. Faced with concern over more government regulations, Internet providers are voluntarily banding together to agree upon a complaint-driven process and to define ethical standards to guide how access providers respond.
Canadian hate-speech laws, according to Saul Littman, Canadian director of the Simon Wiesenthal Center based in Toronto, say that public speech may not foment hatred of identifiable groups -- those based upon race, religion, sex, sexual orientation, etc. Free speech is protected by exempting one-on-one conversation, and by establishing a threshold that must be crossed. Littman said hate speech must be "harmful, prevalent, and repeated."
The Canadian Human Rights Commission (HRC), which cannot impose criminal penalties, is empowered to issue cease orders. If an offending party does not comply, then the HRC can haul them into court and cite them for contempt.
These regulations are now being applied to Ernst Zundel, a well-known neo-Nazi, for his Web publishing activities. Even though his material now runs on Web sites in California, Littman said, Zundel lives in Canada, his material reaches Canada, and so the HRC feels it can take action.
Littman also said the HRC is considering taking action against Internet providers if they refuse to remove offending material running on their sites, once they have been warned.
All that aside, Littman, like Cooper at the Wiesenthal Center in the U.S., argues that the Internet community has both the right and the responsibility to apply some standards to the material they publish, much like a paper.
Nizkor Project spokesperson, Kenneth McVay, said hate-speech laws have "already imposed de facto censorship on the Canadian side of the Net," but since they simply force sites like Zundel's off shore, they are a "complete waste of time and effort."
McVay calls this approach "the ostrich syndrome" of denial rather than squarely facing problems in our communities.
Against this backdrop the Canadian Association of Internet Providers (CAIP) formed in March 1996. This October, the voluntary industry group met to put the finishing touches on the first draft of a code of conduct, said Ken H. Fockler, president of Toronto-based CA*net and also CAIP. Since both membership and adherence to the code are voluntary, this effort is somewhat analogous to the Good Housekeeping seal of approval.
"Voluntary standards are a lot better than what we are seeing happening in the U.K. where police are handing out lists of sites and newsgroups that access providers are not supposed to carry," Fockler said. "The U.K. government hasn't prosecuted yet, but that's the threat."
Fockler said he believes the association's efforts are earning some good will with Industry Canada, the regulatory body that works most closely with Canadian Internet firms. Everyone agrees this is a first step, he said.
"I think after the membership gets more comfortable with the code of conduct, we will discuss making acceptance of the code mandatory for CAIP members," Fockler said.
Margo Langford, general counsel of Toronto-based iSTAR and a representative on the CAIP code of conduct drafting committee, said adopting a complaint-drive strategy was a good compromise.
"No access provider has the resources to proactively monitor content. And I would not want to be a common carrier because there is too much regulatory baggage. If anything, we want to steer regulatory policy in the direction where "best efforts" is good enough for indemnity," Langford said.
About the author
Barry D. Bowen is an industry analyst and writer with the Bowen Group Inc., based in Bellingham, WA.
Reach Barry at barry.bowen@sunworld.com.