[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
W3C on PICS
http://www.collegehill.com/ilp-news/reagle.html
f r o n t p a g e | a b o u t | b a c k i s s u e s | s u b s c r i
b e | a d v e r t i s e
The ILPN discusses PICS with Joseph Reagle of the W3
Consortium
August 18th 1997
Joseph Reagle Jr. joined the World Wide Web Consortium
(W3C) in October of 1996 to focus on policy issues
related to the development of global technologies and
their relationship to social and legal structures.
Specifically, how to promote "good" engineering when
applied to a multifaceted and often contentious policy
environment; one result of this activity is the W3C
Statement on Policy. Mr. Reagle has also been working
on filtering, digital signature, intellectual property
rights management, and privacy capabilities on the Web.
He has also been an active contributor to the
development of the Platform for Privacy Preferences
(P3) project at the W3C. P3 will enable computer users
to be informed and to control the collection, use and
disclosure of their personal information on the Web.�
ILPN: How does PICS work?
The Platform for Internet Content Selection (PICS) is
an infrastructure for associating labels (metadata)
with Internet resources. It was originally designed to
help parents and teachers control what children access
on the Internet, but it also facilitates other uses for
labels, including code signing, privacy, agents, and
intellectual property rights management.
PICS allows organizations to easily define content
rating systems, and enable users to selectively block
(or seek) information. The standard is not a rating
system (like MPAA or RSACi), but an encoding method for
the ratings of those systems. Those encoded ratings can
then be distributed with documents, or through third
party label bureaus.
ILPN: Can you summarize
the origin of PICS as well
as the coalition behind [Related Links]
its development? What role
is W3C playing in the The World Wide Web
ongoing development and Consortium PICS Site
promotion of PICS?
Family Friendly
During 1995, a number of
activities occurred that Internet...The White
were related to concerns House Internet Content
of children accessing Filtering Plan.
potentially inappropriate
Web content: Australian
Anti-PICS Site
1. The Senate Judiciary
Committee heard Ratings Now,
testimony regarding
the "Protection of Censorship
Children From Tomorrow...from SALON
Computer Pornography
Act of 1995" (S. 892) ACLU Press Release
2. The Information on the July White
Highway Parental House Summit
Empowerment Group
(IHPEG), a coalition Foucault in
of three companies
(Microsoft Cyberspace:
Corporation, Netscape Surveillance,
Communications, and Sovereignty and
Progressive Hard-Wired Censors ...
Networks), was formed an absolute must read
to develop standards article from James
for empowering Boyle, a law professor
parents to screen at Washington�College
inappropriate network of Law,
content. American�University.
3. A number of standards
for content labeling were proposed including
Borenstein's and New's Internet Draft "KidCode"
(June 1995), the Voluntary Internet Self Rating by
Alex Stewart and NetRate by Peter Wayner.
4. A number of services and products for blocking
inappropriate content were announced, including
Cyber Patrol, CyberSitter, Internet Filter,
NetNanny, SafeSurf, SurfWatch, and WebTrack.
By August, the standards activity was consolidated
under the auspices of the World Wide Web Consortium
(W3C) when the W3C, IHPEG, and twenty other
organizations agreed to merge their efforts and
resources to develop a standard for content selection.
The intent of the PICS project was to demonstrate that
it was possible and better for individuals and families
on the Internet to have control over the the
information they receive, rather than creating a
national framework for censorship.
Today, the W3C believes PICS-based technology can
fulfill the requirements of mediating access
potentially offensive or illegal content. The next big
step is educating the users on how to use those
technologies. For the future, we are working on the
Resource Description Framework as the basis for a
richer metadata infrastructure. Applications such as
our P3 privacy project will use it to enable sites to
make privacy statements.
ILPN: Can you describe the difference between
'labeling' 'filtering' and 'blocking,' and why this set
of distinctions might be important?
Paul Resnick's PICS Options FAQ has a very good answer
to this question and some of the others that you ask.
To summarize, labels are statements. They have the
capability to describe a Web page, or to make any
arbitrary assertion. Obviously, people can use such
information to block, or select what they want to see.
To generalize, one can use metadata to "rate" a Web
page with respect to some rating system. Given ratings,
a user applies a filter (her preferences about the
ratings) to determine which pages are most appropriate;
some action is associated with the result of the
filtering. The common result is the blocking or
selection of a page, but the user could also be
presented with a list of sites sorted according to her
preferences.
ILPN: What do you make of the opposition by EFF, the
American Library Association and the Electronic Privacy
Information Center to PICS? Do you see PICS as a
bulwark of free speech, or simply as the lesser of two
evils (the other being government regulation)?
I would characterize the response from each of those
organizations differently, and of course my response is
based on my own understanding of that position:�
1. The ALA does not oppose PICS or filtering in
general. I believe they acknowledge its usefulness
as a means of parental empowerment, but do not
feel it is appropriate for installation on every
computer by default in their libraries. I respect
this point of view while acknowledging that
libraries may have requirements placed upon them
by their constituencies or by the law with respect
to illegal materials; it is up to the libraries,
their constituencies, and governments to
appropriately resolve these concerns.
2. I personally like the direction the EFF took in
working on "Public Interest Principles for Online
Filtration, Ratings and Labelling Systems" and
hope to see such efforts continue. The W3C feels
that metadata is necessary to the Web. Hence, I
think it is somewhat naive to criticize the
capability to support metadata. While I do not
agree with every position in the EFF document, I
liked it because I think it is more constructive
to discuss how that metadata infrastructure can be
best used (or how to prevent abuse) rather than
trying to hobble the Web.
3. I, and my colleagues at the W3C, encourage
rigorous discussion on the use of filtering
technologies and how they affect individuals'
rights. I do not buy the slippery slope argument
that all technology which governments could use to
do "bad things" must not be developed.
I do not see PICS in grand terms; PICS is an
application of metadata, as I explain elsewhere. To
respond to the later question I do prefer the
capability to exclude unwanted speech over the
suppression of it at its source. Also, metadata itself
is speech -- having the capability to laud, critique
and criticize others is fundamental to a robust
society.
ILPN: How likely is it that PICS-based software will be
mandated by governments at the level of ISPs?
Unknown. Also, governments could theoretically do a
number of things such as :
* create rating systems
* determine filtering criteria
* require the use of filtering technology in servers
or in clients
* require the use of certain rating systems, etc.
They can accomplish this by legislative action, by
interpreting existing statutes, by promoting
self-regulatory structures, or by providing incentives
to comply with the policies by attaching liability, or
removing it, to the players involved. Even with a
specific question in hand, it would be a difficult task
to predict the path of any nation.
ILPN: Are you concerned with the potential abuse of
PICS by governments and/or employers?
Yes. I personally would protest or subvert my
employer's or government's efforts in applying
mandatory filters against my will; I do use filters to
select content I am interested in and to get rid of
spam and bozos. Regardless of my personal opinion, the
W3C does not have the competency to tell other
organizations what their policies should be. We can
tell them about the technology and consequences of its
use, but what they do with it is their choice
obviously.
ILPN: What about the possibility that a third-party
labeling organization will obtain too much power?
In terms of an independent third party? Let the market
decide. If there are monopolistic concerns, a nation
may wish to apply anti-trust laws. If it is a political
entity, I hope it has some mechanism for being held
accountable to its constituency.
ILPN: We tend to think of PICS in terms of excluding
materials deemed offensive. What are more 'positive'
possible applications of PICS?
PICS is merely one application of "metadata." Metadata
means "data about data" and we are working very hard on
this with our Resource Description Framework (RDF).
This is a fundamental computer science concept and is
essential to the future of the Web. Any time you wish
to make a statement or an assertion, to rely upon a
trusted opinion, it is "metadata." Our Platform for
Privacy Preferences (P3) is another application of
metadata. We wish to enable sites to make statements
about their privacy practices so users are informed and
can make choices about how they wish to interact with
sites. A useful feature of metadata is that it is can
be machine readable, so agents can act on behalf of the
user, freeing the user to concentrate on higher order
content and interactions. Hence, when I configure my
agent, I should be able to search Web sites with the
type of content I like, those which have privacy
practices that I like, or that are referred to me by
trusted third parties, and those that support the
payment capabilities that I posses.
ILPN: PICS seems to transform the web from an arena in
which anything goes, and in which each individual must
define his or her own participation, to an arena in
which various 'cultures' can establish their own,
separately designed comfort zones. Was this an
intention of the PICS developers, or is it simply an
unintended consequence of an effort to protect
children?
I don�t know if this was an original, explicit
intention of PICS, but it soon became apparent that
this is what PICS was about: allowing people to create
their own cultural boundaries on the Web. I look at a
lot of what we do at the W3C as not only providing the
basic infrastructure for exchanging hyperlinked
documents, but we are providing the capability to have
more sophisticated interactions with other users and
agents on the Web -- homegrown cultures and societies.
"Real world" entities may see these tools as ways of
extending their own social structure onto the Web, and
this is actually what a lot of the PICS debate is about
in my opinion. We'll see how successful governments can
be.
In the meantime, the W3C does want to mitigate the
possible fragmentation on the Web from either: 1)
people dropping off it all together and creating their
own, or 2) tearing it apart from fighting over whose
cultural norms should prevail. I'll quote from the W3C
Policy Statement on this point:
...This architecture must allow local
policies to co-exist without cultural
fragmentation or domination...
http://www.w3.org/Policy/statement.html
ILPN: Is W3C promoting the development of PICS into
proxy server products?
Yes. The PICS Options FAQ states that filter processing
can be centralized at a proxy server while still
permitting individuals to choose the filtering rules. I
will qualify this by saying that it has never been the
intent of the W3C to create technology for governments
to use as a means of centralized control. Not that
governments couldn't do such a thing (and there are
other ways for them to do it if they wanted to), but
that isn't our intent in working on this technology.
ILPN:. What kind of legislation, and court cases, do
you expect to see in the future regarding PICS?
I expect to see continued activity in:
1. drawing the line between "illegal" and
"inappropriate" content
2. determining what obligations services have in
restricting illegal material
3. developing self regulatory structures for limiting
childrens' access to "inappropriate" material�
(promoting a "family friendly internet.")
ILPN: What should people maintaining web sites be doing
regarding PICS and the various rating systems? Should
people be rating their sites now? What might the
consequences be for neglecting to rate one's site? Is
it important for people to keep track of how their web
sites are rated, and if so, how can they do this?
Technologies, such as Microsystems Software's
CyberLabeler, are being developed which follow the
recommendations of the PICS specifications and make
labeling sites much easier. Content creators that want
to label should continue to demand such technologies
and that such technologies be integrated into Web
development applications. I expect that in the near
future, many sites will be generated dynamically from
databases; those databases will be indexed and
structured by metadata. At which point, labels and
rating will be integral to the creation of dynamic,
customized sites. Search engines may also begin relying
upon the useful information found in metadata to return
more appropriate -- on "target" -- information to
users.
The consequence of not rating your site might be that
if you have potentially illegal or offensive material
you may draw regulatory attention upon yourself. Also,
by not labeling you may be overlooked by those using
filtering and content selection technologies. Most
adult sites are more than happy to label and use
filtering services, they want to attract those looking
for the services they offer� while avoiding the
difficulties associated with angry parents.
ILPN:Thank you for your time.
My pleasure.
Send us your comments on this article
Copyright 1997. All Rights Reserved
f r o n t p a g e | a b o u t | b a c k i s s u e s | s u b s c r i
b e | a d v e r t i s e