WWW Journal, Issue 3

WORK IN PROGRESS

People & Projects at W3C


Jim Miller

Promoting the PICS Fix

His days start with early morning breakfast meetings. He fields more than a hundred email messages a day, plus a fistful of phone messages. And when he's not touring around Europe promoting his latest project, he's in Washington, D.C., doing the same.

Jim Miller is in high demand.

A W3C research scientist, Miller is one of the core players in the development of PICS, the Platform for Internet Content Selection. Today he's working to further its success, gaining the support of the Internet community, as well as private companies, to adopt the PICS rating system.

PICS offers a bright alternative to government regulation: an independent rating system that allows people with different concerns and tolerance levels to set their own standards. By including "labels" in Internet documents, authors and even private companies (or "labeling bureaus") can "rate" a Web page or site depending on the content. Labeling filters then do the rest of the work for the reader.

In an ironic twist, Miller says PICS actually came about because of the Communications Decency Act originally passed by the U.S. Senate last year. "The Communications Decency Act itself is a legislative response to an important issue," he says. "Basically, there was no reasonable means of regulating the Internet short of censorship. At the time, it may have been the best that could have been done.

"I think PICS is a better approach, but it wasn't available at the time. It's fair to say the legislature caused PICS to be developed." The original idea blossomed in August and was officially released early this year. Now, he says, "We have the technology."

The technology was almost the easy part. Now comes the time for promoting PICS, getting other countries involved in using the system, and enticing private for-profit companies to become "labeling bureaus" on a non-profit basis. This spring, Miller spent two weeks touring Europe to introduce the press, government officials, and industry leaders to PICS.

"It's still going to take a little bit of time. All major providers will have support for it by the end of the year. Our hope is that by the end of the year there will also be bureaus available for people to do third-party rating," says Miller. "There's more and more support coming for the system in Europe and elsewhere."

The PICS system is especially well adapted for global use, where countries with different concerns can use it to filter material. A country like Germany even has conflicting standards within its borders, as some states are more tightly controlled than others.

"Each of the countries has a different set of issues they're worried about . . . and a different relationship with the government." While the U.K. is concerned mainly about the advocacy of violence on the Net, countries like France and Germany are worried about neo-Nazi literature online. Not only is this material offensive to most residents--it's also illegal to distribute.

You might think someone this involved with computers and technology has been programming software since birth. But Miller actually received his first degree from MIT in what he calls "metallurgy"--materials science and engineering. While still a student, he worked part-time at Bolt, Beranek, and Newman. "I knew I wanted to be in computer science. I was working at a computer science company while I was in college, but I didn't think a computer science degree would be the best route."

After getting his master's in engineering management in Alaska, he returned to MIT in 1981 to work in the Artificial Intelligence Lab. "I actually went there on the condition that I not be a graduate student." However, he enjoyed teaching so much while he was there, he decided to complete the credentials required to teach college and received his Ph.D. in 1986. He then spent several years apiece teaching at Brandeis University and working at Digital Equipment and the Software Research Institute. Just one year ago, he returned to MIT once again--this time to head up the "Technology and Society" project.

"My interest has always been to make the computer a useful tool. I try to design systems that are easy to use and fulfill a useful need," says Miller. This often requires maintaining a delicate balance between those on the creative, conceptual side and those on the technical end who must actually put the idea into practice.

"It's a listening role," he says. "I work with people who are primarily interested in what the user will see--and those primarily interested in how hard it is to program. I absorb the input from both sides and I make the decision. If there's an argument, it comes back to me."

Besides playing the middleman at work, Miller also has some not-so-hidden talents. He speaks French fluently (something that came in handy on his last visit to Paris during a surprise presentation in French). He plays classical flute and sings opera around the house with Barbara, his wife of 17 years. And if you happen to be at Sanders Theatre in Cambridge next Christmas, be sure to catch him singing and dancing with the Revels (he's a tenor).

Miller's already got several new projects on his plate. He recently attended a meeting in Washington, D.C., on the Joint Electronic Payments Initiative (JEPI) to develop a protocol for payment recognition. "That's the stuff that happens after you shop and before you purchase. We're working on a protocol that lets you do that on the Web." He's also working on the "Digital Signature Initiative," which will embed code within documents to verify their source (such as a signature on the bottom of a page, or for public documents), giving the reader confidence in their authenticity. And next semester, he hopes to return to teaching "Structure and Interpretation of Computer Programs" at MIT, a course he helped develop.

Beyond that Miller expects to continue doing exactly what he's doing now, working on W3C projects from a variety of angles, adding to his collection of adapter cables for his portable computer--one set for each country.

"I'd like to continue doing these types of projects," he says. "It draws on both technical and people skills. I just love it."

- by Kimberly Amaral


Dave Raggett

The Working Draft

This document describes the career of Dr. David Raggett, who has shown an uncanny ability to be there at the beginning of crucial Internet developments, including the World Wide Web, HTML, and virtual reality. Regularly employed by Hewlett Packard Labs in Bristol, England, he has been a Visiting Scientist with the W3C in Cambridge, Massachusetts, since May 1995. He lives nearby with his wife and two children. Other functions include his position as co-chair of the IETF working group for HTTP, which he set up in December 1994. His current projects cover authentication and micropayments, HTML style sheets, Java, and fonts. He is referred to as "the father of HTML" by VRML innovator Mark Pesce.

Introduction

Dave Raggett, 40, has been a major player on the World Wide Web development scene since its earliest days. Hypertext and the Web seem a natural form of expression for the tall, slender Briton, whose thought patterns and ideas move quickly, as complex and interlinked as the Web itself. Although he has not received the media attention accorded others involved with the inception of the Web, Dave Raggett doesn't seem to mind. He's too busy working on what's next.

Associations

With a degree in physics and a doctorate in astrophysics from the University of Oxford, Raggett has been immersed in hypertext development since the late 1980s. He started with a project at Hewlett Packard Labs in Bristol, England, that combined hypertext and expert systems to allow salespeople to easily put together quotes for workstations, including pictures of custom configurations. Since then, he has pursued his ideas on the distributed maintenance of such a knowledge system, working with individuals and companies from around the world to bring the World Wide Web to fruition.

A Walk Through Raggett's Involvement with the Web

Raggett has been heavily involved in developing standards for the Web, including authoring HTML+, HTML 3.0, HTML 3.2, and 3.5 specifications. These specifications gave browser authors common feature sets to support, helping to move the Web into the big time. "I get a kick out of seeing URLs everywhere," he admits when asked about his reaction to the increasing growth and ubiquity of the Web.

Currently a visiting scientist with the World Wide Web Consortium (W3C), Raggett is pushing the boundaries of the Web even more. Easy-going, friendly, and with credentials of gold, Raggett is a natural choice to be a W3C facilitator. The job entails working with leaders in academia and industry to help design and define the future of the World Wide Web. Through these negotiations, Raggett aspires to develop HTML to the point where it is a competent format for publishing on the Web as well as printing on paper, for a wide range of applications.

Previous Work

Raggett's involvement with the Web began humbly enough. In 1991, Raggett sent off a proposal to the alt.hypertext newsgroup in which he suggested a new "skunkworks" project--engineering slang for an underground and underfunded, but often highly efficient, "just get the job done" kind of project. The goal was to invent a simple global hypertext system analogous to Microsoft Windows Help, but one that worked across the Internet. Previous systems had required a degree of compilation; unhappy with this, he wanted something that was directly interpreted. Some of the responses to his proposal mentioned work that Tim Berners-Lee was doing at CERN, including a simplified form of SGML. Through the rest of 1991 and 1992, Raggett teamed up with Berners-Lee and other members of the www-talk mailing list on developing the Web's formative technologies, including X-based browsers, as well as refining Berners-Lee's simple original version of HTTP.

In March of 1993, Lynx 2.0a was released, followed in April by the release of NCSA Mosaic for X 1.0--with ports to the Apple Macintosh and Microsoft Windows available by August. Raggett regarded the efforts made by the NCSA team in porting Mosaic--releasing easily runnable binaries for a variety of platforms, as--"brilliant." With easily usable Web browsers beginning to find a market with the masses, Raggett began to collaborate with Marc Andreessen and Eric Bina, both with NCSA at the time, on defining the basic tags to be used with forms. The results grew to become HTML+, later proposed as an Internet Draft.

With no formal planning, Raggett continued working with the ad hoc development of HTML, including meeting with members of the SGML and hypertext communities to bring the Text Encoding Initiative and the Web together.

Definitions

At the First International World Wide Web Conference in May 1994, Dave Raggett presented the HTML 2.0 specification, which was a "sanitized version" of HTML as it stood at the time. It incorporated the model for forms, nested lists, the <IMG> tag, and the <HEAD> and <BODY> containers.

Clarifications to the HTML spec progressed, including work on tables, math, and style sheets. Raggett incorporated much of it into the specification for HTML 3.0, which he released as an Internet Draft in March 1995. Unfortunately, it was rejected by the IETF, and fragmentation of HTML and what came to be called "tag abuse" continued.

In the Fall of 1995, Raggett was able to facilitate a meeting with representatives of Netscape, Microsoft, Sun, Spyglass, and Pathfinder, along with Tim Berners-Lee and Dan Connolly, in an effort to come together on standardizing HTML. Agreements reached at the meeting, in addition to contributions by the HTML Working Group, have led to the recent release of HTML 3.2, which represents a baseline level of HTML for browser vendors to support.

Raggett is also involved in the next version of HTML, dubbed Cougar, which is slated to include extended support for math, the first phase of extensions to the form tags, captions for figures, style sheets, and frames support based on an extension to style sheets.

On the HTTP end, Raggett pushed to take Tim Berners-Lee's original version of HTTP--a simple "give me this file . . . here it is" protocol, with the browser left to guess the file format--to the next level. Along with Berners-Lee and Dan Connolly, who was adapting MIME for use with HTTP, Raggett helped rationalize and formalize a specification for HTTP. Co-chairing the IETF working group for HTTP, Raggett has collaborated with many other contributors in designing and testing HTTPng, an improved version of the protocol that supports more efficient connections to servers and the multiplexing of multiple messages over a single connection.

The VRML Element

Ever the Renaissance man, Raggett has interests that extend beyond hypertext into virtual reality, and he is one of the pioneers in the movement to bring virtual reality on-line. Virtual Reality on the Web took flight during a critical Birds-of-a-Feather workshop on virtual reality and the Web that Raggett ran with Tim Berners-Lee at the first World Wide Web Conference in Geneva in May 1994. At the Internet Society conference, held in Prague the next month, Dave Raggett presented a paper entitled "Extending WWW to Support Platform Independent Virtual Reality." It was during this period that Raggett coined the acronym VRML, for Virtual Reality Markup Language (which was subsequently changed from Markup to Modeling).

Raggett originally envisioned VRML as object oriented, based on the concepts of indoor and outdoor scenes. He also saw the need for a scripting language that would be independent of VRML. His original research into virtual reality was done in the context of video teleconferencing, combining model-based coding techniques with a virtual reality model of a scene, like cameras mapping points on a face. Raggett feels that VRML has not yet lived up to the ideas he set out in his paper--"something more than pixels and polygons"--and would like to make it more scalable. He notes, "After HTML, making cyberspace 'real' is going to be a big avenue."

Activity List

These days, Dave Raggett is helping to map the future of the Web with other members of the Consortium. Following up on his work creating the Arena reference browser, he is developing a Web browser written in Java, for use in demonstrating extensions and improvements to HTML.

Raggett is amazed that the Web has continued to grow along an exponential curve, though he points out that in some ways it has moved quite slowly because there are a lot of ideas that have yet to be implemented. Calling the Web a "bit of a hacker's phenomenon," he would have liked to have seen the growth of the Web better harness the power of skunkworks projects and public domain software. And as he watches the Web grow increasingly complex, Dave Raggett hopes that those of us involved in this growth will remember that "simplicity is very important."

by - David Belson


Sally Khudairi

<TITLE>Webmaster</TITLE>

Problem: What do you do with the original Web site--W3C--once it has grown to proportions no one wants to guess at, has more contributors than the United Way, and needs to be more scrupulous than the United Nations?

Solution: Hire a new webmaster.

Enter Sally Khudairi "with the mop in one hand, the iron and starch in the other . . ."

Whoa, what have we here? A webmaster or a cleaning woman for the Augean stables? Or both?

How about both and then some? How about someone trained as an architect and visual designer (Northeastern, Boston Architectural Center, Harvard, and Tulane), someone who teethed on one of the original Apples brought home by her parents?

How about someone who brings a diverse set of skills and bounces from place to place solving problems, imposing ordered process on apparent chaos, and along the way adding to an already broad understanding of computers, design, and communications?

That's what W3C has in its new webmaster. When you look at where she's been, what she's done, and who she's done it for you can't help but be impressed. Her clients include Ziff Davis Interactive, Yahoo! Computing, Lycos, Houghton Mifflin Company, SkyMedia, Central Artery/Tunnel Project, Automobiles Citroen, PowerEgypt, and Coopers & Lybrand.

And by the way--it is "webmaster" not "webmistress."

"I love the term--it implies you're a master at your profession."

Now if someone could only define that profession!

"I've had a bizarre background," she says, "flipping back and forth between design and project management." And it's all been heavily flavored with computing, of course.

Khudairi is aware of the task in front of her, and while she's respectful, she doesn't seem to be so much awed as just plain excited. "I've been sucked into the Web and I can't get out--I don't want to get out."

She's a designer who knows that design is "a way of life" and that Web design is "not a matter of putting lines on paper--you have to be able to understand it as more than a two-dimensional entity."

She also knows that anyone tackling this job has to have many dimensions to her life. She's not a computer geek, but she is comfortable with geek-speak. And while she may not pry under the hood herself very often (no, she doesn't know CGI scripting or Java), she knows what is under the hood and she understands the constraints the technology puts on designers.

Khudairi is not one of the original people who looked over the shoulder of Tim Berners-Lee at CERN six years ago and said, "Hey, great idea, Tim." But she has been with the Web long enough to see it grow from a tool to something that looks more like an entertainment medium. In other words, she appears to have a respect for the past without being too heavily invested in it.

So what is she going to do with the W3C site?

The conventional wisdom, of course, is for the site designer to first determine the audience and what its wants and needs are. But with W3C that doesn't do much good. The audience is incredibly broad and diverse--"more than 256 colors," quips Khudairi. "They are developers and technologists, marketing people, corporate leaders wanting to learn more about what we do, or people who have just heard about the Web and assume this is the place to start surfing."

But if you can't really zero in on an audience, then what?

In a word: structure. Create a structure that will help people find their way around. If the Web is the information superhighway, one thing you notice about the W3C site right away is there's no map. As you prowl around, you can't tell at any given moment whether you're on a dirt lane, a state highway, or the Interstate. The only signs point to other roads that are rarely identifiable in terms of their relationship to a whole, or a subset, or they point to "home." And while the home page has lots of links embedded in short pieces of text, it really doesn't help much in grasping the site's overall content and design.

Working with colleagues at W3C, Khudairi is now trying to sift through the site and determine a logical way to structure it that will be useful for a diverse audience.

So will this be a team effort?

"Absolutely," she declares. "What's interesting is that you can really tell when a site is maintained and administered by one person because it's very flat . . . ideas come from everywhere. Innovation is in front of you and you just have to grab it and go."

How about some hints of what to expect on a future W3C site? Khudairi agrees.

Structure

Three major divisions: User Interface, Technology and Society, and Architecture.

"Structure needs to exist," Khudairi says, "but it has to be a structure you can work in more than one direction."

Images

"Users have become a lot more sophisticated and they're expecting more," Khudairi says. "Yes, we'll incorporate new graphics, our organization will change to reflect the changes in the Web."

Does that mean dancing bears?

"I'd love to have Shockwave or Java on a home page that I've designed," Khudairi says. "But does everyone have access to Shockwave? Access is the issue. You don't have Shockwave or Java on your site just for the sake that it's there."

Access to All Browsers?

Hardly. "You can't please everybody all the time, you just can't." Having said that, Khudairi knows that she has to reconcile this reality with another one that's just as real and demanding--the point of W3C is to not exclude anyone.

Shorter Pages?

"We've got a lot of long pages on the site; people are scrolling and scrolling. Personally I'd like to see our pages a little shorter, structurally I understand why they're not."

In other words there will continue to be a lot of contributors building the site in their own way. But she does see a way to impose some order to these contributions beyond the overall structure.

Style Sheets!

"One thing that I am salivating over is Cascading Style Sheets," she says. "We will definitely have style sheets incorporated into our site."

Not sure what style sheets are? Check out the latest thinking about them on the W3C site. You should find it easier to discover everything you need to know, about style sheets and other things Web.

From chaos to structure while the whole world watches. . . . Hey, all Hercules had to do to clean those stables was divert a couple of rivers!

by - Greg Stone


WWW Journal, Issue 3