All Products  |   Support  |   Search  |   microsoft.com Home  
 
  HOME   |   WRITING   |   BIOGRAPHY   |   SPEECHES   |   GIVING   |   ABOUT MICROSOFT   |


 


 

Unix Expo
Remarks by Bill Gates
October 9, 1996

MONICA VILA: Good morning. Welcome to day two of Unix Expo Plus , the Internet plus Intranet show. Yesterday we reflected on the rapid change of technology and how the dividing lines between various aspects of IT have evolved and the dividing lines are becoming so blurred. I was reflecting this morning that featuring Microsoft at a Unix show is an event that probably wouldn't have occurred just a few years ago. Yet it's very relevant today, given Microsoft's position in enterprise computing.

It is rare we get an opportunity to listen first-hand to someone who's impacted our industry so profoundly for such a long time. Please join me in welcoming Mr. Bill Gates, president and CEO of Microsoft Corporation.

MR. GATES: Good morning. I hope I'm not out of place here. I'm curious, before I start out, how many people here have ever used a Windows-based computer? (Laughter.) All right, all of you.

Well, the reason I'm here at this Unix show is to talk about some of the very exciting developments taking place on both the hardware and software side that are allowing systems to work together more than ever before, and, in fact, allowing some of the benefits of the whole PC world spill over into the work station and Unix server world.

If we go way back in time, Microsoft was actually the first one to go to AT&T and beg to get a nice high-volume commercial license for Unix. And for many, many years we were the highest volume licensee, not only for our own Xenix products, but Siemens with theirs, Santa Cruz with theirs, and dozens and dozens of sub-licensees.

I have to admit, it was fairly difficult to work with AT&T back then. They simply didn't understand what they had. They didn't understand how to manage the asset, either in terms of promoting it properly or in terms of making sure that there wasn't fragmentation in how different implementations were put together. And so that vacuum in leadership created a bit of a dilemma for everybody who was involved in Unix.

Well, Microsoft stepped back and looked at that situation and said that the best thing for us might be to start from scratch: build a new system, focus on having a lot of the great things about Unix, a lot of the great things about Windows, and also being a file-sharing server that would have the same kind of performance that, up until that point, had been unique to Novell's Netware.

And through Windows NT, you can see it throughout the design. In a weak sense, it is a form of Unix. There are so many of the design decisions that have been influenced by that environment. And that's no accident. I mean, we knew that Unix operability would be very important and we knew that the largest body of programmers that we'd want to draw on in building Windows NT applications would certainly come from the Unix base.

Well, today Windows NT has achieved very high volume, and I think it's fair to say that it's both complementary and competitive to all the different flavors of Unix that are out there -- Solaris, HPUX, AIX. Excuse me if I've forgotten your favorite flavor as I go through those.

Well, the PC has become a major phenomenon in the world of computing. In terms of dollars spent, in terms of Unix units, it really is sort of the anchor that drives many of the other things that go on. And this is based on the feedback loop that you get when you get lots and lots of volume, volume drives software and hardware innovation and breadth. And that in turn drives volume.

And so today the PC market continues to grow, both in terms of business use and home use, on a worldwide basis. There are more than 500 vendors worldwide making PC hardware. And part of the dream of having an operating system that's independent from hardware is to be able to have customers get up any day they want and choose a different hardware vendor, choose the most portable machine, the fastest machine, the machine where you get the best service in a particular area, and to be able to do that without any change in software, without recompiling, without having to have the source code, without having to learn a new interface, without having to stay away from the most advanced elements of the system because they're different; simply to make that change in a seamless way so that the users involved don't even know that that's happened.

And that's really driven PC economics. That's what's made it possible to have so many hardware companies making contributions without creating an incredible overhead for software developers and for the customers who'd have to move things around if this adaptation wasn't extremely seamless.

As we have introduced processor independence into the Windows space, it's a tiny bit more complicated than that. When you have the processor independence, you either need to use on-the-fly compilation or interpretive techniques or go back and do recompilation. And so now as people are picking different processors, they have to do that as well.

Now, originally the PC was in a very well-defined niche. It was the low end of computing. And so what people were doing on the servers and the competition with mini-computers, AS-400s and mainframes, that was completely separate. But as Moore's law has driven the power of high-volume chips up to higher and higher levels, now the highest-volume servers in the world, over a million units a year, are based on PC hardware technology.

And again, we get the same sort of success loop that we saw with the PC. Here it's a different class of applications. Here it's vertical applications, business applications, database, electronic mail, and increasingly Internet-based applications that do dynamic page generation up on the server.

And here, again, we have a large number of vendors who've come in and gotten involved, even vendors that you think of as traditionally Unix vendors -- HP, DEC, Siemens, Fujitsu, virtually across the board -- not Sun, but most of the others. (Laughter.) But in any case, very good volume dynamics and very broad participation.

Now, there's two elements here. One is taking PC hardware and running Unix, various flavors of Unix, on PC hardware. And then the other part of that is taking PC hardware and running Windows NT on it. And a little bit later I'll show you some figures that show both of those are very, very significant phenomena.

One thing to keep in mind is that the pace of hardware improvement will continue to be extremely rapid. Moore's law -- there's no problem with that rate of improvement, looking forward at least a decade. And so chips like Pentium Pro or Alpha or Power PC and the floating point arena will actually have even more rapid innovation there. Some of this is catch-up to other things that have gone on, but in terms of integer performance there is still plenty of room to run up the clock speed to a higher level and to get the memory subsystems up to higher bandwidth. At one point, as we were running the clock up, it looked like we wouldn't be processor-limited,we'd be memory subsystem-limited. But now, with some great innovations on that side, neither part looks like it'll be a bottleneck.

Advanced graphics continues to benefit from rich LSI integration. And at SIGGRAPH this year, there were a lot of great papers talking about bringing the most advanced techniques down onto PCs. One of those was an approach that Microsoft described, called Talisman, where you can actually get away from the traditional frame buffer. The traditional frame buffer is becoming a serious bottleneck in terms of the graphics pipeline. And if you can, on the fly, do the decompression and sort of spider Windows assembly, then you can get away from memory bandwidth limitations in graphics and have far, far better fidelity without having to go way, way up in the price spectrum. And so we are seeing some very interesting developments there.

Some other hardware advances here. Of course, the drop in the price of memory is very important, and that now is pushing systems so that 4 gigabytes appears to be a serious limitation for a lot of applications in certain flavors of Unix. And next year Windows NT will move up to take advantage of that 64-bit address space.

I think flat-screen technology is an important element driving computer use forward. Today it's kind of a dilemma to choose whether to read things on the screen or read them on paper. Take the trade journals or the Wall Street Journal. Some days I find myself doing it one way, and some days the other way. And it's not because of any limitation in the software. It's simply that the screen isn't large enough. The resolution isn't high enough to really be a perfect substitute for paper.

Now, of course, the electronic medium has other things that make up for that. It's up-to-date. It's easy to forward articles along to other people. It's easy to search the material there. And so it's really a dilemma that will eventually be resolved totally in the favor of reading off the screen as the size and resolution moves up.

Well, the biggest thing going on not only in the Unix world but certainly in the PC world is the Internet phenomenon. And this is a very exciting thing. You know, when you talk about a success loop where volume drives more people to get involved, which drives further volume, the Internet is the best example ever. More and more content is published all the time, and it's driving more people to sign up every day.

It's kind of interesting that there's still a lot of missing elements -- figuring out how to make money from content publishing, some of the security elements, avoiding people having to type in these long URLs, making it easy to work off-line. There are lots and lots of things to be improved. But because of the size of this phenomenon, it's almost as though those weaknesses have become strengths. For each one of those, there's a dozen companies -- many of them new start-up companies -- that are diving in to improve those things.

And so despite what some people have said about running into bandwidth limitations or the Internet coming down, I have no doubt that this will move forward and will really redefine not only the world of computing but the whole way that people communicate. The impact of that in terms of how business is done or how people learn, or even how they entertain themselves, is going to be very, very dramatic.

And so everyone doing operating system work, everyone doing development tools work, has got to reset their thinking based on having the Internet as a primary platform. In some ways this is a great simplification. You know, for so many years most Unix systems ran on TCP/IP. And yet the world had lots and lots of different protocols. Well, today nobody can argue with TCP/IP being what everybody should use. That doesn't mean they'll switch overnight, but slowly but surely they'll migrate not only to IP but, over time, to IP version 6, which can easily connect together not only all the computers we have today, but all that we're likely to have over any reasonable period of time.

If there's anything that'll hold the Internet back, it's the tough problem of getting bandwidth out to people at home. In universities and businesses, getting that bandwidth is reasonably straightforward. But when you're dealing with today's phone infrastructure, there's a limit to how much data you can push across that line. It looks like we'll be able to go up from 28.8 K baud to about double, 57.6, with some very clever techniques that Rockwell and U.S. Robotics will be applying there.

Now, that's pretty darn nice. I mean, certainly compared to 2400 baud or 4800 baud that was the norm just five or six years ago, this really works. And so when you have still images, they come down in less than five seconds. In order to move up to have motion video, though, we've got to have much higher speed. And that's why there's so much talk about not only ISDN, but ADSL and PC cable modems, which is the solution that the cable industry wants to provide.

Microsoft is doing everything it can to make sure the software pieces are there to take advantage of these higher bandwidth types. But to be realistic, we'd have to say that three years from now, even in the U.S., where the adoption rate will be higher than in any other country, it's very unlikely that more than 30 percent of homes with a personal computer would be hooked up. So we're going to have a period where, as you're doing authoring, you'll have to think about both worlds. You have to think about people who have very high-speed connections -- universities, businesses, and the few that get those new lines -- and people who are still dialing in primarily at 28.8 or something very close to that.

Eventually we'll get speeds even greater than ISDN or ADSL provide. We'll get up to a level where we can do a high-quality video feed, MPEG 2 or perhaps by then an MPEG 4 feed. And that's really the original vision of interactive TV. It's kind of interesting how this has all turned out. Three years ago people would have said that the phone companies and the cable companies were going to come in and just build broad-band networks and that the TV would be the primary device, with each of these companies picking their own networking architectures to put it together.

Well, now it's clear that the networking standards are all those driven by the Internet -- the Internet engineering task force, W3C being the primary bodies there. But that whole wealth of standards is what will be used on not only wide-area networks but the special networks that cable companies and phone companies will set up. And that's wonderful, because it means everything will be connected together.

But what we're on now is an evolutionary path that takes the PC and mid-band data rates and uses that as a stepping stone to get to that broad-band vision that will eventually have all of those nice things -- TV shows whenever you want, but more importantly, the kind of practical things that are being done on the Internet today.

And one of the fascinating things about the Internet is that it's not really one single application that causes somebody to think of the Internet as something they're going to go to every day. It's all the different things that are out there. I have a lot of friends who swore they'd never buy a computer of any kind. But nowadays, I can get them at least tempted by taking subjects they care about and going out on the Internet, browsing around and showing them that this really is a tool that even they will find of great utility and that it's just going to get better and better.

Electronic markets will be a major phenomenon. And we're working with a lot of companies, particularly Visa and MasterCard, to make sure there are clear standards there. There's no reason why, in the electronic world, security shouldn't be even better than it is in the physical world. And the physical world is not perfect. Credit cards are stolen; they are misused. But as a percentage of losses by credit card companies, that's very small. The fraud loss is quite small compared to the bad debt problem they have. And so here, with the ability to use cryptography so that a merchant never actually sees the credit card number, we ought to be able to do better.

One thing that's holding us back a tiny bit here, and it's a real shame, is that the U.S. government is making it fairly difficult for companies like Microsoft who want to use strong cryptographic techniques. You're all probably aware they have export restrictions that kind of force us to divide our product line up, and it makes it, for a global network like the Internet, a real problem to have a strong standard there. And that's why Microsoft, along with other companies, has really gotten out in front and lobbied for these restrictions to go away. In the last few weeks they were improved a tiny bit, so now they'll let us go out with 56-bit encryption. But even this present so-called compromise is not nearly good enough to enable the scenarios that are important.

People today talk a lot about the Intranet being the big driver of the next couple years. And I think the Intranet is very, very important. It really speaks to the vision of information at your fingertips and making it a lot easier to navigate through information rather than simply remembering a hierarchical path name. After all, a lot of the things people say the Intranet is bringing, you could do with file sharing. You could have a directory that you put files into, have everybody go up there, find those documents and those spread sheets. But in order to get there, they have to go through a lot of very opaque steps.

And with the Intranet, for any scenario, whether it's sales analysis or payroll or whatever, you can simply have a home page that describes what's available, says who to send e-mail to if you're at all confused, and you simply follow links. And as you follow those links, some of the pages you navigate through, of course, will be HTML pages that you primarily read. But, some of them will be productivity documents -- a spread sheet that you can pivot and recalculate and try things out; a Word document that you can edit and annotate -- because in this internal scenario, a knowledge worker's job is not simply to read what's out there; it's also to get involved in changing the information. And so it ties into the productivity software that people have today.

One of the beautiful things about Intranets is they can be used to distribute software and support information. And so the Intranet is a great solution to many of those challenges that relate to PC cost of ownership.. The idea of getting rid of installation or only sending down to a machine as much software as appropriately can fit on that machine, you don't have to throw out the PCs that are trying to do that. In fact, that's working today and it's something that PC customers are very enthusiastic about.

One of the exciting things we're announcing today is that our commitment to the Internet and to building a state-of-the-art browser extends not only to Windows 95 and Windows NT, but also to 16-bit Windows and the Macintosh and to Unix. And so, working with some partners, we've created Internet Explorer 3.0, and that's our latest, with all the active control capabilities on several Unix platforms. We're in the alpha stage right now. And we'll have this in beta before the end of the year.

One other key point to make is the placement strategy we have for our browser technology. It's the same policy on all platforms, and for the Internet Explorer 3.0 browser that you see here will be free and available a great number of places, including as a free download from our Web site.

And the reason we do that -- it's not purely a magnanimous thing on our part. (Laughter.) We're doing that to promote the Active X technology, and by having the browser be out there very, very broadly, we're able to go to Web authors and say, "Look, this browser is easy for people to get, it's got a dramatically growing share, and therefore go ahead and not only take advantage of the things that are common between us and Netscape, but take advantage of the things we do uniquely." For example, HTML 3.2, the recent styles capability that the W3C has put together, that's a unique thing that we'd like people to take advantage of, and, of course the active controls as well.

Well, let me give you a little market data on where Windows NT is. When we wrote Windows NT, it was a huge investment for us -- starting from scratch and building the new operating system. That's done very, very rarely. Most operating systems that are out there have antecedents that go way, way back, and there's a lot of benefits to starting from scratch. You can get rid of some of the baggage of the past, you can take some of the great ideas that have come out in universities and make sure you've got something that's going to last for a long, long time.

And so the development cycle for Windows NT was over three years. We had Dave Cutler, who'd worked on VMS and a number of other systems as the head of that development effort. We were very pleased when it came out. But when a new system comes out, the bootstrap is still pretty tough. You've got to get the tools over there. You've got to get the hardware drivers. You've got to get the applications. You have to get customers who are actually very conservative to come on board and encourage other customers. And so you've got to have patience. You've got to be willing to persevere. And this has been true of any systems technology Microsoft has brought forward, whether it was a new version of MS-DOS or Windows that was ridiculed for many years or even Windows NT in its early days, and so you've just got to keep pushing and pushing until you get enough momentum that then product has the success loop that I've described earlier.

Windows NT probably about a year ago passed that critical stage, and so the sales today are very large even by the standards of, say, Windows 95, which is the best-selling operating system. The sales have more than doubled all Unix servers combined. Here I'm just talking about the server. It's quite a bit more than Netware 4.X, but even more important is to look at the leading indicator. The leading indicator for an operating system is always what are software developers doing? Are they adopting it as their new platform and are they doing more than just porting their applications to their platform? They're doing the neat work that takes advantage of that platform.

And so we have lots of server applications being moved over. We do a regular review on a monthly basis where I go through and look at any applications that are still on Sun or AS-400 or anywhere else that we don't have on Windows NT and we talk about what's the status of that developer? What kind of technical support, evangelism, marketing help, what do we need to do to make sure that we have a superset, that we have every application in those various software catalogues. And we've come a long way. We have the vast majority of any of the applications that ever ran on Unix, and much more than ports, versions that use the environment very, very well.

Now, why does this work? Why are people willing to do this? Well, it's all about scale economics. When you sell an operating system by the millions, you can afford to sell it for a few hundred dollars. When you have a high volume of Windows NT machines out there for people to sell on, they can sell their applications at lower prices and still be able to invest more in their R&D. So this scale economics is very important -- even more important to the software world than it is to the hardware world.

I mentioned that Windows NT and Unix have a lot in common. In fact, one way to think of it is to say to yourself, "Well, think of the intersection of your four favorite versions of Unix. Think of what they have in common in terms of capabilities, approaches and APIs, and say, "How does that intersection compare with Windows NT?" Well, you'll find that there's a lot in that interRsection that is in Windows NT - including the hardware neutralities, the ability to port very easily, scalable multi-processor support and other features. There's a type of robustness and richness that you think about in the Unix type system.

A few things are obviously unique -- running Windows applications, having the same API that's totally uniform. That's a very important point. When somebody licenses Windows NT from Microsoft, we do not let them delete APIs or add new APIs, and the reason for that is we have to be able to go to software developers and say, "It is the same. You don't have to retest and redevelop for each of those platforms out there. Don't worry about that because that's Microsoft's job -- to create that common virtual layer." And that is a different business approach that's been played out here and really has determined a lot of the difference between the systems.

We're also very tuned for certain scenarios. If you look at our performance for almost anything that counts -- for file sharing for printing, for being a Web server -- the speed is just dramatically higher on a Windows NT-based server. Now, that's partly because we have an R&D budget funded by that very high volume that let's us put a lot more effort into the system to refine it and make sure all those pieces are there.

And, the final point is it's the same user interface that people have on the desktop. One of the things we see more and more is people who develop code and they're not sure whether it'll run on the server or the desktop. They like it running on the server so that at night they can schedule something and just run it there very reliably, but they'd also like to be able to take that something and put it on a portable and go to a customer site and do that kind of query and analysis. And so not having to think through a different interface, a different API with the desktop and server scenario is very advantageous. And over time, I think, these systems will have code migrate back and forth very automatically, making sure to run on the platform where it can be most efficient.

There's a pretty dramatic contrast in hardware pricing in the PC space than in the non-PC space. I pick my favorite company here, Sun, as a point of comparison. You can do this yourself for any particular configuration that you want to put together. Look at the price of memory, price of disks, the price of applications and the price of the operating systems. Again, this is just volume economics at play. And it's not just Microsoft. These are people like Compaq, HP, DEC, and many, many others that have the efficiencies, primarily driven by volume, that allow this to happen.

Now, the work station market over the last few years has been growing pretty rapidly, but the people who do market research, they've had a hard time defining what is a work station. It used to be very, very simple. If it was non-Windows, non-Intel architecture, and ran some type of Unix, that was a work station, and everything else was a PC, and the two were quite different. But people have recognized with this power build-up, that it doesn't make sense to define it that way, and so although different market researchers draw the line in a different place, some percentage of the very high-end PCs are considered work stations. And so we see here the market grow, and it's fairly dramatic in the sense that the traditional vendors have seen reasonable growth -- 10 to 15 percent unit growth -- where as in the Intel/NT space, you see very, very dramatic growth.

And you might think, "Well, how come on the work station side it's showing this tiny little growth for all these vendors when the sales of somebody like Sun have gone up a lot?" Well, the key element is that their average price per server is dramatically higher today than ever before, and so that's the market where the sales increases for them are very dramatic, but, again, it's not unit driven; it's driven by dollars-per-unit being very, very high. So you take that market and break it down: 37 percent of the units are Intel, and 11 percent are Intel Unix. So you see both of those as a very major part of what's going on.

Now, as we recognize that large Unix space that's out there, there's a lot of things we've done to make these things fit together. First is this idea of taking a Windows application and running it on Unix. And we have three partnerships that fit into this:Wyse provides the Windows interface source environment, and we work together with them to make sure they've got the very latest Windows API technology. Bristol and Mainsoft also provide source and binary compatibility, and again that's a close relationship where it's not just some old version of Windows, it's the very latest.

Finally, we have Insigna and Locust (sp) which will let you do remote execution. They take the ability to sit on a Unix core station, connect up to a Windows box, and see that application executing locally, almost as if it is on a Windows machine. You have a little bit of latency there that makes the difference noticeable, but still it gives you access to those broad applications.

We also have a number of partnerships that relate to making sure that Windows infrastructure is available on Unix. And our view is that something like object plumbing -- it's kind of crazy for people to compete over that. There should be very few of those. Ideally one -- maybe the world won't achieve that; so, fine, we'll have perhaps two of these standards that are very popular, and then we can make sure there is good interoperability between those.

To further this goal, we recently took the Active X technologies, including the object plumbing which was called DCOM, (distributed com), and put that under the control of the open group. And so not just Microsoft, but literally hundreds of companies who have an interest, can control the direction of that technology and understand that it really is there not just as a Windows-driven initiative. We have Software AG and Digital who are doing porting of that technology to make it available to software developers, and then we have AT&T who does a wide variety of Windows NT services over on the Unix platform.

Another thing that's important to us is Unix compatibility coming back over to Windows. And so here again there are a number of partners. One called Softway Systems has created what they call "Open NT." This is software that they put on top of Windows NT that not only gives POSIX compliance but gives full XPG 4 compliance. And that's a definition that the X-Open Group, which is now part of the open group, has had for many years. And so the open group can take that product and brand it as Unix, because Unix brand is under control of the open group. So then, you know, in a very formal sense we can say when packaged that way Windows NT actually bears the Unix trademark.

There are a couple of other important tools here. Nutcracker has been used -- from Data Focus (sp). It's been used by a lot of people to port things over. Portage from Consensus (sp) is used in that same way. And all the popular utilities that people are used in the Unix environment are available for many, many different people. In some cases, that can just be downloaded up on the Net for free and run on top of Windows NT. There are a lot of pieces that we pulled together there.

I was putting together this slide on standards, and couldn't come up with an acronym, so I decided I couldn't list all the standards. But I think it's amazing how healthy the standards process is in this industry, particularly with respect to the Internet. It's exciting to have quality of service guarantees, Multicast, address extensions -- so many of the great extensions that are really required to fulfill the division of the Internet being addressed on a very timely basis by these standards committees. So you can never paint a black-and-white picture. Standards committees actually can do very innovative work. And that work helps to grow the industry. As a high volume participant, anything that grows the industry is something that is very, very important to Microsoft.

So in our product, particularly in Windows NT, we are leveraging an unbelievable number of standards. Many of those come from the Internet space, but some -- SNP , X-400, X-500 unit codes -- are standards that are very important that come from other areas.

We've been involved in creating a lot of standards. When we see a vacuum, we're willing to step in and work with partners to come up with something new. Desktop Management, the DCE remote procedure call capability. Database access, our ODBC interface, is now used very, very broadly. And that was to address the idea that when you write front-end code that calls relational database, you shouldn't have to be married to the back end. If you want to switch from Oracle to Informix to DB 2 to SQL server, that shouldn't affect your front-end code. So we have tools like Access where we wanted that independence. In working with a Sequel/Access group, we put together a standard there. Our licensing API, Active X API -- many examples there.

We are adopting all the standards that are out there for mail. We think mail interoperability is as important as database interoperability. We even have products like SNA server that fully exist for the interoperability with the existing environment.

Now, the term "open" is the term used a lot by different people, and it means different things to different people. Windows is not open in the sense that it is not free. You know, we pay our people salaries and fund more than $2 billion a year in research work, because the way we put it together and test it. The user interface -- we build on that. That's a product that we license out to hardware manufacturers, typically getting something like 2 percent of the overall price of the system. It's actually a bit less than that for the higher-end systems.

Now, what goes into that is over $100 million a year spent on evangelism, over $500 million spent on customer support -- 24-hour-a-day, seven-day-a-week support on a very global basis. Another thing we fund there is some very advanced research which we think is key to defining the next level of computing.

The big rage now is the Internet. Well, in two or three years -- it's hard to project the exact time frame -- it will be the Internet together with new input techniques -- with speech input, visual input, and getting away from having the keyboard and URL be the only way that you're able to navigate that information space. Well, how will we move on to that next level of naturalness? Well, it's going to take some pretty big investments in order to do that. And that's a very substantial percentage of the research that goes on at Microsoft.

I can see here that all the APIs we create in Windows we publish out to people and make sure that developers are aware of those things. We've been able to use the Web and CD distribution to get this information out literally to tens of thousands of people on an ongoing basis. If you sign up for our Microsoft developers network, you will get a lot of CDs, because we even send you, for example, every version of Windows NT localized into over 30 different languages.

Openness -- the way we define it means low prices for customers and choices for customers. Choice here includes the different processors we support. Every time we come out with Windows NT we've had the MIPS, the Alpha, the X-86 and Power PC. We would have been glad to add SPARC to that list. (Laughter.) From a technology point of view it's very, very straightforward. But, as I said earlier, Windows NT is just too, too inexpensive to fit into the SPARC strategy. (Laughter.) There's a lot of choice of devices, development tools, really across the board. People choose all those different things.

Let me talk a little bit about performance and where that's going. It's been a big issue, because customers are not only taking older applications down from larger systems; they're building new applications that are very, very demanding. And so here I'm showing SQL performance transactions. These are actually TPCC benchmarks. I am showing over the last three years there's been a factor of 10 improvement in that benchmark speed. Now, that factor of 10 comes from many different things. It comes from faster processors. It comes from tuning the software. It comes from people doing better multi-processor systems. As we'll see, the particular high benchmark there is on a four-processor system.

Here we see it on a comparative basis, where we've taken the performance of other hardware platforms and their best benchmarks. And the numbers you see on the columns there, those are the number of processors involved. So actually in this case Windows NT has the fewest processors of any of those, and yet achieved the highest benchmark results. And of all the benchmarks in the world, these are among the best, because they are audited results. These are not something where somebody gets to mess around with the definition, as is so typical in the world of benchmarks. And you see the total hardware costs and the cost for transactions.

Now, even 6,000 transactions per minute is not good enough for some of the things that people want to do. And so what we're going to have to do is we're going to have to over the next several years go up another factor of 10. We're going to have to go up to systems that can do literally a billion transactions a day. And we definitely think that's possible.

Again, many elements are coming into this. One very important element is, again, faster processors, more processors and SMP type configurations, better software tuning, and finally clustering. Clustering is really the last sort of mainframe architectural approach that hasn't come down and become a mainstream feature in the PC space. There's a few vendors that have done some things there, but we're just now putting the standard APIs into Windows NT to make that easy.

Today we have many customers, have hundreds of gigabytes here -- over 5,000 current users. And, as I say, we're going to go up over a terabyte and over a billion transactions a day using these systems.

One approach that's very important in clustering is to be able to do automatic partitioning, so-called no-shared clustering. It really is a very powerful approach.

Well, a lot of the applications people are building today are Internet applications. That kind of dynamic page generation really gives you the flexibility not only to reach out within a company, but to reach out to anyone who's got a computer with a browser. And so we think that is definitely the realization of the division of client server.

Now to make this happen, we've got to build in a lot of richness in the operating system. For example, take transaction management. It's been very difficult for people to write applications that run across multiple servers, particularly when you've got data you want to commit and make sure that it's all there and done properly. That's a huge challenge. And then there's SAP we've talked to shows that over half of the applications code they write is actually doing systems-management-type things that properly belong in the operating system. And so we're building not only the Web server into Windows NT Server, but we're also building a rich transaction manager, and that's code-named Viper. And it does adhere to a transaction standard called XA that allows coordination across different types of machines.

Every part of our strategy now is optimized to provide Internet building blocks. Microsoft Office, which we have a major release out early next year, Office 97, that's designed to read and write Web pages and to integrate the office document types into the Web. We have a new tool there, Front Page, that is particularly designed for Web offerings. The back-end products, like that SQL Server -- this dynamic page generation is the key feature that many new applications are using.

Also in the case of our programming languages, the target is to let people put that code inside Web pages, whether it's on the server or on the client. Java is our latest programming tool, and we've got a Java compiler with the highest benchmark feeds, great debugging. Java's -as you know, is a wonderful language, and everybody should have that in their portfolio. We'll continue to push forward with Visual Basic and with C. And I'm sure that many other languages -- Cobol, Powersoft, Delphi -- those will also continue to be important. So our basic approach is to make sure our system can work with every single one of those languages.

Just to wrap up, what is the great opportunity here? Well, I think Windows and Unix systems can work together very, very well. You can build solutions out of that combination. I also think that people who've been involved with Unix -- in a sense, they were always right: It is systems with that kind of power, that kind of approach, that are going to be on the desktop and on servers in dominant numbers. It would have been hard to predict the flavor or the basic way that would happen, but all of that expertise applies very, very well to the state-of-the-art systems that people are building.

As you use that expertise, there's never been so much opportunity to help a company be more competitive -- not only with the internal sharing, but by bringing their customers in very easily, not just for information, but even for transactions -- by working across the Internet.

And as we think about all this and the latest standards battles, I think it's important to step back and remember that certainly within the next decade -- and I'm optimistic enough to say within the next five years -- the whole way that we interact with these machines, it will change. You know, some day we'll look back and say, "Oh, this was the period of time where computers couldn't listen, they couldn't talk, they couldn't see, they couldn't learn. What was it they did? You know, file management, task management? How were people able to get by with those machines?" And so there's nothing but opportunity out there and a lot of innovation to come.

Thank you.