Richard Stallman page 3

MG: Were there offenses that could get someone excommunicated?
RS: Yes, sort of. If some visitor acted obnoxiously, like crashed the machine, all the other visitors would be angry, because they didn't want the machine to go down. They were having fun using the machine. Especially, we had found a way of educating people away from being crackers. We found a way of giving them a role and a stake in the continuation of our society.
MG: You said Crackers?
RS: Crackers, yes. Crackers are people who make a practice of breaking computer security, or crashing computers. But the point is that we didn't have any security to break. Although actually, there was one thing that had a little bit of security. There was a special magic command that you had to type in order to be able to deposit in the core image of the running kernel-- the lowest level program on the system, the one that keeps the fences up between the other programs that are running.
On this system, since it was meant for hackers, the top level command interpreter was actually also the debugger. So any time you were running a program, the debugger was right there in case anything went wrong. You could start to de-bug instantly. Anyway, you could also start to de-bug the actual running kernel, and you could deposit in it. You could patch the code of the running kernel, while it was running, to test out a bug fix. And I did that many times. Of course, you have to be very careful when you're doing that, otherwise you'll crash the system. So because of that, there was actually a password on the ability to do that. I say a password because in some sense that's its role, but it didn't look like that. It was actually a magical command you had to type before your debugger would allow you to deposit in the running kernel. And this command was a bit funny, because you would type one thing, and it would echo as a different thing. So the result was that if somebody was watching your screen and they saw you type this command, they'd be misled about what it was they should type. Of course, anyone who was smart enough to read the source code of this program could see the code that did this and know exactly what to do. But anyone who was that much on the ball was okay. And by the way, this wasn't put in to stop anybody malicious. It was put in to stop a person who worked at the lab but was overconfident and kept on thinking that he knew how to do this and kept on getting it wrong! [LAUGHS]
MG: Try to de-bug while it was running and then he would crash the system?
RS: Right. In fact, they actually named the feature after him internally!
MG: Now, all this time, all through the four years before you graduated, you're doing a full course-load at school? And then you're writing code at night?
RS: Well, we'd go over to MIT typically Friday afternoon, in time to have a dinner of Chinese food, and I would stay there until Saturday evening, have another nice dinner, and then go back to Harvard. Typically, this is what kept me alive, because the food in Harvard's dorms was so horrible. About half the meals I couldn't find anything I wanted to eat, so that left me with about one meal a day.
MG: But between Harvard and the AI Lab, this is all-consuming. This is your life, right?
RS: It was. That and, starting with my junior year, folk dancing, which I really loved.
MG: Then you graduate and go to the AI Lab full time?
RS: Well, actually I went to MIT first as a graduate student in Physics.
MG: Oh, for how long did that last?
RS: A year. What I noticed was that my enthusiasm for Physics was decreasing, and I believe the reason was that in programming I could do something. I could produce something that was new and that was useful and I could feel proud of, and in Physics I hadn't seen how to do that, and I never figured out how to do that. I wished that I could do it, but I never saw how. So what happened was, I had a knee injury, and it stopped me from dancing, and basically broke my heart. I not only couldn't do the dancing that I loved, but I couldn't meet any women. The only way I met women was by dancing with them, and going but not being able to dance didn't enable me to meet anyone I didn't already know. This happened in June, and all summer I went there anyway, and I was reasonably cheerful because at the time I assumed, "Well, it will get better, in a few months it will be better, I can live with this." Then it sunk in on me that it was not getting any better. Then in September I basically fell apart, because I realized that I wasn't going to get to better, and I had nothing. That's when I dropped out of graduate school and just started working at the AI Lab.
MG: You went there right around the time Watergate happened.
RS: I cared a lot about Watergate. In fact, I often wore a button which was inspired by Watergate, which said, "Impeach God." I compared what Nixon was telling us with the spiel that, according to Christianity, God gives us, and they match up point by point. "I have a secret plan to end the War in Vietnam, or end justice and suffering in the world. For heavenly security reasons, I can't let you mere mortals understand the details of my plan. So you'll just have to take it on faith that what I'm doing is right and obey me, because after all, I am entirely good. I told you so myself, and you have to believe everything I say. And besides, I see the big picture, and I am so much wiser than you. So you should just obey implicitly. And if you don't obey me, that means you're evil, so I'll put you on my Enemies List, and the IRS will audit you every year for all eternity." I figured why stop with the small fry, let's go after Mister Big. No matter how powerful a tyrant is, they all deserve to have their power taken away.
I wasn't involved in politics in the sense of joining organizations and working on campaigns. But I thought about politics very much. As did just about everyone else. The people at the AI Lab were not mostly apolitical. They were mostly vaguely left-wing and they were embarrassed that they were getting funds from the Defense Department, which peculiarly, did not bother me, because I thought what we were doing was more important than who we were getting funds from, and besides, just because I was against the war in Vietnam and against some U.S. interventions in Latin America, that didn't mean I was in favor of unilateral disarmament. I was against the policy that had led to the use of our military, but I didn't start thinking that the United States is the enemy, the way many people did.
MG: Where did you see your work at the AI lab going? What was it leading to?
RS: I had no specific idea in mind of what it was going to lead to. I just felt that computers were an exciting thing and that they had to be good for people somehow, and then, I wanted to advance what we could do.
MG: There was no vision yet of personal computing?
RS: Oh yes, there was a vision like that. There were people at Xerox, for example, who were working on trying to develop computers that would be gradually cheaper and more accessible and easier for people to use. I wasn't tremendously interested in that, though.
MG: Were you at all into things like "Saturday Night Live"?
RS: I did like "Saturday Night Live" a little bit. And this was also the time when I had started to learn about various kinds of world music. Because from folk dancing I had learned about Eastern European music and Turkish music and some Arabic music.
MG: Are there any people your age in the lab, or are you mostly surrounded by people older than you?
RS: People were still coming into the lab until around 1980. There were still mostly MIT students, occasionally Harvard students, showing up and becoming great hackers. It was a continuing vital culture. I was not the last person to join it.
MG: But some of the key people began drifting away...
RS: Well, people always came and went. But in any culture there are always people coming and going. People die, people are born. It was only when Symbolics hired away most of the hackers that there was a systematic change.
MG: Symbolics represents the commercialization of the culture, right?
RS: It was. Right. The result was that the lab was sort of empty of hackers.
MG: A new machine comes into the MIT lab, right? And all of the software that you guys had written all these years doesn't run on it? It comes with its own?
RS: Yes. Well, we could have run our system, but they decided, because of the dearth of hackers, that we wouldn't be able to keep it going, and therefore the people who ran the lab decided to use Digital's operating system instead, which meant that only a few of our programs were still of any relevance -- a few that had been ported to that system and were used by people elsewhere also.
MG: Stephen Levy wrote about this in Hackers. One of the MIT hackers formed a company called LMI to write a new operating system, and a group run by a more business-oriented type started a competitor called Symbolics. There was no one left at MIT to improve the non-commercial version of the software-and Symbolics stepped into the breach with a proprietary OS. Did you play a part in all that?
RS: I was the last hacker. I was the only hacker left at the AI lab. And then what happened? An enemy, a company formed by a betrayal, gave us an ultimatum essentially. They said, "from now on, if you want to use any of our code, any improvements we're making, you've got to abandon the MIT version of the system and essentially join our camp." They forced us to choose sides between them and LMI. Until then I had been neutral, and I had been saying, "I am not on the side of either company." But once one of these companies attacked the AI lab, I couldn't be neutral any more, so I joined the war on the side of the other company. What else does a neutral do when one of the sides in a war invades it?
I started maintaining the LISP machine system for MIT, in competition with Symbolics. Now, of course, anything I wrote was available to Symbolics, just as it was available to MIT. But in fact, what I was doing was the same job Symbolics had already done, for the most part. So it was of no benefit to them. It was only of benefit to the company that they intended to be the victims of their ultimatum against us. This was Belgium. The Germans attack Belgium, the Belgian Army fights on the side of the English and the French.
MG: You were saying, "I don't care that this is proprietary to you guys; I'm going to make it open"?
RS: I wasn't using their code. I was writing my own code to do the same job.
MG: I see. But your code was open and their code was proprietary?
RS: My code was available to LMI. But it was not free software, because MIT had licensed the LISP Machine System to those two companies, and thus had made it non-free-software. So I was fighting to defeat this invasion, and I did more or less succeed at that. I ultimately lost the battle. Essentially I thought of it as a rear guard action, because LMI at the time was too small to do its own system development. So if Symbolics had wiped out the MIT version, made it obsolete in no time then they would have knocked LMI out of business, and they would have profited by their invasion. And my goal was to deny them any benefit from their aggression. So I did that by keeping the MIT version viable.
MIT had done the first bad thing by making the software proprietary. But at the time I wasn't thinking so much in those terms. I was thinking that Symbolics had killed the AI Lab's hacker culture, which was my home! So they had done this as part of a deliberate strategy to profit, and I was going to deny them that profit. And I ultimately did, more or less, because I kept this going long enough for LMI to hire hackers and pick up maintenance, and LMI was successful for a few years...
MG: Before we go to what happens to you, what happens to these two companies?
RS: Well, they both went bankrupt a few years later. The day of LISP machines passed. It became feasible to make whole CPUs on a chip and have them be reasonably powerful.
MG: Is this all completely separate from the world of personal computers?
RS: Yeah. Completely separate.

 b a c k  |  n e x t