Completing the Circuit
Her research on integrated circuits advanced the Internet
age by years. Now she finds herself revisiting her earliest, groundbreaking work
in computers, which she long kept secret because, back then, she existed as a man
ANN ARBOR, MICH.--The conversation at Lynn Conway's kitchen table moves
seamlessly from computer architecture to Indian transgender cults, from the practical
anthropology of technical revolutions to the risks of motorbike racing. (A hand injury
two years ago sidelined Conway, but her partner, Charlie, still competes in the over-40
category.) A 14-pound brindled tomcat climbs across the counter, the table, Conway
and me as we talk.
More than 30 years ago, when she was in her late 20s, Conway worked on
a secret supercomputer project at IBM. She invented a way for a single central processing
unit, or CPU, to perform multiple operations simultaneously without interfering with
itself--unique for computers of its time. In her late 30s and early 40s, at the Xerox
Palo Alto Research Center, she helped to develop the techniques for integrated-circuit
design that touched off the VLSI (very large scale integration) explosion of the
1980s, a design and manufacturing approach that boosted the number of transistors
on a chip from thousands to millions. The chips that brought Sun Microsystems, Silicon
Graphics and other companies to prominence saw first silicon under her tutelage.
By the end of that decade, computer architects used VLSI to design computers with
multiple-issue and out-of-order execution capabilities like those Conway had conceived.
After her VLSI work, Conway went on to spur a similar revolution in artificial
intelligence and put in a stint at the U.S. Department of Defense overseeing plans
for high-performance computing. She later served as an associate dean at the University
of Michigan, where she is now professor emerita of electrical engineering and computer
science. Until two years ago, she also kept a secret that had contributed to the
long-standing obscurity of her early work at IBM.
Born male, Conway lived most of her early life as a man. She married and
fathered two children. When she finally underwent surgery to become a woman, IBM
fired her, and local child-welfare authorities barred her from contact with her family.
She was able to rebuild some early personal relationships only decades later.
In retrospect, she traces both her career choice and a significant part
of her success to her experience as a transsexual woman, trying to figure out what
worked in a world that wasn't really equipped to deal with her. "Think of my
life as an Amateur Scientist experiment," she says. "I'm still collecting
Conway recalls having known from early childhood that she wasn't a boy,
but her experimentation only started in earnest at the Massachusetts Institute of
Technology, where she enrolled in 1955 as a physics major. She read up on endocrinology
and learned to treat herself with black-market estrogen. She even cultivated a second,
feminine identity, until a well-meaning physician convinced her that she could only
become an unacceptable freak that way. She dropped out of school soon after.
Researchers estimate that a mismatch between gender identity and physical
sex affects anywhere from one in 30,000 to one in 1,000 people (typically, genetic
males suffer at a rate about three times that of genetic females). Although "gender
dysphoria" is listed as a psychological condition--and candidates for surgery
must undergo extensive evaluation and counseling--there is evidence that the condition
is a result of missed hormonal signals during embryonic development. In the U.S.
today about 2,500 males a year undergo surgery to bring their bodies in line with
their gender identity. The precise number of transsexual women and men is not known;
the vast majority do not advertise their medical status.
In the early 1960s, when Conway resumed her studies after several years
of working as an electronics technician, a mere handful of people had undergone sex-reassignment
surgery, and the stigma associated with transgender behavior was enormous. So she
continued to live as a man. Enrolled at Columbia University, she was perfectly placed
to learn computer science. She also studied anthropology, trying to understand as
much as she could about her personal predicament. She read ethnographic accounts
of cultures throughout the world where some males lived as women.
Conway hoped to quickly parlay a master's degree in electrical engineering
into a high-paying job that would enable her to save enough money for surgery. But
an involvement with a female co-worker led to pregnancy and marriage and postponed
any thoughts of transition indefinitely. The need for a job being even more crucial,
Conway landed an offer from Herb Schorr, an IBM researcher who also taught at Columbia,
to work on "Project Y," later to be known as the Advanced Computer System.
The ACS was a go-for-broke project to wrest back the performance laurels
the company had lost to upstart Control Data Corp. (IBM chief Thomas J. Watson wrote
a blistering memo at the time, demanding to know how a company of 34 people, "including
the janitor," could outdo his thousands of engineers.) The outstanding problem
in computer design (then as now) was to maximize the amount of work a CPU could perform
in a single clock cycle. Pipelining (the division of a complex operation, such as
multiplication, into a series of steps) allowed one completed result to appear per
tick even when operations took several clock cycles to complete, but it introduced
complex dependencies. The input needed for one operation might be the result of another
that had not yet finished, or the output of an operation might overwrite data that
were still being used by another part of the pipeline. Control Data had introduced
"scoreboarding" circuitry to stall conflicting operations while allowing
others to proceed, but the goal of one result per cycle still seemed unattainable.
That was the state of the art in 1965, when IBM researcher John Cocke
rhetorically asked the rest of the ACS staff, "Why can't we execute more than
one instruction per cycle?" During the next few months, inspiration struck the
young Conway in the form of an idea for a circuit that would combine information
about CPU resources currently in use and those needed by upcoming instructions, tagging
those instructions that could be executed without causing conflicts.
"It required a lot of transistors, but it was very fast because all
the checking could be done in parallel," she recounts. So Schorr and the other
senior team members decided to redesign the ACS around this so-called multiple-instruction
issue. Conway programmed a software simulator that became the de facto blueprint
for the ACS-1, bridging conceptual barriers among logic designers, hardware engineers
and programmers. If it had come to fruition, the machine would have been able to
execute a peak of 500 million operations per second, comfortably faster than the
Cray-1, which stunned the computing world when it was announced in 1976.
Instead, by 1968, internal politics and serious doubts about the feasibility
of building such advanced hardware had scuttled the ACS project. Using existing integrated
circuits, the CPU would have required more than 6,000 chips connected by hair-thin
wires. After the project died, only a few hints of its ideas came to the outside
world; years later credit for inventing multiple-instruction-issue CPUs would go
to designers with no formal connection to IBM.
Meanwhile Conway's personal life had been tumultuous as well. Suicidal
feelings led her to conclude that living as a man was impossible, and so she began
the physical transition and had the surgery. Although her immediate supervisors tried
to keep her on, IBM upper management decided that she had to go. Executives were
in such a hurry that they did not even ask her to return her collection of ACS technical
papers. (When contacted for this story to clarify the narratives of Conway and her
former colleagues, a representative of IBM's board of directors declined to comment.)
The unexpected firing destroyed what confidence Conway's family and friends
had had in her. Sudden poverty put her former wife and two children in the hands
of child-welfare officials, who threatened Conway with arrest if she had any further
contact with the family other than paying child support. She had to rebuild her career
without reference to her work at IBM. Job offers evaporated, Conway recalls, every
time she told potential employers about her medical history. Finally, she got a job
as a contract programmer; it was the beginning of what she now describes as "deep
In 1973 came a crucial break: an opening at Xerox's fledgling Palo Alto
Research Center (PARC). The freewheeling environment entranced her (even though she
consistently wore skirts and suits in contrast to the standard dress of T-shirts
and sandals). Without strong academic credentials or an aggressive personality, she
sometimes found it hard to gain respect for her ideas in the rapid-fire give-and-take
during meetings at PARC. Jeanie Treichel, now at Sun, says that Conway would seldom
answer her phone directly, preferring to call back once she had marshaled all the
A new manager, Bert Sutherland, introduced Conway to Carver Mead, a semiconductor
researcher at the California Institute of Technology. Sutherland had hired Mead as
a consultant to "stir up the pot and make trouble," he says. Mead's work
on fundamental limits to transistor size made it clear that engineers would eventually
be able to put millions of transistors on a single chip--say, for example, an entire
Conway and Mead distilled hundreds of pages of semiconductor arcana--the
"design rules" that governed how to draw patterns for metal wires, impurity-doped
silicon and insulating silicon oxide--down to a few dozen lowest-common-denominator
rules. They also winnowed the enormous range of circuit-design styles to a single
basic methodology. Instead of half a dozen ways to draw an adder circuit or a shift
register, their disciples would start by learning just one.
But even more than developing a new design method, Conway created ways
to disseminate her ideas. To make VLSI design appear legitimate, she and her colleagues
wrote a textbook of the kind the more established disciplines used--and composed,
printed and bound it using the networked computers and laser printers that other
PARC researchers had only recently developed. She test-drove the book in front of
30 students and 10 professors when she taught a course at M.I.T. in the fall of 1978.
Guy Steele, now a computer language researcher at Sun, remembers her as "one
of the five or six best professors I've ever had."
The course had a special attraction: PARC, Caltech and Hewlett-Packard
arranged to fabricate all the class-project circuits on a chip so that they could
be tested and displayed. In a couple of years, more than 100 universities were running
courses and getting back working chips, as the Defense Advanced Research Projects
Agency (DARPA) established MOSIS (Metal-Oxide Semiconductor Implementation Service)
to meet the demand Conway and Mead had created. Researchers shared software to design
and test their brainchildren using the primitive workstations of the day. Yard-wide
color plots of chip designs--and eventually the chips themselves--were proudly displayed
in hallways and on doors.
The notion of creating such artifacts was very deliberate. Conway's anthropological
studies had convinced her that such "clan badges" would foster instant
recognition among clan members and spur interest among potential adherents, where
a good idea alone would not. She often cited Eugen Weber's classic Peasants into
Frenchmen when describing how the VLSI community had come together. For the role
that railroads had played carrying cultural goods in the 19th century, Conway had
the Arpanet, predecessor to today's Internet. Stanford president John Hennessy (whose
MIPS chip was an early beneficiary of MOSIS) estimates that the explosion of designers
and design tools, along with ready access to chip foundries, accelerated the development
of VLSI--and the entire computer and Internet revolution that grew from it--by as
much as five years.
Conway won strong loyalty among the people who worked with her. Former
MOSIS program director Paul Losleben was in near awe of her ability to draw from
people ideas they didn't know they had. As a manager, says Mark Stefik, an artificial-intelligence
researcher who worked closely with her at PARC, she had a knack for "getting
people to ask the right questions." In the early 1980s Conway and Stefik applied
the VLSI clan-building methods to artificial intelligence: with buttons, contests
and oversize prints, they popularized tools for representing knowledge in computerized
form as the AI boom took hold.
Although her outsider status played well in universities that previously
had no access to semiconductor research, it also drew heavy opposition. Many established
integrated-circuit engineers derided Mead and Conway's work, saying that it was too
simplistic and inefficient. At one Defense Department meeting, researchers affiliated
with the competing Very High Speed Integrated Circuits program "were laughing
openly" at Mead's presentation, Losleben recalls, and "not even behind
his back." And although Conway's collaborative management style inspired those
around her, her success drew fire from those competing for similar turf. Her former
assistant, Mary Hausladen, recollected how a rival lab manager, who had always claimed
nothing would come of the VLSI work, now spread rumors that Conway was "really
a man." "But no one cared," emphasizes Hausladen, now at ImageX.com,
an Internet printing company. (Stefik recounts Conway telling him that she had dared
the manager in question to go public with his accusation--such as it was--and that
he had backed down.) Her immediate supervisors knew her history, and many others
interviewed for this story claim that they had had their suspicions, but all added
that they considered it irrelevant to her accomplishments.
Shortly thereafter Conway was recruited to work for DARPA, managing the
so-called Strategic Computing Initiative that was to be the Pentagon's response to
Japan's ambitious "Fifth-Generation Computer" project. But her plainspoken
style and penchant for end runs around bureaucratic hurdles did not mesh well with
a hierarchical, military organization. "It was terrible to behold," Losleben
remarks. "Like watching a friend run full-tilt into a brick wall."
Conway moved to the University of Michigan, where she could foment further
unrest--pursuing studies on tools for research collaboration and helping to revamp
the school of engineering--and spend some time having a life. She took up canoeing,
kayaking and motorbiking and found her partner, Charlie. She worked to build the
university's Media Union, a working laboratory for digital libraries, classrooms
and work spaces.
In 1998, as Conway retired, she found herself back at the beginning of
her career. Mark Smotherman, a computer scientist at Clemson University, began unearthing
the history of the ACS-1 and its influence on later machines. Bill Wulf, now president
of the National Academy of Engineering, called the machine "a stunning revelation."
Conway's own archives, which had traveled with her from house to house for 30 years,
became a potential treasure trove.
She attended a reunion of ACS engineers, organized by Smotherman, that
included Cocke, Schorr and others and weighed her options. At last she decided that
setting the record straight about her early invention outweighed maintaining her
"deep stealth" status and began publicizing her ACS work.
Today she has taken on the challenge of being known as a transsexual woman
with her characteristic verve. Ironically, she says, the more seamlessly transgendered
people fit into their new lives, the less visible they are as role models for young
people confronting the same conflicts. So her Web site, lynnconway.com, is now a
significant resource on medical, legal and social issues for transsexual women, who
regularly face discrimination, threats and violence. She also serves on a university
committee examining transgender policies.
If not for IBM's corporate transphobia, she probably would have remained
a computer architect all her career and never initiated the VLSI revolution, Conway
reflects. When I comment on how much the world has gained from her trials, she retorts:
"But that doesn't do anything for me," reminding me of her lost family
and friends, the life she might have had. In the past 30 years gender transitions
have become much smoother. And for the current generation, Conway hopes--and plans--that
what caused her so much pain could be seen as one more correctable medical problem,
to be mostly forgotten as soon as the surgical scars heal. Few people who know Conway
would bet against her ability to help pull off this revolution as well.