April 26, 1812: Birth of Krupp, the ‘Cannon King’

1812: Alfred Krupp is born in Essen, Germany.

While the Krupp Konzern has always manufactured machinery and machine components from high-grade cast steel, the company is best known for producing perhaps the finest artillery ever seen in warfare.

It was Alfred Krupp, a staunch patriot at a time when Germany was struggling toward unification, who set the tone. The artillery produced by the Konzern was excellent, both in terms of its component quality and its accuracy, earning Alfred the sobriquet der Kanonenkonig, or “Cannon King.”

By the time of Krupp’s death in 1887, roughly half of the company’s business was tied up in armaments production. By then, the firm employed 20,000 workers, making it the largest industrial company on earth.

It was Krupp’s breech-loading cannon that annihilated the French army at Sedan in the Franco-Prussian War, Krupp’s “Kaiser Wilhelm” gun that shelled Paris from an incredible 75 miles away in 1918, and the versatile Krupp-made 88mm gun that, during World War II, was equally effective as an artillery piece, a tank killer and an anti-aircraft weapon.

But Alfred was as loyal to his business as he was to his country, so Krupp artillery pieces found their way into armies the world over. As William Manchester observed in his exhaustive saga of the munitions family, The Arms of Krupp, plenty of German soldiers were killed by Allied shells bearing the imprint of Krupp of Essen.

After a complete collapse at the end of World War II, Krupp rebounded during the German post-war “economic miracle” and continued in the steel business. The firm merged with its old rival, Thyssen, in 1999 to form ThyssenKrupp.

Source: The Arms of Krupp, by William Manchester; ThyssenKrupp.com

Photo: Alfred and Bertha Krupp with son Friedrich Alfred. Courtesy ThyssenKrupp

This article first appeared on Wired.com April 26, 2007.

See Also:

April 25, 1953: Riddle of DNA’s Architecture Finally Solved

1953: James Watson and Francis Crick present their research in Nature, describing the architecture of the double helix, which forms the molecular structure of DNA.

Although by then scientists understood that deoxyribonucleic acid was most likely the molecule of life, absolute certainty eluded them, because key components were still missing. Chiefly, they didn’t really know what the DNA molecule looked like.

Many, among them Linus Pauling, were actively engaged in DNA research and a number of structural theories were advanced, all of them wrong in varying degrees. When Watson and Crick finally solved the puzzle, the key was provided by an X-ray diffraction photograph of a DNA molecule — the so-called “photograph 51” — taken by another researcher, Rosalind Franklin.

Franklin’s photograph revealed a fuzzy X in the center, confirming that the molecule had a helical structure, allowing it to carry genetic code and pass genetic material through generations. Combining this with their other research, Watson and Crick concluded that the DNA molecule was a double helix and not a triple, as the prevailing wisdom held.

For their work, Watson and Crick shared the 1962 Nobel Prize in Physiology or Medicine with another DNA researcher, Maurice Wilkins.

Rosalind Franklin received … nothing. She’d died of cancer in 1958, at age 37, and the Nobel Prize is not awarded posthumously.

Source: Nobelprize.org

Photo: The structure of a section of DNA.

This article first appeared on Wired.com April 25, 2007.

See Also:

April 22, 1970: Tell Your Mother You Love Her

1970: Mother Earth finally gets her due with the celebration of the first Earth Day. Historians regard this as the public launch of the modern environmental movement.

In the unslakable thirst for power, wealth and self-aggrandizement, mankind has for centuries shamelessly plundered the planet’s resources. While it’s reasonable to use some of the bounties of the Earth to ensure the survival and progress of the species, the growing ability (not to mention both need and greed) to extract more and more has exacted a heavy toll.

The knowledge that the Earth’s resources are not infinite provided the impetus for conservationists to try to raise awareness among the broader public.

Earth Day, which was founded by Sen. Gaylord Nelson (D-Wisconsin), not only heralded the beginning of the environmental movement but became one of its enduring symbols. Its celebration every April affords the chance to reflect on both the successes and failures of the movement.

Nelson, who at the age of 14 led a campaign to plant trees along the roads in his native Clear Lake, Wisconsin, emerged as a major voice for conservation when he joined the U.S. Senate in 1962. He found a sympathetic ear in President John F. Kennedy, and the seeds for Earth Day — and the environmental movement in general — can rightly be said to have been planted during JFK’s administration.

Nelson found his inspiration for Earth Day in the university teach-ins popular during the Vietnam War. If students could be organized to help stop an unpopular war, then why not use similar tactics to galvanize an environmental movement?

In fact, people everywhere were waking up to the importance of protecting their planet. The months before the first Earth Day saw a profusion of grassroots efforts aimed either at specific causes or raising environmental awareness in general.

Under the guidance of Earth Day national coordinator Denis Hayes, thousands of volunteers — mostly college-age or younger — organized rallies and events from the “redwood forest to the Gulf Stream waters.” When Earth Day arrived, 20 million Americans took part.

During an address at the University of Wisconsin, Nelson summed up the purpose of Earth Day this way: “Our goal is an environment of decency, quality and mutual respect for all other human creatures and for all living creatures…. The battle to restore a proper relationship between man and his environment, between man and other living creatures will require a long, sustained, political, moral, ethical and financial commitment, far beyond any effort made before.”

The senator practiced what he preached. Virtually every significant piece of environmental legislation of the period has Nelson’s fingerprints all over it: from the preservation of the Appalachian Trail to the Clean Air Act to the Clean Water Act.

Following his defeat for re-election in the same 1980 election that swept Ronald Reagan and the neoconservatives into power nationwide, Nelson moved on to the Wilderness Society, where he served as a counselor. He died in 2005.

Over the years, Nelson’s message never wavered: Safeguarding the environment is the single most pressing problem facing humanity.

A lot of people believe him today. Plenty of others, alas, still don’t.

Source: Various

Image: The official national poster for the first Earth Day in 1970 warned of pollution and congestion, and left the scheduling of events up to local groups all over the country.

This article first appeared on Wired.com April 22, 2009.

See Also:

April 21, 1994: Our Solar System Is Not Alone

1994: News breaks that astronomer Alex Wolszczan has confirmed that planets are orbiting pulsar PSR B1257+12. His research appears in Science the next day.

The confirmation kicked off an explosion in extrasolar planet hunting. Astronomers have now found more than 500 planets around other suns, and are adding more all the time.

The groundbreaking discovery came on the heels of a disaster: Wolszczan’s telescope broke. He was working in early 1990 at the Arecibo Observatory in Puerto Rico (famous for its roles in films like Contact and GoldenEye), when the 1,000-foot-wide radio telescope had to be shut down for repairs. Scientists couldn’t aim the telescope’s receiver at particular parts of the sky for about a month. But they could still look straight up and see what was there.

Wolszczan took the opportunity to scan the sky for pulsars: the dense, spinning corpses of stars that died as supernovae. As they rotate, they sweep the sky with a beam of radio energy, so from Earth they appear to wink on and off, or “pulse.” Normally the pulses are so regular, you could use them to set the most accurate atomic clock on Earth.

Not so with PSR B1257+12. This wonky cosmic clock kept unreliable time, alternately speeding up and slowing down. Wolszczan immediately suspected the presence of planets. The gravitational tug of a planet would nudge the pulsar back and forth, changing — by a few milliseconds — the time its radiation takes to reach Earth.

Finding a planet around another star was a revolutionary discovery in itself, but finding one around a pulsar was even weirder. “You couldn’t imagine a worse environment to put a planet around,” astronomer Dale Frail of the National Radio Astronomy Observatory said in a phone interview. Pulsars are essentially rubble from the cataclysmic explosion of an old, massive star — an explosion that would have incinerated any planets the old star might have harbored.

Wolszczan now thinks the first star had a companion, and ate it. The two stars danced around their common center of mass for a few millenniums, until the larger one exploded. Most supernova explosions begin inside the star, but slightly off-center, sending it careening through space in its death throes. Wolszczan’s pulsar either rammed right into its neighbor, or came close enough to rip it apart gravitationally.

“It was like stealing part of the star and leaving the scene of the crime very quickly,” Wolszczan said. The stolen stellar mass formed a disk around the cooling pulsar, which eventually coalesced into planets.

Cold, dark and constantly bombarded with radiation, pulsar planets are not friendly places for life. But the implications for finding planets around normal stars were huge. “If even in this hostile environment you can form rocky bodies in orbit, by golly, Earths must be pretty common,” said Alan Boss of the Carnegie Institute of Washington, one of the first theorists to consider how extrasolar planets might form.

Of course, the pulsar’s funny behavior could also have been explained by an error in measuring its position. Arecibo is great for large surveys, but it’s too big to pinpoint exactly where a star is located. To be certain, Wolszczan asked Frail to use the Very Large Array, a series of 27 radio telescopes in New Mexico (itself famous as a film location for 2010 and Independence Day, among others), to calculate the pulsar’s position as accurately as possible.

While they crunched the numbers, they were almost scooped. A team of astronomers led by British astronomer Andrew Lyne announced in July 1991 that they had found a planet around a pulsar. The astronomical community was agog, the media buzzed, and Wolszczan calmly continued to process his data.

“I decided, all right, he did it, I’ll do my story, we’ll see what happens,” he said. “It was too exciting to get frustrated and throw it away.”

His efforts paid off in September 1991. “I sat down in front of my computer and ran the model for the data, and got the answer that was very astonishing,” he said. “Beyond any doubt there were planets.”

In a dramatic turn of events, Wolszczan and Lyne were asked to give back-to-back speeches at the American Astronomical Society meeting in January 1992.

Lyne went first and shocked the thousand assembled astronomers by admitting that he’d goofed. He made exactly the sort of positioning error Wolszczan had contacted Frail to avoid. Rather than detecting the motion of an extrasolar planet, Lyne had detected the motion of the Earth.

“Everyone sucked in their breath at the same time,” Frail recalled. “There was this moving gasp through the audience. And then Alex had to stand up there and give his talk.”

It took another two years to confirm that the planets were really there. Ultimately, Wolszczan found three of them, one with a mass of 4.3 Earths, one of 3.9 Earths, and one just twice the mass of the moon, the least-massive extrasolar planet found to date. If they were in our solar system, they would all fit within the orbit of Mercury.

“Then all hell broke loose,” Wolszczan said. “Now it’s a blooming field.” With hundreds of planet-hunting astronomers and telescopes on Earth and in space, we’re closer than ever to finding worlds like ours.

Source: Various

Image: Artist’s conception depicts the pulsar planet system discovered by Alex Wolszczan. Radiation would probably cause the planets’ night skies to light up with auroras similar to our northern lights. One such aurora is illustrated on the planet at the bottom of the picture. (NASA/Jet Propulsion Laboratory–Caltech)

This article first appeared on Wired.com April 21, 2009.

See Also:

April 20, 1940: Electron Microscope Crosses the Atlantic; Zworykin Crosses the Delaware

1940: Vladimir Zworykin, better known as a co-inventor of television, demonstrates the first electron microscope in the United States. Once again, the Russian emigré improves but does not, strictly speaking, invent an important electronic apparatus.

Zworykin came to the United States in 1919 and worked for Westinghouse for a decade. While there, he developed and patented the iconoscope and kinescope, which used an electronic system to create and reproduce television images.

Westinghouse decided not to pursue the new technology, and Zworykin moved to RCA. Besides helping advance TV to a commercial medium, he worked on text readers, electric eyes, missile guidance systems and, later, computerized weather prediction.

The goal of creating an electron microscope was to achieve far greater magnifications than those possible with conventional, optical ’scopes. The concept involved using a magnetic coil or electric field to focus electrons to a single point.

Bombard a tiny object with electrons, and you can create a large image with the focused beam. In fact, you can use a combination of these lenses to increase magnification, just as an optical microscope does.

Ernst Ruska made this discovery at Berlin Technical University in the late 1920s. He and Max Knott built the world’s first electron microscope in 1931. The instrument had a resolution of only 400x — not as good as an optical microscope — but it was proof of concept.

Two years later, Ruska built an electron microscope with resolution that bettered its optical counterparts. By 1938, University of Toronto researchers had built their own model, and the German firm Siemens produced a commercial model in 1939 based on Ruska’s work.

Zworykin and his team developed their electron microscope at RCA’s research labs in Camden, New Jersey, in 1939. The device they demonstrated across the river in Philadelphia on April 20 of the following year measured 10 feet high and weighed half a ton. It achieved a magnification of 100,000x.

That was more than proof of concept. It was a fulfillment.

Zworykin shares credit for the television with Philo T. Farnsworth and John Logie Baird. His various efforts earned him the Edison Medal from the American Institute of Electrical Engineers, the National Medial of Science from the National Academy of Sciences and scores of other awards from associations and institutions around the world.

But science’s highest honor eluded him. The 1986 Nobel Prize in Physics went to Ruska, “for his fundamental work in electron optics, and for the design of the first electron microscope,” and to Swiss IBM researchers Gerd Binnig and Heinrich Rohrer for developing the related technology of the scanning tunneling microscope in the early 1980s.

Sic transit gloria mundi.

Source: Various

Photo: Vladimir Zworykin (seated) and James Hillier demonstrate an early electron microscope. (Bettmann/Corbis)

This article first appeared on Wired.com April 20, 2009.

April 19, 1971: Soviets Put First Space Station Into Orbit

1971: Salyut 1, the first operational space station, is launched.

As they often were during the space race, the Soviets were out in front of NASA in concept and launch. But just as often, they were bedeviled by technical glitches and failures, and so it was with Salyut 1.

Beaten to the moon by the Americans, the Soviet space program turned its attention to the deployment of a working space station, which had been on the drawing boards since 1964. Salyut 1 was essentially a lash-up, its components assembled from spacecraft originally designed for other purposes.

The April launch went smoothly and Salyut 1 entered orbit, but it was all downhill after that. The crew of Soyuz 10, intended to be the first cosmonauts to take occupancy of Salyut 1, couldn’t enter the space station because of a docking mechanism problem.

The crew of Soyuz 11 spent three weeks aboard Salyut 1, only to be killed on the return trip to Earth when air escaped from their craft.

Finally, it was curtains for Salyut 1, which fired its rockets for the last time Oct. 11, 1971, to begin its planned re-entry into Earth’s atmosphere and disintegration over the Pacific Ocean.

Source: PBS.org, Wikipedia

Photo: Salyut 1/NASA

This article first appeared on Wired.com April 19, 2007.

See Also:

April 18, 1906: Mother Nature 1, San Francisco 0

1906: San Francisco is destroyed by an earthquake so powerful that it is felt from Coos Bay, Oregon, to Los Angeles, and as far east as central Nevada.

What became known as the San Francisco earthquake and fire struck at 5:12 a.m., when the San Andreas Fault gave way, tearing the earth wide open from Humboldt County, near the Oregon border, to San Benito County, a hundred miles southeast of San Francisco. The epicenter was on the fault line just offshore from the San Francisco–San Mateo county line.

The earthquake had a magnitude measuring anywhere from 7.8 to 8.3 — a precise method of measuring seismic activity did not exist in 1906 — but it was enormous by any standard. There have been larger earthquakes recorded in California, but none so near a major population center. And damage was widespread all along the fault line. The town of Santa Rosa, 50 miles north of the Golden Gate, was flattened. Stanford University, in what was later to be named Silicon Valley, suffered severe damage.

But turn-of-the-century San Francisco was, by far, the most populous and important city in California — the cultural and financial hub of the entire West Coast, in fact — and almost all the attention was focused on the carnage there.

Even without the fire that followed, the damage was severe. The earthquake kept shaking for a full minute. By the time it subsided, a number of buildings in town had collapsed. Brick buildings with foundations of unreinforced masonry, especially those standing on land fill, proved especially vulnerable.

But the quake also ruptured gas and water mains, causing fires to break out and leaving the fire department with no water to fight them. San Francisco, then as now a tightly compact city with a lot of wooden structures, burned well.

The result was a conflagration lasting nearly four days. To stop the great fire, mansions lining the broad thoroughfare of Van Ness Avenue were dynamited by Army engineers to create a firebreak by robbing the flames of something to burn. By the time it was over, the heart of San Francisco lay in ruins. In all, 508 city blocks had burned to the ground.

Anywhere from 3,000 to 5,000 people were killed, most of them as a result of the earthquake itself, making this one of the biggest natural disasters in U.S. history.

A new city emerged quickly (a little too quickly, some historians would say) on the ashes of the old. Also emerging was a new emphasis on seismological studies and new regulations regarding building construction. In order to guarantee a water supply in the event of another major fire, San Francisco constructed a network of reservoirs, underground cisterns, fireboats and sea-water pumps.

San Francisco today also has some of the toughest building codes on earth — and yet remains vulnerable to both earthquake and fire. In San Francisco, it’s not a question of whether the next big one is coming, only of when.

Source: Bancroft Library

Photo: San Francisco’s Mission District burns in the aftermath of the 1906 San Francisco earthquake. (H.D. Chadwick/National Archives and Records Administration/Wikipedia)

This article first appeared on Wired.com April 18, 2008.

April 15, 1726: Apple Doesn’t Fall Far From Physicist

1726: Isaac Newton tells a biographer the story of how an apple falling in his garden prompted him to develop his law of universal gravitation. It will become an enduring origin story in the annals of science, and it may even be true.

Newton was apparently fond of telling the tale, but written sources do not reveal a specific date for the fabled fruit-fall. We do know that on this day in 1726, William Stukeley talked with Newton in the London borough of Kensington, and Newton told him how, many years before, the idea had occurred to him.

As recounted in Stukeley’s Memoirs of Sir Isaac Newton’s Life:

It was occasioned by the fall of an apple, as he sat in contemplative mood. Why should that apple always descend perpendicularly to the ground, thought he to himself. Why should it not go sideways or upwards, but constantly to the earth’s centre.

See Also:

Newton (like Ben Franklin and his kite) may have indulged in some self-mythologizing here. Surely, the puzzle was not that things fell down rather than sideways. Isn’t that what the concepts “fall” and “down” are about?

Newton’s breakthrough was not that things fell down, but that the force that made them fall extended upward infinitely (reduced by the square of the distance), that the force exists between any two masses, and that the same force that makes an apple fall holds the moon and planets in their courses.

John Conduitt, Newton’s assistant at the Royal Mint (and also his nephew-in-law), tells the story this way:

In the year [1666] he retired again from Cambridge on account of the plague to his mother in Lincolnshire & whilst he was musing in a garden it came into his thought that the same power of gravity (which made an apple fall from the tree to the ground) was not limited to a certain distance from the earth but must extend much farther than was usually thought — Why not as high as the Moon said he to himself & if so that must influence her motion & perhaps retain her in her orbit, whereupon he fell a calculating what would be the effect of that supposition but being absent from books & taking the common estimate in use among Geographers & our sea men before Norwood had measured the earth, that 60 English miles were contained in one degree of latitude his computation did not agree with his Theory & inclined him then to entertain a notion that together with the power of gravity there might be a mixture of that force which the moon would have if it was carried along in a vortex, but when the Tract of Picard of the measure of the earth came out shewing that a degree was about 69½ English miles, He began his calculation a new & found it perfectly agreeable to his Theory.

A much finer tale: It shows one of the great minds of the millennium entertaining proper scientific doubt about his hypothesis, before better measurement and better data ultimately provide confirmation.

Voltaire also wrote of the event in 1727, the year Newton died: “Sir Isaac Newton walking in his gardens, had the first thought of his system of gravitation, upon seeing an apple falling from a tree.”

Note that no one, from Newton on down (so to speak) claims the apple bopped him on the bean. Makes a good cartoon, sure, but such an event, if it happened, might have set the guy speculating instead on why — and how — pain hurts.

Source: Various

Image: Isaac Newton was 83 when he told a biographer the tale of observing an apple fall at age 23. He’s 46 in this 1689 painting by Godfrey Kneller.

This article first appeared on Wired.com April 15, 2009.

See Also:

April 14, 1945: Tweaky Toilet Costs Skipper His Sub

1945: A malfunctioning high-tech toilet forces a German U-boat to the surface off the coast of Scotland, where it is promptly attacked by a British aircraft. The boat is scuttled as the crew abandons ship.

U-1206, sailing out of Kristiansand, Norway, as part of the 11th Flotilla, was cruising at a depth of roughly 200 feet when the commander, Kapitänleutnant Karl-Adolf Schlitt, decided to answer the call of nature. The submarine was a late-war Type VIIC, commissioned in March 1944. It carried a new type of toilet designed for use at greater depths.

Like a lot of new technology, the toilet was just a little buggy. Schlitt had trouble operating it. When he called an engineer for help, the man opened the wrong valve, allowing seawater to enter the boat.

When the water reached the batteries located beneath the toilet, the boat began filling with chlorine gas, forcing Schlitt to order U-1206 surfaced. Unfortunately for the Germans, the boat was only 10 miles off the Scottish coast, and it was quickly spotted by the British.

The crew was still blowing clean air into their U-boat when an aircraft appeared and attacked, killing four men on deck and damaging the boat so badly that it was unable to dive. Schlitt, seeing the game was up, gave the order to abandon and scuttle.

It was an ignominious end to Schlitt’s only combat patrol of the war as a commander — although, less than a month later, most of his U-boat comrades had joined him in captivity, as World War II came to an end in Europe.

As for U-1206, its wreck lay undisturbed until the mid-1970s, when workers laying an underwater oil pipeline came across the hulk sprawled on the seabed at 230 feet.

The Type VIIC was the workhorse of Germany’s U-boat fleet. The first VIIC, U-69, was commissioned in 1940, and 568 were built by various shipyards during the war, making it the most widely produced combat submarine in history.

Only one Type VIIC boat still exists. The U-995 is on permanent display as a museum in Laboe, outside Kiel, Germany. The U-505 (at Chicago’s Museum of Science and Industry) and U-534 (on display near Liverpool, England), are larger Type IXCs.

Source: Uboat.net

Photo: The last Type VII U-boat in existence, U-995, is now a museum at Laboe, Germany. It’s the same model as the sub that was sunk because of a malfunctioning toilet.

This article first appeared on Wired.com April 14, 2009.

See Also:

April 13, 1953: Bond Starts Shaking Things Up, Stirring His Fans

1953: British publishing house Jonathan Cape publishes Ian Fleming’s first novel, Casino Royale, introducing the world to literature’s most famous spy: James Bond, 007.

The son of a parliamentary minister and grandson of a Scottish financier, Fleming grew up in a wealthy London family. Educated at Eton and prestigious military schools, Fleming worked as a journalist and junior editor for Reuters and was stationed in pre–World War II Moscow.

See Also:

Fleming then returned to London to work as a stockbroker. But global conflict loomed, and the director of naval intelligence, Rear Adm. John Godfrey, recruited Fleming to serve as his personal assistant. During his intelligence carer, Fleming would rise to the rank of commander, planning operations for an elite team of British commandos, the 30 Assault Unit.

Though his desk-bound duties laid the foundation for his espionage fiction, they kept Fleming out of the field. When he turned to writing after the war, he poured that frustration into his fictional alter ego — making sure Bond was always in on the action. Searching for a moniker that seemed British without sounding too dramatic, Fleming chose his MI6 hero’s name from the author of the book, Bond’s Birds of the West Indies.

It was not the first spy novel, but Casino Royale would elevate the espionage genre into the elite levels of popular culture. Arthur Conan Doyle, William Le Queux and Joseph Conrad were only some of the accomplished authors who took a shot at spy fiction in previous decades. But, Fleming was the first to combine style and sexiness with the dangerous world of espionage.

[Spoiler alert: Plot summary follows.]

Casino Royale sends Bond to France on an assignment to confront master gambler Le Chiffre. The Soviet agent embezzled Soviet money to start a failed chain of brothels, and he needs to win a high-stakes baccarat game to repay his Russian bosses. If Bond can defeat Le Chiffre on the tables, his superiors in London hope Smert’ Shpionam (SMERSH, the KGB’s revenge division — literally “Death to Spies”) will kill Le Chiffre.

Aided by the beautiful Vesper Lynd (secretly a Soviet agent herself), Bond beats the villain at cards. But, Le Chiffre captures 007, torturing and almost castrating him before a SMERSH assassin finally kills Le Chiffre. Bond recovers with Lynd and plans to quit Her Majesty’s Secret Service and live happily ever after with Lynd. That’s a big change for the all-business literary Bond, as he has no time for women when introduced.

When Lynd thinks SMERSH has targeted her for assassination, the double agent commits suicide — leaving Bond a heartfelt love note to explain her betrayal. His romantic illusions shattered, Bond returns to duty with MI6. His short report into London is also one of the great closing lines of any novel: “The bitch is dead now.”

[Spoilers end here.]

Compared to later Bond novels and especially to the films, Casino Royale is not gadget-centric.

After the book’s successful British launch, American Popular Library retitled Casino Royale in the United States as You Asked For It. The novel (and Fleming’s sequels) were not successful stateside until President John F. Kennedy included From Russia with Love (Fleming’s sixth novel) on his list of favorite books.

The book inspired two movies (a comedy in 1967 and 2006’s reboot with Daniel Craig) and a bleak television production in 1954.

Fleming would go on to pen 12 Bond novels and nine short stories overall, in addition to the children’s classic, Chitty Chitty Bang Bang. After witnessing the early stages of Bond’s emergence into movies, he died of a heart attack in 1964 at the age of 56.

2008 marked Fleming’s 100th birthday, which was celebrated with a series of special events throughout Britain.

Source: Ian Fleming: The Man Behind James Bond, by Andrew Lycett

Photo: James Bond’s creator, author Ian Fleming, had a bit of a flair himself. (Howell Conant/Bob Adelman Books)

This article first appeared on Wired.com April 13, 2009.

See Also: