This Day In Tech Events That Shaped the Wired World

Jan. 5, 1943: George Washington Carver Bites the Dust He Enriched

George Washington Carver1943: George Washington Carver dies, leaving a legacy of a revived and diversified Southern agriculture and hundreds of new and improved food products. Think of him whenever you’re enjoying peanut butter.

Carver was born into slavery in Missouri sometime in the first half of the 1860s: The exact date is unknown. His father was killed in an accident before George was born. Slaveholder Moses Carver sent his slaves to Arkansas during the Civil War, and though George’s mother was never heard from again, the boy was returned to the Missouri plantation at the end of the war.

He was no longer a slave, but he was frail and not hardy enough for field work. Instead he helped with household chores and gardening. Carver developed considerable knowledge of local vegetation and gained a reputation as a “plant doctor” who could nurse sickly plants back to health.

Carver left the plantation and set out on his own at about age 10. He supported himself as a household worker, cook, laundryman and farmhand. He picked up a high school education in Minneapolis, Kansas, and went on to study at Simpson College in Iowa.

An art teacher there recognized Carver’s skill with plants and encouraged him to enroll at the Iowa State College of Agriculture and Mechanic Arts (now Iowa State University). He was the first black student to enroll there, and he excelled in classes and extracurricular life.

Carver received a bachelor’s degree in 1894 and was invited to join the faculty (again the first black) as assistant botanist for the College Experiment Station. He published work on plant diseases and fungi, gaining national recognition and a master’s degree.

Booker T. Washington invited Carver to join the faculty of the Tuskegee Normal and Industrial Institute in Alabama in 1896, and Carver became its director of agricultural research. Carver sought to revitalize Southern agriculture through research, education and diversification.

The continuous cultivation of single crops (cotton in some places, tobacco elsewhere) had seriously depleted soils throughout the South. Carver recommended planting peanuts, soybeans and other legumes, because these could restore nitrogen to the exhausted and eroded soil. He also advised Alabama farmers to plant sweet potatoes.

The first farmers to follow these suggestions got good results, but couldn’t find the market to make their new crops profitable. So Carver set about finding new uses for the crops.

Besides peanut butter, he developed 325 derivative products from peanuts, including cheese, milk, coffee, flour, ink, dyes, plastics, wood stains, soap, linoleum, medicinal oils and cosmetics. He also came up with 118 sweet-potato products: flour, vinegar, molasses, ink, synthetic rubber, and postage-stamp glue, etc. Add to that another hundred or so products from another dozen plant sources.

Carver carried the Iowa State Extension idea to the South, encouraging the teaching of new techniques to local farmers in their own communities. The boll weevil pest was destroying the South’s cotton economy, and Carver’s contributions came just in time to save — and transform — the region’s agriculture.

He stayed at Tuskegee for the rest of his life, turning down job offers from other colleges and from industry magnates Henry Ford and Thomas Edison. Praised by many white leaders for his scientific contributions and his cooperative attitude, he was criticized by some black leaders (called Negro or colored in those days) for what they felt was excessive deference or even subservience.

By the time of Carver’s death, peanuts had risen from insignificance to one of the six leading crops in the nation — and second in the South. Among many other honors, Carver is enshrined in the National Inventors Hall of Fame and the Hall of Fame for Great Americans.

Source: Iowa State University, Encyclopedia Brittanica

Photo courtesy Library of Congress

See Also:

Jan. 4: Braille, Pitman Birthdays Celebrate New Ways to Write

Louis Braille and Issac Pitman

1809: Louis Braille is born. He’ll devise a tactile alphabet for the blind.

1813: Isaac Pitman is born. He’ll devise a shorthand alphabet for quickly writing what people are saying.

It’s probably more important that the two men were white, male Europeans of the same era than that they coincidentally shared the same date of birth. But it’s an intriguing coincidence nonetheless.

- - -

Braille was born in Coupvray, France. An accident when he was playing with an awl blinded the boy at age 4. He attended the National Institute for the Blind in Paris, where he was taught how to read raised letters shaped like the visual Roman alphabet. It was difficult and confusing: What’s easy for the eye to distinguish (or not, because you have to mind your p’s and q’s, not to mention b’s and d’s) is not necessarily easy for the fingertips.

Soldier Charles Barbier de la Serre visited the school in 1821 to demonstrate a code he’d invented at Napoleon’s behest. It allowed soldiers to silently pass signals at night without using lights to betray their positions to the enemy.

The Night Writing code was a six-by-six grid representing the alphabet and some frequent sequences of two or three letters. The coordinates in the grid were designated by a row number from 1 to 6, and a column number from 1 to 6. These coordinates were actually communicated by side-by-side columns or raised dots, which held from one to six dots each.

If that sounds complicated to you, you’re not alone. Braille realized the potential, however, and condensed the entire system. He devised a system of the Braille cell: two short columns of dots for each letter or symbol.

Each column had three positions, and each position was binary: The absence of a dot there was a signal in the same way the presence of a dot was. Each column thus had 23 or 8 possible permutations of code. And each two-column symbol had 64 possibilities — does this sound familiar in any way? — enough to quickly represent letters, accents, numerals and punctuation.

The system allowed for rapid reading, because the fingertip could read all six bits at once, and could scan quickly to the next character, without having to move just to decode a single character. The first braille book was published in 1827. The system of braille writing revolutionized instruction for the blind and spread to other alphabets around the world.

Louis Braille died in 1852. His alphabet lives on. In an age of sound recordings and electronic-voice readers, braille is used somewhat less than it used to be. But blind and visually impaired people who can read braille are more successful economically than those who are not braille-literate.

Continue Reading “Jan. 4: Braille, Pitman Birthdays Celebrate New Ways to Write” »

Dec. 31, 1999: Horror or Hype? Y2K Arrives and World Trembles

John Koskinen

1999: The world braces for chaos as midnight approaches. Will computer systems crash when the calendar switches over to 2000?

Although the answer turned out to be “no,” and the so-called Y2K crisis never materialized, the potential for disaster seemed real enough in the days and weeks leading up to the final day of the 1900s. Fears within the computer industry and the resulting media frenzy it produced certainly helped to fan the flames.

The problem, as some saw it, was that older computers still being used for critical functions might break down when the date switched from 99 to 00, since the numeric progression convention, programmed to store data using only the last two digits of any given year, wouldn’t recognize the logic of a century change.

As far as these computers were concerned, it would be 1900, not 2000. How much data might be lost as the result of this 100-year miscalculation was the great, unanswered question.

Y2K fears were real enough to make governments around the world take remedial action before the event, which had the unintended benefit of actually strengthening the existing computer infrastructure. Systems were upgraded or, when they couldn’t be replaced, were given additional backup. Billions of dollars were spent fixing the original source code in older computers.

If the threat was real — and there are still plenty of people around who say it was — then the precautions paid off. If Y2K was a form of mass paranoia — and plenty of people believe that, too — then a lot of money was wasted.

As for the midnight switchover itself, 1999 passed into history with barely a whimper. A few glitches were reported here and there, but nothing catastrophic occurred. The industry would be in crisis soon enough, but as Jan. 1, 2000, dawned, nobody saw that one coming yet.

Source: CNN

Photo: John Koskinen, chairman of the President’s Council on Y2K Conversion, warns reporters on Dec. 31, 1999, that it’s too soon to ring in the new year with a celebration, as the full impact of the Year 2000 Bug is still uncertain.
Heesoon Yim/AP

This article first appeared on Wired.com Dec. 31, 2007.

See Also:

Dec. 30, 1924: Hubble Reveals We Are Not Alone

Andromeda Galaxy

1924: Astronomer Edwin Hubble announces that the spiral nebula Andromeda is actually a galaxy and that Milky Way is just one of many galaxies in the universe.

Before Copernicus and Galileo, humans thought our world was the center of creation. Then (except for a few notable stragglers) we learned that the sun and planets did not revolve around the Earth, and we discovered that our sun — though the center of our solar system and vitally important to us — was not the center of the universe or even a major star in our galaxy.

But we still grandiosely thought our own dear Milky Way contained all or most of the stars in existence. We were about to be knocked off our egotistical little pedestal once again.

Edwin Hubble was born in Missouri in 1889 and moved to Chicago in 1898. In high school, he broke the state record in the high jump, and went on to play basketball for the University of Chicago. He won a Rhodes scholarship and studied law at Oxford. He earned a Ph.D. in astronomy, but practiced law in Kentucky. After serving in World War I and rising to the rank of major, he got bored with law and returned to astronomy.

He trained the powerful new 100-inch telescope at Mount Wilson in Southern California on spiral nebulae. These fuzzy patches of light in the sky were generally thought to be clouds of gas or dust within our galaxy, which was presumed to include everything in the universe except the Magellanic Clouds. Some nebulae seemed to contain a few stars, but nothing like the multitudes of the Milky Way.

Hubble not only found a number of stars in Andromeda, he found Cepheid variable stars. These stars vary from bright to dim, and a very smart Harvard computationist named Henrietta Leavitt had discovered in 1912 that you could measure distance with them. Given the brightness of the star and its period — the length of time it takes to go from bright to dim and back again — you could determine how far away it is.

Hubble used Leavitt’s formula to calculate that Andromeda was approximately 860,000 light years away. That’s more than eight times the distance to the farthest stars in the Milky Way. This conclusively proved that the nebulae are separate star systems and that our galaxy is not the universe.

Cosmic though it was, the news did not make the front page of The New York Times. The paper did notice the following Feb. 25 that Hubble and a public health researcher split a $1,000 prize ($12,000 in today’s money) from the American Academy for the Advancement of Science.

Hubble went on to discover another couple of dozen galaxies. Before the 1920s were over, he added another astronomical achievement to his reputation. By analyzing the Doppler effect on the spectroscopic signals of receding stars, he established that their red shift was proportional to their distance.

When the 200-inch Mount Palomar telescope was completed in January 1949, Hubble was honored to be the first astronomer to use it. He died in 1953. NASA named its space telescope after him.

Source: Various

Photo: Edwin Hubble’s 1920s observations of Andromeda (seen here in ultraviolet) expanded our notions of the size and nature of a universe that is itself expanding.
Galaxy Evolution Explorer image courtesy NASA.

This article first appeared on Wired.com Dec. 30, 2008.

See Also:

Dec. 29, 1766: He Put the Mac in Mackintosh

Charles Macintosh

1766: Charles Macintosh, who has no connection whatsoever to the computer of the same name, is born in Glasgow, Scotland. He will be remembered in tech annals as the inventor of rubberized, waterproof clothing. He’s remembered more generally for the raincoat that bears his name.

Macintosh, the son of a well-known dyemaker, developed an early interest in chemistry and science. By age 20 he was already running a plant producing ammonium chloride and Prussian blue dye. Around this time, he introduced some new techniques for dyeing cloth.

In partnership with a certain Charles Tennant, Macintosh developed a dry bleaching powder that proved popular, making a fortune for both men. The powder remained the primary agent for bleaching cloth and paper into the 1920s.

At the same time, though, Macintosh was experimenting with the idea of waterproofing fabric, using waste byproducts from the dye process. One byproduct he worked with was coal tar, which, when distilled, produced naphtha.

Macintosh found that naphtha — a volatile, oily liquid created in the distillation the aforementioned coal tar, as well as petroleum — could be used to waterproof fabrics. In 1823, he patented what was the first truly waterproof fabric, supple enough to be used in clothing. He produced the desired results by joining two sheets of fabric with dissolved India rubber soaked in naphtha.

When this concoction of his was later used to make a flexible, waterproof raincoat, the garment quickly became known as the mackintosh. (The extraneous “k” has never been explained.) The coat came into widespread use, both by the British army and by the general public.

Which is not to say it was all smooth sailing for Macintosh’s process. The fabric was vulnerable to changes in the weather, becoming stiffer in the cold and stickier in the heat. It was not especially good with wool, either, because that fabric’s natural oil caused the rubber cement to deteriorate.

Nevertheless, the waterproofing process was essentially sound and was improved and refined over time. It was considered effective enough to be used in outfitting an Arctic expedition led by 19th-century explorer Sir John Franklin.

Although he enjoyed his greatest success and lasting fame for his waterproofing process, Macintosh was no one-trick pony. In his capacity as a chemist, he helped devise a hot-blast process for producing high-quality cast iron.

Source: Today in Science

Illustration: Charles Macintosh enjoys the warm, dry indoors.
Painting by J. Graham Gilbert, RSA, engraved by Edward Burton.

This article first appeared on Wired.com Dec. 29, 2008.

See Also:

Dec. 28, 1879: Tay Bridge Collapses, Alas

Tay Bridge

1879: An iron railway bridge over Scotland’s River Tay collapses in a severe storm as a passenger train rolls across. The train plunges into the roiling river, killing everyone on board.

The lattice-girder bridge, designed by highly regarded railway engineer Sir Thomas Bouch, crossed the Firth of Tay between Dundee and Leuchars. It was built on the cheap, which turned out to be a hallmark (and a selling point) of Bouch’s work. The North British Railway, which commissioned the 2-mile-long bridge, was hewing to a tight budget, and Bouch was considered a master of the form.

Since buying prefabricated sections from established foundries was out of the question, the resourceful Bouch used iron produced in his own hastily constructed foundry. The quality was poor and the casting uneven. Additionally, Bouch didn’t bother calculating wind loads, even after altering his original design to include girders longer than 200 feet.

The underlying shoddiness, however, was lost in the sheer magnitude of the project. Former U.S. President Ulysses S. Grant visited Scotland to admire this feat of engineering.

The bridge opened June 1, 1878, to great fanfare: Queen Victoria took a ride across the mammoth structure not long after opening day, and Bouch was knighted for his efforts. The euphoria didn’t last long.

Even by Caledonian standards, the storm that hit the night of Dec. 28, 1879, was a rough one. The winds were so severe that the high tower at Kilchurn Castle on Loch Awe also fell, and hundreds of Scots had the roofs torn off their homes.

As the train thundered across the Tay, the high girders at the span’s midpoint gave way, sending the six-car train and the entire central section of the bridge crashing into the river. Based on the passenger manifest, it was determined that 75 people were on board. Fewer than 50 bodies were recovered, and there were no survivors. Well, there was one: The locomotive was later fished from the Tay and, after a refitting, returned to service. It became known to the railwaymen, ghoulishly and unofficially, as “The Diver.”

The official inquiry into the disaster destroyed Bouch’s professional reputation, concluding that the bridge was “badly designed, badly built and badly maintained, and that its downfall was due to inherent defects in the structure, which must sooner or later have brought it down.”

The final verdict: “For these defects … Sir Thomas Bouch is, in our opinion, mainly to blame. For the faults of design he is entirely responsible.” Disgraced, Bouch retired to the spa town of Moffat and died within the year.

A second bridge was built across the Tay and opened in 1887, just to the west of the original. That one is still standing.

There was a final indignity to bear, however. William Topaz McGonagall, a Dundee bard occasionally described as the worst poet in British history, was inspired to pen “The Tay Bridge Disaster.” Yikes!

Source: Various

Photo: The new Tay Rail Bridge replaced the poorly built original. It stands next to the piers of the old bridge.
Danger Mike/Flickr

See Also:

Dec. 24, 1968: Christmas Eve Greetings From Lunar Orbit

nasa_earthrise_1968_630px

1968: The crew of Apollo 8 delivers a live, televised Christmas Eve broadcast after becoming the first humans to orbit another space body.

Frank Borman, Jim Lovell and William Anders made their now-celebrated broadcast after entering lunar orbit on Christmas Eve, which might help explain the heavy religious content of the message. After announcing the arrival of lunar sunrise, each astronaut read from the Book of Genesis.

How this went down at the Baikonur Cosmodrome in the Soviet Union is unknown, but it stands in stark contrast to the alleged message sent back to Earth several years earlier by cosmonaut Yuri Gagarin, the first man in space.

“I don’t see any God up here,” Gagarin reportedly said from his vantage point aboard Vostok I, although the accuracy of that statement has been challenged over the years. True or not, the reactions were poles apart and did nothing to diminish the God-fearing-West–vs.–godless-commies propaganda campaign very prevalent in the United States at the time.

The crew of Apollo 8 didn’t claim to see God, either, but they were clearly impressed by His handiwork. “The vast loneliness is awe-inspiring, and it makes you realize just what you have back there on Earth,” Lovell said during another broadcast. (There were six broadcasts from the crew in all.)

But admiring the vastness of space was not Apollo 8’s primary mission. This was a pivotal step on the way to the ultimate goal of landing a man on the moon, which was achieved less than a year later. During a flight lasting six days and including 10 orbits of the moon, the Apollo 8 astronauts photographed the lunar surface in detail, both the near and far side, and tested equipment that would be used by Apollo 11’s crew for the eventual approach and landing.

The Apollo 8 command module is on display at Chicago’s Museum of Science and Industry.

Speaking of famous Christmas eve broadcasts, it’s worth remembering that Reginald Fessenden made what is generally recognized as the first public voice-over-radio broadcast on Dec. 24, 1906. Fessenden, a Canadian inventor, was in the midst of promoting his alternator-transmitter to potential buyers of his patent rights, among them representatives of American Telephone & Telegraph.

Like the crew of Apollo 8, Fessenden’s broadcast was of a pious nature. There was a reading from Luke, Chapter 2, and Fessenden himself played “O Holy Night” on the violin.

Being broadcast over radio waves meant Fessenden’s program was available to anyone with a receiver who was within range of his transmitter in Brant Rock Station, Massachusetts. In 1906, that audience was severely limited, consisting mostly of shipboard radio operators at sea off the New England coast.

Source: Various

Photo: In lunar orbit at Christmas 1968, Apollo 8 sent back the first televised view of an Earthrise.

Photo, video courtesy NASA

This article first appeared on Wired.com Dec. 24, 2008.

See Also:

Dec. 23, 1947: Transistor Opens Door to Digital Future

transistor_inventors_hr

1947: John Bardeen and Walter Brattain, with support from colleague William Shockley, demonstrate the transistor at Bell Laboratories in Murray Hill, New Jersey.

It’s been called the most important invention of the 20th century. The transistor, aka point-contact transistor, is a semiconductor device that can amplify or switch electrical signals. It was developed to replace vacuum tubes.

Vacuum tubes were bulky, unreliable and consumed too much power. So AT&T’s research-and-development arm, Bell Labs, started a project to find an alternative.

For nearly a decade before the first transistor was developed, Shockley, a physicist at Bell Labs, worked on the theory of such a device. But Shockley couldn’t build a working model. His first semiconductor amplifier had a “small cylinder coated thinly with silicon, mounted close to a small, metal plate.”

So Shockley asked his colleagues, Bardeen and Brattain, to step in. One of the problems they noticed with Shockley’s first attempt was condensation on the silicon. So they submerged it in water and suggested the initial prototype have a metal point “that would be pushed into the silicon surrounded by distilled water.”  At last there was amplification — but disappointingly, at a trivial level.

Following more experiments, germanium replaced silicon, which increased amplification by about 300 times.

A few more modifications later, Brattain had a gold metal point extended into the germanium. That resulted in better ability to modulate amplification at all frequencies.

The final design of a point-contact transistor had two gold contacts lightly touching a germanium crystal that was on a metal plate connected to a voltage source. Also known as the “little plastic triangle,” it became the first working solid-state amplifier.

Bardeen and Brattain demonstrated the transistor device to Bell Lab officials Dec. 23, 1947. Shockley was reported to have called it “a magnificent Christmas present.” But Shockley himself was not present when it happened and was said to be bitter over losing out on that day.

He had his revenge, though. Shockley continued to work on the idea and refine it.  In early 1948, he came up with the bipolar or junction transistor, a superior device that took over from the point-contact type.

Bell Labs publicly announced the first transistor at a press conference in New York on June 30, 1948.

The transistor went to replace bulky vacuum tubes and mechanical relays. The invention revolutionized the world of electronics and became the basic building block upon which all modern computer technology rests.

Shockley, Bardeen and Brattain shared the 1956 Nobel Prize in Physics for the transistor, but the trio never worked together after the first few months of their initial creation of the transistor.

Shockley left Bell Labs and founded Shockley Semiconductor in Mountain View, California — one of the early high-tech companies in what would later become Silicon Valley.

Brattain remained a fellow at Bell Labs. Bardeen became a professor at the University of Illinois in 1951, and he shared a second Nobel Prize in Physics in 1972, for the first successful explanation of superconductivity.

Source: Various

Photo: William Shockley, John Bardeen and Walter Brattain work at Bell Labs in the late 1940s.
Courtesy Alcatel-Lucent/Bell Labs

See Also:

Dec. 22, 1882: Looking at Christmas in a New Light

Christmas Tree

1882: An inventive New Yorker finds a brilliant application for electric lights and becomes the first person to use them as Christmas tree decorations.

Edward H. Johnson, who toiled for Thomas Edison’s Illumination Company and later became a company vice president, used 80 small red, white and blue electric bulbs, strung together along a single power cord, to light the Christmas tree in his New York home. Some sources credit Edison himself with being the first to use electric lights as Christmas decorations, when he strung them around his laboratory in 1880.

Sticking them on the tree was Johnson’s idea, though. It was a mere three years after Edison had demonstrated that light bulbs were practical at all.

The idea of replacing the Christmas tree’s traditional wax candles — which had been around since the mid-17th century — with electric lights didn’t, umm, catch fire right away. Although the stringed lights enjoyed a vogue with the wealthy and were being mass-produced as early as 1890, they didn’t become popular in humbler homes until a couple of decades into the 20th century.

A general distrust of using electricity for indoor lighting, still widespread in the late 19th century, kept the popularity of Christmas lights low. They were most commonly seen ringing the seasonal display windows of big-city department stores.

In 1895, President Grover Cleveland (a New York stater himself) supposedly ordered the family’s White House tree festooned with multicolored electric lights. If he did, it barely moved the needle on the popularity scale. Even so, General Electric began selling Christmas-light kits in 1903.

Another New Yorker is generally credited with popularizing indoor electric Christmas lights. According to the story, Albert Sadacca, whose family sold ornamental novelties, became a believer in 1917 after reading the account of a bad fire caused by a candlelit tree bursting into flames.

Whether or not that’s the reason, Sadacca began selling colored Christmas lights through the family business. By then, the public’s distrust of electricity had diminished. So the timing was right, and sales took off.

With his brothers, Sadacca later started a company devoted solely to the manufacture of electric Christmas lights. He succeeded in roping a few competitors into a trade association, which proceeded to dominate the Christmas-light industry into the 1960s.




Source: Various

This article first appeared on Wired.com Dec. 22, 2008.

Photo: The 2009 White House Christmas tree in the Blue Room is adorned with LED lights.
J. Scott Applewhite/AP

See Also:

Dec. 21, 1898: The Curies Discover Radium

Pierre and Marie Curie

1898: Radium is discovered by the husband-and-wife team of Pierre and Marie Curie.

Sorbonne-bred physicist Pierre Curie had been noodling with crystals and magnetism since the early 1880s. He was a professor at the School of Physics in Paris when one of his students, Marie Sklodowska, caught his eye. They wed in 1895, and theirs was both a happy marriage and a fruitful professional collaboration.

A colleague of the Curies, Henri Becquerel, paved the way for their groundbreaking research with his discovery of spontaneous radioactivity in 1896. With the rest of the scientific community going gaga over Wilhelm Roentgen’s recent discovery of X-rays, Becquerel’s presentation to the Academy of Sciences aroused little interest. However, Marie Curie, casting about for a potential doctoral thesis, took note.

Dragging Pierre away from his crystals, Marie got the ball rolling on what would be the central pillar of their life’s work.

Following on Becquerel, the Curies succeeded in isolating element 84, polonium (named for Poland, the country of Marie’s birth), and then element 88, radium. It was Marie, in particular, who devised a method for separating radium from its radioactive residues, making possible the closer study of its therapeutic properties. This would remain a lifelong interest of hers.

The Curies and Becquerel shared the 1903 Nobel Prize for Physics for their associated research involving what the Nobel committee referred to as the “radiation phenomena.” Thus, Marie Curie became the first woman recipient of a Nobel Prize, nosing out Bertha von Suttner (Peace) by two years.

As for Pierre, his satisfaction over winning a Nobel was short-lived. He was killed in an accident on a Paris street in 1906. Marie continued with their work, taking over her husband’s position as professor of general physics on the Faculty of Sciences, then becoming director of the Radium Institute’s Curie Laboratory at the University of Paris in 1914.

Marie Curie received a second Nobel Prize in 1911, this time for Chemistry. She spent the rest of her life in science, much of it promoting the healing properties of radium. In 1929, five years before her death, Curie founded a radiation laboratory in her native Warsaw.

One of the Curie daughters, Irene, later became a Nobel recipient as well, also in collaboration with her scientist-husband.

Curium, element 96, is named in honor of Pierre and Marie Curie. Francium, element 87, is named for France, site of the Curie Institute where it was discovered.

The curie is the international unit of measurement for radioactivity. Although originally defined as the radioactivity of 1 gram of pure radium, it is now specified as 3.7 x 1010 atomic disintegrations per second, or 37 gigabecquerels.

Source: Nobel Foundation

Photo: Associated Press

See Also: