Archive for the ‘19th century’ Category

Aug. 15, 1877: ‘Hello. Can You Hear Me Now?’

1877: Thomas Edison suggests using the word hello as a telephone greeting. The idea catches on.

Edison invented a lot of things, for sure, but one thing he didn’t invent was the telephone. The brass ring for that one goes to Alexander Graham Bell, although Elisha Gray filed his patent for a similar device the same day. But they never called it Ma Gray, did they?

Edison’s contribution to the “improvement in telegraphy” was giving us the salutation now used the world over, in one form or another. Bell’s famous first words spoken over what we now call the telephone — “Mr. Watson, come here. I want to see you.” — were delivered without any greeting at all.

When he did weigh in on the subject, Bell proposed using “ahoy, ahoy,” the age-old seafarer’s hail. And, in fact, ahoy was the first greeting used, until Edison suggested hello.

At the time, the phone was conceived of as a business machine that would connect two offices with a permanently open line. Some people toyed with the idea of an alarm bell at each end to alert one office that the other office wanted to speak. On Aug. 15, 1877, Edison wrote to a friend who was setting up a phone system in Pittsburgh: “I don’t think we shall need a call bell as Hello! can be heard 10 to 20 feet away. What do you think?”

Contrary to some accounts, Edison did not coin the word. Halloo and variants had been used for ages to urge on hunting hounds and to shout to people at a distance. Edison was tinkering with a prototype phonograph in 1877 and used a shouted halloo! for testing. Early gramophones and telephones alike had pretty low signal-to-noise ratios.

Hello itself turns up in a number of places prior to 1877, including Mark Twain’s travelogue, Roughing It, published four years before Bell called Mr. Watson. Earlier references to the word also exist, one dating back to at least 1826.

In any case, hello caught on quickly and entered the dictionary in 1883. And when was the last time you had to look up that spelling?

Source: Various

Photo: Thomas Edison in 1876.

This article first appeared on Wired.com Aug. 15, 2008.

Aug. 8, 1876: Run This Off on the Mimeo

1876: Thomas Edison receives a patent for the mimeograph. It will dominate the world of small-press-run publication for a century.

Before the inkjet printer, before the laser printer, before the dot-matrix printer, before the photocopier, there came the mimeograph machine. They were everywhere — in schools, offices and the military.

If you needed just a few copies of a document, you used carbon paper. If you needed thousands (and had the time and the budget), you could send it to a print shop for typesetting and publication.

But if you needed something in between, say 30 copies for a classroom handout (or test!) or 500 or 1,000 for a church bulletin or incendiary revolutionary poster, you had the mimeograph.

Before the light bulb lit up in his laboratory, before he pioneered the power station, before he recorded “Mary had a little lamb” in the first practical phonograph, before he made motion pictures work and then made motion pictures, before 1,000 or so other inventions and improvements great and small, Edison invented the mimeograph.

Those of us who are old enough to remember the mimeo can probably conjure up the smell of its ink — especially ink for the Dittograph or spirit duplicator, which handled the smaller press runs. Those who actually used to “run things off” on the machines probably remember the look and feel of its sometimes-delicate stencils.

Those who are younger may not even know how the word is pronounced. It’s MIM-EE-oh-graf, not MYME-oh-graf or MEEM-oh-graf. Ask your parents.

The process is simple: Cut a stencil, push ink through the holes onto paper, and repeat. The business model is also simple: Sell the machine, sell the stencils, sell the ink — maybe even sell the paper, but there might be competition there.

Edison’s 1876 patent covered a flatbed duplicating press and an electric pen for cutting stencils. Chicago inventor Albert Blake Dick improved the stencils while experimenting with wax paper and merged his efforts with Edison’s. The A.B. Dick Co. released the Model 0 Flatbed Duplicator in 1887. It sold for $12 (about $284 in today’s money).

If you didn’t want to use the electric pen, you could try cutting a stencil with one of those newfangled typewriters. But hand drawing of stencils persisted well into the 20th century for diagrams of sentences and diagrams of scientific concepts, as well as mathematical formulas that were beyond the scope of the typewriter keyboard.

Later models replaced Edison’s original flatbed press and hand roller for the ink with a rotating cylinder and an automatic feed from the ink reservoir. Deluxe models included an electric motor. You could also get cheaper ones that you had to crank by hand.

The A.B. Dick Co. believes almost every U.S. military personnel order of World War II was run off on one of its machines. And so central is the mimeograph to the history of 20th-century education that the Columbia University Teachers College planned a special library exhibit on the mimeograph in 2008.

We saw that on the web, not on a mimeographed flyer.

Source: Various

Photo: Employees of Vermont’s state purchasing department use mimeograph machines in 1959. (Courtesy Vermont State Archives Photograph Collection)

This article first appeared on Wired.com on 08/08/08. How’s that for duplicating?

Aug. 2, 1873: San Francisco’s First Cable Car Conquers Nob Hill

1873: Andrew Hallidie tests the first cable car in San Francisco.

Hallidie is said to have conceived his idea in 1869 while watching a team of horses being whipped as they struggled to pull a car up wet cobblestones on Nob Hill. They slipped and were dragged to their deaths.

It so happened that Hallidie’s father held the British patent for wire-rope cable, and when the son came to the Gold Rush fields he put it to use hauling ore-laden cars from mines. So it wasn’t too much of a stretch for him to envision horseless cable cars carrying passengers up the steep slopes of San Francisco’s hills.

He formed the Clay Street Hill Railroad and was awarded a contract to build the city’s first cable car line up Nob Hill. Fourteen months later, on Aug. 2, 1873, the first cable car made its way up Clay Street. It was an unqualified success. Regular passenger service began a month later, and cable cars have been operating in San Francisco ever since.

A number of cable car lines and companies sprang up in the wake of Hallidie’s success. At its high-water mark, prior to the great earthquake and fire of 1906, 53 miles of cable car track stretched to virtually every corner of town.

A vast underground pulley system moves a cable at a steady 9 mph, pulling the cable cars along the tracks. The operator, or gripman, uses a lever to grip the cable moving beneath the street.

In the late 1940s and ’50s, there was a move to dismantle San Francisco’s cable car system, backed largely by politicians who some say were in the pockets of the oil and tire companies that had a vested interest in seeing buses replace cable. The system would have probably vanished if not for the efforts of one San Franciscan in particular, Friedel Klussmann, who founded the Citizens’ Committee to Save the Cable Cars and battled City Hall every step of the way.

When the issue finally made it to the ballot, San Franciscans voted overwhelmingly to keep their cable cars. Today, three lines — the Powell & Hyde, Powell & Mason, and California Street cable — continue to operate in the City by the Bay.

Source: San Francisco Municipal Railway

Photo: The Clay Street Hill Railroad, which started public service in 1873, was the first cable car company in San Francisco. (Courtesy Cable Car Museum)

This article first appeared on Wired.com Aug. 2, 2007.

July 28, 1858: Press Down Firmly, You’re in Our Files Now

1858: A British colonial magistrate in India starts using fingerprints as a means of identifying people. It’s the first-known, modern official use of the technique.

Like many innovations, this one wasn’t completely new. Ancient Babylonian clay tablets recording business transactions were sometimes “sealed” with fingerprints. Officials in ancient Rome may have solved one murder by matching the culprit’s hand to a bloody handprint.

China’s Tang Dynasty (618 to 906 A.D.) used fingerprints as a source of identification. A thumbprint was a legal signature for documents in Japan around the same time. A medieval Persian official noticed that the fingerprints on government documents were unique to the individual.

British physician Nehemiah Grew lectured in 1684 on the ridge patterns on fingerprints. Italian doctor Marcello Malpighi wrote about the same subject just two years later. An 1823 doctoral dissertation by Johannes Purkinje at the University of Breslau classified fingerprints into nine types.

Purkinje studied the ridges, spirals and loops with a microscope, another first in fingerprint study. But neither he nor Malpighi commented on fingerprints’ potential use for identification.

William James Herschel served as a magistrate at Nuddea, India. At his request, local businessman Rajyadhar Konai made a handprint on the back of a contract July 17, 1858. Herschel wasn’t initially trying to use the system for personal identification. He merely wanted to “frighten [Konai] out of all thought of repudiating his signature.”

Herschel liked the idea and made it a regular requirement for Indians executing documents. He soon moved from using palm prints to just taking impressions of the right index and middle fingers. And he began to notice that no two prints were identical, and that prints didn’t change as an individual grew older. Herschel’s 1877 request to use fingerprints to identify inmates at a Bengali prison in 1877 was denied, but the concept was moving from civil law into criminal law.

It would be 1892 before Argentine police official Juan Vucetich clearly established modern fingerprinting as a way to solve crimes and prove guilt. Vucetich was born in … 1858.

Source: Various

Photo: Jim Merithew/Wired.com

This article first appeared on Wired.com July 28, 2008.

July 18, 1876: Royal Commissioners Wrinkle Their Noses

1876: The British government appoints a Royal Commission on Noxious Vapours to look into the growing problem of industrial air pollution. Its report two years later would bring better regulation but warn about impeding economic growth.

England had been trying to do something about air quality for centuries. King Edward I in 1306 prohibited burning sea coal in London, because of all the smoke it caused. By act of Parliament, anyone who sold and burned the outlawed coal could be punished by torture or hanging. Richard II and Henry V issued further regulations and restrictions in the following centuries.

See Also:

The Industrial Revolution worsened things, with factories putting out a toxic soup of new pollutants. The 1853 Smoke Nuisance Abatement (Metropolis) Act provided for an inspector to work with the metropolitan police to reduce “nuisance from the smoke of furnaces in the metropolis and from steam vessels above London Bridge.” A similar act four years later applied to Scotland.

A new process for manufacturing alkali (sodium carbonate, used in manufacturing glass and other products) was releasing huge volumes of the byproduct hydrochloric acid into the air. That led to a deluge of lawsuits and a loud public outcry. This resulted in passage of the Alkali Act in 1863. It required a minimum 95 percent capture of the acid and set dilution standards for what was emitted: 0.2 grains of HCl per cubic foot.

Chief inspector Robert Smith and four assistant inspectors worked with manufacturers to show them how to transform what would be pollution into marketable byproducts. The Alkali Act was extended and amended in 1874 to require manufacturers to use the “best practicable means” of controlling the acid vapors.

Still, things were so bad by 1876 that the Conservative government of Prime Minister Benjamin Disraeli appointed the Royal Commission on Noxious Vapours. The commissioners visited industrial areas around England, inspecting “alkali works, cement works, chemical manure works, coke ovens, copper works of all descriptions, glass, lead and metal works, potteries and salt works.”

The commission asked 14,000 questions of 196 witnesses, including “manufacturers, landowners, farmers, clergymen, occupiers of houses, lands and gardens, land-agents, scientific witnesses, medical persons, local officers” and the Alkali Act inspectors.

Witnesses complained of damage to trees, crops, vegetation and human health. They said the noxious industrial gases were carried far and wide by the wind and caused coughing, difficulty breathing and nausea. The alkali manufacturers gave the commission a statement rebutting the allegations.

The commission made 10 recommendations in August 1878. New legislation increased the frequency of inspections and made the inspectors’ reports public records. The commission concluded (.pdf) that “it is not a question of a few manufactories, but of industries all over the country, which in relation to man are causing pollution of the air in degrees sufficient to make them common-law nuisances.” So, the Alkali Acts were extended to include the production of sulfuric acid, chemical fertilizer works and coke ovens.

But the witnesses who had argued that noxious vapors were inevitable if the nation was to prosper had their effect. The commission noted that regulation was only practical if it did not involve “ruinous expenditure.” And courts remained reluctant to shut down polluters if the result would destroy the industry of a town.

London suffered a killer smog in December 1952 that killed as many as 12,000 people. Britain passed its Clean Air Act in 1956. The United States passed a weak Clean Air Act in 1963 and strengthened it in 1970.

Source: Various

Image: As in many other British cities in the 19th and 20th centuries, Manchester’s mills polluted the skies. Engraving by Edward Goodall (1795-1870), from an 1852 painting by William Wyld (1806-1889).

This article first appeared on Wired.com July 18, 2008.

July 14, 1850: What a Cool Idea, Dr. Gorrie

1850: Florida physician John Gorrie uses his mechanical ice-maker to astonish the guests at a party. It’s the first public U.S. demonstration of ice made by refrigeration.

William Cullen had demonstrated the principle of artificial refrigeration in a University of Glasgow laboratory in 1748, by allowing ethyl ether to boil into a vacuum. American Oliver Evans designed in 1805 — but never built — a refrigeration machine that used vapor instead of liquid. Jacob Perkins used Evans’ concept for an experimental volatile-liquid, closed-cycle compressor in 1834.

Nonetheless, mid-century cooling in the tropics and subtropics — and in the temperate summer — relied on natural ice blocks carved from frozen lakes and rivers in the North, kept in shaded sheds and cellars under layers of sawdust for insulation, and often delivered at great expense by specially fitted ice ships.

Gorrie was born in the tropics, on the Caribbean island of Nevis. He received his medical education in New York state before settling in the Florida cotton-shipping port of Apalachicola. There, he served at various times as mayor, justice of the peace, postmaster and bank president, besides carrying on his medical practice.

It would be another half-century before the causes of the killer diseases malaria and yellow fever were discovered, but Dr. Gorrie knew they relied on heat and moisture to propagate. He urged the draining of swamps and the enforcement of hygiene in the town’s food market.

Gorrie also sought to improve the survival rate of his feverish patients by cooling them down. He suspended pans of ice water high in their sickrooms, so the cooled, heavy air would flow downward.

But ice was expensive in the Florida summer and often completely unavailable. Gorrie wanted to make ice mechanically. He wrote:

If the air were highly compressed, it would heat up by the energy of compression. If this compressed air were run through metal pipes cooled with water, and if this air cooled to the water temperature was expanded down to atmospheric pressure again, very low temperatures could be obtained, even low enough to freeze water in pans in a refrigerator box.

Gorrie began tinkering with compressor-coolers and had a working model by the mid-1840s. The power source was irrelevant to his invention: It could be driven by wind, water, steam or the brute force of an animal.

He applied for patents in 1848 and had a prototype built in Ohio by the Cincinnati Iron Works. It was described in Scientific American the following year, but Gorrie still had to attract venture capital to fight the existing ice-block industry.

He arranged a dramatic demonstration of his machine for a social, rather than medical, occasion. It was a muggy July in Florida. Ice from the North had been exhausted. Gorrie attended an afternoon reception given by the French consul to honor Bastille Day.

The doctor first complained about drinking warm wine in hot weather, then suddenly announced, “On Bastille Day, France gave her citizens what they wanted. [Consul] Rosan gives his guests what they want, cool wines! Even if it demands a miracle!”

Then he signaled for waiters to enter with bottles of sparkling wine on trays of ice. It was a sensation: mechanically made ice in the sweltering Florida summer. Smithsonian magazine dubbed that party the “chilly reception.”

Gorrie received a British patent a month later and U.S. patent 8,080 on May 6, 1851, but he failed at business. His business partner died, and Gorrie’s inefficient, leaky machines were mocked in the press by the ice-shipping establishment. He died in poverty and ill health in 1855, still in his early 50s. It would take Frenchman Ferdinand P.E. Carre’s closed, ammonia-absorption system (patented in 1860) to make way for practical, widespread mechanical refrigeration.

Florida has honored Gorrie by placing his statue in the National Statuary Hall collection in the U.S. Capitol. (The other Florida statue is Confederate Gen. Edmund Kirby Smith.)

So, have a happy Bastille Day (or joyeuse Fête Nationale), chill out and lift a cold one to the father of refrigeration. You can use the very words spoken more than a century-and-a-half ago: “Let us drink to the man who made the ice: Dr. Gorrie.”

Source: Smithsonian magazine, John Gorrie State Museum

Diagram: U.S. Patent 8,080; May 6, 1851.

This article first appeared on Wired.com July 14, 2008.

July 6, 1885: Rabies Vaccine Saves Boy — and Pasteur


1885: Louis Pasteur successfully tests his rabies vaccine on a human subject.

Pasteur, a French chemist and biologist, began closely studying bacteria while investigating the cause of souring in milk and other beverages. This led him to develop the process of pasteurization, where a liquid is boiled and then cooled to kill the bacteria that cause the souring.

Pasteur moved on into a more thorough study of bacteria, enabling him to prove that these microscopic organisms occurred naturally in the environment and did not simply appear spontaneously, as was then generally believed.

As the director of scientific studies at the Ecole Normale in Paris, Pasteur pursued his germ theory, which posited that germs attack the body from the outside. Proved right again, his work led to vaccinations being developed for many germ-borne diseases, including anthrax, tuberculosis, cholera and smallpox. It also led to further work on rabies, which was much more prevalent in Pasteur’s time than it is today.

He developed his rabies vaccine by growing the virus in rabbits, then drying the affected nerve tissue to weaken the virus.

On July 6, 1885, the vaccine was administered to Joseph Meister, a 9-year-old boy who had been attacked by a rabid dog. The boy survived and avoided contracting rabies, which would have almost certainly proved fatal.

Good thing it worked: Pasteur was not a licensed physician and could have been prosecuted had the vaccine failed. The legalities were forgotten, and Pasteur instead became a national hero.

Source: Various

This article first appeared on Wired.com July 6, 2007.

Painting: Portrait of Louis Pasteur by A. Edelfeldt

July 1, 1858: Darwin and Wallace Shift the Paradigm


1858: The Linnaean Society of London listens to the reading of a composite paper on how natural selection accounts for the evolution and variety of species. The authors are Charles Darwin and Alfred Russel Wallace. Modern biology is born.

Scientists of the time knew that evolution occurred. The fossil record showed evidence of life forms that no longer existed. The question was, how did it occur?

Darwin had been working on his theory since 1837, soon after his epic voyage on the HMS Beagle. The hypermethodical naturalist wanted not only to classify the prodigious variation he had observed, but also to explain how it came to be.

He felt he would need to publish extensive documentation of natural selection to overcome popular resistance to so radical a notion. So he planned a comprehensive, multivolume work to convince scientists and the world.

Darwin was still working on his magnum opus when in June 1858 he received a letter from an English naturalist working in Malaysia. Alfred Russel Wallace was young and brash. When he conceived of natural selection, he didn’t plan a 10-volume lifework. He just dashed off a quick paper on the subject and mailed it to the author of The Voyage of the Beagle, asking him to refer it for publication if it seemed good enough.

Darwin was crestfallen. Was he about to lose credit for two decades of work? Wallace had suggested that Darwin forward the paper to Scottish geologist Charles Lyell. Along with English botanist Joseph Hooker, Lyell was one of a small handful of people Darwin had shown early drafts of his own work on natural selection.

Darwin wrote to Lyell and Hooker, and they arranged for a joint paper to be read at the forthcoming meeting of the Linnaean Society of London. (Founded in 1788 and named for Carl Linnaeus, the Swedish scientist who devised the binomial system of taxonomy, it is the world’s oldest active biological society.)

Neither Darwin nor Wallace attended the meeting. Wallace was still in Malaysia. Darwin was at home with his wife mourning the death of their 19-month-old son just three days earlier.

The secretary of the society read the 18-page paper, comprising four parts:

  1. Lyell and Hooker’s own letter of introduction explaining the extraordinary circumstances;
  2. An excerpt from Darwin’s unpublished draft, part of a chapter titled, “On the Variation of Organic Beings in a State of Nature; on the Natural Means of Selection; on the Comparison of Domestic Races and True Species”;
  3. An abstract of Darwin’s 1857 letter on the subject to Harvard University botanist Asa Gray;
  4. Wallace’s manuscript, “On the Tendency of Varieties to Depart Indefinitely From the Original Type.”

The paper and the meeting did not cause an immediate sensation. Other papers were read the same day. The society had routine business to transact. The meeting was long (.pdf). But the paper was accepted for publication in the society’s Proceedings later that year.

Was this a remarkable case of simultaneous discovery? Not quite. It was more like simultaneous announcement. What is remarkable is that both Darwin and Wallace credited their central insight to reading Thomas Malthus’ essay, Population, first published in 1798.

Darwin read Malthus in 1838 and immediately realized how it applied to his own work. Wallace had read it around 1846, but first saw its import for explaining evolution while he lay recovering from fever in Malaysia a dozen years later.

Malthus observed that population was held in check because not every individual would survive to reproduce. As Wallace wrote, “It suddenly flashed upon me … in every generation the inferior would inevitably be killed off and the superior would remain — that is, the fittest would survive.”

June 29, 1888: Handel Oratorio Becomes First Musical Recording

1888: The earliest known musical recording is made. The piece, Georg Friedrich Handel’s Israel in Egypt, is recorded on a paraffin cylinder.

Israel in Egypt, assigned the catalog number HWV 54, is an oratorio, a form in which Handel excelled. Like his more famous Messiah, Israel in Egypt is composed using biblical passages, mainly from Exodus and the Psalms.
Unlike the Messiah, however, it didn’t enjoy much of a reception when it premiered in 1739. As a result, Handel shortened the work and inserted a few Italian arias to lighten the mood a bit.

Nevertheless, it was selected by Col. George Gourand, Thomas Edison’s foreign sales agent, for the first musical recording. Gourand made his recording in London’s Crystal Palace, using Edison’s yellow paraffin cylinder — candle wax, essentially.

Source: Stanford University, National Park Service

Image: Georg Friedrich Handel was a German-born British baroque composer.

This article first appeared on Wired.com June 29, 2007.

June 27, 1898: Down to the Sea in Ships, and Then Some

1898: Joshua Slocum completes a solo voyage lasting nearly three years, becoming the first sailor to circumnavigate alone.

Slocum, born within sight of Nova Scotia’s Bay of Fundy in 1844, ran away from home at 14 and signed on a fishing schooner as cabin boy to begin a lifetime at sea. He later crossed the Atlantic and became an ordinary seaman on the Tangier, a British merchantman. By 18, he had received his papers from the Board of Trade qualifying him as a second mate.

Landing in California, Slocum received his first command there and spent 13 years sailing out of San Francisco, taking square-rigged ships to Japan, China, Australia and the Spice Islands (the Moluccas of present-day Indonesia), as well as engaging in the coastwise lumber trade.

Several ships, two wives and two sons later — his first wife died in Argentina — Joshua Slocum found himself back on the East Coast, in possession of a rotting old oyster sloop called the Spray. He would make history with this boat.

He spent the next few years restoring the Spray and rigging her for solo sailing. In 1895, at age 51, Slocum set out to be the first sailor ever to make a solo circumnavigation. The 37-foot Spray left Boston in April 1895 with her original sloop rig, but difficulties in the Strait of Magellan would cause Slocum to re-rig her as a yawl for the remainder of the voyage.

One peculiarity of Slocum’s sailing was his decision to eschew the chronometer — in favor of using a sextant and the ancient method of dead reckoning — for fixing his longitudinal position at sea.

It was an eventful passage. Chased by pirates, feted by island kings and almost drowned a couple of times in storms, Slocum sailed 46,000 miles, staying for weeks and sometimes months at various stops along the way. His longest time at sea without making landfall was 72 days in the Pacific.

In addition to his seafaring skill, Slocum was an accomplished writer. His account of the voyage, Sailing Alone Around the World, is considered a classic of adventure literature. He begins his story thus:

I had resolved on a voyage around the world, and as the wind on the morning of April 24, 1895, was fair, at noon I weighed anchor, set sail and filled away from Boston, where the Spray had been moored snugly all winter. The 12 o’clock whistles were blowing just as the sloop shot ahead under full sail.

A short board was made up the harbor on the port tack, then coming about she stood to seaward, with her boom well off to port, and swung past the ferries with lively heels. A photographer on the outer pier of East Boston got a picture of her as she swept by, her flag at the peak throwing her folds clear

A thrilling pulse beat high in me. My step was light on deck in the crisp air. I felt there could be no turning back, and that I was engaging in an adventure the meaning of which I thoroughly understood.

Kind of makes you want to dump your stupid computer and run off to sea, doesn’t it?

Sailing Alone earned Slocum a lot of money, enabling him to buy his first home on land — though characteristically offshore — in Martha’s Vineyard in 1902.

Although sales of the book remained brisk during the first several years of the 20th century, they were waning by 1908. Slocum was suddenly hurting for money and decided to sail south this time, to the Orinoco River in Venezuela, with the idea of gathering material for another book. Luck was not with him on this voyage, however, and the Spray, while still seaworthy, was not what she had been a decade earlier.

Slocum set sail for the West Indies in November 1909 and was never heard from again. He wasn’t officially declared dead until 1924.

A World War II Liberty ship, SS Joshua Slocum, was named for the doughty mariner.

Source: Various

Image courtesy Joshua Slocum Society International

This article first appeared on Wired.com June 27, 2008.