Archive for the ‘Inventions’ Category

April 21, 1987: Feds OK Patents for New Life Forms

1987: The U.S. Patent and Trademark office announces it will begin accepting patent applications for animals.

A year later, Harvard University was awarded the first such patent — the Oncomouse, a mouse researchers produced to be especially susceptible to getting cancer.

screen-shot-2010-04-19-at-35346-pm

Three decades later, the government has issued about 800 animal patents –- on everything from cats, cattle, chimps, dogs, fish and horses to sheep. They are used in most every field, from cosmetics to medicine. Some even blend humans and animals.

There are pigs with human blood, and rabbit eggs fused with DNA to help crippled mice walk.

The American Anti-Vivisection Society, which staunchly abhors animal patenting, estimates that up to 50 million animals are used in genetic engineering experiments annually in the United States alone — all in a bid to create what the group calls “unnatural new animals.”

The government has rejected some of the most controversial proposed patents, including the “humanzee,” a half man, half chimp. It was denied in 2005 because it was too human.

The patent office’s 1987 decision said the government “now considers non-naturally occurring nonhuman multicellular living organisms, including animals, to be patentable subject matter.”

The move, which has invoked tense ethical debate on both sides of the issue, came seven years after the Supreme Court first recognized as intellectual property genetically modified bacteria that could eat crude oil (.pdf). That decision also paved the way for the patenting of human genes — which hit a legal crossroads last month: A federal judge invalidated a human gene patent, raising doubts about the validity of 2,000 other gene patents.

One of the most controversial approved animal patents centered on the University of Texas’ treatment of beagles. The university won a patent in 2002 to infect the dogs with mold, if they lived through weeks of daily radiation doses. After a public outcry, the university disclaimed the patent in 2004 .

Another controversial approval patented rabbits whose eyes were glued open so researchers could test corneal medications. Four years after the government issued the patents to Biochemical and Pharmacological Laboratories of Japan, they were rescinded in 2009 on grounds of “obviousness.”

Even the methods for cloning animals are patentable, such as the process that produced Dolly, the first cloned mammal, in 1996. Ian Wilmut, who with a team of others cloned the sheep in Scotland, said the protocol would work with humans.

Human cloning would help us multitask in a wired world. How to explain you’ve been cloned on your Facebook profile is another story.

Source: Various

Screenshot: DuPont’s website touts the Oncomouse.

See Also:

April 15, 1452: It’s the Renaissance, Man!

da_vinci_vitruve_luc_viatour

1452: Leonardo da Vinci, one of the greatest multitalented artists in our history, is born in the Tuscan hill town of Vinci. Painter, sculptor, anatomist, architect, engineer, geologist: The labels don’t even begin to describe him.

Da Vinci’s influence is so diverse that few haven’t heard of him or his work — the paintings of Mona Lisa and The Last Supper, the drawing of Vitruvian Man, and the incredible sketches of machines imagined long before they would become possible.

An illegitimate child, da Vinci started an apprenticeship at age 15 in Florence with the renowned Italian painter, Andrea di Cione, aka Verrocchio. Da Vinci contributed to pieces produced by Verrocchio’s workshop, including painting an angel in the masterpiece “Baptism of Christ.” But by 1477, he left to strike it on his own.

For the next 17 years, da Vinci spent his time with the Duke of Milan, not just painting and sculpting but designing buildings and machinery. It was to be one of the most amazing periods of the amazingly prolific artist’s life. Da Vinci’s work covered four major categories: painting, architecture, mechanics and human anatomy.

In art, the Mona Lisa is his greatest triumph. Da Vinci worked on La Gioconda or La Joconde, as the painting is also known, for about four years before taking a break and then took another three years to finish it. The 31 x 21–inch painting is among the world’s most recognizable piece of art.

Leonardo’s other artworks have also inspired today’s pop culture: Dan Brown’s Da Vinci Code tries to interpret the symbolism in the Last Supper.

Da Vinci’s artistic side existed alongside a sharp technical mind. Some of his most fascinating works are his drawings in engineering and anatomy. Da Vinci conceptualized and drew the idea for a flying machine that’s similar to a modern helicopter. His sketches show the concept of a parachute and a lightweight hang glider.

His design notes even contain the idea for a robot — a humanoid machine that could open and close its anatomically correct jaw and carry out motions such as sitting up and moving its arms and neck. Computer models by Florence’s Institute and Museum of the History of Science show it was a technically feasible robot. The robot was seen to be the result of da Vinci’s extensive knowledge of the human anatomy.

What’s also fascinating is his design of weapons. Da Vinci sketched what we can recognize today as a tank or armored vehicle, a scythed chariot and a multibarreled gun. None of these weapons ever seem to have been made, but they certainly had the potential.

The manuscripts from his work are now valued in millions, with Bill Gates reportedly paying about $30 million in 1994 for the Codex Leicester — a record on linen paper of da Vinci’s thoughts spanning a wide range of topics.

Da Vinci died May 2, 1519 in Cloux, France.

Source: Various

Image: Leonardo da Vinci’s Vitruvian Man

See Also:

April 9, 1860: Phonoautogram Records Sound, But Doesn’t Reproduce It

Phonautograph 1859

1860: Seventeen years before Thomas Edison records what was previously thought to have been the first sound ever captured in a fixed medium, Parisian typesetter and inventor Édouard-Léon Scott de Martinville’s phonautograph records sound visually onto paper.

Neither Scott nor any of his contemporaries devised a way to turn these visual phonautogram recordings back into sound, and they lay silent for a century-and-a-half.

The phonautograph was a complicated, rudimentary, completely mechanical apparatus. It contained all of the basic principles of modern recording, just without electricity.

A thin membrane on one end of a plaster-of-paris barrel vibrated a stylus. The stylus pressed against sheets of paper that had been blackened by oil-lamp smoke and attached to a rotating cylinder. The sound waves thus left a mark on on the rotating black paper.

Scott had indeed recorded sound. He just didn’t have any means to play it back. He viewed his recording as an automatic dictation device whose patterns people would eventually be able to read with their eyes. It would be 1877 before Thomas Edison made and played back his famous recording of “Mary Had a Little Lamb,” thus successfully reproducing sound.

A group of American audio historians discovered a Scott phonoautogram in a Paris archive in 2008. Researchers at Lawrence Berkeley National Laboratory directed an optical “virtual stylus” at high-resolution scans of the sooty original recording to re-create the sound.

They were able to listen to the ghostly echo of music recorded April 9, 1860: a scratchy 10-second snippet of the folk song, “Au Clair de la Lune.”

Samuel Brylawski, former head of the recorded-sound division of the Library of Congress, called the phonoautogram “a historic find — the earliest known recording of sound.”

You can hear it now from the comfort of your own browser (due in part to lapsed copyright):

Take that, Tom Edison!

Source: Various

Image: This 1859 depiction of a phonautograph shows how a plaster-of-paris barrel (left) gathers sound into a membrane that vibrates a sharp implement to leave a visual record of the wave forms on black paper mounted on a rotating drum. The drum has to be rotated by a hand crank, like a pencil sharpener. Note the four different positions to cradle the barrel and change the angle of the sound collector.
WikiCommons

See Also:

April 8, 1879: The Milkman Cometh … With Glass Bottles

Milk

1879: Milk is sold in glass bottles for the first time in the United States. It’s a clear improvement in hygiene and convenience.

Until that time, people bought milk as a bulk item, with the seller dispensing milk out of a keg or bucket into whatever jugs, pails or other containers the customers brought. That practice left a lot to be desired on the cleanliness front. Some dairies tried offering milk in fruit jars, perhaps because customers had started bringing the resealable containers to them to be filled.

Echo Farms Dairy introduced the first purpose-made milk bottles in New York City, delivering the milk from Litchfield, Connecticut. Other dealers initially feared the expense of breakage, and some customers didn’t like the drugstore look of the containers.

But the new method of delivery eventually caught on. By the first decade of the 20th century, some cities were legally requiring that milk be delivered in glass bottles.

Early bottles had many designs, including models with stoppers on wire loops, like the bottles still used today by some European and specialty breweries. Many had the name of the dairy embossed on the glass.

Because milk has a short shelf life, consumers used the contents quickly and returned them when they went to the market or when fresh milk was delivered to their doors by milkmen. The typical milk bottle made 22.5 round trips in the early 1900s before getting broken, lost or diverted by consumers to other purposes.

The loss of bottles — as well as the expense of returning them to the bottling plant, washing and sterilizing them — contributed to the eventual abandonment of the glass bottle. Producers and consumers were also concerned about the health implications of transporting fresh milk in the same trucks right next to empty, unwashed bottles.

Worse yet, unscrupulous milkmen would split a fresh quart into two empty (and not-yet washed) pint bottles to fill a customer’s order, or reverse the process and combine two pints into an empty quart.

All this led to the development of single-use containers. The earliest wax containers appeared in the 1890s. Shapes ranged from simple boxes to cylinders to cones to truncated pyramids, even ones that imitated the shape of a typical round glass bottle.

What finally prevailed, in the 1940s, was a rectangular column design, with a small, round pull-up cap on a flat top piece. They were lightweight and compact, wasting little space in milk trucks.

Flat-top boxes were replaced in the 1950s by square cartons with “gable tops” that opened out into a spout for easy pouring. This design had actually been patented in 1915.

Milk in glass bottles is a specialty or niche market these days, and home milk delivery is pretty much a thing of the past. The wax gable-top design and the more-recent plastic bottles account for nearly all the retail milk sold in the United States, and you gotta go to the store to get ‘em.

Source: Doug & Linda’s Dairy Antique Site

Image: An oil-field worker drinks a bottle of milk.
Russell Lee/Library of Congress

See Also:

April 6, 1903: Edgerton Born, Father of High-Speed Photography

Edgerton

1903: Harold Edgerton is born. The electrical engineer and photographer will change the way we see the world: fast.

Edgerton invented stop-action, high-speed photography, helping push the obscure stroboscope from a laboratory instrument into a household item. He used the technique to make a body of work that’s revered both for its scientific advancement and its aesthetic qualities.

Edgerton was using stroboscopes in the late 1920s to study synchronous motors for his Master of Science thesis at the Massachusetts Institute of Technology. The stroboscope emitted short, repeating bursts of light. Edgerton thought to aim it at everyday objects, like a milk drop.

He then took the technology one step further and started building flash tubes, with the help of Kenneth J. Germeshausen and Herbert E. Grier. The first was filled with vaporized mercury, though later models used xenon gas.

(Edgerton, Germeshausen and Grier founded EG&G, a technology and management firm now part of URS Corp.)

Following the path of Eadweard Muybridge a half-century earlier, Edgerton photographed a variety of previously unseen details of athletes, animals and inanimate objects.

The duration of Edgerton’s flash was extremely short, about a millionth of a second. His most famous photos include bullets penetrating an apple and a playing card, and a football being kicked.

Edgerton’s research also led him to develop side-scan sonar. His underwater stroboscope technology helped his friend Jacques Cousteau discover the wreck of the ocean liner Britannic.

His work found its way into many forms of media. A 1940 documentary film about Edgerton’s wizardry, Quicker’n a Wink, won the Oscar for best short subject. His photography work is featured in the collections of the art museums the world over.

Edgerton also designed a flash technique for aerial night reconnaissance for the Army. Using his underwater stroboscope technology, he led a search for the Loch Ness monster in 1976, but to no avail.

Edgerton continued to teach at MIT for more than 40 years and was a professor emeritus of electrical measurements there until his death in 1990.

What’s your best high-speed photo? Submit it to our Reddit widget below, and vote for your favorite.

Source: Various

Photo: Cutting the Card Quickly!, 1964, Minneapolis Institute of Arts, Gift of the Harold and Esther Edgerton Family Foundation

See Also:

Vote on high-speed photography submitted by other readers.

Show entries that are: hot | new | top-rated. Submit your high-speed photography.

Submit your high-speed photography.

(No more than one every 30 minutes. No HTML allowed.)

Back to top

March 23, 1857: Mr. Otis Gives You a Lift

picture-4

1857: Attention shoppers: The first commercial elevator goes safely up and down in a New York City department store. Like air conditioning and public transportation, elevators are supposed to make the working life a little easier. Maybe they do. But there’s no doubt they introduce the necessary condition to fill cities with skyscrapers.

You may, after all this time, still not know what to do with yourself during the inexplicably long seconds you share in vertical captivity with strangers and other people you’d rather not acknowledge. But this dilemma almost certainly did not concern Elisha Graves Otis in 1853 when he founded Otis Elevator, the company that would dominate the elevator business for more than a century and a half — and counting.

The secret of Otis’ success wasn’t so much that he could make a platform go up and down, which (patent trolls note) isn’t really much of an engineering achievement. There were already steam and hydraulic elevators in use here and there for a couple of years before Otis stepped up. No: Otis’ achievement was that he convinced people he could make an elevator that would go not only up, but also down without going into a free fall.

Otis set up business in Yonkers, New York, an emerging industry town about 15 miles north of Times Square. He sold only three elevators in 1853 — for $300 each — and none in the first few months of the following year. So the entrepreneur decided to make a dramatic demonstration at the New York Crystal Palace, a grand exhibition hall built for the 1853 Worlds Fair.

The company recounts this milestone in its history.

Perched on a hoisting platform high above the crowd at New York’s Crystal Palace, a pragmatic mechanic shocked the crowd when he dramatically cut the only rope suspending the platform on which he was standing. The platform dropped a few inches, but then came to a stop. His revolutionary new safety brake had worked, stopping the platform from crashing to the ground. “All safe, gentlemen!” the man proclaimed.

Otis’ demonstration had the desired effect. He sold seven elevators that year, and 15 the next. When Otis died only seven years later his company, now run by his sons, was well on its way. By 1873 there were 2,000 Otis elevators in use. They expanded to Europe and Russia. In rapid succession his company got the commissions for the Eiffel Tower, the Empire State Building, the Flatiron Building and the original Woolworth Building — in its day, the world’s tallest. In 1967, Otis Elevator installed all 255 elevators and 71 escalators in the World Trade Center.

But the very first commercial installation was on March 23, 1857, at a five-story department store at Broadway and Broome Street in what is now New York City’s SoHo district.

The elevator’s wide adoption had a dramatic effect on how we work and live. Before, most buildings were built only a few stories high, since climbing stairs is a tiring, high-impact activity. With elevators, the sky became the limit. Offices, and later homes, on higher floors commanded the highest prices, for the view and the respite from street noise. The world-famous New York City skyline? Impossible without the elevator.

Elevators also created new jobs and helped empower the United States’ most oppressed citizens. You may not see them much anymore, but there were once tens of thousands of elevator operators, most of whom were black. Indeed, the first elevator operator’s union was formed in 1917 by none other than legendary labor organizer and civil rights leader A. Philip Randolph — an elevator operator who went on to create the game-changing Brotherhood of Sleeping Car Porters.

In their earliest days, the job of elevator operator required the skill and touch of a barista: An operator ran the lift with a sliding lever that raised, lowered and stopped the lift. Later on, elevators became fully automated vehicles with buttons anyone could push and electronics that knew where each floor was. Now there are few manual elevators still in operation — but their age and safe records are testaments to Otis’ early work.

One thing hasn’t changed: Riding in elevators may be the most boring few seconds of daily life, next to waiting for the microwave to “ding.” And for the tiny fraction of our lives we spend in them, being there presents an inordinate number of etiquette challenges. And, for the most part, we don’t even have elevator music to distract us anymore.

One thing you almost certainly don’t have to worry about, though, is the risk of serious injury or death. Out of millions of elevators in the world, only 20 to 30 elevator-related deaths are reported every year. Those fatalities tend to happen when someone steps into an elevator shaft when the elevator should be there, or from the extreme(ly stupid) sport of elevator surfing — not because an elevator hurtles out of control.

So it does seem that they are as safe as Otis knew they were when he cut the cord on himself in 1854.

Source: Various

See Also:

March 17, 1953: The Black Box Is Born

Black Box

1953: After several high-profile crashes of de Havilland Comet airliners go unsolved, Australian researcher David Warren invents a device to record cockpit noise and instruments during flight.

During the first half of aviation’s history, crashes rarely came with any answers. Even if an eyewitness saw an airplane crash, little was known of the cause or what pilots might have been aware of before the crash.

In the early 1950s, the world’s first jet-powered airliner, the de Havilland Comet, crashed several times. Warren, a researcher at the Aeronautical Research Laboratories in Melbourne, Australia, believed if the pilot’s voices could be recorded, as well as instrument readings, the information could help determine the cause of a crash — and help prevent them. His device was called a “Flight Memory Unit.”

By 1957, the first prototypes of the device were produced. Early versions could record up to four hours of voice and instrument data on a steel foil. Warren believed the device would be popular and help solve the mysteries behind aviation crashes, but the device was initially rejected by the Australian aviation community for privacy issues.

Eventually, British officials accepted the idea of a flight data recorder and Warren began producing FDRs in crash- and fire-proof containers and selling them to airlines around the world. After a 1960 crash in Queensland, where the cause could not be determined, the Australian government required all commercial airplanes carry a recorder. The country became the first to require the use of the devices.

Early recorders logged basic flight conditions such as heading, altitude, airspeed, vertical accelerations and time. Today’s FDRs can record many more parameters including throttle and flight-control positions. Analyzing so many parameters allows investigators to recreate most of the pilot-controlled activity in the moments leading up to a crash. In recent years, digital reproductions of flights using FDR data have been valuable in recreating accidents and analyzing both the problems leading to the crash and the pilots’ response.

Modern FDRs, aka “black boxes,” are actually bright orange. They must withstand several tests, including fire and piercing, and the ability to withstand the pressure of being submerged to 20,000 feet below the ocean. Perhaps most impressive is their ability to withstand a 3,400-g crash-impact test. To aid in recovery, a locator-beacon signal is emitted for up to 30 days.

While early designs recorded the information onto a steel foil, modern FDRs use solid-state memory that can be downloaded almost instantly. This data can also be checked during routine maintenance inspections to monitor the performance of aircraft.

Future improvements to flight recorders include the possibility of transmitting flight data in real time to ground stations, which would eliminate the need to physically find the flight data recorder. Interest in this kind of in-flight transmission of data gained momentum after Air France flight 447 disappeared over the Atlantic in 2009 and a flight data recorder could not be found.

Source: Various

Photo: Officials transfer the TWA Flight 800 flight data recorder from saltwater into freshwater on July 25, 1996, at the Coast Guard station in East Moriches, New York.
Associated Press/US Coast Guard

See Also:

March 12, 1790: Batteries Now Included

daniell11790: John Frederic Daniell, a 19th-century scientific and academic heavyweight, inventor of the first practical electric battery and all-around geek, is born in London, England.

Daniell, the son of a prominent London attorney, attended prestigious schools in Europe where he excelled in science — especially when it came to performing experiments and building instruments.

After he graduated, he ran a sugar-refining plant where he developed several technical upgrades to the refining process.

The scientific community noticed Daniell after he published several chemistry papers, and he landed a plush job with the newly anointed Continental Gas Company, a British firm dedicated to developing the burgeoning natural-gas lighting industry. His new appointment with Continental would be the 21st-century equivalent of a top job at Tesla Motors.

Daniell developed gas lighting for various French and German cities and figured out a way to generate gas resin from turpentine.

Years later, he invented the dew-point hygrometer (for gauging humidity) and a pyrometer (a device for measuring superhot temperatures). He wrote a series of papers for the Horticultural Society explaining the importance of humidity regulation in greenhouses.

But he’s best known for his contribution to portable electric power and storage. Building on what he learned about chemistry and electricity during his overseas studies and work with Continental, Daniell conceived and built the precursor to the modern-day battery in 1836.

Performing electrical experiments had previously involved mucking around with lightning (dangerous) or using a primitive battery (weak) developed by Italian physicist Alessandro Volta back in 1800. Volta’s battery, also known as a voltaic pile, consisted of stacked metal discs arranged to conduct an electric current. The battery’s juice depleted quickly and was of little practical use.

Daniell’s battery, known as the Daniell cell, lasted much longer. It used copper and zinc electrodes submerged in a solution of copper sulphate and zinc sulphate. When the zinc oxidized, it reacted with a cathode and the copper was reduced. This produced a continuous flow of electric power.

battery2

His breakthrough was key to advancing communications technology, in particular, the telegraph.

Daniell later became a high-ranking official at the Royal Society. He worked in academia until his sudden and rather mysterious death in 1845 when he was 55.

Sources: Various
See Also:

March 8, 1955: The Mother of All Operating Systems

whirlwind

1955: Computer pioneer Doug Ross demonstrates the Director tape for MIT’s Whirlwind machine. It’s a new idea: a permanent set of instructions on how the computer should operate.

Six years in the making, MIT’s Whirlwind computer was the first digital computer that could display real-time text and graphics on a video terminal, which was then just a large oscilloscope screen. Whirlwind used 4,500 vacuum tubes to process data.

The Whirlwind occupied 3,300 square feet and was the fastest digital computer of its time. It also pioneered a number of new technologies, including magnetic core memory for RAM.

Another one of its contributions was Director, a set of programming instructions on paper tape that is regarded as the predecessor of operating systems in computers. The Director was designed to issue commands to the 4-year-old Whirlwind machine.

The idea was to eliminate the need for manual intervention (.pdf) in reading the tapes for different problems during a computing session.

The Director tape would communicate with the computer through a separate input reader. That means different tapes with various problems to be computed would be recognized and appropriately processed. A Director tape would make a complete run possible by pushing a single button.

Programmers John Frankovich and Frank Helwig wrote the first Director tape program. The software concept was to connect a Flexowriter — a mechanical, heavy-duty tape reader — to a newer, faster photoelectric tape reader.

This allowed the team to feed the spliced-together paper tapes directly to Whirlwind, without having a separate human operator.

Lead programmer Doug Ross finally demonstrated it in 1955.

The Director tape was also probably the first example of a Job Control Language–driven operating system. JCL is a scripting language used on mainframe operating systems to instruct them how to run a batch job or start a subsystem.

The Whirlwind is credited with leading to development of the SAGE, or Semi-Automatic Ground Environment, system used by the U.S. Air Force. It’s also said to have influenced most of the computers of the 1960s.

Source: Wikipedia, MIT Computer Science and Artificial Intelligence Laboratory

Photo: Stephen Dodd, Jay Forrester, Robert Everett and Ramona Ferenz test Whirlwind in 1950.
Courtesy Mitre Corp.

See Also:

March 4, 1877: The Microphone Sounds Much Better

Emile Berliner

1877: Emile Berliner files a patent caveat for a new kind of microphone. It assures the future of the telephone, but not fame for Berliner.

Alexander Graham Bell had already invented his telephone, but without Berliner’s carbon-disk or carbon-button microphone, telephones would have sounded terrible for decades. And they may not have been capable of surmounting such great distances, hindering one of humanity’s most important advances.

Like most of today’s microphones, early designs turned compressions in the air, otherwise known as sound, into electrical signals. But the results didn’t sound good by any account, and the device lacked practicality for widespread use. Bell’s microphone, for instance, involved suspending a diaphragm above a pool of electrified liquid.

Berliner’s patent application improved on the existing design by adding a layer of carbon particles in between two contacts, one of which acted as a diaphragm for catching sound waves. Movements of the diaphragm created varying pressure on the carbon particles, allowing more or less electricity to pass between the contacts.

This process converted sound waves into electricity more accurately than any other microphone could at the time. It became commonplace in telephones, and even radio, until the appearance of the condenser microphone in the mid-1920s.

Although Berliner’s microphones still sounded hissy, they proved critical not only for encoding speech into electricity, but for amplifying the signal in the wire every so often to compensate for electrical resistance. Without that amplification, Bell’s telephone would have remained a mere curiosity, rather than transforming the world.

In those pre–vacuum-tube, pre-transistor days, Berliner’s carbon-disk microphones were coupled to little speakers to amplify the signal mechanically over long distances. Presumably, you’d be able to eavesdrop on a conversation simply by standing near one of these mechanical repeaters.

Carbon Mic

Bell paid $50,000 for Berliner’s microphone patent (about $1.1 million in today’s money) and began manufacturing telephones using the technology in 1878. But controversy dogged the patent, which was eventually thrown out, much to Berliner’s dismay. The U.S. Supreme Court ruled in 1892 that Thomas Edison, and not Berliner, invented the carbon microphone.

In truth, neither can claim total credit.

As Bell executive W. Van Benthuysen told The New York Times (.pdf) in December 1891, the idea of transmitting speech by varying the current between two contacts as they are affected by sound waves was common knowledge in some circles, having appeared in published works as early as 1854 — well before either Berliner or Edison (who filed a similar patent) claimed credit for the idea in 1877.

“It was known long before the date of Bell’s [formerly Berliner's] patent that the resistance of a circuit is varied without being broken by variations in the intimacy of contact or amount of pressure between the electrodes in contact, from one to the other of which a current is passing,” said Van Benthuysen, adding that France’s Count Du Moncel had written extensively on the topic over two decades earlier.

Nonetheless, Berliner reputedly went to his grave in 1929 convinced that Edison had stolen his idea. Before that, he did receive ample credit for another crucial invention: the lateral-cut disc record, whose design is still prized by hipsters and purists alike. Before that, everybody was using Edison’s phonograph cylinders, which took up much more space, and were difficult to duplicate.

Berliner’s records were used in toys from 1888 until 1894, when his company began selling records using a logo of a dog cocking its ear towards a record player. Modified versions of the “His Master’s Voice” logo have been used by record companies around the world, including RCA in the United States. It now forms the retail entertainment chain HMV’s logo.

The saga of the carbon-button microphone comes as a reminder that while history feels the need to assign great ideas to individual people, their origins are often murky and collaborative in nature, and owe no small part to people ripping each other off.

We couldn’t find any accounts of Emile Berliner’s first words transmitted by microphone, but they were probably not “Ich bin ein Berliner” — as nice as that would have been.

Source: Various

Photo: Emile Berliner sits with a couple of microphones in 1927./Courtesy Library of Congress
Diagram courtesy TutorVista

See Also:

  • Smoke and Lasers Could Disrupt Microphone Market
  • Video: Microphone Prototype Uses Smoke, Lasers to Record Audio
  • New Microphone Uses Smoke — and Lasers!
  • March 4, 1887: Start Your Engine
  • March 4, 1890: Bridge Tech Takes a Great Leap Forth
  • March 4, 1962: Nuclear Age Comes to Antarctica
  • Aug. 14, 1877: Internal Combustion's Stroke of Genius
  • Aug. 15, 1877: 'Hello. Can You Hear Me Now?'