Today, pencils are everywhere, from schools to golf courses to any art studio. What seemed like a simple invention is now a billion-dollar global industry. But who invented this household staple?
Before there were pencils, the preferred writing instrument was the stylus, which had been around since the ancient Romans. Some were made of thin pieces of metal that left light marks on a paper-like material called papyrus. Other styluses—which stuck around all the way until the 16th century—were made of lead, which proved to be a harbinger of writing instruments to come.
More modern pencils arose thanks to a bit of luck and some creativity. In 1564, a tree fell down in England and unearthed a large deposit of graphite, an incredibly valuable mineral. Unlike lead, graphite could leave dark gray, almost metallic marks on paper. Despite being made of carbon, many believed it to be lead.
According to NPR, a Swiss naturalist named Conrad Gessner created the first depiction of a pencil in 1565. His drawing portrayed graphite inside wood. That illustration became popular throughout Europe, but it wouldn’t be until the 1700s that pencils as we know them started taking shape.
The late 18th century saw further pencil improvements. France, no longer able to trade with England while at war with Britain, became desperate for its own source of graphite. As a substitute, engineer Nicolas-Jacques Conté created the “Crayons Conté.” He mixed cheap graphite with wet clay, which was then sculpted into a rod-like shape and baked.
Since Conté, many have improved on the pencil, including Henry David Thoreau. Thoreau created pencils that didn’t smudge as much, as well as the numbering system that notes the firmness of the graphite. The rise of factories and machines also helped make the writing instrument more popular. Eberhard Faber opened the first pencil factory in the U.S. in 1861, about 100 years after his family opened their first one in Germany. Hyman Lipman attached the first eraser to a pencil in 1858. He received a patent, but it was later invalidated by the courts, as he didn’t create erasers, just combined two items.
Multiple inventors deserve credit for the technology, which had its origins in the 19th century.
The way people watch television has changed dramatically since the medium first burst onto the scene in the 1940s and ‘50s and forever transformed American life. Decade after decade, TV technology has steadily advanced: Color arrived in the 1960s, followed by cable in the ‘70s, VCRs in the ‘80s and high-definition in the late ‘90s. In the 21st century, viewers are just as likely to watch shows on cell phones, laptops and tablets as on a TV set. Amazingly, however, all these technological changes were essentially just improvements on a basic system that has worked since the late 1930s—with roots reaching even further back than that.
Early TV Technology: Mechanical Spinning Discs
No single inventor deserves credit for the television. The idea was floating around long before the technology existed to make it happen, and many scientists and engineers made contributions that built on each other to eventually produce what we know as TV today.
Television’s origins can be traced to the 1830s and ‘40s, when Samuel F.B. Morse developed the telegraph, the system of sending messages (translated into beeping sounds) along wires. Another important step forward came in 1876 in the form of Alexander Graham Bell’s telephone, which allowed the human voice to travel through wires over long distances.
Both Bell and Thomas Edison speculated about the possibility of telephone-like devices that could transmit images as well as sounds. But it was a German researcher who took the next important step toward developing the technology that made television possible. In 1884, Paul Nipkow came up with a system of sending images through wires via spinning discs. He called it the electric telescope, but it was essentially an early form of mechanical television.
TV Goes Electronic With Cathode Ray Tubes
In the early 1900s, both Russian physicist Boris Rosing and Scottish engineer Alan Archibald Campbell-Swinton worked independently to improve on Nipkow’s system by replacing the spinning discs with cathode ray tubes, a technology developed earlier by German physicist Karl Braun. Swinton’s system, which placed cathode ray tubes inside the camera that sent a picture, as well as inside the receiver, was essentially the earliest all-electronic television system.
Russian-born engineer Vladimir Zworykin had worked as Rosing’s assistant before both of them emigrated following the Russian Revolution. In 1923, Zworykin was employed at the Pittsburgh-based manufacturing company Westinghouse when he applied for his first television patent, for the “Iconoscope,” which used cathode ray tubes to transmit images.
In 1929, Zworykin demonstrated his all-electronic television system at a convention of radio engineers. In the audience was David Sarnoff, an executive at Radio Corporation of America (RCA), the nation’s biggest communications company at the time. Born into a poor Jewish family in Minsk, Russia, Sarnoff had come to New York City as a child and began his career as a telegraph operator. He was actually on duty on the night of the Titanic disaster; although he likely didn’t—as he later claimed—coordinate distress messages sent to nearby ships, he did help disseminate the names of the survivors.
Utah Inventor Battles Giant Corporation
Sarnoff was among the earliest to see that television, like radio, had enormous potential as a medium for entertainment as well as communication. Named president of RCA in 1930, he hired Zworykin to develop and improve television technology for the company. Meanwhile, an American inventor named Philo Farnsworth had been working on his own television system. Farnsworth, who grew up on a farm in Utah, reportedly came up with his big idea—a vacuum tube that could dissect images into lines, transmit those lines and turn them back into images—while still a teenager in chemistry class.
In 1927, at the age of 21, Farnsworth completed the prototype of the first working fully electronic TV system, based on this “image dissector.” He soon found himself embroiled in a long legal battle with RCA, which claimed Zworykin’s 1923 patent took priority over Farnsworth’s inventions. The U.S. Patent Office ruled in favor of Farnsworth in 1934 (helped in part by an old high school teacher, who had kept a key drawing by the young inventor), and Sarnoff was eventually forced to pay Farnsworth $1 million in licensing fees. Though viewed by many historians as the true father of television, Farnsworth never earned much more from his invention, and was dogged by patent appeal lawsuits from RCA. He later moved on to other fields of research, including nuclear fission, and died in debt in 1971.
Sarnoff, with his marketing might, introduced the public to television in a big way at the World’s Fair in New York City in 1939. Under the umbrella of RCA’s broadcasting division, the National Broadcasting Company (NBC), Sarnoff broadcast the fair’s opening ceremonies, including a speech by President Franklin D. Roosevelt.
By the 1950s, television had truly entered the mainstream, with more than half of all American homes owning TV sets by 1955. As the number of consumers expanded, new stations were created and more programs broadcast, and by the end of that decade TV had replaced radio as the main source of home entertainment in the United States. During the 1960 presidential election, the young, handsome John F. Kennedy had a noticeable advantage over his less telegenic opponent, Richard M. Nixon in televised debates, and his victory that fall would bring home for many Americans the transformative impact of the medium.
As the late Paul Harvey used to say “Now You Know The Rest Of The Story.”
For decades, the chicken nugget has been a staple of fast food menus and grocery store frozen food aisles. But contrary to popular belief, these fried bits of poultry deliciousness weren’t invented by McDonald’s. Like so many other dishes before it, parts of the nuggets origin story are contested, but most sources agree that it all began with Robert C. Baker, a poultry and food science professor at Cornell University.
In the 1960s, Baker was attempting to find new ways to make chicken exciting again for Americans. During World War II, the U.S. government had created a rationing system, similar to the one that was being used in the UK. The list of rationed items was extensive and included foods like beef, pork, sugar, oil, and canned or preserved meats. While cheese and cream would later be added to the list of rationed items, milk, eggs, and poultry were not—which made chicken dishes a popular dinner choice for many households during the war.
After the war, according to Slate, demand for poultry saw a steep decline because chickens were typically sold as whole birds, which could be inconvenient for families. Some butchers were willing to cut their chickens into smaller pieces to make it easier to cook, but Baker—who had a reputation as a food innovator (he was the man behind frozen French toast and chicken hot dogs, and was working on ways to increase the value of chickens once they were no longer able to lay eggs)—was interested in creating a way to simplify the process completely.
First, Baker created what was known as a chicken stick—ground up chicken that was breaded in an egg batter then frozen. He realized he could solve some of the issues food scientists were facing by removing the skin and by making a batter that could be fried even after it was frozen. He sent his chicken sticks to local grocery stores, where they were an instant hit, with some selling up to 200 boxes per week.
But Baker felt there were ways his process could be improved upon and refined, and he was happy to let others give it a try. Instead of patenting his chicken sticks, Baker published his entire process for creating them in Agricultural Economics Research and had copies sent to poultry companies and food scientists throughout the United States.
“Robert C. Baker was both a product of changes going on in the poultry world and a driver of those changes,” anthropologist Steve Striffler, author of Chicken: The Dangerous Transformation of America’s Favorite Food, said. “Industry leaders quickly realized that real profit would not so much come from producing more chicken, but by doing more to chicken. Hence, further processing.”
According to History.com, Baker’s creation could not have come at a better time. Beginning in the 1970s, scientists and the U.S. government began suggesting that Americans cut down on their consumption of red meat, as too much of it could lead to such health problems as high cholesterol. Chicken was promoted as a healthy alternative.
While there’s no direct source to confirm that McDonald’s Corporation founder Ray Kroc read Baker’s original process report, it was clear that the fast food magnate wanted to capitalize on this chicken push. Kroc wanted to create a chicken product that would offer convenience—”a boneless piece of chicken, [sold] almost like French fries,” explained McDonald’s chairman Fred Turner.
While McDonald’s executive chef René Arend played around with some potential recipes, Kroc enlisted the help of meat supplier Keystone Foods to come up with a way to automate the chicken-chopping process. He also reached out to Gorton’s, the company known for its fish sticks—which had helped McDonald’s when the company was developing the Filet-O-Fish—to create a batter for these bite-sized pieces of chicken. In 1981, Mickey D’s officially introduced their Chicken McNuggets, which turned out to be one of the most successful new product launches in fast food history. Even today, McNuggets still account for about 10 percent of the restaurant’s sales.
Baker, meanwhile, went on to found Cornell University’s Institute for Food Science and Marketing in 1970 and serve as its very first director. He also helped invent a chicken deboning machine. Though he received no monetary benefit from the success of the chicken nuggets he helped to create, his place in poultry history did lead to him becoming known as the “George Washington Carver of Chicken.”
The telescope is one of humankind’s most important inventions. The simple device that made far away things look near gave observers a new perspective. When curious men pointed the spyglass toward the sky, our view of Earth and our place in the universe changed forever.
But the identity of the ingenious mind who invented the telescope remains a mystery. Although the invention changed humankind’s perspective of the universe forever, It was probably inevitable that as glassmaking and lens-grinding techniques improved in the late 1500s, someone would hold up two lenses and discover what they could do.
The first person to apply for a patent for a telescope was Dutch eyeglass maker Hans Lippershey (or Lipperhey). In 1608, Lippershey laid claim to a device that could magnify objects three times. His telescope had a concave eyepiece aligned with a convex objective lens. One story goes that he got the idea for his design after observing two children in his shop holding up two lenses that made a distant weather vane appear close. Others claimed at the time that he stole the design from another eyeglass maker, Zacharias Jansen.
Jansen and Lippershey lived in the same town and both worked on making optical instruments. Scholars generally argue, however, that there is no real evidence that Lippershey did not develop his telescope independently. Lippershey, therefore, gets the credit for the telescope, because of the patent application, while Jansen is credited with inventing the compound microscope. Both appear to have contributed to the development of both instruments.
Adding the confusion, yet another Dutchman, Jacob Metius, applied for a patent for a telescope a few weeks after Lippershey. The government of the Netherlands turned down both applications because of the counterclaims. Also, officials said the device was easy to reproduce, making it difficult to patent. In the end, Metius got a small reward, but the government paid Lippershey a handsome fee to make copies of his telescope.
In 1609, Galileo Galilei heard about the “Dutch perspective glasses” and within days had designed one of his own — without ever seeing one. He made some improvements — his could magnify objects 20 times — and presented his device to the Venetian Senate. The Senate, in turn, set him up for life as a lecturer at the University of Padua and doubled his salary, according to Stillman Drake in his book “Galileo at Work: His Scientific Biography” (Courier Dover Publications, 2003).
Galileo was the first to point a telescope skyward. He was able to make out mountains and craters on the moon, as well as a ribbon of diffuse light arching across the sky — the Milky Way. He also discovered the rings of Saturn, sunspots and four of Jupiter’s moons.
Thomas Harriot, a British ethnographer and mathematician, also used a spyglass to observe the moon. Harriot became famous for his travels to the early settlements in Virginia to detail resources there. His August 1609 drawings of the moon predate Galileo’s, but were never published.
The more Galileo looked, the more he was convinced of the sun-centered Copernican model of the planets. Galileo wrote a book “Dialogue Concerning the Two Chief World Systems, Ptolemaic and Copernican” and dedicated it to the Pope Urban VIII. But his ideas were considered heretical, and Galileo was called to appear before the inquisition in Rome in 1633. He struck a plea bargain and was sentenced to house arrest, where he continued to work and write until his death in 1642.
Elsewhere in Europe, scientists began improving the telescope. Johannes Kepler studied the optics and designed a telescope with two convex lenses, which made the images appear upside down. Working from Kepler’s writings, Isaac Newton reasoned it was better to make a telescope out of mirrors rather than lenses and built a reflecting telescope in 1668. Centuries later the reflecting telescope would dominate astronomy.
Exploring the cosmos
The largest refracting telescope (one that use lenses to gather and focus light) opened at Yerkes Observatory in Williams Bay, Wisconsin, in 1897. But the 40-inch (1 meter) glass lens at Yerkes was soon made obsolete by larger mirrors. The Hooker 100-inch (2.5 m) reflecting telescope opened in 1917 at Mount Wilson Observatory in Pasadena, Calif. It was there that the astronomer Edwin Hubble determined that the Andromeda Nebula was indeed (as some astronomers had argued) a galaxy far, far away (2.5 million light-years) from the Milky Way.
With the development of the radio, scientists could start to study not just light, but other electromagnetic radiation in space. An American engineer named Karl Jansky was the first to detect radio radiation from space in 1931. He found a source of radio interference from the center of the Milky Way. Radio telescopes have since mapped the shape of galaxies and the existence of background microwave radiation that confirmed a prediction in the Big Bang Theory.
Here are some of the more famous telescopes:
Hubble Space Telescope
This telescope launched in 1990. Some of Hubble’s major contributions include determining the age of the universe with more precision, finding more moons near Pluto, doing observations of galaxies in the young universe, monitoring space weather on the outer planets, and even observing exoplanets — a situation not anticipated for the telescope as the first major exoplanet discoveries didn’t happen until the mid-1990s.
A flaw in its mirror was fixed with an upgrade from a space shuttle crew in 1993. Hubble underwent five servicing missions by shuttle crews, with the last one being in 2009. It remains in good health to this day and is expected to overlap some observations with the James Webb Space Telescope (Hubble is part of a set of four “great observatories” launched by NASA in the 1990s and 2000s. The other members included the Spitzer Space Telescope, the Compton Gamma Ray Observatory and the Chandra X-Ray Observatory, which made many discoveries of their own.)
James Webb Space Telescope
This is the successor to Hubble, and its launch date has been delayed several times over the years, with the latest estimate now for 2020. Unlike Hubble, this telescope will be parked far from Earth and out of reach of repair crews. Its science will look at four major themes: the universe’s first light, how the first galaxies were formed, how stars are formed, and looking at the origins of life (including exoplanets).
This planet-hunting machine has found more than 4,000 potential planets since first launching in 2009. Initially, it focused on a section of the Cygnus constellation, but in 2013 problems with pointing consistently created a new mission in which Kepler moves between different regions of the sky. One of Kepler’s major contributions is finding more super-Earths and rocky planets, which are harder to spot near bright stars.
Atacama Large Millimeter/submillimeter Array (ALMA)
This telescope in Chile has 66 receivers and its specialty is looking through the dust in young planetary systems (or through dusty stars and galaxies) to see how cosmic objects are formed. It was fully operational as of 2013. ALMA is unique in its sensitivity because it has so many receivers available. Some of its results include the clearest-ever image of the star Betelgeuse, and precise measurements of black hole masses.
This observatory has been operating since 1963, and is famous for many radio astronomy studies. The Puerto Rican telescope is also know for a message called the Arecibo Message that was directed at the globular cluster M13 in 1974. The observatory was damaged during a 2017 hurricane that devastated Puerto Rico. In popular culture, Arecibo was also the location of the climax of the 1995 James Bond film “Goldeneye”, and it appeared in the 1997 movie “Contact.”
Karl G. Jansky Very Large Array
This is a set of 27 telescopes located in the New Mexico desert. Construction began on the VLA in 1973. Some of the VLA’s major discoveries include finding ice on Mercury, peering into the dusty center of the Milky Way, and looking at the formation of black holes. The telescope array also was prominently featured in the 1997 movie “Contact” as the site where a purported extraterrestrial signal arrived.
W.M. Keck Observatory
The twin telescopes at the W.M. Keck Observatory in Hawaii are the largest optical and infrared telescopes available. The telescopes started their work in 1993 and 1996. Some of their major discoveries including finding the first exoplanet “transiting” across its parent star, and learning about star movements in the nearby Andromeda Galaxy.
The Palomar Observatory, located in San Diego County, Calif., began work in 1949. The telescope is best known for discovering the small worlds Quaoar, Sedna and Eris in the Kuiper Belt, but its work also includes discovering supernovas (star explosions), tracking asteroids and looking at gamma-ray bursts.
Ancient umbrellas or parasols were first designed to provide shade from the sun
The basic umbrella was invented more than 4,000 years ago. There is evidence of umbrellas in the ancient art and artifacts of Egypt, Assyria, Greece, and China.
These ancient umbrellas or parasols were first designed to provide shade from the sun. The Chinese were the first to waterproof their umbrellas for use as rain protection. They waxed and lacquered their paper parasols in order to use them for rain.
Origins of the Term Umbrella
The word “umbrella” comes from the Latin root word “umbra,” meaning shade or shadow. Starting in the 16th century the umbrella became popular in the western world, especially in the rainy climates of northern Europe. At first, it was considered only an accessory suitable for women. Then the Persian traveler and writer Jonas Hanway (1712-86) carried and used an umbrella publicly in England for 30 years. He popularized umbrella use among men. English gentleman often referred to their umbrellas as a “Hanway.”
James Smith and Sons
The first all umbrella shop was called “James Smith and Sons.” The shop opened in 1830 and is still located at 53 New Oxford Street in London, England.
The early European umbrellas were made of wood or whalebone and covered with alpaca or oiled canvas. The artisans made the curved handles for the umbrellas out of hardwoods like ebony and were well paid for their efforts.
English Steels Company
In 1852, Samuel Fox invented the steel ribbed umbrella design. Fox also founded the “English Steels Company” and claimed to have invented the steel ribbed umbrella as a way of using up stocks of farthingale stays, the steel stays used in women’s corsets.
After that, compact collapsible umbrellas were the next major technical innovation in umbrella manufacture, which arrived over a century later.
In 1928, Hans Haupt invented the pocket umbrella. In Vienna, she was a student studying sculpture when she developed a prototype for an improved compact foldable umbrella for which she received a patent in September 1929. The umbrella was called “Flirt” and was made by an Austrian company. In Germany, the small foldable umbrellas were made by the company “Knirps,” which became a synonym in the German language for small foldable umbrellas in general.
In 1969, Bradford E Phillips, the owner of Totes Incorporated of Loveland, Ohio obtained a patent for his “working folding umbrella.”
Another fun fact: Umbrellas have also been crafted into hats as early as 1880 and at least as recently as 1987.
Golf umbrellas, one of the largest sizes in common use, are typically around 62 inches across but can range anywhere from 60 to 70 inches.
Umbrellas are now a consumer product with a large global market. As of 2008, most umbrellas worldwide are made in China. The city of Shangyu alone had more than 1,000 umbrella factories. In the U.S., about 33 million umbrellas, worth $348 million, are sold each year.
As of 2008, the U.S. Patent Office registered 3,000 active patents on umbrella-related inventions.
The most common origin story for the potato chip involves Moon’s Lake House, a popular restaurant in the resort town of Saratoga Springs, N.Y. But even there, at least five different men and women have been credited as its creator. What’s more, food historians suggest the chip probably wasn’t invented in Saratoga—and possibly not in the U.S. at all.
The Saratoga Story
The most popular potato chip legend goes like this: One day in 1853, the shipping and railroad baron Cornelius Vanderbilt was dining at Moon’s Lake House. Disappointed by the fried potatoes he’d been served, he sent them back to the kitchen, asking for more thinly sliced ones. George Crum, a famed chef of Native American and Black heritage, took umbrage at the request and, in an “I’ll show him!” mood, sliced some potatoes as thin as he could, fried them to a crisp and served them to Vanderbilt. To Crum’s surprise, Vanderbilt loved them, and the potato chip was born.
This version of events eventually became so well-established that, in 1976, American Heritage magazine would dub Crum, also known as George Speck, the “Edison of Grease.”
Unfortunately, there are several problems with the Crum story. For one, if there was a disgruntled diner, it almost certainly wasn’t Vanderbilt. “There is no truth to the tale,” historian T.J. Stiles concluded in his Pulitzer prize-winning biography The First Tycoon: The Epic Life of Cornelius Vanderbilt.
For another, Crum’s supposed role in inventing the potato chip seems to have gone largely unrecognized in his lifetime, even though he was widely known across the U.S. and celebrated for his brook trout, lake bass, woodcock and partridge, among other dishes—making him perhaps the first celebrity chef in America. In 1889, a writer in the New York Herald called him “the best cook in the country,” with nary a word about potatoes. Most of his obituaries, in 1914, don’t mention the potato chip at all, and those that do simply say that he was “said to have” invented it.
Three years later, an obituary for Catherine Adkins Wicks, age 103, maintained that she, in fact, “was said to be the originator of the potato chip.” Wicks, who was Crum’s sister, worked alongside him in the kitchen and was familiarly known as Aunt Kate or Aunt Katie. In one variation of the disgruntled diner story, it is she, not Crum, who carved potatoes paper-thin in a moment of pique. In another telling, she accidentally dropped a thin slice into a boiling pot of fat while peeling potatoes, retrieved it with a fork, and had her eureka moment.
Crum and Wicks weren’t the only posthumous claimants to the title. In his 1907 obituaries, Hiram S. Thomas was widely credited as “the inventor of Saratoga chips.” A prominent Black hotelier, referred to in one obituary as “next to Booker T. Washington” as one of the most well-known African Americans in the region, Thomas ran Moon’s Lake House for about a decade. However, that was in the 1890s, some 40 years after the Crum and/or Wicks discovery—and a good decade after the chips had become commercially available far beyond Saratoga.
Still another notable to receive credit in her obituaries was Emeline Jones. Renowned as a cook to the rich, famous and powerful in New York City and Washington, D.C., Jones, who was also Black, had worked briefly at Moon’s Lake House under Hiram S. Thomas earlier in her career. So, while it’s possible she learned to make potato chips there, she seems unlikely to have been present at the creation.
A more recent theory, apparently first advanced by Stiles, is that the Lake House’s potato chips actually precede even Crum and Wicks. Another New York Herald article, this one from 1849, notes the “fame of ‘Eliza, the cook,’ for crisping potatoes,” adding that “scores of people visit the lake and carry away specimens of the vegetable, as prepared by her, as curiosities.” Regrettably, Eliza’s last name and anything else about her seems lost to history.
Whether or not anybody at Saratoga Springs actually invented potato chips, the town certainly did a lot to popularize them. For years, they were known as Saratoga chips, and they are still sold under that name today.
At first, Saratoga chips were a gourmet delicacy served at fine hotels and restaurants. Diners at the Cadillac Hotel in Detroit could enjoy them with chicken salad in aspic. Passengers aboard the luxury liner R.M.S. Berengaria nibbled theirs alongside roast pheasant. Wealthy families whose cooks had mastered the art of chip making could buy a sterling silver Saratoga chip server at Tiffany for dishing them out with elegance.
Usually handmade and often served in wax paper bags, freshly fried snack chips tended to have a short shelf life, making them a hyperlocal, highly fragmented business proposition. It wasn’t until the 1930s that two companies, Lay’s and Fritos—the latter of which made their chips from corn, not potato—began their rise to becoming national brands mass-producing and distributing the popular snack foods. In time, chips became a universal treat, with potato chips alone becoming a $10 billion industry in the U.S.
Was a British Doctor the Real Inventor?
But if potato chips weren’t born in Saratoga, where did they come from? Food historians suggest they go back to at least 1817, when an English doctor named William Kitchiner came out with the first edition of his pioneering cookbook, The Cook’s Oracle, published in both British and American editions. One recipe, “potatoes fried in slices,” sounds remarkably like today’s potato chip. Later revisions referred to the dish as “potatoes fried in slices or shavings.”
While Kitchiner may have been the first to publish a potato chip recipe, that doesn’t mean he invented it. In fact, given the ubiquity of the potato—long a staple throughout the world—it seems likely that the potato chip has been invented and reinvented by countless cooks, possibly for centuries.
For serious snackers, the question of who actually invented the chip may be beside the point, anyway. The important thing is that somebody did.