Why the Candy Bar Market Exploded After World War I

H/T  History.com.

A look at the candy bar through history.

By the end of the 1920s, more than 40,000 different candy bars were being made in the U.S.
 

Candy bars may seem quintessentially American, but they have origins in the World War I chocolate rations given to European soldiers. The American military followed suit, helping its doughboys develop a sweet tooth they would bring home after the war. Throughout the 1920s, thousands of small, regional confectioners emerged to meet the demand, creating a candy boom brimming with catchily named bars based on popular expressions, pop culture icons and even dance crazes. (Hello, Charleston Chew.) The goal of the most ambitious new sweets makers? To take a bite out of a candy business dominated by Hershey’s, the planet’s biggest chocolate maker.

The Military History of Chocolate

While the history of chocolate consumption stretches back 4,000 years to ancient cultures in what is today Mexico and Central America, the U.S. story of chocolate has strong military associations.

In the earliest decades of the United States, candy was quickly recognized not just as a sweet treat, but as a valuable way to fuel troops. During the Revolutionary War, chocolate, a favorite treat of George Washington, became part of his soldier’s rations. It was prized for its combined kick of caffeine and sugar; it even served as occasional payment to American troops in lieu of money. Candy also played a role in the Civil War, used as “a provision with quick energy and lots of sugar,” says Steve Almond, author of Candyfreak: A Journey through the Chocolate Underbelly of America.

While the first chocolate bar was created by Joseph Fry in Great Britain in 1847, and Cadbury began selling individual boxes of chocolate candies there as early as 1868, it would take the outbreak of war on a global scale for the chocolate candy bar to really take off.

World War I: The Candy Bar Is Born

Two soldiers of the 351st Field Artillery which returned on the 'Louisville' receive candy from the Salvation Army women that welcome every troopship arriving in port, 1919.

 

Two soldiers of the 351st Field Artillery which returned on the ‘Louisville’ receive candy from the Salvation Army women that welcome every troopship arriving in port, 1919.

War Department/Buyenlarge/Getty Images

In World War I, the British military gave soldiers chocolate to boost morale and energy. The Mayor of York sent a tin of hometown confectioner Rowntree’s chocolates to residents in uniform, and in 1915, every U.K, soldier abroad received a “King George Chocolate Tin.”

Not to be outdone, the American Army Quartermaster Corps solicited donations of 20-pound blocks of chocolate from confectioners back home, which they then cut down and wrapped by hand. When U.S. GIs returned from the war with an insatiable appetite for chocolate, they arrived back just before the onset of Prohibition—when Americans actively sought alternatives to alcohol to boost their energy and mood, from soda to ice cream to candy. By the end of the 1920s, more than 40,000 different candy bars were being made in the U.S., says Susan Benjamin, candy historian and author of Sweet as Sin: The Unwrapped Story of How Candy Became America’s Favorite Pleasure.

Chocolate Candy Bar Marketing: It’s All in the Name

During the candy bar boom, nearly every major city had a set of confectioners cranking out as many types of candy bars as they could, filling them with everything from nougat, marshmallow and nuts to fruits and dehydrated vegetables. (Yes, really.) Because a lack of widespread refrigeration and transportation issues remained a barrier to national distribution, regional brands dominated each market, creating bars with names that appealed to local pride. The Charleston Chew took its name from a local dance craze. The 18th Amendment Bar was born in Chicago during Prohibition. “It was the birth of modern marketing. Since most bars used the same six or seven ingredients, people were furiously trying to figure out how to differentiate their brand,” says Almond.

Candy companies often named their popular bars after pop culture icons: “Charles Lindbergh begat both The Lindy and Winning Lindy. Clara Bow begat the It bar. Dick Tracy had his own bar. So did Amos ’N’ Andy and Little Orphan Annie and Betsy Ross,” Almond says.

Baby ruth candy bar ad, 1927

 

Baby Ruth candy advertisement, 1927.

Transcendental Graphics/Getty Images

One chocolate baron recognized for his marketing genius was Otto Schnering of the Curtiss Candy Company. He had the audacity to fudge the name of the Baby Ruth bar—claiming he named it for President Grover Cleveland’s daughter, while conveniently cashing in on the popularity of baseball legend Babe Ruth. (The slugger later tried to get in on the candy business himself with something called “Ruth’s Home Run Candy,” but Schnering boldly sued him for copyright infringement—and won.) A master of the marketing stunt, Schnering chartered a barnstorming biplane in 1923 to do aerial tricks over U.S. cities like Chicago and Pittsburgh while dropping payloads of Baby Ruth bars, complete with tiny parachutes. Later, Schnering arranged to have his Butterfinger candy bar featured in the 1934 Shirley Temple movie Baby Takes a Bow, pioneering the art of product placement with one of the biggest stars of the era.

Later, during the Great Depression, as Americans had less disposable income for treats, some makers shifted their marketing campaigns to position candy bars as a cheap meal replacement option. “Candy bars became essentially fast food, especially [later] in the Depression era, when people needed quick energy and cheap calories,” says Almond. “Candy bars like ‘Chicken Dinner’ or ‘Club Sandwich’ sent the message that if you don’t have time or money for a full meal, candy bars were a quick and affordable way to eat.”

Candy Bar Consolidation

WATCH: Combat Rations in WWII

The Depression slowed the chocolate gold rush considerably, as scarcity and high prices for raw candy ingredients like sugar helped drive many independent, regional confectioners out of business—or into buyouts with larger manufacturers. Hershey’s, for example, cut a deal to help support the popular, but financially foundering, Reese’s Cups. With the outbreak of World War II, shortages became even more pronounced. And the military commissioned chocolate rations from American’s biggest producers, spelling an end to the regional candy bar boom.

In 1937, the U.S. Army asked Hershey Company to create the “D-Ration bar.” It had to weigh just four ounces, provide a burst of energy, not melt in high temperatures and “taste a little better than a boiled potato” to prevent soldiers from eating it too quickly. The resulting product was not known for its taste. Hershey’s launched a more palatable product for the Pacific Theater, the “Tropical bar,” in 1943. By the close of World War II, Hershey’s had manufactured more than 3 billion ration bars.

After World War II, improvements in manufacturing, transportation and refrigeration further challenged hyper-regional confectioners. Large companies bought out smaller ones and provided national distribution. Today, most American chocolate bars are manufactured by the “big three”: Hershey’s, Mars and Nestle, though individual bars like Baby Ruth, Butterfinger and PayDay harken back to the heyday of the American candy bar.

 

Why Ice Cream Soared in Popularity During Prohibition

H/T History.com.

This is a story of Prohibition that does not include speakeasy’s and bootlegged alcohol.

No beer? No problem. Better refrigeration, together with innovations in making and selling frozen treats, helped steer people toward this ‘refreshing and palatable food.’
 

When Congress passed the Volstead Act in 1920, prohibiting the manufacture and sale of alcoholic beverages in the United States, the law nearly decimated the alcohol industry. But it helped give the nascent ice cream business a sweet boost.

Between 1919 and 1929, federal tax revenues from distilled spirits plummeted from $365 million to less than $13 million, according to the U.S. Treasury Department. The few breweries that survived to the end of Prohibition in 1933 did so by pivoting—producing everything from ceramics and farm equipment to American cheese, candy and malt syrup. Iconic breweries such as Anheuser-Busch and Yuengling turned, in part, to ice cream production.

“As men sought alternatives to having a drink at the local saloon, many ate ice cream more often,” wrote Anne Cooper Funderburg, the author of Chocolate, Strawberry, and Vanilla: A History of American Ice Cream, driving an estimated 40 percent growth in consumption between 1920 and 1929. A song from a Pacific Ice Cream Manufacturers Convention in 1920 declared, “Gone are the days when Father was a souse,” and that now, instead of beer, he brings home a brick of ice cream.

The Anti-Saloon League, the most powerful lobby for Prohibition, eagerly supported the dairy industry, trying to take credit for the growth of the ice cream market. “It is believed that this large increase in ice cream consumption was due in a large degree to the fact that men with a craving for stimulants turned readily to this refreshing and palatable food,” reported the organization’s yearbook in 1921. “The more ice cream that is used, the better it is for the consumers and the producers of milk.” 

Other factors driving the ice cream boom included the expansion of soda fountains, improved methods of refrigeration and innovations in ice cream production. The latter two, in particular, helped bring frozen desserts to a national market, with competitive development of new single-serve products like the chocolate-covered ice cream bar, the Popsicle and the ice cream-filled Dixie Cup.

Soda Fountains

Soda Fountain, c. 1920
 

A barman serving a soda to costumers, c. 1920.

George Rinhart/Corbis/Getty Images

As sugary drinks became America’s favorite alcohol replacement in the 1920s, companies like Coca-Cola grew into behemoths and soda fountains replaced saloons as the place where people gathered to socialize in public. In 1922, The New York Times estimated the U.S. had more than 100,000 soda fountains—most of which were situated in drugstores—with $1 billion in sales. But ice cream played a key supporting role to soda, as fountain mixologists, not unlike saloon bartenders, concocted drinks that mixed the two—like the Coke float. Some got more creative: Proprietors of a soda fountain in Aspen, Colorado made the Aspen Crud, a cocktail of ice cream laced with bourbon, taking advantage of laws that enabled drugstores to sell alcohol for medicinal purposes.

Mass Producing Ice Cream

Ice Cream during Prohibition
 

A man selling frozen treats from a street stall, c. 1922.

Hulton Archive/Getty Images

The surge in ice cream’s popularity during Prohibition coincided with the development of more efficient means of refrigeration both at soda fountains and private homes, as well as less labor-intensive methods to make ice cream. Countertop freezers allowed busy soda fountain operators to store large quantities of ice cream, but the process of making it using a manual crank could be an arduous task. In 1926, Clarence Vogt, an inventor from Louisville, Kentucky, made it possible to mass produce ice cream on an industrial scale when he created the first commercially successful continuous process freezer. Vogt’s machine, which allowed the ingredients to be poured in at one end of the machine and ice cream to come out the other end, led to “true mass marketing” of ice cream and the proliferation of the boom that started with the onset of Prohibition, according to Diana Rosen, the author of The Ice Cream Lover’s Companion.

Put a Stick in It

Through the 1920s, several entrepreneurs were developing their own innovations to help make standardized frozen treats that were less messy and more portable, allowing them to be sold at volume in amusement parks, boardwalks and other large public venues. In January 1922, an Iowa schoolteacher named Christian Nelson patented the Eskimo Pie, a single-serve bar of vanilla ice cream covered in a thin chocolate shell. Harry Burt, a Chicago-based confectioner, became the first to put a stick in such chocolate-covered ice cream treats with the Good Humor bar, patenting his process and machinery in 1923. Burt revolutionized the industry by launching a fleet of refrigerated trucks driven by snappily white-clad “Good Humor men” dispensing bars, cones and cups directly to neighborhoods around the country.

In 1923, Dixie Cups hit the market, offering a 2.5-ounce serving of two ice cream flavors in disposable paper cups. That year, the A&P grocery store chain put ice cream cabinets in 1,500 of its stories, making it possible for consumers to buy their favorite frozen treat at the food market.

And in 1924, Frank Epperson’s Popsicle Corporation filed for a patent on the process to create flavored ice treats on a stick, something Epperson had famously invented almost two decades earlier as a young man after leaving a syrupy drink out in the cold overnight with a stirring stick in it. In 1924, his burgeoning Popsicle Corporation reported sales of 6.5 million units. Ultimately, frozen treats on a stick became the subject of numerous contentious copyright-infringement lawsuits. After Epperson’s majority investor made a deal with Good Humor, the Popsicle inventor divested himself of his company, sold his patent and left the frozen treat business.

The Ice Cream Boom Fades

By the late 1920s and early ’30s, the ice cream industry was hit by a double whammy: the Great Depression and the repeal of Prohibition. After that, “World War II, with its quotas on milk and sugar, further dampened ice cream enthusiasm,” wrote Gail Damerow, the author of Ice Cream! The Whole Scoop. The industry surged again after the war, yet it was the spark of Prohibition and the temporary dismantling of one industry that helped elevate another to unprecedented heights—an impact that forever changed the U.S. ice cream business.

9 Groundbreaking Inventions by Women

H/T History.com.

I learned a lot reading this story.

Women inventors are behind a wide range of key innovations, from Kevlar to dishwashers to better life rafts.
 

Female inventors have played a large role in U.S. history, but haven’t always received credit for their work. Besides the fact that their contributions have sometimes been downplayed over overlooked, women—particularly women of color—have historically had fewer resources to apply for U.S. patents and market their inventions.

Not all of the female inventors on this list received attention for their work in their lifetime, or were able to market their inventions. But all of them contributed innovations that helped advance technology in their respective fields.

1. Life Raft

In the early 1880s, when a new wave of European immigrants were sailing to the United States, a Philadelphia inventor named Maria E. Beasley designed an improved life raft. Unlike the flat life rafts of the 1870s, Beasley’s raft had guard rails to help keep people inside during emergencies when they had to abandon ship.

Beasley patented her first life raft design in 1880 in both the United States and Great Britain, and received a second U.S. patent for an updated version of the raft in 1882. In addition to the life raft, she also invented a foot warmer, a stream generator and a barrel-hooping machine, receiving a total of 15 U.S. patents and at least two in Great Britain during her life.

2. Fold-Out Bed

In 1885, a Chicago inventor and furniture store owner named Sarah E. Goode received a patent for her “Cabinet-Bed.” The new piece of furniture was a desk that folded out into a bed, allowing the user to save space in a tiny apartment.

Goode’s invention predated the 20th century’s pull-down Murphy beds and pull-out sofas. With her Cabinet-Bed, Goode—who was born into slavery and won her freedom after the Civil War—became one of the first Black women to patent and invention with the U.S. Patent and Trademark Office.

READ MORE: 8 Black Inventors Who Made Daily Life Easier

3. Dishwasher

Josephine G. Cochran was a wealthy socialite in Shelbyville, Illinois when she got the idea to invent a dishwasher. Cochrane employed servants to perform housework in her mansion, but started washing her fine china herself when she discovered some of the servants had accidentally chipped them. Cochrane found her brief exposure to housework unpleasant, and resolved to build a machine that could wash the dishes for her.

The result was the first commercially-successful dishwasher, which Cochrane patented in 1886. Previous attempts at dishwashers had used scrubbers, but Cochrane’s design was more effective because it used water pressure to clean the dishes. With her patent secure, she founded Cochran’s Crescent Washing Machine Company. Because the machine was too expensive for most households, Cochran sold most of her dishwashers to hotels and restaurants.

4. Car Heater

The first person to patent an automobile heater was Margaret A. Wilcox, an engineer in Chicago. Wilcox’s 1893 design used heat from the car’s engine to keep drivers and passengers warm during trips. Later engineers improved upon the idea by making the heat easier to regulate.

Wilcox’s other inventions included a combined clothes-and-dishwasher, which didn’t catch on in the same way.

5. Feeding Tube

Bessie Virginia Blount, also known as Bessie Blount Griffin, was an American nurse, physical therapist, inventor, handwriting expert and possibly the first Black woman to train at Scotland Yard’s Document Division. In the 1940s, she who worked with World War II veterans in New York City’s Bronx Hospital (now part of BronxCare Health System), where she taught veterans with amputations to read and write with their teeth and feet. It was during this work that Bount invented a device that her patients could use to feed themselves.

Blount’s invention involved a tube that delivered food to a person’s mouth whenever he or she bit down on it. She patented part of the design in 1948, then gifted the rights to the invention over to the French government in 1951 on the advice of a religious leader (the U.S. government hadn’t shown much interest in the device). 

Her invention paved the way for modern feeding tubes, which can be inserted into a person’s nose or stomach if the user can’t ingest food orally. After patenting the feeding tube, Blount continued to invent and went on to become a forensic handwriting analyst.

READ MORE: 11 of History’s Fiercest Females Everyone Should Know

6. Kevlar

Stephanie L. Kwolek was a chemist who created synthetic fibers while working at DuPont’s Pioneering Research Laboratory in Wilmington, Delaware. The most famous one she created was Kevlar—a strong, lightweight and heat-resistant synthetic fiber.

Kwolek patented the process for making Kevlar in 1966. Kevlar is used in bulletproof vests and other protective equipment, and has also become a substitute for asbestos since the 1970s, when companies began to scale back on using the cancer-causing material.

7. Home Security System

Marie Van Brittan Brown was a Black nurse and inventor in New York City who, together with her husband, Albert Brown, patented the first home security system in 1969. Brown got the idea for the security system because she and her husband worked long hours as an electronics technician, and she often found herself coming home to their apartment and being by herself late at night.

The system that Brown invented involved a sliding camera that could capture images through four different peepholes in her door, TV monitors to display the camera images and two-way microphones that allowed her to talk with anyone outside her door. There was also a remote to unlock the door from a distance and a button to alert police or security. This system paved the way for modern security systems, and has been cited in at least 32 patent applications that came after it.

8. Cataract Treatment

Patricia E. Bath was the first Black American to complete a residency in ophthalmology and the first Black female doctor to patent a medical device in the United States. The device she invented was the Laserphaco Probe, which removed cataracts—cloudy blemishes in the eye that can lead to vision loss.

Bath’s new ways of removing cataracts was faster, more accurate and less invasive than previous methods. She earned her first U.S. patent related to the procedure in 1988, and received four other U.S. patents related to her cataract-removal innovations during her lifetime, in addition to patents in Japan, Canada and Europe. She passed away at the age of 76 in .

9. Stem Cell Isolation

While working in Palo Alto in 1991, Asian American scientist Ann Tsukamoto was part of the team that patented the first method of isolating blood-forming stem cells in 1991. Tsukamoto holds a total of 12 U.S. patents for her stem cell research, which has helped with the development of cancer treatments.

Women in WWII Took on These Dangerous Military Jobs

H/T History.com.

Women in World War II.

Looking beyond traditional nursing or clerical roles, some women served as snipers, bomber pilots and more.
 

Women served on both sides of World War II, in official military roles that came closer to combat than ever before. The Soviet Union, in particular, mobilized its women: Upward of 800,000 would enlist in the Red Army during the war, with more than half of these serving in front-line units. British forces included many women alongside men in vital anti-aircraft units. And Nazi Germany followed suit later in the conflict, when its flagging fortunes required the nation’s full mobilization.

Of the four major powers in the conflict, only the United States resisted sending any women into combat. Still, thousands of American women did join the military in various capacities during World War II, upending generations of traditional gender roles and longstanding assumptions about female capability and courage.

Soviet Union: Bombers and Snipers

WATCH: The Night Witches

Soviet women served as scouts, anti-aircraft gunners, tank drivers and partisan fighters, but the two most dangerous—and celebrated—roles they played were as pilots and snipers.

In the fall of 1941, with invading German forces threatening Moscow, Marina Raskova (known as the “Russian Amelia Earhart”) convinced Joseph Stalin to authorize three regiments of female pilots. The most famous was the 588th Night Bomber Regiment, whose pilots hit so many of their targets that the Germans started calling them the Nachthexen, or “night witches.” Using rickety plywood planes, the women of the 588th flew more than 30,000 missions and dropped more than 23,000 tons of bombs on the Nazis; 30 of them were killed and 24 received the Hero of the Soviet Union medal, the nation’s highest award for valor.

Though nearly 2,500 Soviet women were trained as snipers, many others took on the role without formal training. Assigned to infantry battalions, female snipers were tasked with targeting German frontline officers and picking them off as they advanced. One sniper, Lyudmila Pavlichenko (aka “Lady Death”), killed a confirmed 309 Germans, including 36 enemy snipers, in less than a year of service with the Red Army’s 25th Rifle Division. Wounded four separate times, she was taken out of combat by late 1942; the Soviet government sent her to the United States, where she toured the country with Eleanor Roosevelt. She was 25 years old.

Great Britain: The ‘Ack Ack Girls’

British 'Ack Ack Girls', WWII
 

British A.T.S. on searchlight battery, January 19, 1943.

Mirrorpix/Getty Images

In mid-1941, when the British military began using women from the Auxiliary Territorial Service (ATS) in anti-aircraft units, they made it clear that the purpose was to free up more men to fight; women were still barred from taking combat roles. The Blitz had just ended, but Germany’s Luftwaffe still ran bombing raids over London and across Britain throughout the conflict. ATS women (popularly known as Ack Ack Girls) served in mixed Royal Artillery batteries with men. Though they became skilled in spotting, or locating enemy aircraft, setting the range and bearing on the anti-aircraft guns and handling the ammunition, women were prohibited from actually pulling the trigger. As one gunnery assistant put it: “We did the same duties as the men. When they stood on guard all night they had their rifle, when we stood on guard we had a broom handle.”

Many ATS members were assigned to searchlight units, positioned around each gun complex to spotlight incoming German bombers for the gunners to take aim. Searchlights also lit the skies for returning British bombing crews and scanned the seas for approaching German vessels, among other vital tasks. Formed in October 1942 on the orders of Major-General Sir Frederick Pile, the 93rd Searchlight Regiment was Britain’s first all-female army regiment. It operated some 72 searchlights outside of London, each staffed by a dozen women (plus one man to turn on the generator and fire a machine gun, if necessary).

By war’s end, more than 74,000 British women were serving in anti-aircraft units. Overall, 389 ATS women were killed and wounded during the conflict. As Pile later wrote, “The girls lived like men, fought their lights like men and, alas, some of them died like men.”

Germany: Anti-Aircraft Units

Adolf Hitler and Hanna Reitsch
 

Hanna Reitsch shakes hands with Adolf Hitler, c. 1941

Heinrich Hoffmann/ullstein bild/Getty Images

While Adolf Hitler initially insisted that women remain at home during the war and focus on their roles as wives and mothers, Germany’s increasingly desperate need for resources would lead more than 450,000 women to join auxiliary military forces.

In July 1943, German war production minister Albert Speer convinced Hitler to authorize women to serve in searchlight and anti-aircraft units with the Luftwaffe, and as many as 100,000 German women would serve in this capacity by the end of the war. As with the ATS, they were fully trained in operating anti-aircraft guns, but were barred from firing them. According to an order Hitler issued in late 1944, no German woman was to be trained in the use of weapons. Nazi propaganda warned women in the auxiliary forces not to become flintenweiber, or “gun women,” a derisive term for Soviet women fighters. In the last, desperate months of the conflict, Hitler gave in and created an experimental women’s infantry battalion, but the war ended before it could be raised.

A total of 39 women would receive Germany’s Iron Cross for duty near the front, but nearly all of them were nurses. Among those who weren’t were Hitler’s test pilots Hanna Reitsch and Melitta Schiller-Stauffenberg, two of some 60 female pilots used to ferry Nazi flights in order to free up male pilots for active duty.

United States: WACs and WASPs

Jane Tedeschi next to one of the aircraft she flew during WWII with the Women Airforce Service Pilots (WASPs), a group that performed aviation services stateside, covering for the male pilots deployed to the WWII battlefront. She was one of about 1,100 female pilots who moved planes and towed target aircraft for live-fire drills.
 

Jane Tedeschi next to one of the aircraft she flew during WWII with the Women Airforce Service Pilots (WASPs), a group that performed aviation services stateside, covering for the male pilots deployed to the WWII battlefront. She was one of about 1,100 female pilots who moved planes and towed target aircraft for live-fire drills.

Andy Cross/Denver Post/Getty Images

Much has been made of the way American women served on the homefront, powering the factories that enabled the United States to become “the arsenal of democracy.” As in past conflicts, tens of thousands of American women also served courageously as nurses, with more than 1,600 members of the U.S. Army Nurse Corps alone earning medals, citations and commendations. But many other women served the U.S. war effort in an active—and often dangerous—capacity. Though the United States did not send any women into combat during World War II, the conflict did see the nation take steps toward integrating women into the military in a new way.

After heated debate in Congress over the inversion of traditional gender roles implied by women’s enlistment in the armed forces, the Army became the first to enlist women, creating the Women’s Auxiliary Army Corps (WAAC) in May 1942. In July 1943, thanks to the efforts of director Oveta Culp Hobby, the WAAC was converted to regular army status as the Women’s Army Corps (WAC).

Influenced by the performance of female soldiers in Europe, Army Chief of Staff George C. Marshall authorized some in the WAAC to be trained on anti-aircraft batteries and searchlight units, like their British and German counterparts. But by mid-1943, he called off the experiment, fearing public outcry and Congressional opposition to the idea of women in combat roles. More than 150,000 women served in the WAC during the war, with thousands sent to the European and Pacific theaters. None saw combat, but their brave service would lead to greater acceptance of the idea of women into the military.

American women also took to the skies during World War II, as the U.S. Army Air Forces (predecessor of the Air Force) began training women to fly military aircraft in order to free male pilots for combat duty. In the Women Airforce Service Pilots (WASP) program, women flew B-26 and B-29 bombers and other heavy planes between factories and military bases around the country; tested new and repaired planes; and towed targets for gunners in the air and on the ground to practice shooting, using live ammunition.

By December 1944, when Congress mandated the closure of the elite program (more than 25,000 women applied during the war, but only 1,100 would end up serving), 38 WASP pilots had lost their lives due to plane crashes or other accidents in the line of duty. Program records were classified, and all official traces of the program disappeared until the late 1970s, when President Jimmy Carter finally granted the pioneering female aviators the status of U.S. military veterans. 

 

Razors

H/T SoftSchools.com.

A look back through the sands of time at razors.

For as long as humans have walked the Earth, they have desired to tame their unwieldy manes. In prehistoric times, early razor forms were created from sharpened shark’s teeth and clamshells to trim back long facial and head hair. As human society developed, so did the razor. By the 4th millennium BC, razors made of gold and copper were being developed and buried in Egyptian tombs. Further ancient developments carried the razor all the way to Rome in the 6th century BC.

Continued development and use of the razor wove its way throughout the following centuries. These advancements allowed for the introduction of the first modern straight razor in Sheffield England during the 18th century. Made from steel, this razor was reluctantly accepted throughout Europe during the next 100 years. With the development of the Sheffield Steel razor, wealthy members of society would have their servants shave them often.

Daily shaving, however, was not common at this time. The introduction of gas masks during World War I was the instigator of the current cultural practice to shave every day. Needing to ensure their gas masks fit securely and snugly required constant grooming of facial hair. This practice continued after the war, and is still practiced by many today.

Until the 1950s, straight razors were the common form of razor used by barbershops and common men. However, an invention by King C. Gillette changed this. He had developed a safety razor that used replaceable blades. Due to an extremely effective marketing campaign, which portrayed the straight razors as ineffective, this new razor design was a huge success.

Today, the modern hand-held razors are based on Gillette’s original design. However, new designs such as the electric razor have been developed to compete with this gold standard. However, as razors have been with humans from the dawn of time, their continued use and advancement will likely continue for many centuries to come.

 

A Brief History of the Invention of the Home Security Alarm

H/T Smithsonian Magazine.

A hardworking nurse envisioned a new way to know who was at the door

patent application for home-security system and an image of a woman and a man displaying the patent
Left, a portion of the patent plan designed by Marie Van Brittan Brown and her husband Albert, right. (Marie Van Brittan Brown and Albert L. Brown, courtesy U.S. Patent and Trademark Office; New York Times / Redux)
 
 

Marie Van Brittan Brown, an African American nurse living in Jamaica, Queens in the 1960s, was working odd shifts, as was her husband, Albert, an electronics technician. When she arrived home late, she sometimes felt afraid. Serious crimes in Queens jumped nearly 32 percent from 1960 to 1965, and police were slow to respond to emergency calls. Marie wanted to feel safer at home.

 

Enlisting her husband’s electrical expertise, Marie conceived a contraption that could be affixed to the front door. It would offer four peepholes, and through these, a motorized video camera on the inside could view visitors of different heights as the occupant toggled the camera up and down. The camera was connected to a television monitor inside. A microphone on the outside of the door and a speaker inside allowed an occupant to interrogate a visitor, while an alarm could alert police via radio. Closed-circuit television, invented during World War II for military use, was not widespread in the 1960s, and the Browns proposed using the technology to create the first modern home security system.

They filed a patent for their device in 1966, citing Marie as lead inventor. It was approved three years later. “The equipment is not in production,” the New York Times reported, “but the Browns hope to interest manufacturers and home builders.

Patent application for bedside door security camera, sketched in black, white, and orange
The Browns’ 1969 patent plan for an elaborate home security system suggests safety and relaxation can go hand in hand. (Marie Van Brittan Brown and Albert L. Brown, courtesy U.S. Patent and Trademark Office)

That never happened, presumably because the Browns’ system was ahead of its time. “The cost of installing it would be pretty high,” says Robert McCrie, an emergency management expert at John Jay College of Criminal Justice in Manhattan.

Marie’s invention, though it didn’t benefit them financially, would earn the Browns a measure of recognition in the tech world: The predecessor of today’s home security systems, it has been cited in 35 U.S. patents. Companies first offered CCTV to residential consumers around 2005, but Marie never saw her vision realized; she died in Queens in 1999, at the age of 76.

As the tech has become cheaper and smarter, home security has grown into a $4.8 billion business in North America and is expected to triple by 2024.

A Brief History of Peanut Butter

H/T Smithsonian Magazine.

The bizarre sanitarium staple that became a spreadable obsession

Jars of peanut butter
Veteran food critic Florence Fabricant has called peanut butter “the pâté of childhood.” (Dan Saelinger)
 

North Americans weren’t the first to grind peanuts—the Inca beat us to it by a few hundred years—but peanut butter reappeared in the modern world because of an American, the doctor, nutritionist and cereal pioneer John Harvey Kellogg, who filed a patent for a proto-peanut butter in 1895. Kellogg’s “food compound” involved boiling nuts and grinding them into an easily digestible paste for patients at the Battle Creek Sanitarium, a spa for all kinds of ailments. The original patent didn’t specify what type of nut to use, and Kellogg experimented with almonds as well as peanuts, which had the virtue of being cheaper. While modern peanut butter enthusiasts would likely find Kellogg’s compound bland, Kellogg called it “the most delicious nut butter you ever tasted in your life.”

 

A Seventh-Day Adventist, Kellogg endorsed a plant-based diet and promoted peanut butter as a healthy alternative to meat, which he saw as a digestive irritant and, worse, a sinful sexual stimulant. His efforts and his elite clientele, which included Amelia Earhart, Sojourner Truth and Henry Ford, helped establish peanut butter as a delicacy. As early as 1896, Good Housekeeping encouraged women to make their own with a meat grinder, and suggested pairing the spread with bread. “The active brains of American inventors have found new economic uses for the peanut,” the Chicago Tribune rhapsodized in July 1897.

 

A vintage peanut butter advertisement
“It’s the Great Depression that makes the PB&J the core of childhood food,” food historian Andrew F. Smith has said. (Buyenlarge / Getty Images)

Before the end of the century, Joseph Lambert, an employee at Kellogg’s sanitarium who may have been the first person to make the doctor’s peanut butter, had invented machinery to roast and grind peanuts on a larger scale. He launched the Lambert Food Company, selling nut butter and the mills to make it, seeding countless other peanut butter businesses. As manufacturing scaled up, prices came down. A 1908 ad for the Delaware-based Loeber’s peanut butter—since discontinued—claimed that just 10 cents’ worth of peanuts contained six times the energy of a porterhouse steak. Technological innovations would continue to transform the product into a staple, something Yanks couldn’t do without and many a foreigner considered appalling.

By World War I, U.S. consumers—whether convinced by Kellogg’s nutty nutrition advice or not—turned to peanuts as a result of meat rationing. Government pamphlets promoted “meatless Mondays,” with peanuts high on the menu. Americans “soon may be eating peanut bread, spread with peanut butter, and using peanut oil for our salad,” the Daily Missourian reported in 1917, citing “the exigencies of war.”

The nation’s food scientists are nothing if not ingenious, and peanut butter posed a slippery problem that cried out for a solution. Manufacturers sold tubs of peanut butter to local grocers, and advised them to stir frequently with a wooden paddle, according to Andrew Smith, a food historian. Without regular effort, the oil would separate out and spoil. Then, in 1921, a Californian named Joseph Rosefield filed a patent for applying a chemical process called partial hydrogenation to peanut butter, a method by which the main naturally occurring oil in peanut butter, which is liquid at room temperature, is converted into an oil that’s solid or semisolid at room temperature and thus remains blended; the practice had been used to make substitutes for butter and lard, like Crisco, but Rosefield was the first to apply it to peanut butter. This more stable spread could be shipped across the country, stocked in warehouses and left on shelves, clearing the way for the national brands we all know today. The only invention that did more than hydrogenation to cement peanut butter in the hearts (and mouths) of America’s youth was sliced bread—introduced by a St. Louis baker in the late 1920s—which made it easy for kids to construct their own PB&Js. (In this century, the average American kid eats some 1,500 peanut butter and jelly sandwiches before graduating from high school.)

Rosefield went on to found Skippy, which debuted crunchy peanut butter and wide-mouth jars in the 1930s. In World War II, tins of (hydrogenated) Skippy were shipped with service members overseas, while the return of meat rationing at home again led civilians to peanut butter. Even today, when American expats are looking for a peanut butter fix, they often seek out military bases: They’re guaranteed to stock it.

But while peanut butter’s popularity abroad is growing—in 2020, peanut butter sales in the United Kingdom overtook sales of the Brits’ beloved jam—enjoying the spread is still largely an American quirk. “People say to me all the time, ‘When did you know that you had fully become an American?’” Ana Navarro, a Nicaraguan-born political commentator, told NPR in 2017. “And I say, ‘The day I realized I loved peanut butter.’”

Though the United States lags behind China and India in peanut harvest, Americans still eat far more of the spread than the people in any other country: It’s a gooey taste of nostalgia, for childhood and for American history. “What’s more sacred than peanut butter?” Iowa Senator Tom Harkin asked in 2009, after a salmonella outbreak was traced back to tainted jars. By 2020, when Skippy and Jif released their latest peanut butter innovation—squeezable tubes—nearly 90 percent of American households reported consuming peanut butter.

The ubiquity of this aromatic spread has even figured in the nation’s response to Covid-19. As evidence emerged last spring that many Covid patients were losing their sense of smell and taste, Yale University’s Dana Small, a psychologist and neuroscientist, devised a smell test to identify asymptomatic carriers. In a small, three-month study of health care workers in New Haven, everyone who reported a severe loss of smell using the peanut butter test later tested positive. “What food do most people in the U.S. have in their cupboards that provides a strong, familiar odor?” Small asks. “That’s what led us to peanut butter.”

Sustainable

George Washington Carver’s research was about more than peanuts
By Emily Moon

 

 

George Washington Carver in his laboratory.
Carver in his laboratory, circa 1935. (Hulton Archive / Getty Images)

No American is more closely associated with peanuts than George Washington Carver, who developed hundreds of uses for them, from Worcestershire sauce to shaving cream to paper. But our insatiable curiosity for peanuts, scholars say, has obscured Carver’s greatest agricultural achievement: helping black farmers prosper, free of the tyranny of cotton.

Born enslaved in Missouri around 1864 and trained in Iowa as a botanist, Carver took over the agriculture department at the Tuskegee Institute, in Alabama, in 1896. His hope was to aid black farmers, most of whom were cotton sharecroppers trapped in perpetual debt to white plantation owners. “I came here solely for the benefit of my people,” he wrote to colleagues on his arrival.

He found that cotton had stripped the region’s soil of its nutrients, and yet landowners were prohibiting black farmers from planting food crops. So Carver began experimenting with plants like peanuts and sweet potatoes, which could replenish the nitrogen that cotton leached and, grown discreetly, could also help farmers feed their families. In classes and at conferences and county fairs, Carver showed often packed crowds how to raise these crops.

Since his death in 1943, many of the practices Carver advocated—organic fertilizer, reusing food waste, crop rotation—have become crucial to the sustainable agriculture movement. Mark Hersey, a historian at Mississippi State University, says Carver’s most prescient innovation was a truly holistic approach to farming.

“Well before there was an environmental justice movement, black environmental thinkers connected land exploitation and racial exploitation,” says Hersey. A true accounting of American conservation, he says, would put Carver at the forefront.

Soldiers’ Rations Through History: From Live Hogs to Indestructible MREs

H/T History.com.

Feeding the troops through the years.

I have never tasted them but I have heard MRE’s described as Meals Rarely Edible.

While ancient Roman armies largely hunted their rations during war campaigns, modern soldiers now have access to pizza that can last as long as three years.
 

As the saying goes, an army marches on its stomach, relying on good and plentiful food to fuel its ability to fight. 

For contemporary U.S. armed forces in combat, that usually means Meals, Ready-to-Eat, or MREs. U.S. armed forces switched to MREs in the early 1980s, replacing the much-derided canned rations that had sustained troops from WWII through most of the Vietnam war. In September 2018, specially engineered pizza that can last three years was added to 24 available MRE options, as part of a larger strategy to improve morale (and avoid something called “menu fatigue”). 

Throughout history, feeding troops has been a challenge for all of the greatest fighting forces, from the Roman legions to the hordes of Genghis Khan to Napoleon’s chasseurs. Here’s how they did it:

Roman soldier hunting a wild boar.

 

Roman soldier hunting a wild boar.

PHAS/UIG/Getty Images

The Roman Legions

Roman armies hunted everything that was available, archaeological remains of wild animals show, says Thomas R. Martin, a professor in Classics at College of the Holy Cross. From the limited evidence of what the administration in Rome provided the soldiers, he adds, the most important source of calories were carbohydrates: barley or wheat. One source says soldiers were given one pound of meat daily. “For an army you have to kill 120 sheep a day just for the meat ration. Or 60 hogs,” says Martin. 

Whatever the exact amount, it would not be enough to sustain a Roman soldier, who was “a mule more than anything else,” says Martin. They carried very heavy gear, on bad roads, and that’s when they were not expending calories fighting. With their food they were given wine—a diluted version of what we’re used to—or something closer to vinegar that would help reduce bacteria in their drinking water. For their supply of fat, Roman troops, unsurprisingly, looked to olive oil.

Crusaders

During the Crusades, the average Christian soldier in a siege would have some dried meat and grain to make things like porridge. But this was food they would have brought with them, supplemented with fruits and vegetables or cheese purchased locally. During the First Crusade, soldiers would have provided their own food stores, which they would have mortgaged their property or sold possessions to buy. Later, during crusades like those in the 14th century, called by Pope Innocent III, deals were made with the Venetian fleet and merchants to keep soldiers supplied.

During battles, “if crusaders got to the Muslim camp they would stop fighting and start eating. And it would cost them the battle. It happened twice at the siege of Acre,” says John Hosler, associate professor of military history at the U.S. Army Command & General Staff College, a medievalist military expert and author of The Siege of Acre, 1189-1191. At one point in the Third Crusade, an observer noted several kitchens in the sultan Saladin’s camp, with up to nine cauldrons each. Those cauldrons were substantial—Hosler points out you could fit four cows’ heads in each. The Christian invaders had nothing comparable. 

Mongolian cooking.

 

Mongolian cooking.

Christophel Fine Art/UIG/Getty Images

Genghis Khan’s Mongol Warriors

The Mongol diet “was not gourmet,” says Morris Rossabi, a historian and author of The Mongols and Global History. In the early 13th century, when Genghis Khan was conquering swaths of Asia (mostly in the territory we’d now call China), his horde wasn’t able to carry much. Warriors were supplied by their own households, and as territories were conquered, the Mongols came in contact with foodstuffs like wine. (Their homegrown brand of liquor was fermented mare’s milk called airag, or kumis.) 

The Mongolian lands were not particularly arable, nor did the Mongols stay in one place for a long time, so fruits and vegetables weren’t staples. The Mongols brought their herds of cows and sheep with them on campaigns. When herds were unavailable, the horsemen would hunt (dogs, marmots and rabbits) or subsist on dried milk curd, cured meat and both fresh and fermented mare’s milk.

Janissaries gathering, including a head cook and water bearer.

 

Janissaries gathering, including a head cook and water bearer.

DeAgostini/Getty Images

The Ottoman Empire: The Janissaries

At the height of its power, in the late 17th century, the Ottoman Empire was a massive horseshoe around the Mediterranean, including huge swaths of North Africa, the Middle East, modern-day Turkey and Eastern Europe. The Janissaries, elite foot soldiers and bodyguards to the empire’s sultan, are considered Europe’s first modern standing army. 

Janissaries ate well, according to research by Virginia H. Aksan, professor emeritus of McMaster University and a leading scholar of the Ottoman Empire. The soldiers were fueled, she writes, with “fresh baked bread, biscuit when bread was unavailable; a daily meat ration (lamb and mutton) of approximately 200 grams; honey, coffee, rice, bulgur and barley for the horses.” 

Above all, the biscuit appears to have held primacy in sustaining the soldiers. One observer noted 105 ovens in Istanbul that were solely dedicated to biscuit-baking for the military. Another wrote angrily about biscuit bakers hoarding excess flour for profit and replacing it with dirt, resulting in the death of many soldiers.

Facsimile of a  declaration about the scarcity of food during the American Revolution.

 

Facsimile of a  declaration about the scarcity of food during the American Revolution.

Universal History Archive/UIG/Getty Images

Continental Troops in the American Revolution

George Washington—along with his quartermaster and commissary general—had major problems feeding the Continental army. Congress lacked taxing authority and thereby lacked the funds to purchase supplies. It was a problem compounded by transportation and other supply issues. The result, according to Joseph Glatthaar, a professor of history at University of North Carolina-Chapel Hill, was that soldiers would often go days without a ration. “You’d get a little flour and maybe some meat and often the meat is pretty bad,” he says. 

In 1775, Congress determined a uniform ration that included one pound of beef (or three-quarters of a pound of pork or one pound of salted fish), and one pound of flour or bread per day; three pounds of peas or beans per week, one pint of milk per day, one pint of rice per week, one quart of spruce beer or cider per day, and a little molasses. (Later vinegar was added.) Because army leaders were rarely able to deliver, soldiers would beg from civilians and supplement with whatever animals they could find. Congress pressured Washington to seize food—paid for with low value paper currency (effectively an IOU)—but General Washington worried the practice would alienate the colonials.

Napoleon eating his soldiers' bread.

 

Napoleon eating his soldiers’ bread.

Christophel Fine Art/UIG/Getty Images

Napoleon’s Army

“On campaign, Napoleon’s soldiers spent most of their time desperately hungry,” says Charles Esdaile, professor of history at University of Liverpool. When all was going to plan, French rations included 24 ounces of bread, a half-pound of meat, an ounce of rice or two ounces of dried beans or peas or lentils, a quart of wine, a gill (roughly a quarter pint) of brandy and a half gill of vinegar. (French measurements are slightly different, so these amounts are approximate.) When bread was unavailable, rough little doughboys would be made from flour, salt and water, baked in the fire, or mixed with stew.

What helped sustain French troops was that European agriculture had switched toward things like the potatoes and corn, which one can eat almost right out of the ground. “French loaves come in long sticks; baguettes,” says Esdaile. “The story is that the baguette was developed so that French soldiers could carry their bread in the legs of their trousers.”

The officers' mess of Company D, 93d New York Infantry during the Civil War, circa 1863. 

 

The officers’ mess of Company D, 93d New York Infantry during the Civil War, circa 1863. 

Timothy H. O’Sullivan/Buyenlarge/Getty Images

The Civil War: Union Troops

The Union Army in the American Civil War had a standard ration: roughly three-quarters of a pound of meat, a pound of flour or cornmeal, some kind of vegetable and vinegar and molasses. “If you received the standard ration, it would be substantial,” says Glatthaar. “Over time that did not become practical; they began issuing hardtack biscuits called salt cakes, as well as salted meat and dehydrated vegetables.” These were made with flour and water and then dried so they’d last longer.

During campaigns, especially as the Union soldiers moved South, seasonal fruits and vegetables, like apples and sweet potatoes, could be pillaged from orchards and farms. Additionally, soldiers would receive care packages from home, as the Union postal system was fairly reliable throughout the war. As for water, both the Union and Confederate armies could easily rely on lakes and streams, as water sources were rarely contaminated.

Soldiers unpacking boxes of tobacco, chewing-gum, chocolate, tooth powder and other rations.

 

Soldiers unpacking boxes of tobacco, chewing-gum, chocolate, tooth powder and other rations.

LAPI/Roger Viollet/Getty Images

World War II: The G.I.

For U.S. Troops, there were two major types of rations during the World War II: the C-Ration (for combat troops) and the K-Ration (less bulky and initially developed for airborne regiments and messengers). “A version of the C-Ration had six containers in one crate, and what’s in a C-ration is going to vary,” says Glatthaar. “You’re going to have a main course—like franks and beans—some cigarettes, some canned fruit, some chewing gum, chocolate bars, some instant coffee, some toilet paper. There’s some processed cheese and some biscuits, but really they’re crackers. And you also get a matchbook.”

Rations, designed to provide three meals—and approximately 3,600 calories—each, were almost universally unpopular. Later, soldiers would get powdered drinks like lemonade and buillion, and eventually sweetened cocoa. K-Rations would have three “meals”: a breakfast, lunch and dinner with four ounces of meat and/or eggs, cheese spread, “biscuits,” candy, gum, salt tablets and a sugary drink. There were also cigarettes, a wooden spoon and toilet paper.

MRE from 1981.

 

MRE from 1981.

Digital Commonwealth/CC BY 3.0

Vietnam: From MCI to MRE

From 1958 to 1981, U.S. rations known as the Meal, Combat, Individual or MCI, were eventually replaced with Meal, Ready-to-Eat (MRE). In Vietnam, these were distributed to combat soldiers in a cardboard box, which contained 1,200 calories through a can of meat (like ham and lima beans, or turkey loaf), a can of “bread” which could be crackers or hardtack or cookies, and a can of dessert, like applesauce, sliced peaches or pound cake.

A full ration could be bulky, so troops often disassembled it, taking what they needed on patrol by placing the cans into socks which they could tie to their packs. In his book, Vietnam: An Epic Tragedy 1945-1975, Max Hastings explains how meals were cooked by punching holes in a ration tin and using a C4 explosive to heat it. Hastings also writes about the pills that troops consumed daily, including a malaria tablet, salt pills that could be sucked on, as well as Lomotil tablets, taken four times per day to control diarrhea brought on by the Halazone troops used to purify their water.

How Canned Food Revolutionized The Way We Eat

H/T History.com.

A small look at the history of canned food.

In the 18th century, the French government hosted a competition to find a better means of food preservation. The winning invention changed how the world ate.
 

From pickling and salting to smoking and drying, humans have been finding ways to make food last longer since prehistoric times. But by the 18th century, an efficient—and truly effective—means of preservation remained elusive. 

In 1795, the French government decided to do something about it. That year, the country was fighting battles in Italy, the Netherlands, Germany and the Caribbean, highlighting the need for a stable source of food for far-flung soldiers and seamen. France’s leaders decided to offer a 12,000-franc prize through the Society for the Encouragement of Industry for a breakthrough in the preservation of food.

Nicolas Appert, a young chef from the region of Champagne, was determined to win. Appert, who had worked as a chef for the French nobility, dove into the study of food preservation. He eventually came up with a radical innovation: food packed in champagne bottles, sealed airtight with an oddly effective mixture of cheese and lime. Appert’s discovery built on earlier imperfect techniques, which either removed air or preserved food by heat but hadn’t managed to do both.

Running a bustling lab and factory, Appert soon progressed from champagne bottles to wide-necked glass containers. In 1803 his preserved foods (which came to include vegetables, fruit, meat, dairy and fish) were sent out for sea trials with the French navy. By 1804, his factory had begun to experiment with meat packed in tin cans, which he soldered shut and then observed for months for signs of swelling. Those that didn’t swell were deemed safe for sale and long-term storage.

In 1806 the legendary gastronomist Grimod de la Reynière wrote glowingly of Appert, noting that his canned fresh peas were “green, tender and more flavorful than those eaten at the height of the season.” Three years later, Appert was officially awarded the government’s prize, with the stipulation that he publish his method. He did in 1810 as The Art of Preserving, for Several Years, all Animal and Vegetable Substances.

Nicolas Appert

 

Nicolas Appert (1750-1841).

Boyer/Roger Viollet/Getty Images

Appert’s process (which was quickly built upon by canners across the English Channel) was all the more amazing because it predated Louis Pasteur’s discoveries of germ growth and sterilization by more than 50 years. Canned food also predated, by around 30 years, the can opener itself. The first metal canisters were made of tin-plated steel or even cast iron, with heavy lids that had to chiseled open or stabbed through with soldiers’ bayonets.

After winning the prize, Appert spent many more years working to improve his method amidst the chaos of post-Napoleonic France. His factories remained innovative but unprofitable, and he died a poor man in 1841 and was buried in a common grave. By then variants of his process were used to can foods ranging from New York oysters and Nantes sardines to Italian fruit and Pennsylvania tomatoes.

The availability of canned food played a crucial role in 19th century, feeding the enormous armies of the Crimean War, the U.S. Civil War and the Franco-Prussian War, and offering explorers and colonialists a taste of home in unfamiliar lands. Following the global depression of 1873, U.S. exports of canned foods boomed, led by the Campbell, Heinz and Borden companies. In 1904, the Max Ams Machine Company of New York patented the double-seam process used in most modern food cans. Today a double-seam machine can safely seal more than 2,000 cans a minute—a long way indeed from Appert’s pea-packed bottles.

Q-tips – History of Q-tips

H/T SoftSchools.com.

I have wondered how Q-tips came to be.

Found stored in bathrooms throughout the country, the Q-tip is a widely used tool for applying cosmetics, cleaning ears, and assisting in numerous other applications. Although coming from humble beginnings, the Q-tip has spread into every grocery store and household in the Nation.

The first Q-tip was invented in 1920 by the Polish-born American Leo Gerstenzang. He noticed that his wife was covering a toothpick with cotton in order to clean her baby’s ears. Obviously, due to the toothpicks pointed end, this was extremely dangerous for the child, with even one wrong move resulting in a serious ear wound. Therefore, Leo decided to create a much safer cotton swab for the same purpose.

Although simple in design, the development of the cotton swab took serious experimentation. First, Leo wanted to ensure the wood wouldn’t splinter in the baby’s ears. Next, he needed to guarantee the cotton would remain on the swab, and wouldn’t leave residual particles within the ear. Finally though, he found the right formula for the cotton swab’s structure. Now, all he needed was a name.

He chose Baby Gays as the name of the first cotton swab. Although this name would be considered quite unusual for today’s standards, the baby gay was a huge hit! By 1926 he added “Q-Tips” in front of the old “Baby Gays” title to create the first “Q-Tip Baby Gays”. The “Q” stood for quality. Eventually, however, the “Baby Gays” portion of the name was dropped, leaving the modern title of “Q-Tip”.

Today, “Q-tips” are extremely common, and used for a number of various purposes in and outside of a bathroom setting. Since their invention in the 1920s, Q-tips have undergone various advancements, including ditching the wood for paper. One of the newest advancements in the use of the Q-tip is that of its use. Although once created for cleaning the ears, most doctors recommend that “Q-tips” should never be stuck into an ear canal. Despite this, the broad use for a “Q-tip” will allow it to remain a staple product found in bathrooms throughout the country.