A Brief History of Pickles

H/T Mental Floss.

I do not know anyone that does not like pickles.

Is there an alternate timeline where America is known as the United States of the Pickle-Dealer? It seems unlikely, but there’s an element of truth to this half-sour hypothetical. Amerigo Vespucci didn’t discover the Americas, contrary to what the map-makers who named the continents believed, but his given name did end up lending itself to the so-called “new world.” And Ralph Waldo Emerson once called Vespucci “the pickle-dealer at Seville,” a derisive label that may have stretched the truth a bit, but pointed towards a very real part of the itinerant Italian’s biography.

Before traveling to the New World himself, Vespucci worked as a ship chandler—someone who sold supplies to seafaring merchants and explorers. These supplies included foods like meat, fish, and vegetables that had been pickled, which meant they would stay preserved beneath a ship’s deck for months. Without pickling, expeditions had to rely on dried foods and ingredients with naturally long shelf lives for sustenance. Much of the time, this limited diet wasn’t enough to provide crewmembers the nutrition they needed for the journey ahead. This made pickle sellers like Vespucci indispensable during the golden age of exploration. Vespucci even supplied Christopher Columbus’s later voyages across the Atlantic with his briny goods. So while he wasn’t the world’s most important explorer, Vespucci’s pickles may have changed history by preventing untold bouts of scurvy.

And pickles weren’t just enjoyed by 15th century sailors. From ancient Mesopotamia to New York deli counters, they’ve played a vital role in the global culinary scene. But where do pickles come from? How did the cucumber become the standard-issue pickling vegetable in the States? And what exactly is a pickle, anyway?

WHAT PICKLES A PICKLE

The verb, “to pickle” means to preserve something in a solution. That solution is often vinegar, which is, at its most basic, made of water and acetic acid. Most bacteria can’t flourish in highly acidic environments, so submerging a perishable food in vinegar helps create a sort of natural forcefield against the microbes that cause spoilage.

Another common pickling solution is brine, a.k.a. salty water. The brining method also relies on acid’s preserving properties, but the acid isn’t added by the pickle maker. It’s introduced by bacteria via a process called fermentation: Lactobacillus bacteria consume carbohydrates and excrete lactic acid, so if you leave a jar of vegetables in saltwater, those bacteria will eventually turn the briny solution into an acidic one.

Vegetables soaked in microbe excrement may sound unappetizing, but these bacteria and the acid they produce are perfectly safe to eat. They’re even beneficial. Lactic acid protects pickles from other, harmful organisms, while lactobacillus bacteria can boost the health of your gut’s microbiome.

CUCUMBERS IN A PICKLE

Pickles of all kinds were a hit with the ancient world. It’s thought that the Ancient Mesopotamians were the first to enjoy some pickled dishes, and Herodotus noted the Ancient Egyptians ate fish preserved with brine. Columella proclaimed that “the use of vinegar and hard brine is very necessary they say, for the making of preserves.”

But when did cucumbers enter the briny equation? While loads of websites and books talk about ancient Mediterranean peoples enjoying pickled cucumbers, according to a 2012 paper in the Annals of Botany, it’s actually unclear when cucumbers arrived in the Mediterranean region. There are definitely early accounts that use words that people have translated as cucumber, but according to the paper, the texts in question are describing something more akin to snake melons. The evidence suggests it’s not until the medieval era that Europeans were able to enjoy a cucumber pickle with their sandwich, as cukes made their way to the West via two independent paths: “Overland from Persia into eastern and northern Europe,” before the Islamic conquests, and a later diffusion into western and southern Europe, which the paper’s authors peg to a primarily “maritime route from Persia or the Indian subcontinent into Andalusia” in the southern part of present-day Spain.

As the centuries progressed, pickles continued to win famous fans. Queen Elizabeth I reportedly enjoyed them, and William Shakespeare liked them enough to reference them numerous times in his work. He even helped build a new idiom around the word when he had The Tempest’s King Alonso ask the court jester Trinculo, “how camest thou in this pickle?” Merriam-Webster speculates that the Bard may have been playing off a Dutch expression that translates to something like “sit in the pickle,” though given Trinculo’s penchant for imbibing, the line may have also been a reference to the jester’s preferred method of preservation. In any case, being “in a pickle” is now widely understood to describe any difficult situation (and—as The Sandlot‘s Benny “the Jet” Rodriguez taught us—has a specific, related meaning in baseball, used when a runner is caught between two bases and is at risk of being tagged out).

Scottish doctor James Lind discussed how pickles could fight scurvy, noting how the “Dutch sailors are much less liable to the scurvy than the English, owing to this pickled vegetable carried out to sea.” The pickled vegetable in question was cabbage. And Captain James Cook was such a proponent of what he called Sour Krout that he gave his officers as much as they wanted, knowing that the crew would eat it as soon as they saw the officers liked it.

But not everyone was a fan. John Harvey Kellogg, who as we’ve previously discussed was deeply concerned about eating food with any known flavor, felt pickles were one of the “stimulating foods” that needed to be avoided.

THE BIG DILL WITH PICKLE BRINES

For most of pickling history, people have added spices and aromatics to their pickle brines. Ingredients like garlic, mustard seeds, cinnamon, and cloves all add flavor to pickles, but that’s not the only purpose they serve. These spices all have antimicrobial properties, which could partially explain why they were added to pickle recipes in the first place.

Dill, perhaps the ingredient most closely associated with pickles today, is also antimicrobial. The herb has been found in Ancient Egyptian tombs, but it was hugely popular in Ancient Rome, where it spread alongside the Empire itself. Eventually, it found its way into Eastern European cuisine—and into pickling solutions. Pickles were already an important part of Eastern European diets: They provided a refreshing and nutritious contrast to the heavy, often-bland foods that were available in colder months, and it was customary for families to pickle barrels full of vegetables in the fall so they would have enough to last them through winter. Dill became a common ingredient in pickle brines.

When large numbers of Ashkenazi Jews immigrated from Eastern Europe to New York in the 19th and 20th centuries, they brought their pickle-making traditions with them. A classic kosher pickle is made with cucumbers fermented in a salt brine and flavored with garlic, dill, and spices. There are two main types of kosher pickles: crisp, bright-green half-sour pickles and the duller green full-sours. The only difference between the two varieties is that half-sours have a shorter fermentation time. (“Kosher pickles,” by the way, aren’t necessarily kosher. Early kosher pickles may have been made in accordance to Jewish law, but today the word is used to describe any pickles made in the traditional New York style.)

Initially, Jewish pickle-makers sold their products out of pushcarts to their immigrant neighbors. When Jewish-owned delis began popping up around New York City, pickles were a natural addition to plates of fatty lunch meat. And today, no matter where in the country you are, dill pickles and sandwiches are a common pairing.

THE ORIGINS OF BREAD AND BUTTER PICKLES

Some people prefer bread and butter pickles, which are made by adding something sweet to the pickling brine, like brown sugar or sugar syrup, and they generally omit the garlic that gives kosher pickles their distinctive flavor. But where does the name “bread and butter” come from?

It turns out it’s a bit hard to pin down the origin of the unusual pickle name. Some say it’s a holdover from the Great Depression, when families would eat simple sandwiches of bread, butter, and pickles. People may have done that, but if you’re looking for a written record, it seems that one of the first known uses of the term came when Omar and Cora Fanning registered to trademark the logo of their product, “Fanning’s Bread and Butter Pickles,” back in 1923. GFA Brands, which at one point owned the company that came to be known as Mrs Fannings, suggested that the “bread and butter” label came from a bartering system the Fannings once used. In this version of the story, the Fannings traded their delicious pickled cukes for groceries, including bread and butter.

THE PICKLE GOES MAINSTEAM

As pickles became more popular, American food companies hopped on the pickle-wagon. Heinz started selling them in the 1800s, and at the 1893 World’s Fair, H.J. Heinz lured visitors to his out-of-the-way booth by giving away free pickle pins. The promotion was so successful that the company featured a pickle in itslogo for more than a century.

Heinz was the business to beat in the pickle industry until the 1970s. That’s when Vlasic launched an ad campaign featuring a cartoon stork who delivered pickles instead of babies. The advertising approach worked—it played on the belief that women crave pickles when they’re pregnant. At one point, Vlasic even adopted the slogan “the pickle pregnant women crave.”

And that’s only the tip of the strange spear of this pickle marketing story. A 1973 newspaper reports an ad of a husband telling his wife, “Sweetie, it’s time for your 4 o’clock pickle.” Big no comment from me on that one. Even the stork angle was part of a bizarre extended Vlasic universe mythology wherein life had been good for storks during the baby boom in the United States. Once the boom ended, the stork had to find a new job, and wound up delivering Vlasic pickles.

PICKLES FROM AROUND THE GLOBE

It’s not just cucumbers that get pickled—there are many notable pickles from around the world. In Korea, the pickle of choice is Kimchi. Like pickle, the word kimchi describes both a process and a food. Kimchied vegetables are traditionally salted, covered in a mixture of garlic, ginger, chili peppers, and fish sauce, and pickled in lactic acid via fermentation. Traditionally kimchi is made with cabbage, but any number of vegetables—including carrots, cucumbers, and radishes—can all be kimchi. The food is an integral part of Korean cuisine, and can be served with almost any meal. Some families even own dedicated kimchi fridges for storing their mixtures in the ideal environment for fermentation.

But kimchi isn’t the only fermented cabbage out there. Sauerkraut is a staple of many European cuisines. It’s cabbage that’s been preserved through lacto-fermentation, but unlike Kimchi, it doesn’t contain any seafood or bold spices. The name means “sour cabbage” in German, but the condiment might not have originated in Europe at all. Food historian Joyce Toomre suggests it originated in China, and according to legend, laborers building the Great Wall first made it by pickling shredded cabbage in rice wine. The dish allegedly traveled West by way of the Mongolian army in the 13th century.

A jar of pickled eggs used to be a common sight in English pubs and American dive bars. Preserved eggs and booze may seem like an odd pairing, but it actually makes perfect sense from a nutritional standpoint. Eggs are high in cysteine, an amino acid that your body uses to help keep your liver happy. That means bar patrons might have reached for a pickled egg to go with their ale for the same reason you crave a bacon egg and cheese sandwich when you’re hungover.

Another common non-vegetable pickle is pickled herring. In Poland and parts of Scandinavia, eating the preserved fish at the stroke of midnight on new year’s is thought to boost your good fortune in the year ahead. With the success all things pickled have had around the world, we can buy it.

A Brief History of Sunglasses

H/T Sunglass Museum.

A very interesting look back at sunglasses.

The right pair of shades can make or break an outfit. But just who do we have to thank for this sartorial — yet practical — invention?

Primitive sunglasses were worn by the Inuit all the way back in prehistoric times, but these were merely walrus ivory with slits in them — good for helping with snow blindness but not particularly fashionable (unless you were a prehistoric Inuit). See our version, the Hitomi sunglass.

primitive sunglasses

Image: Anavik at Banks Peninsula, Bathurst Inlet, Northwest Territories (Nunavut), May 18, 1916, Photo by Rudolph Martin Anderson, Canadian, 1876–1961, Canadian Museum of Civilization, 39026.

Legend has it that the emperor Nero watched gladiator fights wearing emerald lenses, but many historians cite this claim as iffy.

The Chinese made a slight improvement over the Inuit model in the 12th century, when they used smoky quartz for lenses, but the specs were used for concealing judges’ facial expressions rather than style or sunlight purposes. In the mid-1700s, a London optician began experimenting with green lenses to help with certain vision problems — and, indeed, green is the best color for protecting your peepers from the sun’s rays. Emerald-tinted specs remained quite the rage for some time, as evidenced by several mentions in the works of Nathaniel Hawthorne and Edgar Allan Poe.

It wasn’t until the 20th century that modern sunglasses as we know them were invented. In 1929, Sam Foster began selling the first mass-produced shades, which soon became a hot fashion item on the Atlantic City boardwalk. A few years later, Bausch & Lomb got in on the act when the company began making sunglasses for American military aviators, a design that has changed little since General Douglas MacArthur sparked a new trend when he wore a pair to the movies.

In the decades since, sunglasses have enjoyed various degrees of popularity and more than a few design upgrades. Perhaps the most important technological improvement has been polarized lens, introduced in the 1930s, which help to further reduce glare and also reduce the risk of eye damage due to UV light.

And there you have it — a little conversation fodder for your next dinner party.

A Brief History of Coloring Books

H/T Culture to Color.com.

An amazing story.

Despite their recent burst of popularity, the first coloring books go back several centuries. Here is a short history of coloring books.

Although people in the early 1600s were apparently fond of coloring in the illustrations that were included along with a popular volume of poetry, the real precursors to modern coloring books were typically created to teach affluent people how to paint. After all, painting was often considered a skill that educated aristocrats need to acquire. The illustrations in these books were made using woodcuts or copper plates, which made them expensive to produce. As a result, painting books for children were few and far between.

However, advances in technology eventually changed that. The lithograph made it cheaper and easier to produce quality illustrations. The lack of modern copyright laws also helped speed up the process by allowing publishers to “borrow” from one another without running into any serious legal problems and they did so quite freely during those years. In addition, the early 1900s was the period that parents started being encouraged to provide their children with creative outlets.


The Little Folks’ Painting Book was eventually published by the McLoughlin Brothers in 1879. It featured the works of the popular artist Kate Greenaway, who may or may not have supported their efforts. Even so, the company went on to produce a number of similar publications in the decades that followed, before they were bought out by Milton Bradley.

Coloring books were traditionally filled in with watercolors or other paints. However, crayons started becoming popular around the 1930s and eventually became the design implement of choice for youngsters. After all, crayons were and are a lot less messy than paints. So it was even easier for busy parents to clean up after their young artists were finished with their work.


From the beginning, coloring books have featured popular comic book characters and they have been frequently used to advertise popular films. Botanic illustrations, classic cars, and scenic spaces have also been commonly seen in adult coloring (and painting) books. However, creative entrepreneurs throughout history have also used coloring books to promote products and services that may not have even appealed to the children that were typically filling in the pages.

During the 1960s, some coloring books started taking a political turn. A particularly popular one featured the Kennedy family. Others published around the same time were far more controversial and presumably intended for adult audiences. These included one that portrayed policemen as pigs and another that focused on female anatomy as part of its creator’s feminist agenda. The political trend in coloring books has continued to this day with modern ones focusing on events like the election of President Obama.

From the very beginning of their existence, coloring books have been used for educational purposes. Learners of all ages have benefited from their inclusion in various classroom settings. As it turns out, coloring books are particularly helpful as a non-verbal means of communication, making them ideal for children who can’t speak well and those who don’t speak the same language as their teachers. Coloring books are also useful at illustrating complex concepts for older learners, particularly in the math, science, and technology fields.


In recent years, psychiatrists have also decided that coloring books have therapeutic uses. This has undoubtedly led to the growing popularity of adult coloring books. Of course, this new trend is not without its detractors, but it shows no signs of stopping, much less slowing down, anytime soon.

A Brief History of the Chocolate Chip

H/T Mental Floss.

Happy Chocolate Chip Day.

Celebrating National Chocolate Chip Day—not a federal holiday, at least not yet—should be easy enough for all the classic dessert lovers and cookie aficionados out there. Just grab a bag of some chocolate morsels, whip them into some delectable cookie dough, and have at it. But have you ever wondered where exactly the chocolate chip came from? Who invented it? Who decided it was best for baking? Should we be calling it a “chip” or a “morsel”? We’ve got all those answers—and more!—in our brief history of the chocolate chip.

THE TOLL HOUSE MYTH

 

ISTOCK

Chances are, you’ve made (or at least eaten) a Nestle Toll House chocolate chip cookie at some point in your life. The baking bits purveyor has long stamped their “Nestle Toll House Chocolate Chip Cookie” recipe on the back of their various morsel packages (and yes, all Nestle packages refer to them as “morsels,” not “chips,” but we’ll get to that later), so it’s no surprise that most people associate the famous cookie with Nestle.

They’ve even got a whole story to go along with the kinda-sorta myth of the Toll House cookie. The traditional tale holds that Toll House Inn owner Ruth Wakefield invented the cookie when she ran out of baker’s chocolate, a necessary ingredient for her popular Butter Drop Do cookies (which she often paired with ice cream—these cookies were never meant to be the main event), and tried to substitute some chopped up semi-sweet chocolate instead. The chocolate was originally in the form of a Nestle bar that was a gift from Andrew Nestle himself—talk about an unlikely origin story! The semi-sweet chunks didn’t melt like baker’s chocolate, however, and though they kept their general shape (you know, chunky), they softened up for maximum tastiness. (There’s a whole other story that imagines that Wakefield ran out of nuts for a recipe, replacing them with the chocolate chunks.)

The recipe was such a hit (it first popped up in Wakefield’s Tried and True cookbook in 1938, and it even appeared on Betty Crocker’s radio show, thanks to its massive popularity) that Wakefield eventually struck a deal with Nestle: They would feature her recipe on the back of every bar of semi-sweet chocolate the company sold, and she’d get a lifetime supply of their chocolate. 

THE FAMOUS RECIPE

 

ISTOCK

Sounds great, right? Well, even if the story wasn’t exactly true (more on that later), it did spawn a classic recipe that’s still the gold standard of chocolate chip cookie recipes, even though it’s been slightly tweaked over the years. You can find the original recipe here. Try it!

THE REAL ORIGIN

 

ISTOCK

The problem with the classic Toll House myth is that it doesn’t mention that Wakefield was an experienced and trained cook—one not likely to simply run out of things, let accidents happen in her kitchen, or randomly try something out just to see if it would end up with a tasty result. As author Carolyn Wyman posits in her Great American Chocolate Chip Cookie Book, Wakefield most likely knew exactly what she was doing, and while that doesn’t dilute how delicious the final product ended up being, it does make its mythic origin story seem just a smidge less magical. 

Even less magical? The truth about the deal Wakefield struck with Nestle. While Wakefield did indeed get free chocolate for the rest of her life and the company paid her to work as a consultant, she was reportedly due a single dollar for her recipe and the good “Toll House” name—a dollar she never got.

CHIPS VERSUS MORSELS

 

ISTOCK

Although we call the cookies that bear them “chocolate chip,” the proper name for said chips is actually “morsels”—at least if you’re Nestle.

The moniker “chip” appears to have first popped up in the late nineteenth century, as part of an English tea biscuit recipe for “Chocolate Chips.” These chips, however, referred to the biscuits’ shape—they were cut out of the pan into small strips that the recipe deemed as being “chips.” Interestingly, the recipe did call for actual chocolate—but of the melted variety, not the morsel.

In 1892, the “chip” title was first applied to candy, as a Kaufmanns candy ad from the time boasted of their supply of “chocolate chips.” A year later, another candy store advertised their own chocolate chip candies. Not so fast, though, because it doesn’t seem like those chips had much to do with morsels as we know them; an 1897 court case involving the use of the trademarked name “Trowbridge Chocolate Chips” described the chips in question as “thin oblong pieces of molasses candy coated with chocolate.” This thin candy business continued into the 1930s, when Wakefield’s recipe hit the world.

Wakefield’s first published chocolate chip cookie recipe was actually called “Toll House Chocolate Crunch Cookies.” When Nestle began publicizing the recipe, they simply became “Toll House Cookies.” Since no one had bothered to invent pre-made chunks, morsels, or chips at that time, Wakefield’s recipe graced the back of semi-sweet bars, which all included an individual cutter to chunk up the bars for cookie-making. The famous cookies finally got the “chip” moniker some time in 1940, thanks to various newspaper articles and recipes about various cookies and their popularity. By 1941, “chocolate chip cookies” was considered the standard name for the sweet treat.

Also in 1940, Nestle finally unveiled morsels for sale, and ads from the time tout the availability of both bars and morsels. Since then, Nestle has shared its famous chocolate chip recipe, all while selling its most important ingredient as “morsels” (other brands, like Hershey’s and Ghirardelli, call them “chips”).

THE FAMOUS IMITATORS

 

Although Nestlé’s morsels and Wakefield’s recipe pioneered the great chocolate chip cookie trail, they weren’t the only ones—there were plenty of imitators. In the ’50s, both Nestle and Pillsbury rolled out premade cookie dough for purchase. In 1963, Chips Ahoy hit shelves, thanks to Nabisco. By the time the ’70s rolled around, entire stores were dedicated to cookie sales—including chocolate chips—like Famous Amos, Mrs. Fields, and David’s Cookies. What do they all have in common? That necessary chip. Er, morsel.

Happy Chocolate Chip Day!

A Brief History of Chocolate

H/T Mental Floss.

A sweet history.

In 2017, two members of a Russian crime syndicate in the United States were charged with the transport and sale of 10,000 pounds of “stolen chocolate confections.” The indictment didn’t mention whether the thieves took a few bites for themselves, but if they did have a sweet tooth they’d hardly be alone: Napoleon Bonaparte was a fan of chocolate, which was said to be his drink of choice when working late. Thomas Jefferson fell in love with it while serving as minister to France, and proclaimed that it might soon be more popular than tea or coffee. And though she probably never said “let them eat cake,” Marie Antoinette was known to enjoy hot chocolate, which was served at the Palace of Versailles.

Chocolate’s worldwide popularity streak has lasted centuries, but it wasn’t always the sweet, easily accessible treat we know today. So what is chocolate, and how did it transform from sacred beverage to sweet snack?

GETTING CHOCOLATE FROM CACAO

Every chocolate product starts with the cacao tree. The plants were originally native to the Americas, but today they’re grown worldwide, primarily in tropical regions. The fruits of cacao trees are called pods; one pod is roughly the size of a football, and it contains around 40 almond-sized cacao beans—which are actually seeds.

When fermented and roasted, cacao beans develop a rich, complex flavor. They’re the key to making chocolate taste chocolatey. The word cacao, by the way, usually refers to the plant and its seeds before they’re processed, while chocolate describes products made from processed cacao beans. And if you’re wondering what the difference between cacao and cocoa is, there really isn’t one. Both versions of the plant name are technically correct, but in modern usage, cacao is increasingly applied to things closer to the plant while cocoa is used for the more processed stages.

There’s some debate over who first decided to turn raw cacao beans into processed chocolate. One long-standing theory posits that humans were first drawn to the pulp of the cacao pod, which they used to make an alcoholic beverage. The oldest evidence we have for the consumption of cacao products comes from 5000 years ago in what is now Ecuador.

At some point, chocolate migrated north: Evidence of cacao residue has been found in vessels from the Olmec people, in what is now southern Mexico. It’s still unclear if this cacao was the result of beer-like fermented beverages made from cacao pods or some kind of chocolate that would be more recognizable to us today.

According to art and hieroglyphs from Central America and southern Mexico, chocolate was a big part of Maya culture. It didn’t look or taste anything like a Hershey’s bar, though. Back then, chocolate was sipped rather than eaten, and to make these chocolate drinks, the Maya harvested beans from cacao pods and fermented them.

Fermentation is basically controlled rot. Microorganisms like yeast and bacteria break down the organic substances in food, changing the taste on a biochemical level without making the food go bad. Fermentation also generates heat, and when a pile of cacao beans ferments, it can exceed 120 degrees Fahrenheit. This heat is essential in developing chocolate’s signature flavor and aroma. It unlocks flavor compounds we associate with chocolate and activates enzymes that mellow the cacao beans’ natural bitterness. It’s also what kills the germ, or embyro, in the middle of a bean that would cause it to sprout, and dissolves any leftover pulp from the cacao pod surrounding the beans.

After they’re fermented for several days, cacao beans are dried, roasted, shelled, and ground into a paste called chocolate liquor. Roasting is an important step. It creates new flavor compounds and concentrates other flavors that were already there. It also burns off acetic acid, a natural byproduct of fermentation that can give chocolate an undesirable, vinegary flavor.

CHOCOLATE BEVERAGES AND CACAO CURRENCY

These early steps in the chocolate-making process haven’t changed much over the centuries. The main difference in the Maya preparation came after the beans were processed. Instead of using the ground cacao beans to make candy or desserts, they mixed the paste with water, cornmeal, and spices like chili peppers to make a thick, savory beverage. By pouring the liquid from one container to another a few times, they were able to give it a frothy head, which was a big part of the drink’s appeal.

Chocolate was especially popular among elite members of society. It was enjoyed by Maya rulers, and cacao beans and chocolate paraphernalia have been found in royal tombs. Priests drank chocolate and used it in religious ceremonies. Cacao was considered a gift from the gods, and it was featured in Maya weddings, where the bride and groom would exchange sips of the beverage to seal their union. After important transactions were agreed to, the two parties would share a drink of chocolate to make it official.

The Aztecs, who dominated central Mexico from around 1300 to 1521, were just as enamored with chocolate. They used cacao beans as currency. One bean was worth a tamale, while 100 beans were enough to get you a quality female turkey.

Chocolate played a role in Aztec religious ceremonies, too. In their book The True History of Chocolate, Sophie and Michael Coe mention a Spanish chronicler who wrote that sacrifice victims who weren’t in the mood to participate in the ritual dances leading up to their deaths were given chocolate—mixed with the blood from previous human sacrifices—to boost their spirits.

According to Aztec legend, the emperor Montezuma II (who, incidentally, is increasingly referred to as Moctezuma in English because it more closely resembles the original Aztec) was rumored to have drunk a gallon of chocolate a day, but he didn’t just like it for the taste. Chocolate was believed to be an aphrodisiac, and he purportedly binged the drink to fuel his affairs.

Chocolate never lost its romantic reputation, but the scientific evidence for its amorous abilities is actually pretty limited. It contains the compounds tryptophan and phenylethylamine, and tryptophan does help the body make serotonin, which is associated with feelings of happiness and well-being. Phenylethylamine releases dopamine, otherwise known as the “feel-good” neurotransmitter. Tryptophan and phenylethylamine may qualify as aphrodisiacs, but there probably aren’t enough of them in cacao beans to produce any noticeable effects.

CHOCOLATE’S EUROPEAN DEBUT

The word chocolate originated in Mesoamerica. Like the Aztecs and Maya, the Pipil people of what is today El Salvador brewed drinks from cacao beans, and they called these beverages chocola-tl. It’s thought that when the first Spaniards to visit the region heard the word, they basically kept it. The name still persists today, largely unchanged from its original language.

A number of European explorers, from Christopher Columbus to Hernan Cortes, have been credited with bringing chocolate back home after traveling to the Americas. But the first chocolate to land in Europe may not have come from a famous explorer at all. Some historians say Spanish missionaries were instrumental in getting cacao across the Atlantic. Upon returning from an overseas trip, Catholic friars presented a group of Maya dignitaries to the court of Prince Philip in 1544. The Maya brought with them gifts from the New World, including chocolate. This offering marks the first recorded evidence of chocolate in Spain.

Soon enough, chocolate spread to the rest of Europe, where it underwent its next big transformation. The drink was too bitter for European palates, so people started adding more sweeteners to the mix. Different countries added their own spices—the Spanish liked cinnamon and vanilla in their chocolate, while the French flavored their chocolate with cloves.

In Europe, as in Mesoamerica, chocolate was mostly enjoyed by the upper classes. In 17th century Britain, a pound of chocolate cost 15 shillings, which was about 10 days’ worth of wages for a skilled tradesman. In 1657, London opened its first chocolate house, a place where men could gather to gamble, do business, and discuss politics over a nice cup of cocoa.

CADBURY VERSUS FRY

Chocolate was already a global success story by the 19th century, but it might never have become the nearly ubiquitous treat we know today if it wasn’t for a Dutch chemist named Coenraad Johannes van Houten. In 1828, he discovered that by removing some of the fat, or cocoa butter, from chocolate liquor and treating it with alkaline salt, he could turn the ingredient into a new kind of powder. Alkaline substances are basically the opposite of acidic substances; adding the alkaline salts to chocolate created a product that had a more mellow, earthier taste. If you see natural cocoa powder and Dutch-process cocoa powder next to each other at the grocery store, know that the natural stuff will generally be more acidic than van Houten’s “Dutch” version.

Dutch cocoa powder was easier to mix with water than ground-up beans, but the invention had implications far beyond that. His work eventually helped give us the first modern chocolate bars. A British candy maker named J.S. Fry & Sons created solid chocolate in 1847 after mixing melted cocoa butter back into cocoa powder and letting it harden. If you’re not familiar with his company J. S. Fry & Sons, you’ve likely heard of Cadbury, which pioneered the heart-shaped chocolate box in the 1860s.

In the 1900s, the two companies worked together to import South American cacao beans to England, but the Cadburys eventually made a series of deals with farmers to cut their partner-rivals out of the supply chain. This led to some good old-fashioned Chocolate Beef: In his book, Chocolate: A Bittersweet Saga of Dark and Light, Mort Rosenblum tells the story of Cecil Fry’s funeral at Westminster Abbey. When Fry’s widow saw the patriarch of the Cadbury family file into the ceremony late, she apparently rose to her feet and shouted, “Get out, Devil.”

FROM NESTLÉ TO HERSHEY

Swiss chemist Henri Nestlé created a powdered milk product in the mid-19th century, which a countryman by the name of Daniel Peter decided to add to chocolate. This was the debut of a new product called milk chocolate.

Today, the FDA defines milk chocolate as having at least 10 percent chocolate liquor and 12 percent milk solids. These standards are far from universal; in Europe, milk chocolate must contain at least 25 percent dry cocoa solids and 14 percent dry milk solids. (When it comes to white chocolate, on the other hand, the only product derived from cacao beans is cocoa butter. There’s some debate over whether it should be considered chocolate at all.)

The company many Americans associate with chocolate today didn’t arrive on the scene until fairly recently. Milton Hershey got his start in the candy business selling caramels, not chocolate bars. The entrepreneur fell in love with chocolate at the 1893 World’s Fair. He was so impressed by Germany’s chocolate production display that he bought their machinery when the exposition was over and started making chocolate professionally the next year. An early slogan for Hershey’s was “Our Milk Chocolates are highly nutritious, easily digested, and far more sustaining than meat.”

In 1900, Milton sold his caramel business for $1 million and fully devoted himself to the Hershey Chocolate Company. The company got so big that Milton Hershey built an entire town for his employees to live in. Now, people can visit Hershey, Pennsylvania, to enjoy candy-themed rides at Hersheypark, see how chocolate is made at Hershey’s Chocolate World, or take a bath in real chocolate at the Hotel Hershey.

PLEASE GIVE US S’MORE

The differences in cocoa content might lead some international readers to turn their noses up at a Hershey’s bar, but try one in a s’more and then thank the U.S. of A. and the Girl Scouts of America, who published what is debatably the first known recipe for “Some Mores” in the 1927 guidebook “Tramping and Trailing with the Girl Scouts.” And be thankful that it’s not worse: Back in 2007, a group of lobbyists sought to change the FDA’s definition of chocolate to allow for the removal of cocoa butter entirely, in exchange for more affordable, accessible alternatives like vegetable oils.

It seems this effort failed, so you can rest assured: The next time a pair of former-Soviet-bloc gangsters steal a few tons of chocolate here in the United States, cocoa butter will be part of the haul.

A Brief History of Beer

H/T Mental Floss.

While I have drank a beer or two my poison of choice was vodka or rum.

 
Odairson Antonello/iStock via Getty Images Plus
 
ODAIRSON ANTONELLO/ISTOCK VIA GETTY IMAGES PLUS
 
 
 
 

In 1814, Meux’s Horse Shoe Brewery was the victim of some very bad luck—or maybe just poor engineering. At the time, massive storage vats were en vogue in London’s breweries, and when one of these large vats burst at The Horse Shoe Brewery, it led to over a hundred thousand gallons of beer flooding the surrounding area in a veritable brew-nami. The surge led to the collapse of two nearby buildings and the loss of eight lives. Over the years, rumors even popped up that unconscientious ale-lovers had flocked to the scene of the accident to consume the runaway beverage.

Contemporary accounts suggest there’s not much substance to those rumors, but it’s easy enough to see why people would believe them. People really like beer. Behind water and tea, it’s thought to be the third most widely consumed drink on Earth. It helped shape civilization as we know it, and we’re not just talking about those commercials where the guys say “whasssup?!”

Perhaps it shouldn’t be surprising that the story of the world’s most-widely consumed intoxicating beverage is shrouded in legend and half-truths. Was beer brewed to make water potable? When did hops enter the picture? And what do the Budweiser Frogs have in common with Captain Jack Sparrow? The short answers, respectively, are “no,” “the 9th century, at the latest,” and “Gore Verbinski.” But the long version is more fun.

THE ANCIENT ORIGINS OF BEER

Ancient Sumerians were cultivating grains thousands of years ago, and there’s some debate about what they were doing with the grains they grew. According to one theory, the grains were used to make beer before bread ever entered the picture. The discovery of ancient tools potentially designed for beer brewing supports this. That would mean beer was part of the origin of agriculture, which is arguably what allowed humans to build civilizations, which led to the development of new technologies, which made it possible to brew even more beer.

In 2018, archaeologists from Stanford announced they had evidence that people in what is now Israel were brewing something like beer around 12,000 years ago, which they noted predates “the appearance of domesticated cereals by several millennia in the Near East.” The archaeologists speculated that it was likely a thin gruel possibly consumed for religious purposes.

Beer, by the way, is any fermented, alcoholic beverage made with cereal grains such as wheat or barley. There was a time when beer and ale were considered two different beverages, with beer defined by the presence of hops, but we’re going to proceed with more modern usage, where the two words are basically interchangeable. We’ll get to hops soon enough (about 11,000 years).

Early beer was likely made by crushing up grains, heating them gradually in water, then possibly baking them, and steeping them again. This process encourages fermentation. Grains contain starches, and heating up grains helps break these starches down into their simple sugar components. Fermentation happens when yeast microbes consume these sugars and convert them into alcohol, flavor compounds, and carbon dioxide. It’s sometimes said that Louis Pasteur discovered yeast in the mid-1800s, but that’s a little misleading. Sure, single-celled organisms like yeast are invisible to the naked eye, but when millions or billions congregate in the beer-making process they can be seen and manipulated.

In Richard Unger’s Beer in the Middle Ages and the Renaissance, he points to several indications that brewers began to intuit the vital role of yeast hundreds of years before Pasteur’s time. Rather than relying on wild yeast in the air, a 15th-century brewer in Munich received permission to use a specific source of yeast from the bottom of his brew. In 16th century Norwich, England, brewers recognized the value of skimming off excess yeast for use in bread-making and further brewing. They actually donated some of that yeast to charities. Even without Pasteur’s experiments defining the biological processes that result in living yeast turning glucose into ethanol—what he dubbed alcoholic fermentation—people evidently knew that fermentation made beer bubbly, flavorful, alcoholic, and generally much more fun to drink than plain barley water.

BEER: THE BEVERAGE OF THE GODS

Beer was among the Sumerians’ most influential contributions to the world, right behind written language and a formal number system. And the Sumerians knew they had come up with something big. They even had a goddess of beer and brewing named Ninkasi. In the year 1800 BCE, a hymn was written for Ninkasi that doubled as a beer recipe. Because it was written in song, the recipe was easy for the average beer drinker to memorize if they didn’t know how to read. It’s also the oldest beer recipe ever discovered. Here’s an excerpt:

“Ninkasi, You are the one who handles the dough with a big shovel….you are the one who waters the malt set on the ground. … You are the one who soaks the malt in a jar, the waves rise, the waves fall…You are the one who spreads the cooked mash on large reed mats, coolness overcomes.”

If the Sumerians penned a similarly poetic cure for hangovers, it hasn’t been discovered.

The Ancient Egyptians were also fanatical about their beer. They believed that beer brewing knowledge was a gift from the god Osiris, and they incorporated the drink into their religious ceremonies. It infiltrated other parts of Egyptian culture as well. Beer was so common that the laborers who built the pyramids of Giza were given daily rations amounting to about 10 pints of the stuff. It was also served at celebrations, where over-imbibing was not only accepted, but encouraged. As far as etiquette was concerned, leaving a party when you could still walk straight was the Egyptian equivalent of not finishing your meal.

HOPS TO IT

Adding unusual flavors to beer is not a new phenomenon. Before the first hipster microbrewery opened, ancient beer makers were using ingredients like carrots, bog myrtle, hemp, and cheese to make their concoctions. But one component that’s found in virtually all beer today took a while to enter the picture. That would be hops, the ingredient that gives beer its bitter, floral taste. Though it’s more noticeable in IPAs, the vast majority of beers depend on hops to balance out their sweetness. And hops, by the way, isn’t the name of the plant; it’s the name of the flower or “cone” that comes from the plant. The plant itself is called Humulus lupulus, which means “climbing wolf” in Latin.

During the Middle Ages, Catholic monks supported themselves by selling homemade goods like cheese, mustard, and in some cases, beer. These monks were likely the first people to make beer with hops. In the mid-800s, Adalard of Corbie, a German Abbot from the monastery of Corvey in Germany (and a cousin of Charlemagne’s), referred to the use of hops in brewing. A few hundred years after that first written reference, German abbess and eventual Catholic saint Hildegard of Bingen wrote, in her book Physica Sacra, that hops “make the soul of a man sad and weigh down his inner organs.” 

According to beer scholar William Bostwick, her description was actually so scathing that it helped launch a beer war between Catholics and Protestants. Partly inspired by Hildegard, Catholics ditched hops and fully embraced gruit, which was the mixture of herbs and aromatics used to flavor most early beers. This made hoppy beer anti-Catholic, so naturally Protestants claimed it as their own. Martin Luther himself was even a proponent of the beverage. During the Reformation in the 16th century, the rise of Protestantism helped boost hops’ profile in Europe. And hops had another advantage in the beer wars: The ingredient contains beta acids that delay spoilage and act as a natural preservative. Monks weren’t aware of this property when they first added hops to beer, and it when it did become clear centuries later, that was the final nail in gruit’s coffin.

THE BEER VS. WATER MYTH

Beer was a popular drink of the lower classes from ancient times through the Middle Ages, but there’s some confusion as to why. You may have heard that peasants drank beer every day because it was more sanitary than the water they had access to, and it makes a certain amount of sense. Brewing generally involves boiling the unfermented beer, or wort; this would theoretically kill off pathogens. Once fermentation takes place, the alcohol itself would presumably provide further disinfection.

While it’s hard to say that beer was never looked at as a healthier alternative to water, the theory doesn’t stand up to much scrutiny as it relates to the Middle Ages. The truth is that clean water wasn’t that hard to come by, even in poorer communities. People could get free water from wells and streams, and some places like London even had cisterns by the 13th century. A more likely explanation for beer’s popularity with poor and working class people is that, beyond its intoxicating effects, it was viewed as a cheap source of nutrition. If you were a worker in the Middle Ages, an afternoon pint provided a measure of hydration and quick calories at the same time.

THE WITCHY HISTORY OF WOMEN IN BEER BREWING

Today the craft brewing industry skews heavily male, but women have likely been making beer for thousands of years. Female brewers during the 16th and 17th century may even have given rise to some of the iconic imagery around witches. From the cauldrons they brewed in to the pointy hats they wore (perhaps to attract customers) to the cats they kept to deal with grain-loving rodents, some writers have drawn a line connecting alewife entrepreneurs and what would eventually become witchy iconography.

While it’s difficult to source accusations of witchcraft levied at brewsters, there does seem to be overlap in assertions of duplicitousness, for example—perhaps as a method to drive women out of a field that was quickly becoming dominated by men.

BEER IN THE INDUSTRIAL AGE

For good and ill, beer-making took some big steps forward during the Industrial Revolution. Emerging technologies like steam power and refrigeration led to a tastier, more consistent, and easier-to-brew beverage.

Industrialization and globalization also paved the way for the widespread consolidation of the modern beer industry. When Anheuser-Busch InBev acquired SABMiller for more than $100 billion in 2016, the resulting conglomerate contained over 500 beverage brands [PDF] and accounted for over a quarter of global beer market sales, according to market research firm Euromonitor International.

Many of the “craft beers” you know may very well belong to a conglomerate like this. Because these giant beer-makers are also giant beer-distributors, with a big say in what brands up on store shelves, critics accuse them of reducing competition from independent producers and potentially stifling consumer choice. InBev would presumably argue that their scale and history in the industry allows them to operate more efficiently [PDF]. It’s interesting to note that you could buy a bottle of Budweiser or a bottle of Goose Island Bourbon County Stout and be supporting the same bottom line. Speaking of Budweiser …

THE BUDWEISER FROGS AND A RETURN TO THE PAST

During 1995’s Super Bowl, a commercial featuring the three talking amphibians—Bud, Weis, and Er—aired, and for some reason, America fell in love. That spot was directed by Gore Verbinski, who would go on to direct the American remake of The Ring and the first three Pirates of the Caribbean films. The frogs were brought to life by artists at Stan Winston’s studio, the same legendary company that helped create the dinosaurs in Jurassic Park and The Terminator’s titular cyborg. That’s five combined Academy Awards and billions in box office success, all in the service of selling beer.

Mass-produced beer made from hops, grains, and yeast is standard today, but the brews of the past haven’t disappeared completely. Resurrecting ancient beer recipes has become a popular pastime among home brewers. Even some commercial breweries have joined the trend. New Belgium makes gruit ale, and Dogfish Head collaborated with a molecular archeologist to recreate a beer based on residue collected from what may have been King Midas’s tomb. But the truth is, no matter what beer you reach for, you’ll be drinking something that connects you to the very beginnings of civilization.

A Brief History of the Rain Boot

H/T TreeHugger.com.

Until I read this article I did not know the history of rainboots.

April showers, indeed! Here in Southern Florida, rain boots have become standard attire these days and from the looks of my weather app, for many other places too. It’s hard to believe that there was once a time when rain boots didn’t exist, when people walked out in wet, muddy weather in their regular shoes. It wasn’t even that long ago! Herein, a brief history of the practical, yet ever-stylish, rain boot.

Rain boots first made their debut on the feet of Arthur Wellesley in Britain in the early 19th century. Also known as the Duke of Wellington, the military man (like many others of his day) used to wear Hessian boots. Hessian boots, standard issue in the military, were made out of leather, had a semi-pointed toe, reached up to the knee and had a tassel on the top. (Think Mr. Darcy in “Pride and Prejudice”). Thinking he could improve on them, Wellesley commissioned his personal shoemaker to make a variation just for him. He asked him to do away with the trim around the calf, shorten the heel and cut the boot closer around the leg. The result, known as Wellingtons, quickly took hold among the British aristocracy, and the name wellies endures to this day.

 

The original Wellington boots were fashioned out of leather, but in the mid-19th century, a man named Hiram Hutchinson bought the patent for vulcanization of natural rubber for footwear from Charles Goodyear (who was using the process to make tires) and began manufacturing rubber Wellingtons. The introduction of the rubber Wellington was met with much approval, especially among farmers, who could now work all day and still have clean, dry feet.

The Wellington became even more popular after World War I and World War II. Soldiers often spent long hours in flooded European trenches, and the rubber boots allowed their feet to stay warm and dry. By the end of World War II, men, women, and children were all wearing the rain boot. Hunter Boot, the company commissioned to make boots for the British Army in both wars, continues to sell their signature boots today.

 

Rain boots are still called wellies in England, but around the world are referred to as billy boots, gummies, gumboots and, of course, rain boots. In South Africa, where they are called gumboots, miners wore rain boots and used them to help them communicate with each other when talking wasn’t permitted. The miners even created gumboot dances (whose variations have become popular entertainment today) to keep themselves from getting bored.

Wellies in all styles
sixmilliondollardan/Flickr

The lower cost of Wellington’s manufacturing process made it the standard footwear for a variety of professions – often reinforced with a steel toe to prevent injury. Used in factories, meat packing plants, farms, clean rooms for delicate electronics, even fast-food environments, rubber boots are just practical – and stylish.

 

Whereas most rain boots could only be found in a few colors (olive green, yellow, black) 50 years ago, they are manufactured in all colors (and patterns) of the rainbow today. And even though they’re quite practical for muddy, rainy spring weat

A Brief History of the Invention of the Home Security Alarm

H/T Smithsonian Magazine.

A hardworking nurse envisioned a new way to know who was at the door

patent application for home-security system and an image of a woman and a man displaying the patent
Left, a portion of the patent plan designed by Marie Van Brittan Brown and her husband Albert, right. (Marie Van Brittan Brown and Albert L. Brown, courtesy U.S. Patent and Trademark Office; New York Times / Redux)
 
 

Marie Van Brittan Brown, an African American nurse living in Jamaica, Queens in the 1960s, was working odd shifts, as was her husband, Albert, an electronics technician. When she arrived home late, she sometimes felt afraid. Serious crimes in Queens jumped nearly 32 percent from 1960 to 1965, and police were slow to respond to emergency calls. Marie wanted to feel safer at home.

 

Enlisting her husband’s electrical expertise, Marie conceived a contraption that could be affixed to the front door. It would offer four peepholes, and through these, a motorized video camera on the inside could view visitors of different heights as the occupant toggled the camera up and down. The camera was connected to a television monitor inside. A microphone on the outside of the door and a speaker inside allowed an occupant to interrogate a visitor, while an alarm could alert police via radio. Closed-circuit television, invented during World War II for military use, was not widespread in the 1960s, and the Browns proposed using the technology to create the first modern home security system.

They filed a patent for their device in 1966, citing Marie as lead inventor. It was approved three years later. “The equipment is not in production,” the New York Times reported, “but the Browns hope to interest manufacturers and home builders.

Patent application for bedside door security camera, sketched in black, white, and orange
The Browns’ 1969 patent plan for an elaborate home security system suggests safety and relaxation can go hand in hand. (Marie Van Brittan Brown and Albert L. Brown, courtesy U.S. Patent and Trademark Office)

That never happened, presumably because the Browns’ system was ahead of its time. “The cost of installing it would be pretty high,” says Robert McCrie, an emergency management expert at John Jay College of Criminal Justice in Manhattan.

Marie’s invention, though it didn’t benefit them financially, would earn the Browns a measure of recognition in the tech world: The predecessor of today’s home security systems, it has been cited in 35 U.S. patents. Companies first offered CCTV to residential consumers around 2005, but Marie never saw her vision realized; she died in Queens in 1999, at the age of 76.

As the tech has become cheaper and smarter, home security has grown into a $4.8 billion business in North America and is expected to triple by 2024.

A Brief History of Peanut Butter

H/T Smithsonian Magazine.

The bizarre sanitarium staple that became a spreadable obsession

Jars of peanut butter
Veteran food critic Florence Fabricant has called peanut butter “the pâté of childhood.” (Dan Saelinger)
 

North Americans weren’t the first to grind peanuts—the Inca beat us to it by a few hundred years—but peanut butter reappeared in the modern world because of an American, the doctor, nutritionist and cereal pioneer John Harvey Kellogg, who filed a patent for a proto-peanut butter in 1895. Kellogg’s “food compound” involved boiling nuts and grinding them into an easily digestible paste for patients at the Battle Creek Sanitarium, a spa for all kinds of ailments. The original patent didn’t specify what type of nut to use, and Kellogg experimented with almonds as well as peanuts, which had the virtue of being cheaper. While modern peanut butter enthusiasts would likely find Kellogg’s compound bland, Kellogg called it “the most delicious nut butter you ever tasted in your life.”

 

A Seventh-Day Adventist, Kellogg endorsed a plant-based diet and promoted peanut butter as a healthy alternative to meat, which he saw as a digestive irritant and, worse, a sinful sexual stimulant. His efforts and his elite clientele, which included Amelia Earhart, Sojourner Truth and Henry Ford, helped establish peanut butter as a delicacy. As early as 1896, Good Housekeeping encouraged women to make their own with a meat grinder, and suggested pairing the spread with bread. “The active brains of American inventors have found new economic uses for the peanut,” the Chicago Tribune rhapsodized in July 1897.

 

A vintage peanut butter advertisement
“It’s the Great Depression that makes the PB&J the core of childhood food,” food historian Andrew F. Smith has said. (Buyenlarge / Getty Images)

Before the end of the century, Joseph Lambert, an employee at Kellogg’s sanitarium who may have been the first person to make the doctor’s peanut butter, had invented machinery to roast and grind peanuts on a larger scale. He launched the Lambert Food Company, selling nut butter and the mills to make it, seeding countless other peanut butter businesses. As manufacturing scaled up, prices came down. A 1908 ad for the Delaware-based Loeber’s peanut butter—since discontinued—claimed that just 10 cents’ worth of peanuts contained six times the energy of a porterhouse steak. Technological innovations would continue to transform the product into a staple, something Yanks couldn’t do without and many a foreigner considered appalling.

By World War I, U.S. consumers—whether convinced by Kellogg’s nutty nutrition advice or not—turned to peanuts as a result of meat rationing. Government pamphlets promoted “meatless Mondays,” with peanuts high on the menu. Americans “soon may be eating peanut bread, spread with peanut butter, and using peanut oil for our salad,” the Daily Missourian reported in 1917, citing “the exigencies of war.”

The nation’s food scientists are nothing if not ingenious, and peanut butter posed a slippery problem that cried out for a solution. Manufacturers sold tubs of peanut butter to local grocers, and advised them to stir frequently with a wooden paddle, according to Andrew Smith, a food historian. Without regular effort, the oil would separate out and spoil. Then, in 1921, a Californian named Joseph Rosefield filed a patent for applying a chemical process called partial hydrogenation to peanut butter, a method by which the main naturally occurring oil in peanut butter, which is liquid at room temperature, is converted into an oil that’s solid or semisolid at room temperature and thus remains blended; the practice had been used to make substitutes for butter and lard, like Crisco, but Rosefield was the first to apply it to peanut butter. This more stable spread could be shipped across the country, stocked in warehouses and left on shelves, clearing the way for the national brands we all know today. The only invention that did more than hydrogenation to cement peanut butter in the hearts (and mouths) of America’s youth was sliced bread—introduced by a St. Louis baker in the late 1920s—which made it easy for kids to construct their own PB&Js. (In this century, the average American kid eats some 1,500 peanut butter and jelly sandwiches before graduating from high school.)

Rosefield went on to found Skippy, which debuted crunchy peanut butter and wide-mouth jars in the 1930s. In World War II, tins of (hydrogenated) Skippy were shipped with service members overseas, while the return of meat rationing at home again led civilians to peanut butter. Even today, when American expats are looking for a peanut butter fix, they often seek out military bases: They’re guaranteed to stock it.

But while peanut butter’s popularity abroad is growing—in 2020, peanut butter sales in the United Kingdom overtook sales of the Brits’ beloved jam—enjoying the spread is still largely an American quirk. “People say to me all the time, ‘When did you know that you had fully become an American?’” Ana Navarro, a Nicaraguan-born political commentator, told NPR in 2017. “And I say, ‘The day I realized I loved peanut butter.’”

Though the United States lags behind China and India in peanut harvest, Americans still eat far more of the spread than the people in any other country: It’s a gooey taste of nostalgia, for childhood and for American history. “What’s more sacred than peanut butter?” Iowa Senator Tom Harkin asked in 2009, after a salmonella outbreak was traced back to tainted jars. By 2020, when Skippy and Jif released their latest peanut butter innovation—squeezable tubes—nearly 90 percent of American households reported consuming peanut butter.

The ubiquity of this aromatic spread has even figured in the nation’s response to Covid-19. As evidence emerged last spring that many Covid patients were losing their sense of smell and taste, Yale University’s Dana Small, a psychologist and neuroscientist, devised a smell test to identify asymptomatic carriers. In a small, three-month study of health care workers in New Haven, everyone who reported a severe loss of smell using the peanut butter test later tested positive. “What food do most people in the U.S. have in their cupboards that provides a strong, familiar odor?” Small asks. “That’s what led us to peanut butter.”

Sustainable

George Washington Carver’s research was about more than peanuts
By Emily Moon

 

 

George Washington Carver in his laboratory.
Carver in his laboratory, circa 1935. (Hulton Archive / Getty Images)

No American is more closely associated with peanuts than George Washington Carver, who developed hundreds of uses for them, from Worcestershire sauce to shaving cream to paper. But our insatiable curiosity for peanuts, scholars say, has obscured Carver’s greatest agricultural achievement: helping black farmers prosper, free of the tyranny of cotton.

Born enslaved in Missouri around 1864 and trained in Iowa as a botanist, Carver took over the agriculture department at the Tuskegee Institute, in Alabama, in 1896. His hope was to aid black farmers, most of whom were cotton sharecroppers trapped in perpetual debt to white plantation owners. “I came here solely for the benefit of my people,” he wrote to colleagues on his arrival.

He found that cotton had stripped the region’s soil of its nutrients, and yet landowners were prohibiting black farmers from planting food crops. So Carver began experimenting with plants like peanuts and sweet potatoes, which could replenish the nitrogen that cotton leached and, grown discreetly, could also help farmers feed their families. In classes and at conferences and county fairs, Carver showed often packed crowds how to raise these crops.

Since his death in 1943, many of the practices Carver advocated—organic fertilizer, reusing food waste, crop rotation—have become crucial to the sustainable agriculture movement. Mark Hersey, a historian at Mississippi State University, says Carver’s most prescient innovation was a truly holistic approach to farming.

“Well before there was an environmental justice movement, black environmental thinkers connected land exploitation and racial exploitation,” says Hersey. A true accounting of American conservation, he says, would put Carver at the forefront.

A Brief History, of the Refrigerator

H/T Martha Stewart.com.

From ice boxes to space-saving marvels, here’s how the refrigerator became the modern appliance we know and love.

 
 
CREDIT: GETTY / GEORGE MARKS

Can you imagine a kitchen without a refrigerator? It’s hard to believe, but the way that home cooks keep their groceries cool is relatively new. If you’re wondering who invented the concept of refrigeration, it’s hard to say-people first began freezing water in China in 1,000 B.C., and many societies (including the Greeks, Romans, and Hebrews) stored snow in insulated materials to keep foods cool, according to the International Journal of Refrigeration. In the 18th century, Europeans often collected ice in the winter and salted large pieces before storing it deep underground, and this Colonial Williamsburg Foundation report says the practice would help ice keep for months. Before the advent of the refrigerator, people spent a lot of time preserving food-canningsmoking, drying, or salting.

It wasn’t until the early 1860s that Americans were introduced to the icebox, an early precursor of the refrigerator. Tim Buszka, a senior associate product marketing manager with the Whirlpool Corporation, says the icebox became more commonplace for middle and upper-class families in the 1890s. “There’s not one real clear inventor of the modern refrigerator,” Buszka says. “It was mostly auto companies who created early refrigerator models-Frigidaire was owned by General Motors way back when.”

The earliest models of the refrigerator really just had one feature to them-a chunk of ice. According to archival records from the Smithsonian National Museum of American History, the icebox was an insulated cabinet with a compartment containing ice that kept perishable foods cool. Fresh ice would have to be inserted into the fridge every week or so.

ice box illustration
CREDIT: GETTY: FOTOTECA STORICA NAZIONALE / CONTRIBUTOR

When the first home refrigerator was introduced in the early 1910s, Buszka says it was a luxury for even the wealthiest Americans. “Back then, the cold box itself would exist on the first floor in the kitchen and you had a supplemental unit in the basement,” Buszka says, explaining that the first air compressors were extremely loud.

 

It wasn’t until early 1920s that companies like Whirlpool introduced the earliest forms of the single-unit refrigerator featuring a brand new technology-evaporative cooling. It was a self-contained unit, and “wasn’t cheap at the time, but didn’t require the same amount of installation and maintenance of earlier models,” Buszka explained.

early refrigerator
CREDIT: SCIENCE & SOCIETY PICTURE LIBRARY / GETTY

According to Pacific Standard magazine, only eight percent of American residences had a refrigerator in the early 1930s-but by the early 1940s, almost 45 percent of American homes had ditched ice boxes and installed a refrigerator.

The Whirlpool models produced en masse in the early 1930s had top freezers. Design features that we know and love now, like wood trim handles and bottom-drawer freezers came along later. Beginning in the 1950s, Whirlpool kicked off the trend of designing refrigerators and other appliances in vivid colors, including signature hues like “harvest gold” and “avocado green.” And in the 1970s, design features like side-by-side doors were introduced.

Woman using an old refrigerator

CREDIT: DAILY HERALD ARCHIVE / GETTY

“Between the 1930s and 1970s, the evolution of refrigerator design focused on configuration, evolution, and organization,” Buszka says. “But the development we’re most proud of is energy-efficient refrigerators during the 1980s. People think of the refrigerator of being very power dependent, but in reality, they can run on as little power as one incandescent light bulb.”

In the 21st century, all refrigerator models are anchored in cutting-edge technology and have features that are increasingly customizable. And Whirlpool is looking to revolutionize modern refrigerators by returning to their 1920s roots. “Old ice boxes were basically the first four-door refrigerators, where each door had a specific function beyond supporting cooling,” Buskza says. “In some markets, we’ve launched [four-door] refrigerators that have quick grab zones for kids who have to stand on a chair to access the fridge.”

More innovative functions currently in development include “total coverage cooling,” which is a design feature that pipes cold air to each and every shelf in the fridge, which means you can finally place milk in the back of the fridge without worrying it’ll freeze. Now that’s truly far out.