A Brief History of Beer

H/T Mental Floss.

While I have drank a beer or two my poison of choice was vodka or rum.

Odairson Antonello/iStock via Getty Images Plus

In 1814, Meux’s Horse Shoe Brewery was the victim of some very bad luck—or maybe just poor engineering. At the time, massive storage vats were en vogue in London’s breweries, and when one of these large vats burst at The Horse Shoe Brewery, it led to over a hundred thousand gallons of beer flooding the surrounding area in a veritable brew-nami. The surge led to the collapse of two nearby buildings and the loss of eight lives. Over the years, rumors even popped up that unconscientious ale-lovers had flocked to the scene of the accident to consume the runaway beverage.

Contemporary accounts suggest there’s not much substance to those rumors, but it’s easy enough to see why people would believe them. People really like beer. Behind water and tea, it’s thought to be the third most widely consumed drink on Earth. It helped shape civilization as we know it, and we’re not just talking about those commercials where the guys say “whasssup?!”

Perhaps it shouldn’t be surprising that the story of the world’s most-widely consumed intoxicating beverage is shrouded in legend and half-truths. Was beer brewed to make water potable? When did hops enter the picture? And what do the Budweiser Frogs have in common with Captain Jack Sparrow? The short answers, respectively, are “no,” “the 9th century, at the latest,” and “Gore Verbinski.” But the long version is more fun.


Ancient Sumerians were cultivating grains thousands of years ago, and there’s some debate about what they were doing with the grains they grew. According to one theory, the grains were used to make beer before bread ever entered the picture. The discovery of ancient tools potentially designed for beer brewing supports this. That would mean beer was part of the origin of agriculture, which is arguably what allowed humans to build civilizations, which led to the development of new technologies, which made it possible to brew even more beer.

In 2018, archaeologists from Stanford announced they had evidence that people in what is now Israel were brewing something like beer around 12,000 years ago, which they noted predates “the appearance of domesticated cereals by several millennia in the Near East.” The archaeologists speculated that it was likely a thin gruel possibly consumed for religious purposes.

Beer, by the way, is any fermented, alcoholic beverage made with cereal grains such as wheat or barley. There was a time when beer and ale were considered two different beverages, with beer defined by the presence of hops, but we’re going to proceed with more modern usage, where the two words are basically interchangeable. We’ll get to hops soon enough (about 11,000 years).

Early beer was likely made by crushing up grains, heating them gradually in water, then possibly baking them, and steeping them again. This process encourages fermentation. Grains contain starches, and heating up grains helps break these starches down into their simple sugar components. Fermentation happens when yeast microbes consume these sugars and convert them into alcohol, flavor compounds, and carbon dioxide. It’s sometimes said that Louis Pasteur discovered yeast in the mid-1800s, but that’s a little misleading. Sure, single-celled organisms like yeast are invisible to the naked eye, but when millions or billions congregate in the beer-making process they can be seen and manipulated.

In Richard Unger’s Beer in the Middle Ages and the Renaissance, he points to several indications that brewers began to intuit the vital role of yeast hundreds of years before Pasteur’s time. Rather than relying on wild yeast in the air, a 15th-century brewer in Munich received permission to use a specific source of yeast from the bottom of his brew. In 16th century Norwich, England, brewers recognized the value of skimming off excess yeast for use in bread-making and further brewing. They actually donated some of that yeast to charities. Even without Pasteur’s experiments defining the biological processes that result in living yeast turning glucose into ethanol—what he dubbed alcoholic fermentation—people evidently knew that fermentation made beer bubbly, flavorful, alcoholic, and generally much more fun to drink than plain barley water.


Beer was among the Sumerians’ most influential contributions to the world, right behind written language and a formal number system. And the Sumerians knew they had come up with something big. They even had a goddess of beer and brewing named Ninkasi. In the year 1800 BCE, a hymn was written for Ninkasi that doubled as a beer recipe. Because it was written in song, the recipe was easy for the average beer drinker to memorize if they didn’t know how to read. It’s also the oldest beer recipe ever discovered. Here’s an excerpt:

“Ninkasi, You are the one who handles the dough with a big shovel….you are the one who waters the malt set on the ground. … You are the one who soaks the malt in a jar, the waves rise, the waves fall…You are the one who spreads the cooked mash on large reed mats, coolness overcomes.”

If the Sumerians penned a similarly poetic cure for hangovers, it hasn’t been discovered.

The Ancient Egyptians were also fanatical about their beer. They believed that beer brewing knowledge was a gift from the god Osiris, and they incorporated the drink into their religious ceremonies. It infiltrated other parts of Egyptian culture as well. Beer was so common that the laborers who built the pyramids of Giza were given daily rations amounting to about 10 pints of the stuff. It was also served at celebrations, where over-imbibing was not only accepted, but encouraged. As far as etiquette was concerned, leaving a party when you could still walk straight was the Egyptian equivalent of not finishing your meal.


Adding unusual flavors to beer is not a new phenomenon. Before the first hipster microbrewery opened, ancient beer makers were using ingredients like carrots, bog myrtle, hemp, and cheese to make their concoctions. But one component that’s found in virtually all beer today took a while to enter the picture. That would be hops, the ingredient that gives beer its bitter, floral taste. Though it’s more noticeable in IPAs, the vast majority of beers depend on hops to balance out their sweetness. And hops, by the way, isn’t the name of the plant; it’s the name of the flower or “cone” that comes from the plant. The plant itself is called Humulus lupulus, which means “climbing wolf” in Latin.

During the Middle Ages, Catholic monks supported themselves by selling homemade goods like cheese, mustard, and in some cases, beer. These monks were likely the first people to make beer with hops. In the mid-800s, Adalard of Corbie, a German Abbot from the monastery of Corvey in Germany (and a cousin of Charlemagne’s), referred to the use of hops in brewing. A few hundred years after that first written reference, German abbess and eventual Catholic saint Hildegard of Bingen wrote, in her book Physica Sacra, that hops “make the soul of a man sad and weigh down his inner organs.” 

According to beer scholar William Bostwick, her description was actually so scathing that it helped launch a beer war between Catholics and Protestants. Partly inspired by Hildegard, Catholics ditched hops and fully embraced gruit, which was the mixture of herbs and aromatics used to flavor most early beers. This made hoppy beer anti-Catholic, so naturally Protestants claimed it as their own. Martin Luther himself was even a proponent of the beverage. During the Reformation in the 16th century, the rise of Protestantism helped boost hops’ profile in Europe. And hops had another advantage in the beer wars: The ingredient contains beta acids that delay spoilage and act as a natural preservative. Monks weren’t aware of this property when they first added hops to beer, and it when it did become clear centuries later, that was the final nail in gruit’s coffin.


Beer was a popular drink of the lower classes from ancient times through the Middle Ages, but there’s some confusion as to why. You may have heard that peasants drank beer every day because it was more sanitary than the water they had access to, and it makes a certain amount of sense. Brewing generally involves boiling the unfermented beer, or wort; this would theoretically kill off pathogens. Once fermentation takes place, the alcohol itself would presumably provide further disinfection.

While it’s hard to say that beer was never looked at as a healthier alternative to water, the theory doesn’t stand up to much scrutiny as it relates to the Middle Ages. The truth is that clean water wasn’t that hard to come by, even in poorer communities. People could get free water from wells and streams, and some places like London even had cisterns by the 13th century. A more likely explanation for beer’s popularity with poor and working class people is that, beyond its intoxicating effects, it was viewed as a cheap source of nutrition. If you were a worker in the Middle Ages, an afternoon pint provided a measure of hydration and quick calories at the same time.


Today the craft brewing industry skews heavily male, but women have likely been making beer for thousands of years. Female brewers during the 16th and 17th century may even have given rise to some of the iconic imagery around witches. From the cauldrons they brewed in to the pointy hats they wore (perhaps to attract customers) to the cats they kept to deal with grain-loving rodents, some writers have drawn a line connecting alewife entrepreneurs and what would eventually become witchy iconography.

While it’s difficult to source accusations of witchcraft levied at brewsters, there does seem to be overlap in assertions of duplicitousness, for example—perhaps as a method to drive women out of a field that was quickly becoming dominated by men.


For good and ill, beer-making took some big steps forward during the Industrial Revolution. Emerging technologies like steam power and refrigeration led to a tastier, more consistent, and easier-to-brew beverage.

Industrialization and globalization also paved the way for the widespread consolidation of the modern beer industry. When Anheuser-Busch InBev acquired SABMiller for more than $100 billion in 2016, the resulting conglomerate contained over 500 beverage brands [PDF] and accounted for over a quarter of global beer market sales, according to market research firm Euromonitor International.

Many of the “craft beers” you know may very well belong to a conglomerate like this. Because these giant beer-makers are also giant beer-distributors, with a big say in what brands up on store shelves, critics accuse them of reducing competition from independent producers and potentially stifling consumer choice. InBev would presumably argue that their scale and history in the industry allows them to operate more efficiently [PDF]. It’s interesting to note that you could buy a bottle of Budweiser or a bottle of Goose Island Bourbon County Stout and be supporting the same bottom line. Speaking of Budweiser …


During 1995’s Super Bowl, a commercial featuring the three talking amphibians—Bud, Weis, and Er—aired, and for some reason, America fell in love. That spot was directed by Gore Verbinski, who would go on to direct the American remake of The Ring and the first three Pirates of the Caribbean films. The frogs were brought to life by artists at Stan Winston’s studio, the same legendary company that helped create the dinosaurs in Jurassic Park and The Terminator’s titular cyborg. That’s five combined Academy Awards and billions in box office success, all in the service of selling beer.

Mass-produced beer made from hops, grains, and yeast is standard today, but the brews of the past haven’t disappeared completely. Resurrecting ancient beer recipes has become a popular pastime among home brewers. Even some commercial breweries have joined the trend. New Belgium makes gruit ale, and Dogfish Head collaborated with a molecular archeologist to recreate a beer based on residue collected from what may have been King Midas’s tomb. But the truth is, no matter what beer you reach for, you’ll be drinking something that connects you to the very beginnings of civilization.

A Brief History of the Rain Boot

H/T TreeHugger.com.

Until I read this article I did not know the history of rainboots.

April showers, indeed! Here in Southern Florida, rain boots have become standard attire these days and from the looks of my weather app, for many other places too. It’s hard to believe that there was once a time when rain boots didn’t exist, when people walked out in wet, muddy weather in their regular shoes. It wasn’t even that long ago! Herein, a brief history of the practical, yet ever-stylish, rain boot.

Rain boots first made their debut on the feet of Arthur Wellesley in Britain in the early 19th century. Also known as the Duke of Wellington, the military man (like many others of his day) used to wear Hessian boots. Hessian boots, standard issue in the military, were made out of leather, had a semi-pointed toe, reached up to the knee and had a tassel on the top. (Think Mr. Darcy in “Pride and Prejudice”). Thinking he could improve on them, Wellesley commissioned his personal shoemaker to make a variation just for him. He asked him to do away with the trim around the calf, shorten the heel and cut the boot closer around the leg. The result, known as Wellingtons, quickly took hold among the British aristocracy, and the name wellies endures to this day.


The original Wellington boots were fashioned out of leather, but in the mid-19th century, a man named Hiram Hutchinson bought the patent for vulcanization of natural rubber for footwear from Charles Goodyear (who was using the process to make tires) and began manufacturing rubber Wellingtons. The introduction of the rubber Wellington was met with much approval, especially among farmers, who could now work all day and still have clean, dry feet.

The Wellington became even more popular after World War I and World War II. Soldiers often spent long hours in flooded European trenches, and the rubber boots allowed their feet to stay warm and dry. By the end of World War II, men, women, and children were all wearing the rain boot. Hunter Boot, the company commissioned to make boots for the British Army in both wars, continues to sell their signature boots today.


Rain boots are still called wellies in England, but around the world are referred to as billy boots, gummies, gumboots and, of course, rain boots. In South Africa, where they are called gumboots, miners wore rain boots and used them to help them communicate with each other when talking wasn’t permitted. The miners even created gumboot dances (whose variations have become popular entertainment today) to keep themselves from getting bored.

Wellies in all styles

The lower cost of Wellington’s manufacturing process made it the standard footwear for a variety of professions – often reinforced with a steel toe to prevent injury. Used in factories, meat packing plants, farms, clean rooms for delicate electronics, even fast-food environments, rubber boots are just practical – and stylish.


Whereas most rain boots could only be found in a few colors (olive green, yellow, black) 50 years ago, they are manufactured in all colors (and patterns) of the rainbow today. And even though they’re quite practical for muddy, rainy spring weat

A Brief History of the Invention of the Home Security Alarm

H/T Smithsonian Magazine.

A hardworking nurse envisioned a new way to know who was at the door

patent application for home-security system and an image of a woman and a man displaying the patent
Left, a portion of the patent plan designed by Marie Van Brittan Brown and her husband Albert, right. (Marie Van Brittan Brown and Albert L. Brown, courtesy U.S. Patent and Trademark Office; New York Times / Redux)

Marie Van Brittan Brown, an African American nurse living in Jamaica, Queens in the 1960s, was working odd shifts, as was her husband, Albert, an electronics technician. When she arrived home late, she sometimes felt afraid. Serious crimes in Queens jumped nearly 32 percent from 1960 to 1965, and police were slow to respond to emergency calls. Marie wanted to feel safer at home.


Enlisting her husband’s electrical expertise, Marie conceived a contraption that could be affixed to the front door. It would offer four peepholes, and through these, a motorized video camera on the inside could view visitors of different heights as the occupant toggled the camera up and down. The camera was connected to a television monitor inside. A microphone on the outside of the door and a speaker inside allowed an occupant to interrogate a visitor, while an alarm could alert police via radio. Closed-circuit television, invented during World War II for military use, was not widespread in the 1960s, and the Browns proposed using the technology to create the first modern home security system.

They filed a patent for their device in 1966, citing Marie as lead inventor. It was approved three years later. “The equipment is not in production,” the New York Times reported, “but the Browns hope to interest manufacturers and home builders.

Patent application for bedside door security camera, sketched in black, white, and orange
The Browns’ 1969 patent plan for an elaborate home security system suggests safety and relaxation can go hand in hand. (Marie Van Brittan Brown and Albert L. Brown, courtesy U.S. Patent and Trademark Office)

That never happened, presumably because the Browns’ system was ahead of its time. “The cost of installing it would be pretty high,” says Robert McCrie, an emergency management expert at John Jay College of Criminal Justice in Manhattan.

Marie’s invention, though it didn’t benefit them financially, would earn the Browns a measure of recognition in the tech world: The predecessor of today’s home security systems, it has been cited in 35 U.S. patents. Companies first offered CCTV to residential consumers around 2005, but Marie never saw her vision realized; she died in Queens in 1999, at the age of 76.

As the tech has become cheaper and smarter, home security has grown into a $4.8 billion business in North America and is expected to triple by 2024.

A Brief History of Peanut Butter

H/T Smithsonian Magazine.

The bizarre sanitarium staple that became a spreadable obsession

Jars of peanut butter
Veteran food critic Florence Fabricant has called peanut butter “the pâté of childhood.” (Dan Saelinger)

North Americans weren’t the first to grind peanuts—the Inca beat us to it by a few hundred years—but peanut butter reappeared in the modern world because of an American, the doctor, nutritionist and cereal pioneer John Harvey Kellogg, who filed a patent for a proto-peanut butter in 1895. Kellogg’s “food compound” involved boiling nuts and grinding them into an easily digestible paste for patients at the Battle Creek Sanitarium, a spa for all kinds of ailments. The original patent didn’t specify what type of nut to use, and Kellogg experimented with almonds as well as peanuts, which had the virtue of being cheaper. While modern peanut butter enthusiasts would likely find Kellogg’s compound bland, Kellogg called it “the most delicious nut butter you ever tasted in your life.”


A Seventh-Day Adventist, Kellogg endorsed a plant-based diet and promoted peanut butter as a healthy alternative to meat, which he saw as a digestive irritant and, worse, a sinful sexual stimulant. His efforts and his elite clientele, which included Amelia Earhart, Sojourner Truth and Henry Ford, helped establish peanut butter as a delicacy. As early as 1896, Good Housekeeping encouraged women to make their own with a meat grinder, and suggested pairing the spread with bread. “The active brains of American inventors have found new economic uses for the peanut,” the Chicago Tribune rhapsodized in July 1897.


A vintage peanut butter advertisement
“It’s the Great Depression that makes the PB&J the core of childhood food,” food historian Andrew F. Smith has said. (Buyenlarge / Getty Images)

Before the end of the century, Joseph Lambert, an employee at Kellogg’s sanitarium who may have been the first person to make the doctor’s peanut butter, had invented machinery to roast and grind peanuts on a larger scale. He launched the Lambert Food Company, selling nut butter and the mills to make it, seeding countless other peanut butter businesses. As manufacturing scaled up, prices came down. A 1908 ad for the Delaware-based Loeber’s peanut butter—since discontinued—claimed that just 10 cents’ worth of peanuts contained six times the energy of a porterhouse steak. Technological innovations would continue to transform the product into a staple, something Yanks couldn’t do without and many a foreigner considered appalling.

By World War I, U.S. consumers—whether convinced by Kellogg’s nutty nutrition advice or not—turned to peanuts as a result of meat rationing. Government pamphlets promoted “meatless Mondays,” with peanuts high on the menu. Americans “soon may be eating peanut bread, spread with peanut butter, and using peanut oil for our salad,” the Daily Missourian reported in 1917, citing “the exigencies of war.”

The nation’s food scientists are nothing if not ingenious, and peanut butter posed a slippery problem that cried out for a solution. Manufacturers sold tubs of peanut butter to local grocers, and advised them to stir frequently with a wooden paddle, according to Andrew Smith, a food historian. Without regular effort, the oil would separate out and spoil. Then, in 1921, a Californian named Joseph Rosefield filed a patent for applying a chemical process called partial hydrogenation to peanut butter, a method by which the main naturally occurring oil in peanut butter, which is liquid at room temperature, is converted into an oil that’s solid or semisolid at room temperature and thus remains blended; the practice had been used to make substitutes for butter and lard, like Crisco, but Rosefield was the first to apply it to peanut butter. This more stable spread could be shipped across the country, stocked in warehouses and left on shelves, clearing the way for the national brands we all know today. The only invention that did more than hydrogenation to cement peanut butter in the hearts (and mouths) of America’s youth was sliced bread—introduced by a St. Louis baker in the late 1920s—which made it easy for kids to construct their own PB&Js. (In this century, the average American kid eats some 1,500 peanut butter and jelly sandwiches before graduating from high school.)

Rosefield went on to found Skippy, which debuted crunchy peanut butter and wide-mouth jars in the 1930s. In World War II, tins of (hydrogenated) Skippy were shipped with service members overseas, while the return of meat rationing at home again led civilians to peanut butter. Even today, when American expats are looking for a peanut butter fix, they often seek out military bases: They’re guaranteed to stock it.

But while peanut butter’s popularity abroad is growing—in 2020, peanut butter sales in the United Kingdom overtook sales of the Brits’ beloved jam—enjoying the spread is still largely an American quirk. “People say to me all the time, ‘When did you know that you had fully become an American?’” Ana Navarro, a Nicaraguan-born political commentator, told NPR in 2017. “And I say, ‘The day I realized I loved peanut butter.’”

Though the United States lags behind China and India in peanut harvest, Americans still eat far more of the spread than the people in any other country: It’s a gooey taste of nostalgia, for childhood and for American history. “What’s more sacred than peanut butter?” Iowa Senator Tom Harkin asked in 2009, after a salmonella outbreak was traced back to tainted jars. By 2020, when Skippy and Jif released their latest peanut butter innovation—squeezable tubes—nearly 90 percent of American households reported consuming peanut butter.

The ubiquity of this aromatic spread has even figured in the nation’s response to Covid-19. As evidence emerged last spring that many Covid patients were losing their sense of smell and taste, Yale University’s Dana Small, a psychologist and neuroscientist, devised a smell test to identify asymptomatic carriers. In a small, three-month study of health care workers in New Haven, everyone who reported a severe loss of smell using the peanut butter test later tested positive. “What food do most people in the U.S. have in their cupboards that provides a strong, familiar odor?” Small asks. “That’s what led us to peanut butter.”


George Washington Carver’s research was about more than peanuts
By Emily Moon



George Washington Carver in his laboratory.
Carver in his laboratory, circa 1935. (Hulton Archive / Getty Images)

No American is more closely associated with peanuts than George Washington Carver, who developed hundreds of uses for them, from Worcestershire sauce to shaving cream to paper. But our insatiable curiosity for peanuts, scholars say, has obscured Carver’s greatest agricultural achievement: helping black farmers prosper, free of the tyranny of cotton.

Born enslaved in Missouri around 1864 and trained in Iowa as a botanist, Carver took over the agriculture department at the Tuskegee Institute, in Alabama, in 1896. His hope was to aid black farmers, most of whom were cotton sharecroppers trapped in perpetual debt to white plantation owners. “I came here solely for the benefit of my people,” he wrote to colleagues on his arrival.

He found that cotton had stripped the region’s soil of its nutrients, and yet landowners were prohibiting black farmers from planting food crops. So Carver began experimenting with plants like peanuts and sweet potatoes, which could replenish the nitrogen that cotton leached and, grown discreetly, could also help farmers feed their families. In classes and at conferences and county fairs, Carver showed often packed crowds how to raise these crops.

Since his death in 1943, many of the practices Carver advocated—organic fertilizer, reusing food waste, crop rotation—have become crucial to the sustainable agriculture movement. Mark Hersey, a historian at Mississippi State University, says Carver’s most prescient innovation was a truly holistic approach to farming.

“Well before there was an environmental justice movement, black environmental thinkers connected land exploitation and racial exploitation,” says Hersey. A true accounting of American conservation, he says, would put Carver at the forefront.

A Brief History, of the Refrigerator

H/T Martha Stewart.com.

From ice boxes to space-saving marvels, here’s how the refrigerator became the modern appliance we know and love.


Can you imagine a kitchen without a refrigerator? It’s hard to believe, but the way that home cooks keep their groceries cool is relatively new. If you’re wondering who invented the concept of refrigeration, it’s hard to say-people first began freezing water in China in 1,000 B.C., and many societies (including the Greeks, Romans, and Hebrews) stored snow in insulated materials to keep foods cool, according to the International Journal of Refrigeration. In the 18th century, Europeans often collected ice in the winter and salted large pieces before storing it deep underground, and this Colonial Williamsburg Foundation report says the practice would help ice keep for months. Before the advent of the refrigerator, people spent a lot of time preserving food-canningsmoking, drying, or salting.

It wasn’t until the early 1860s that Americans were introduced to the icebox, an early precursor of the refrigerator. Tim Buszka, a senior associate product marketing manager with the Whirlpool Corporation, says the icebox became more commonplace for middle and upper-class families in the 1890s. “There’s not one real clear inventor of the modern refrigerator,” Buszka says. “It was mostly auto companies who created early refrigerator models-Frigidaire was owned by General Motors way back when.”

The earliest models of the refrigerator really just had one feature to them-a chunk of ice. According to archival records from the Smithsonian National Museum of American History, the icebox was an insulated cabinet with a compartment containing ice that kept perishable foods cool. Fresh ice would have to be inserted into the fridge every week or so.

ice box illustration

When the first home refrigerator was introduced in the early 1910s, Buszka says it was a luxury for even the wealthiest Americans. “Back then, the cold box itself would exist on the first floor in the kitchen and you had a supplemental unit in the basement,” Buszka says, explaining that the first air compressors were extremely loud.


It wasn’t until early 1920s that companies like Whirlpool introduced the earliest forms of the single-unit refrigerator featuring a brand new technology-evaporative cooling. It was a self-contained unit, and “wasn’t cheap at the time, but didn’t require the same amount of installation and maintenance of earlier models,” Buszka explained.

early refrigerator

According to Pacific Standard magazine, only eight percent of American residences had a refrigerator in the early 1930s-but by the early 1940s, almost 45 percent of American homes had ditched ice boxes and installed a refrigerator.

The Whirlpool models produced en masse in the early 1930s had top freezers. Design features that we know and love now, like wood trim handles and bottom-drawer freezers came along later. Beginning in the 1950s, Whirlpool kicked off the trend of designing refrigerators and other appliances in vivid colors, including signature hues like “harvest gold” and “avocado green.” And in the 1970s, design features like side-by-side doors were introduced.

Woman using an old refrigerator


“Between the 1930s and 1970s, the evolution of refrigerator design focused on configuration, evolution, and organization,” Buszka says. “But the development we’re most proud of is energy-efficient refrigerators during the 1980s. People think of the refrigerator of being very power dependent, but in reality, they can run on as little power as one incandescent light bulb.”

In the 21st century, all refrigerator models are anchored in cutting-edge technology and have features that are increasingly customizable. And Whirlpool is looking to revolutionize modern refrigerators by returning to their 1920s roots. “Old ice boxes were basically the first four-door refrigerators, where each door had a specific function beyond supporting cooling,” Buskza says. “In some markets, we’ve launched [four-door] refrigerators that have quick grab zones for kids who have to stand on a chair to access the fridge.”

More innovative functions currently in development include “total coverage cooling,” which is a design feature that pipes cold air to each and every shelf in the fridge, which means you can finally place milk in the back of the fridge without worrying it’ll freeze. Now that’s truly far out.


A Brief History of the TV Dinner

H/T Smithsonian Magazine.

I remember as a kid I thought they were great.

Thanksgiving’s most unexpected legacy is heating up again

Vintage Swanson TV dinner packaging
A new form of entertainment and a wandering trainload of frozen turkey triggered a convenience food boom. (Evan Angelastro)

In 1925, the Brooklyn-born entrepreneur Clarence Birdseye invented a machine for freezing packaged fish that would revolutionize the storage and preparation of food. Maxson Food Systems of Long Island used Birdseye’s technology, the double-belt freezer, to sell the first complete frozen dinners to airlines in 1945, but plans to offer those meals in supermarkets were canceled after the death of the company’s founder, William L. Maxson. Ultimately, it was the Swanson company that transformed how Americans ate dinner (and lunch)—and it all came about, the story goes, because of Thanksgiving turkey.


According to the most widely accepted account, a Swanson salesman named Gerry Thomas conceived the company’s frozen dinners in late 1953 when he saw that the company had 260 tons of frozen turkey left over after Thanksgiving, sitting in ten refrigerated railroad cars. (The train’s refrigeration worked only when the cars were moving, so Swanson had the trains travel back and forth between its Nebraska headquarters and the East Coast “until panicked executives could figure out what to do,” according to Adweek.) Thomas had the idea to add other holiday staples such as cornbread stuffing and sweet potatoes, and to serve them alongside the bird in frozen, partitioned aluminum trays designed to be heated in the oven. Betty Cronin, Swanson’s bacteriologist, helped the meals succeed with her research into how to heat the meat and vegetables at the same time while killing food-borne germs.


A vintage Swanson TV dinner advertisement
“Eating off a tray in the dusk before a TV set is an abomination,” the columnist Frederick C. Othman wrote in 1957. (Advertising Archive / Everett Collection)


The Swanson company has offered different accounts of this history. Cronin has said that Gilbert and Clarke Swanson, sons of company founder Carl Swanson, came up with the idea for the frozen-meal-on-a-tray, and Clarke Swanson’s heirs, in turn, have disputed Thomas’ claim that he invented it. Whoever provided the spark, this new American convenience was a commercial triumph. In 1954, the first full year of production, Swanson sold ten million trays. Banquet Foods and Morton Frozen Foods soon brought out their own offerings, winning over more and more middle-class households across the country.

Whereas Maxson had called its frozen airline meals “Strato-Plates,” Swanson introduced America to its “TV dinner” (Thomas claims to have invented the name) at a time when the concept was guaranteed to be lucrative: As millions of white women entered the workforce in the early 1950s, Mom was no longer always at home to cook elaborate meals—but now the question of what to eat for dinner had a prepared answer. Some men wrote angry letters to the Swanson company complaining about the loss of home-cooked meals. For many families, though, TV dinners were just the ticket. Pop them in the oven, and 25 minutes later, you could have a full supper while enjoying the new national pastime: television.

In 1950, only 9 percent of U.S. households had television sets—but by 1955, the number had risen to more than 64 percent, and by 1960, to more than 87 percent. Swanson took full advantage of this trend, with TV advertisements that depicted elegant, modern women serving these novel meals to their families, or enjoying one themselves. “The best fried chicken I know comes with a TV dinner,” Barbra Streisand told the New Yorker in 1962.

By the 1970s, competition among the frozen food giants spurred some menu innovation, including such questionable options as Swanson’s take on a “Polynesian Style Dinner,” which doesn’t resemble any meal you will see in Polynesia. Tastemakers, of course, sniffed, like the New York Times food critic who observed in 1977 that TV dinner consumers had no taste. But perhaps that was never the main draw. “In what other way can I get…a single serving of turkey, a portion of dressing…and the potatoes, vegetable and dessert…[for] something like 69 cents?” a Shrewsbury, New Jersey, newspaper quoted one reader as saying. TV dinners had found another niche audience in dieters, who were glad for the built-in portion control.

The next big breakthrough came in 1986, with the Campbell Soup Company’s invention of microwave-safe trays, which cut meal preparation to mere minutes. Yet the ultimate convenience food was now too convenient for some diners, as one columnist lamented: “Progress is wonderful, but I will still miss those steaming, crinkly aluminum TV trays.”

With restaurants closed during Covid-19, Americans are again snapping up frozen meals, spending nearly 50 percent more on them in April 2020 over April 2019, says the American Frozen Food Institute. Specialty stores like Williams Sonoma now stock gourmet TV dinners. Ipsa Provisions, a high-end frozen-food company launched this past February in New York, specializes in “artisanal frozen dishes for a civilized meal any night of the week”—a slogan right out of the 1950s. Restaurants from Detroit to Colorado Springs to Los Angeles are offering frozen versions of their dishes for carryout, a practice that some experts predict will continue beyond the pandemic. To many Americans, the TV dinner tastes like nostalgia; to others, it still tastes like the future.