The History of the Piñata

H/T Back Then History.

I learned a lot about the Piñata.

It Originated in China

Today, the Piñata is a staple at many celebrations and plays a particularly central role in Mexican fiestas. You may think of it as just a simple object, but it has a surprisingly fascinating history! The piñata is thought to have originated over 700 years ago in Asia. Specifically, the Chinese used to fashion paper-covered animals to celebrate the New Year. They decorated paper-covered animals (which included cows, oxen, and buffalos) with colorful harnesses and other trappings. Then, they filled the figures with seeds and knocked them with sticks until the seeds spilled out. Afterwards, the remains were burned; the ashes were thought to bring good luck in the coming year. It is thought that Marco Polo discovered this Chinese practice and introduced it to the Western world.

It Became Part of Lenten Traditions in Europe

In the 14th century, the piñata entered Europe and was quickly adapted to the Christian season of Lent. The first Sunday of Lent was known as “Piñata Sunday’ – the name comes from the Italian word pignatta, meaning “fragile pot,” because early European piñatas resembled clay pots. When the custom spread from Italy to Spain, the first Sunday in Lent there became known as the “Dance of the Piñata.” The Spanish fiesta featured a clay container called la olla (the Spanish word for pot); originally, it was not decorated, but over time decorations like tinsel, ribbon, and fringed paper were added.

Indigenous Peoples Had Their Own Version

When Spanish missionaries travelled to the Americas, they used the piñata to attract crowds and attention at their ceremonies. However, the indigenous peoples already had their own tradition that was similar; to celebrate the birthday of Huitzilopochtli, the Aztec god of war, Aztec priests put a clay pot on a pole in the temple at the end of each year. The clay pot was decorated with feathers and filled with small treasures. When broken with a stick, the treasures would fall at the god’s feet as an offering. The Mayans also played a sport where a player’s eyes would be covered and they would have to try to hit a hanging clay pot.

Image Credit:

Missionaries Gave It Religious Meaning

The missionaries transformed these indigenous traditions for the purpose of religious instruction. They covered the traditional pot in colored paper so that it appeared different (and perhaps even scary) to the local peoples. The original piñata features seven points; the missionaries used these to symbolize the Seven Deadly Sins in Christianity: envy, sloth, gluttony, greed, lust, anger/wrath, and pride. (There is also a traditional ten-pointed piñata, which missionaries said symbolized the sins that come from breaking the Ten Commandments.) The missionaries said that the stick used to break the piñata represented love. The stick, representing love, destroyed the piñata, which represented sins and temptation, thus imparting a religious lesson. Some people also say the piñata was meant to represent Satan. The treats (usually candies and fruits) that fell out of the broken piñata were said to represent God’s forgiveness of sins and a new beginning. Another interpretation holds that the fruits represented temptations and earthly pleasures, while yet another holds that the sharing of the fruits and candies represented a reward for keeping the faith – a share in divine blessings and gifts.

Today, the piñata has lost most of its religious meaning. Instead of being used as tool to teach the Christian catechism, it is simply a fun pastime at celebrations. It’s especially popular at Mexican fiestas and is used to mark special holidays, such as Christmas and Cinco de Mayo. It’s also popular at children’s parties, and many commercially available piñatas are made in the likeness of beloved children’s characters.


The History of the Necktie

H/T Back Then History.

I learned quite a lot about the history of ties.

It’s Been Around for a Long Time

Neckties are a staple fashion accessory in the 21st century. But when and how did ties become an integral part of formal dress for men? The history of the necktie goes back a lot further than you might think! Many of the terracotta sculptures of Chinese soldiers from the 2nd century B.C. feature a carefully tied neck cloth. Since the item is only shown on select soldiers, historians think that the cloth was an honorary badge used to denote exemplary performance. In ancient Rome, a band of linen called a “sudarium” was worn around the neck (or sometimes tied around the waist) of most men. Military genius Emperor Trajan’s soldiers are depicted wearing them on a pillar, leading historians to believe that the sudarium had its roots in military attire. In ancient Egypt, cloths adorned with precious stones were worn around the necks of some Pharaohs. Certain members of tribes in Oceania wore neck adornments as well.

It Has Its Roots in Military Dress

While neck cloths and adornments were worn in ancient cultures across the world, what we think of as the modern necktie didn’t come into existence until the 17th century. During the Thirty Years War, French King Louis XIII hired Croatian mercenaries to help aid his cause. The Croatian mercenaries all wore colorful neckerchiefs as an official part of their uniforms. The front-line soldiers’ neckerchiefs were made from common materials, while the officers wore neckerchiefs made from muslin or silk. The neckerchiefs were knotted around the soldiers’ necks and used to hold up their capes; the ends of the cloths were either arranged in a bow, finished with a tuft or tassel, or left hanging loosely. King Louis XIII liked the look of these neckerchiefs so much that he required the item be worn during all royal events. The French first called the accessory a “croate,” after the Croatian soldiers, but it was soon corrupted to “la cravate” or “cravatte” – cravat in English. The cravat caught on in England after Charles II reclaimed the throne, and there are reports of German soldiers adopting the Croatian mercenaries’ neckties as a style accessory as well. Over the course of the next century, the cravat caught on in Germany, France, and throughout the English colonies.

It Was a Gentlemen’s Accessory

The cravat quickly became a style accessory for gentlemen and was associated with power, wealth, and elegance. A group of young British gentlemen who called themselves the Macaronis helped popularize the cravat in the 1760s. In the early 1800s, Napoleon wore one during the Battle of Waterloo. And in Regency England, a young dandy named Beau Brummell helped to solidify the importance of how a cravat was tied; the young fashion icon wore a complicated cravat knot that appeared effortless but was actually incredibly intricate and took hours to get correct. However, his style was financially accessible to both middle-class and upper-class gentlemen, so it caught on quickly. As the use of precise cravat knots spread, the notion that a well-tied knot was the mark of a true gentlemen arose; in fact, we still consider sharp dress the mark of a man of taste today! (Incidentally, Brummell also started the trend of wearing black evening wear, another trend that still endures today.) Soon there were so many complicated ways to tie a cravat that in 1818, a pamphlet called Neckclothitania was published to help men learn about all the different options. Some of the popular knots covered in the instructional booklet included: The Mathematical, The American, The Irish, and The Mail Coach. Interestingly, the booklet was the first printed material to use the word “tie” to refer to a neckcloth as opposed to “cravat.” By 1840, the term “tie” had almost entirely replaced the word “cravat” in popular usage.

Image Source:

Until the 1860s, the tie was a handmade product fashioned from high-quality materials such as muslin, white lace, or linen. It also had to be laundered and pressed often by servants and was (generally) only worn by gentlemen. However, during the late 1800s, men became increasingly aware of how they looked while out in public and the tie provided a way to take pride in their appearance. More and more men in the middle class began to dress above their current station as part of their upwardly mobile goals. The invention of the sewing machine and industrialization made ties even more accessible, since they were no longer handmade items but were instead mass produced.

Its Famous Knot Was Invented in the 1800s

Increased availability coupled with its continued popularity among London’s most influential young gentlemen ensured the tie’s enduring place in men’s fashion. In fact, the Four-in-Hand Knot, which remains one of the most well-known tie knots today, was actually created all the way back in the late 1800s! Since the knot resembled the way a driver would tie his reigns to direct a carriage pulled by four horses, the name was a natural fit. There’s also a theory that the knot is named after London’s famous gentlemen’s carriage-driving club, the Four in Hand, who helped popularize the new knot. In addition to the Four-in-Hand knot, another neck accessory, the ascot, also became popular around this same time (it is named for the Royal Ascot horse race). The well-known bow tie also emerged as a fashion favorite during this period, although it was first invented at the beginning of the 18th century.

It Was Modernized in the 1920s

In the 1920s, a New York tie maker named Jesse Langsdorf came up with a new way of cutting the fabric when creating a tie. He cut the fabric on an angle and then sewed it in three segments. Unlike older versions of the necktie, Langsdorf’s innovation allowed a tie to spring back to its original shape after being worn. This made tie wearing even more accessible for men from all walks of life. And indeed, the popularity of Langsdorf’s necktie endured; it is the version that men still wear today!

It Gained Variety in the 20th Century

After Langsdorf’s innovation, the basic construction of the tie did not change, but the accessory still underwent many different style periods throughout the 1900s. In 1936, the Duke of Windsor famously created the Windsor knot, which is still worn by many men today. Ties in bright colors and patterns were introduced after the end of World War II and remain popular today as well. In the 1950s, the skinny tie was introduced to the public. Legend has it that the style was invented because tie makers were running short on fabric, but whether or not this explanation is true, it’s certain that the skinnier style complemented the well-tailored suits favored by men in the 1950s. In the 1970s, extra-large ties called kipper ties became popular, as they complemented the dominant style of men’s suits at the time.

It Has an Enduring Legacy

The tie remains as popular as ever in the 21st century. Men commonly wear them to job interviews, at the office, and for important events. Just like in the 19th century, it is thought that wearing a tie may help men from all walks of life convey a sense of professionalism and power. But its modern popularity also has to do with self-expression. As society moved through the close of the 19th century and into the early 20th century, men’s fashion became less extravagant, leaving the tie as one of the only elements of fashion through which men could express their personality. In the 21st century, the tie continues function as a way for men to express themselves through their sartorial choices. Today, ties are available in a variety of widths, lengths, materials, colors, and patterns, so men can choose whatever combination best complements their style and personality.


The History of Shortbread Cookies

H/T Back Then History.

Shortbread has an amazing history.

They Have Medieval Origins

Shortbread cookies are a beloved staple at many family gatherings, including holidays and weddings. But you may be surprised to learn that there’s quite the history behind the buttery treats you love so much! Shortbread cookies are thought to have evolved from a food called medieval biscuit bread. Medieval biscuit bread was a type of twice-baked biscuit made from leftover bread dough and it usually contained some added spices and/or sugar. Medieval biscuit bread, sometimes called rusk, was popular in many European countries during the Middle Ages. Over time, the Scottish began to use butter in place of yeast, and the dry, hard biscuit bread evolved into the crumbly, buttery shortbread cookies we know and love today!

They’ve Been Around for Centuries

A Scottish woman named Mrs. McLintock is credited with writing the first shortbread recipe that appeared in print. It was included in a cookbook in 1736. However, shortbread had become a staple at Scottish family events long before that – some experts estimate that shortbread was being consumed in Scotland as early as the 12th century. Over time, the proportions of one part sugar, two parts butter, and three parts flour became the standard shortbread recipe and many bakers still use this same basic recipe today.

They Were Once Reserved as a Celebratory Treat

Because shortbread cookies rely on ingredients that would have been expensive for most Scottish families to afford, they were reserved for special occasions such as weddings, Christmas, and celebrating the New Year. In Shetland, it was traditional to break shortbread over a new bride’s head on the threshold of her new home, and all across Scotland, shortbread is still offered to the “first footers” when celebrating Hogmanay, the Scottish New Year’s Eve. After the Scottish started making shortbread, it made its way to the rest of the United Kingdom and ultimately all across the commonwealth territories. That’s why today, many families have a tradition of serving shortbread cookies at holiday parties and important life events.

They Were Famously a Royal Favorite

Early shortbread recipes were somewhat different from the shortbread cookies we know and love today. So how did the change take place? Many give the credit to Mary, Queen of Scots. The Queen spent much of her childhood in France and her French tastes influenced the food served in her court. Her chefs created a version of shortbread called “petticoat tails” that the Queen famously loved; the cookies were shaped like pizza slices and included caraway seeds for flavor as well as plenty of butter. The Queen’s preferred shortbread recipe is likely to have been more decadent than earlier versions and may have helped give rise to the buttery and sweet shortbread cookies that we know and love today. The Queen’s fondness for these shortbread cookies also helped to further popularize the Scottish treats and elevated them from their peasant roots to a more elegant food worthy of royalty.

Image Credit:

There Are a Variety of Shapes & Styles

Shortbread generally comes in one of three shapes. The shortbread cookies most favored by Mary, Queen of Scots are called “petticoat tails” – they are made by dividing a large circle of dough into segments like a pizza. Shortbread shaped into thick rectangles (which are sometimes called “fingers”) are a common type still seen today. Individual circles called “shortbread rounds” are another shape that you may be familiar with. Sometimes, these round shortbreads are baked with a traditional design on top that’s reminiscent of the decorations used on the ancient Yule bannock – a round cake linked to sun worship in pre-Christian times. In addition to the different shapes, there are also many regional varieties of shortbread. Some popular regional variations include shortbread made with coriander and caraway seeds; shortbread made with almonds and orange peel; and ginger shortbread. And of course, bakers today often experiment with unique ingredients and flavors to create gourmet versions of these simple treats.

They’re Cleverly Named

Ever wondered why shortbread is called, well, shortbread? It’s actually quite clever! In baking, the term “short” indicates a crisp, crumbly texture. Shortbread is known for this type of texture, so the name makes sense. But there’s also a second layer of meaning. The term “shortening” is sometimes used more generally in baking to mean any type of fat in the recipe; in the case of shortbread, the fat source is butter, and it’s essential for achieving shortbread’s famous crumbly texture. Since fat is such a key ingredient in shortbread, the cookie’s name is thought to be a reference to the fat (or “shortening”) called for in the recipe. It’s also interesting to note that shortening does indeed function as its name implies – it interferes with the formation of gluten strands in pastry dough, making them shorter. These shortened gluten strands result in a pastry with the tender, crumbly texture characteristic of the shortbread cookies we know and love today!

The history of industrial masking tape.

H/T Science Center.

A look back at masking tape.

1920s car painting using masking tape

3M Canada

3M CanadaScience. Applied to Life.

In the hustle and bustle on the plant floor, it’s easy to overlook small details like masking tape. But don’t be deceived.

Behind that humble roll of 3M tape is almost a century of R&D inspired by the people who use it.

Whether you’re baking a coat of paint, bundling cords, or labeling a box, 3M Industrial Masking Tape has been thoughtfully formulated to go on easy, hold on tight, and remove cleanly.

It’s a tradition that has been proudly championed for almost a hundred years, starting with the invention of masking tape in 1925 by 3M’s Richard G. Drew.

William L. McKnight marked the production of 3M's two-billionth role of tape in 1957. McKnight, left, is pictured with Herbert Buetow, 3M president, and Richard G. Drew, inventor of 3M's Scotch brand trans-parent tape and masking tape.William L. McKnight marked the production of 3M’s two-billionth role of tape in 1957. McKnight, left, is pictured with Herbert Buetow, 3M president, and Richard G. Drew, inventor of 3M’s Scotch brand trans-parent tape and masking tape.

Masking tape: created to customize your ride: 1920’s style

If you’d like to appreciate how far 3M Industrial Masking Tape has come, it makes sense to start at the beginning: in 1925.

The ‘roaring 20s’ were a time of prosperity and big changes, including a shift that made automobiles affordable, everyday necessities for the average North American. With the rise in popularity in car ownership, came the desire to jazz up one’s ride with a custom paint job.

However, for those working in paint shops, perfecting the pinstripe without proper tape was a feat. Sometimes the adhesives stuck so firmly that trying to remove them ruined the paint job.

One morning, 3M employee Richard Drew was on a sales call with an auto shop – showing them samples of a waterproof sandpaper, when he heard an outburst of profanity from the shop floor, sparked by a ruined paint job.

Drew pledged that he would create a tape that would solve the problem.

After two years of experimenting with vegetable oils, chicle, linseed, various resins, glue glycerin and treated crepe paper he created Scotch® Brand Masking Tape. It adhered strongly and still came off easily, without damaging paint.

Customer driven innovation continues to push masking tape technology forward.

The world has changed a lot since the 1920’s. In today’s manufacturing landscape, you’re working with high demands, complex procedures, and advanced materials – and time is money.

3M Industrial Masking Tape continues to provide tapes that meet the task.

From high temperature paint baking and pipe coating to marking and temporary holding, there’s a 3M Industrial Masking Tape for your application. And with our line of easy-to-choose Masking Made Simple products, you can select the right tape for your job faster than ever.

The History of the Thermometer

H/T ThoughtCo.

A look at thermometers through the years.

Thermometers measure temperature by using materials that change in some way when they are heated or cooled. In a mercury or alcohol thermometer, the liquid expands as it is heated and contracts when it is cooled, so the length of the liquid column is longer or shorter depending on the temperature. Modern thermometers are calibrated in standard temperature units such as Fahrenheit (used in the United States) or Celsius (used in Canada), or Kelvin (used mostly by scientists).

The Thermoscope

Galileo thermometer
 Galileo thermometer.

Adrienne Bresnahan / Getty Images

Before there was the thermometer, there was the earlier and closely related thermoscope, best described as a thermometer without a scale. A thermoscope only showed the differences in temperatures; for example, it could show something was getting hotter. However, the thermoscope did not measure all the data that a thermometer could, such as an exact temperature in degrees.

Early History

Galileo Galilei (1564-1642), wood engraving, published in 1864
 Galileo Galilei (1564-1642), wood engraving, published in 1864.

 ZU_09 / Getty Images

Several people invented a version of the thermoscope at the same time. In 1593,Galileo Galileiinvented a rudimentary water thermoscope, which for the first time allowed temperature variations to be measured. Today, Galileo’s invention is called the Galileo Thermometer, even though by definition it was really a thermoscope. It was a container filled with bulbs of varying mass, each with a temperature marking. The buoyancy of water changes with temperature. Some of the bulbs sink while others float, and the lowest bulb indicated what temperature it was.


In 1612, the Italian inventor Santorio Santorio became the first inventor to put a numerical scale on his thermoscope. It was perhaps the first crude clinical thermometer, as it was designed to be placed in a patient’s mouth for temperature taking.

Neither Galileo’s nor Santorio’s instruments were very accurate.

In 1654, the first enclosed liquid-in-a-glass thermometer was invented by the Grand Duke of Tuscany, Ferdinand II. The Duke used alcohol as his liquid. However, it was still inaccurate and did not use a standardized scale.

Fahrenheit Scale: Daniel Gabriel Fahrenheit

An old style mercury thermometer, which isn't safe if it breaks, and could be hard to read anyway.
 An old style mercury thermometer, which isn’t safe if it breaks, and could be hard to read anyway.

What can be considered the first modern thermometer, the mercury thermometer with a standardized scale, was invented by Daniel Gabriel Fahrenheit in 1714.

Daniel Gabriel Fahrenheit was the German physicist who invented the alcohol thermometer in 1709 and the mercury thermometer in 1714. In 1724, he introduced the standard temperature scale that bears his name—Fahrenheit scale—that was used to record changes in temperature in an accurate fashion.

The Fahrenheit scale divided the freezing and boiling points of water into 180 degrees; 32 degrees was the freezing point of water and 212 degrees was its boiling point. Zero degrees was based on the temperature of an equal mixture of water, ice, and salt. Fahrenheit based his temperature scale on the temperature of the human body. Originally, the human body temperature was 100 degrees on the Fahrenheit scale, but it has since been adjusted to 98.6 degrees.


Centigrade Scale: Anders Celsius

Anders Celsius portrait in full color.

Public Domain

The Celsius temperature scale is also referred to as the “centigrade” scale. Centigrade means “consisting of or divided into 100 degrees.” In 1742, the Celsius scale was invented by Swedish Astronomer Anders Celsius. The Celsius scale has 100 degrees between the freezing point (0 degrees) and boiling point (100 degrees) of pure water at sea level air pressure. The term “Celsius” was adopted in 1948 by an international conference on weights and measures.


Kelvin Scale: Lord Kelvin

Frost-covered statue of Lord Kelvin
 Frost-covered statue of Lord Kelvin.

Jeff J Mitchell / Getty Images

Lord Kelvin took the whole process one step further with his invention of the Kelvin Scale in 1848. The Kelvin Scale measures the ultimate extremes of hot and cold. Kelvin developed the idea of absolute temperature—called the “Second Law of Thermodynamics—and developed the dynamical theory of heat.


In the19th century, scientists were researching what the lowest temperature possible was. The Kelvin scale uses the same units as theCelsius scale, but it starts atAbsolute Zero, the temperature at which everything, including air, freezes solid. Absolute zero is 0 degrees Kelvin, which is equal to minus 273 degrees Celsius.


When a thermometer was used to measure the temperature of a liquid or of air, the thermometer was kept in the liquid or air while a temperature reading was being taken. Obviously, when you take the temperature of the human body you can’t do the same thing. The mercury thermometer was adapted so it could be taken out of the body to read the temperature. The clinical or medical thermometer was modified with a sharp bend in its tube that was narrower than the rest of the tube. This narrow bend kept the temperature reading in place after you removed the thermometer from the patient by creating a break in the mercury column. That is why you shake a mercury medical thermometer before and after you use it to reconnect the mercury and get the thermometer to return to room temperature.


Mouth Thermometers

Woman with thermometer in her mouth

Larry Dale Gordon / The Image Bank / Getty Images

In 1612, the Italian inventor Santorio Santorio invented the mouth thermometer and perhaps the first crude clinical thermometer. However, it was both bulky, inaccurate, and took too long to get a reading.


The first doctors to routinely take the temperature of their patients were Hermann Boerhaave (1668–1738); Gerard L.B. Van Swieten (1700–1772), founder of the Viennese School of Medicine; and Anton De Haen (1704–1776). These doctors found temperature correlated to the progress of an illness. However, few of their contemporaries agreed, and the thermometer was not widely used.


First Practical Medical Thermometer

Medic screening temperature of coronavirus patient with digital thermometer
 Modern digital thermometers all descend from the first medical thermometer invented by Sir Thomas Allbutt.

narvikk / Getty Images

English physician Sir Thomas Allbutt (1836–1925) invented the first practical medical thermometer used for taking the temperature of a person in 1867. It was portable, 6 inches in length, and able to record a patient’s temperature in 5 minutes.


Ear Thermometer

Mother taking young boy's temperature with ear thermometer

Thanasis Zovoilis / Getty Images

Theodore Hannes, a pioneering biothermodynamics scientist and flight surgeon with the Luftwaffe during World War II, invented the ear thermometer. David Phillips invented the infrared ear thermometer in 1984, the same year that Dr. Jacob Fraden, CEO of Advanced Monitors Corporation, invented the popular Thermoscan Human Ear Thermometer.


The History of Toilet Paper

H/T Backthennews.

People went temporarily insane in March 2020 hording toilet paper.

Early Solutions

Before toilet paper was invented, humans used a variety of natural materials to clean themselves. Depending on climate, resources, and hierarchical customs, different cultures used different items. Some of the more widespread options included stones, animal furs, leaves, moss, and sponges. Many cultures also simply sluiced themselves clean with water or snow. Using cloth was relatively rare, since at the time all cloth would have been handmade, making it an expensive item – only the richest of citizens would have used cloth for such a lowly purpose. The Romans used sponges on a stick in their latrines; however, it is unclear if these devices were meant for cleaning the person or simply used functioned as a toilet cleaning brush. The Chinese used spatulas made from bamboo and other woods to clean up after themselves; various examples have been found in the latrines at Xuanquanzhi, a former Han Dynasty military base located along the Silk Road in China.

The Introduction of Paper

The Chinese invented paper in the second century B.C. However, it wasn’t until the 6th century that paper started to be used for personal cleaning. It caught on quickly and sometime around the 14th century, toilet paper manufacturing reached peak levels. In the Western world, paper became available sometime around the 15th century, but commercially available toilet paper didn’t originate until much later. Up through the 1700s in America, people commonly used corn cobs, discarded magazines, and newspapers as makeshift toilet paper. The Sears Catalog was especially well-known for this use. In fact, nailing magazines and catalogs to the outhouse wall was so common that Farmer’s Almanac began pre-drilling a hole in their publication for easy hanging in 1919 as a nod to this practice.

The Invention of Toilet Paper

Modern, commercially available toilet paper originated in America in 1857 when Joseph Gayetty created a brand new product, which he called “Medicated Paper, for the Water-Closet.” He claimed the product helped to prevent hemorrhoids. It was first sold in packages of 500 sheets for 50 cents. Then in 1890, Clarence and E. Irvin Scott successfully popularized toilet paper on a roll. However, it was an uphill battle to get Americans to accept toilet paper, mostly due to cultural embarrassment. It wasn’t until the end of the 19th century, when homes with indoor plumbing gained popularity, that the product really hit its stride. Why? Toilet paper met the newfound need for a disposable solution that could be flushed away without damaging indoor plumbing systems.

Softness and Other Improvements

For a while, toilet paper was advertised as a medicinal product. It wasn’t until the 1930s that softness became a factor. In fact, it took until the third decade of the 20th century for toilet paper to finally be manufactured as “splinter free” – yikes! Credit for the innovation of softer toilet paper belongs to the Hoberg Paper Company of Wisconsin, who rolled out their softer toilet paper in 1928 and used elegant, ladylike packaging to appeal to skittish Americans. Due to their “charming” packaging, the brand came to be known as Charmin, and it remains one of the most popular toilet paper brands on the market today.

Shortages & Hoarding

We all remember the famous toilet paper shortage that started in March 2020 in response to coronavirus-fueled panic buying. But did you know that the 2020 toilet paper shortage was not America’s first? In December of 1973, Johnny Carson made a joke about a toilet paper shortage during his opening monologue on The Tonight Show. While it was meant to be humorous, Americans panicked and ran out to purchase as much toilet paper as they could. This reaction demonstrates just how integral toilet paper had become to Americans in the 20th century. And that still holds true today. In fact, America spends over $6 billion on toilet paper today – more than any other nation – and the average American uses 50 pounds of the product each year!

The History of Hot Chocolate


There is an interesting hot chocolate recipe at the end of this story.

Plus How Our Chocolate Expert Makes His

If I were to ask you to describe the physical characteristics of chocolate, chances are you might think of a dark, shiny and brittle bar that slowly melts in the mouth. Perhaps you might immediately associate its rich flavor baked into a brownie or concealed within a creamy bonbon. You wouldn’t be wrong, of course, as chocolate has found its way into countless applications — a sweet shape shifter that pairs perfectly with our favorite flavors. That hasn’t always been the case.

For much of its history, chocolate wasn’t something we would eat out of hand or find in a dessert recipe.

Hot Chocolate


The modern chocolate bar didn’t emerge until the mid-1800s, when technology and inventiveness converged. When Casparus van Houten developed the cocoa butter press in the 1820s, he was originally after the pressed solids — the cocoa butter (the fat that makes up over 50 percent of a cocoa bean) was merely a by-product. It would be many years before a chocolate maker (most likely the Fry family in England) would come up with the idea to add some of that extra cocoa butter back into ground cocoa beans and sugar.

Aztec woman pouring chocolate
An Aztec Woman Pouring Chocolate

At this point, chocolate began to resemble what we think of today, and its texture and flavor would evolve further as the industrial revolution continued in the decades to follow. Before that breakthrough? When one mentioned chocolate, they were really referring to a beverage. We can trace the history of chocolate back thousands of years to the Olmec, Mayan and Aztec cultures of present-day Mexico and Central America.

These early chocolate makers cultivated the cacao tree, ultimately rendering the seeds of its fruit (the bean) into a drink. What these cultures enjoyed, however, bore little resemblance to a package of Swiss Miss. For starters, it wasn’t served hot, and most likely unsweetened, rather made with water and flavored with spices and flowers, then made frothy by repeatedly pouring from one vessel into another.

The beans themselves were of great value and a significant staple crop, though most historians suggest that it was only enjoyed by a few, and not necessarily a part of the average person’s diet, rather used primarily for medicinal and ceremonial uses. Most culinary applications — even savory mole — appeared much later. After the Spanish conquered the birthplace of chocolate in the 1500s, it would undergo further changes as it made its way to European drinkers.

The first to adapt the Aztec beverage were likely the missionaries tasked with “converting” the indigenous people. By the time chocolate took hold back in Spain, it would evolve into something recognizable today — served warm, sweetened and whipped to a froth using a wooden molinillo. It remained, however, a treat for nobility, as it slowly spread throughout Europe.

This growing taste for chocolate, which would become a beverage on par with tea or coffee, also led to its cultivation in European colonies in tropical zones throughout the world. For two centuries, its popularity surged but remained something not to eat, but to drink. 

When van Houten sought to remove cocoa butter from the preparation, his goal was to make a lighter beverage, with much of its fat removed — what many at the time referred to as digestible cocoa. Soon after, digestible cocoa became increasingly accessible to a wider audience, taken in the morning or in the afternoon as a pick-me-up.

Huylers Bean-to-Cup Chocolate Trade Card
Huylers Bean-to-Cup Chocolate Trade Card

Chocolate would also be touted for various health benefits and considered a gentler alternative to its cousin, coffee. As chocolate culture progressed, it did of course find its way into bar form (and then confections and baked goods) in the mid-1800s. By the turn of the 20th century, cocoa and chocolate were firmly embedded into our daily regimen.

My own research into chocolate history has led to some interesting discoveries: colorful Victorian-era cocoa tins decorated with imagery of cacao pods, and even references to “bean-to-cup,” foreshadowing the “bean-to-bar” term we now use more than a hundred years later. All of this research of chocolate’s history has renewed my own interest in its drinkable form. I’ve been studying both ancient recipes and its more familiar adaptations.

As the weather turns, I can’t think of a better way to warm up than with a frothy cup of hot chocolate, while quietly considering the complex journey this magical bean has made over the centuries. Below, my favorite modern recipe, inspired by Mexican-style chocolate prepared today, is deep in chocolate flavor with subtle accents of unrefined sugar, warm spices and a touch of heat from dried smoked chile.  

Hot Chocolate

Yield: 8 servings


  • 1 quart (950 grams) whole milk
  • ¼ cup (60 grams) heavy cream
  • 1 cup (200 grams) grated panelapiloncillo, or light brown sugar
  • ½ teaspoon (2 grams) salt
  • 2 sticks whole cinnamon
  • 2 pieces whole star anise
  • ½ teaspoon (2 grams) powdered chipotle morita (or to taste)
  • 1 vanilla bean, split and scraped
  • 7 ounces (200 grams) dark chocolate, roughly chopped


  1. Combine the milk, cream, sugar, salt, spices, and the vanilla bean in a medium sauce pan and bring to a boil over medium heat. Reduce heat to low and hold at a bare simmer, stirring occasionally, for five minutes.
  2. Whisk in the chopped chocolate and continue to simmer an additional five minutes. Remove the vanilla bean and the whole spices. Blend well with an immersion blender to create a froth and serve immediately.

The History of Hardware Tools


Have you ever wondered who invented a particular tool you are using?

Hopefully this story will answer your questions about tools.

Who Invented Wrenches, Gauges and Saws?




























The History of Neon Lights


A look back through the sands of time at neon lights.

The 1920s through the 1960s are considered the Golden Age of Neon in America. During this time, businesses across the country lit up their storefront windows with the glow of brightly colored neon signs. A new technology at the time, neon lights were considered magical and mesmerizing. So, where did these glowing signs come from and how did they get so popular?

Initial Creation

Neon lights were created by a French engineer named Georges Claude. He demonstrated his invention for the first time at the Paris Motor Show in 1910. Claude had previously created a method for liquifying air in 1902, and neon was a byproduct of the air liquifaction process he had developed. With the large quantities of neon gas he was producing at his air liquefaction business, he was able to create his neon lights.

Claude’s design was based on the Geissler tube, an electrified glass tube that contained a rarefied gas and was invented in 1855 by Heinrich Geissler, a German glassblower and physicist. To make his famous neon lights, Claude filled a glass tube with his newly created rarefied neon gas. Then he passed an electrical current through the rarefied neon gas, which caused it to glow and emit a dark orange light.

Introduction of Different Colors

Because neon gas was the first type of gas that Claude used in his lights, the term “neon lights” was established. However, not all neon signs use neon gas. In fact, as his lights gained popularity, Claude began using different rarified gasses and fluorescent coatings with his tubes in order to create different colors. Some of the gasses he used included:

  • Hydrogen to produce red light
  • Helium to produce yellow lights
  • Carbon dioxide to produce white lights
  • Mercury to produce blue lights
Image Source:

Glass Bending

But how did these straight glass tubes of gas become the twisty, glowing signs we are all so familiar with? Once the tubes were filled with the appropriate gas or gases, the ends had to be sealed off with metal electrodes. Then the glass would be heated just enough to make it flexible. Handheld tools were then used to carefully bend the tubes into the desired shape. Usually, neon signs would spell out business names or basic logos. Many of them also proclaimed generic messages like, “Open!” or “Vacancy!”

Popularity in Paris and America

The first product to advertise using a neon sign was Cizano in Paris in 1913. Then in 1919, the Paris Opera acquired their own neon sign. From there, the neon trend took off and Paris was aglow with neon lights. In 1923, neon lights were introduced to the United States. Claude’s company, Claude Neon, sold the first two neon signs in America to a Packard car dealership in Los Angeles. Soon after, neon signs exploded in popularity. Americans embraced neon lights in a way that no other country did. New York City’s Times Square was especially well known for using the new technology. Although neon lights were expensive, businesses considered them a novelty that caught consumer’s attention and helped them stay competitive. People would stop and stare at the glowing light, which was often referred to as “liquid fire.”

A Lingering Fascination

While neon lights eventually gave way to more efficient LED lighting, they still carry with them a certain fascination. Today, only about 18% of signs are neon lights, while 40% now use LEDs. However, neon lights are still visible in some storefronts and remain especially popular among businesses wishing to evoke nostalgia for the 1940s and 1950s in America.

The History of the Frisbee


Until now I never knew where the name Frisbee came from.

An Idea Is Born

Fred Morrison and his wife Lu often made a game of tossing upside-down cake pans between them on the beach. A stranger saw them playing and offered Fred a quarter for the cake pan. At the time, cake pans only cost five cents, so Morrison realized that there was a profit to be made. He joined up with a business partner, Warren Franscioni, and began to work on creating a plastic version.

Flying Discs & UFOs

Since UFOs were a hot topic at the time, Morrison modelled his first plastic disc after a flying saucer. By 1948, Morrison had created his first product, which he called the “Flyin’ Saucer.” By the 1950s, he had reworked the design again and renamed his toy the “Pluto Platter.” He chose space-themed names in hopes of capitalizing on the UFO craze happening after the 1947 sightings in Roswell, New Mexico.


Image Source:

Wham-O Buyout

A strong salesman, Morrison went around to fairs and shows and demonstrated his flying discs to attendees, but it wasn’t until the late 1950s that the toy really took off. In 1955, the founders of the Wham-O toy company, Arthur “Spud” Melin and Richard Knerr, saw Morrison’s flying disc. In 1957, they purchased the rights to Morrison’s toy. Wham-O changed the name to “Frisbee” and began selling the rebranded toy in 1958. Sales skyrocketed, reaching 100 million before Mattel eventually bought out Wham-O.

What’s in a Name?

Where did Wham-O get the name Frisbee? It comes from an unlikely source! College students in New England used to toss around pie pans from the Frisbie Baking Company in Bridgeport, Connecticut – similar to the way Morrison and his wife used to toss a cake pan back and forth on the beach. At the time, the company stamped all their pie pans with the phrase, “Frisbie’s Pies.” Melin and Knerr had heard college students using the term “Frisbie” to refer to the cake pans and realized they could use a similar term for their flying discs. They changed the spelling to “Frisbee,” and the toy we know and love today was born!