A Brief History of the Rubber Band

H/T Gizmodo.com.

The rubber band has a very interesting history.

Cheap, reliable, and strong, the rubber band is one of the world’s most ubiquitous products. It holds papers together, prevents long hair from falling in a face, acts as a reminder around a wrist, is a playful weapon in a pinch, and provides a way to easily castrating baby male livestock… While rubber itself has been around for centuries, rubber bands were only officially patented less than two centuries ago. Here now is a brief history of the humble, yet incredibly useful, rubber band.

It has only recently been discovered that Mesoamerican peoples (which includes Aztecs, Olmecs, and Mayans) were making rubber (though they didn’t call it this) three thousand years ago. Mixing milky-white sap known as latex from the indigenous Hevea brasiliensis trees (later called Para rubber trees) with juices from the morning glory vines, they could create a solid that was, surprisingly, quite sturdy. The civilizations used this ancient rubber for a variety of purposes, from sandals to balls to jewelry. In fact, while Charles Goodyear is generally credited with the invention of vulcanized rubber (a more durable and non-sticky rubber compound via the addition of sulfur and heat), it seems that the Aztecs were simply varying the ingredient proportions (between the latex and the morning glory juice) to create different variations in strength.

When Spanish explorers arrived in South America in the 16th century, they discovered for themselves the many uses of this elastic, malleable sap. When the French explorer Charles de la Condamine “discovered” it in the 1740s, he called it “caoutchouc”, a French word, but a variation on the South American word for latex. In attempting to figure out what it was exactly, Condamine came to a wrong conclusion – he thought it was condensed resinous oil. The name “rubber” was only attributed to this latex material when, in 1770, the famed British chemist Joseph Priestley (who also discovered oxygen) noted that the material rubbed pencil marks right off paper, thereby inventing the eraser and giving the “rubbing material” a name. By the end of the 18th century, the material was forever known as “rubber.”

In 1819, Englishmen Thomas Hancock was in the stagecoach business with his brothers when he attempted to figure out better ways to keep his customers dry while traveling. He turned to rubber to develop elastic and waterproof suspenders, gloves, shoes, and socks. He was so enamored with the material that he began to mass produce it, but he soon realized he was generating massive amounts of wasted rubber in the process. So, Hancock developed his “Pickling machine” (later called a masticator) to rip up the leftover rubber into shreds. He, then, mashed the malleable rubber together, creating a new solid mass, and put it into molds to design whatever he wanted. One of his first designs were bands made out of rubber, though he never marketed or sold them, not realizing the practically of rubber bands. Plus, vulcanization hadn’t been discovered yet (which we will discuss in a moment), so the bands would soften considerably on hot days and harden on cold days. In short, these rubber bands simply weren’t very practical at this stage of the game, in terms of many of the types of things rubber bands would later be used for. Hancock didn’t patented his machine or the shreds of rubber it produced, instead hoping to keep the manufacturing process completely secret. This would end up being a rather large mistake.

In 1833, while in jail for failure to pay debts, Charles Goodyear began experimenting with India rubber. Within a few years, and after he got out of jail, Goodyear discovered his vulcanization process. Teaming with chemist Nathaniel Hayward, who had been experimenting with mixing rubber with sulfur, Goodyear developed a process of combining rubber with a certain amount of sulfur and heating it up to a certain point; the resulting material became hard, elastic, non-sticky, and relatively strong. A few years later, in 1844, he had perfected his process and was taking out patents in America for this process of vulcanization of rubber. He then traveled to England to patent his process oversees, but ran into a fairly large problem – Thomas Hancock had already patented the nearly identical process in 1843.

There seems to be conflicting reports on whether Hancock had developed the vulcanization process independently of Goodyear or if, as many claim, that he had acquired a sample of Goodyear vulcanized rubber and developed a slight variation on the process. Either way, Hancock’s patent stopped Goodyear from being able to patent his process in England. The ensuing patent battle dragged on for about a decade, with Goodyear eventually coming to England and watching in person as a judge proclaimed that, even if Hancock had acquired a sample prior to developing his own process for this type of rubber, as seems to have been the case, there was no way he could have figured out how to reproduce it simply by examining it. However, famed English inventor Alexander Parkes claimed that Hancock had once told him that running a series of experiments on the samples from Goodyear had allowed him to deduce Goodyear’s, at the time, unpatented vulcanization process.

But in the end, in the 1850s the courts sided with Hancock and granted him the patent, rather than Goodyear, quite literally costing Goodyear a fortune; had they decided otherwise, Goodyear would have been entitled to significant royalties from Thomas Hancock and fellow rubber pioneer Stephen Moulton.

Though he had a right to be bitter over the ruling, Goodyear chose to look at it thusly, “In reflecting upon the past, as relates to these branches of industry, the writer is not disposed to repine, and say that he has planted, and others have gathered the fruits. The advantages of a career in life should not be estimated exclusively by the standard of dollars and cents, as is too often done. Man has just cause for regret when he sows and no one reaps.”

Goodyear, though eventually receiving the credit he deserved, died in 1860 shortly after collapsing upon learning of his daughter’s death, leaving his family approximately two hundred thousand dollars in debt (about $5 million today).

The patent dispute with Goodyear also had a profound, ultimately negative, effect on Hancock as well. As he was entangled in the time-consuming mess for years, others began to reap the benefits on Hancock not patenting his masticator process nor patenting the seemingly useless bands that they created. Specifically, in 1845, Stephen Perry, working for Messers Perry and Co, Rubber Manufacturers of London, filed a patent for “Improvements in Springs to be applied to Girths, Belts, and Bandages, and Improvements in the Manufacture of Elastic Bands.” He had discovered a use for those rubber bands – holding papers together. In the patent itself, Perry distances himself and his invention from the ongoing vulcanized rubber dispute by saying,

“We make no claim to the preparation of the india rubber herein mentioned, our invention consisting of springs of such preparation of india rubber applied to the articles herein mentioned, and also of the peculiar forms of elastic bands made from such manufacture of india rubber.”

While the rubber band was invented and patented in the 19th century, at this point it was mostly used in factories and warehouses, rather than in the common household. This changed thanks to William Spencer of Alliance, Ohio. The story goes, according the Cincinnati Examiner, that in 1923, Spencer noticed the pages of the Akron Beacon Journal, his local newspaper, were constantly being blown across his and his neighbors’ lawns. So, he came up with a solution for this. As an employee of the Pennsylvania Railroad, he knew where to acquire spare rubber pieces and discarded inner tubes – The Goodyear Rubber Company also located in Akron. He cut these pieces into circular strips and began to wrap the newspapers with these bands. They worked so well that the Akron Beacon Journal bought Spencer’s rubbers bands to do the deed themselves. He then proceeded to sell his rubber bands to office supply, paper goods, and twine stores across the region, all the while continuing to work at Pennsylvania Railroad (for more than a decade more) while he built his business up.

Spencer also opened the first rubber band factory in Alliance and, then, in 1944 the second one in Hot Springs, Arkansas. In 1957, he designed and patented the Alliance rubber band, which ultimately set the world rubber band standard. Today, Alliance Rubber is the number one rubber band manufacturer in the world, churning out more than 14 million pounds of rubber bands per year.

So, next time you are shooting a friend with this little elastic device, you can thank the Mayans, Charles de la Condamine, Thomas Hancock, Charles Goodyear, and William Spencer for the simple, yet amazingly useful rubber band.

A Brief History of the Sewing Needle

H/T sfneedleworkanddesign.org

I did not know that needles dated back 60,000 years.



The needles we use here at SNAD are made of steel, copper, and a thin layer of gold or silver to avoid rust or corrosion. The modern embroidery needle, made of combinations of different metals, however, is just the most recent in a long (and we mean really long) history of needle development.

The oldest needle we know of dates back around 60,000 years ago: a human-constructed, animal (most likely bird) bone needle found in South Africa. Other needles made of bone and ivory have been discovered in Slovenia, Liaoning, China, and Russia, dating back to between 45,000 and 30,000 years ago. The first needle with an eyelet dates to around 25,000 years ago.

Although these artifacts originated in varying climates and cultures, they point to a time when modern humans were evolving away from their evolutionary ancestors. Armenian copper needles, for example, which date to around 7,000 BC, mark the development of metal harnessing, a major development in human technology. Early sewing needles, on the other hand, were crucial in the survival of the human species, helping early humans construct more fitted clothing made of animal furs and skins to protect themselves from the elements during the most recent ice age.

The use of needles in the arts, which evolved from the more practical need to sew, has a more contested beginning. The earliest known example of embroidery was found in Russia, dating to around 30,000 years ago. However, it is widely accepted that embroidery first developed in South/Central Asia and the Middle East. Text documentation from China during the Warring States Period, around 220 BC, describes the practice of ‘making decorations with a needle’, or iuhua/zhahu, as an ancient tradition. The earliest existing example of Chinese silk embroidery comes from a tomb in Mashan in Hubei Province, dating to around the 4th century BC, though physical evidence of embroidery in China dates back centuries.

Nowadays, we can order needles online from companies all over the world, specializing in styles or particular needle content (our steel, copper, silver and gold needles are highly regulated in composition). Our access to the tools that fit our precise needs allows us to focus on our craft, and continue to improve the products we make.


A Brief History of Gummy Bears

H/T Bonappetit.

Gummy Bears either you like them or hate them.

From actual bears to every kid’s favorite Halloween get, the sweet, chewy history of the gummy bear.

Whether you call them “gummy” or “gummi,” whether you prefer bears or worms, whether your loyalty lies with Haribo or Black Forest, there’s no denying that the gelatinous, rainbow-colored candies most of us first came to know and love simply as “gummy bears” are one of the world’s most popular confections. Sure, chocolate bars (and the many variations thereof) remain the top-selling treats across the globe, but how many cocoa-based snacks inspired a hit animated TV series in the 1980s (Disney’s The Adventures of the Gummi Bears), a song with over 45 million hits on YouTube (The Gummy Bear Song), and played a pivotal role in the plot of an award-winning Broadway musical about a transgender East German rock singer (Hedwig and the Angry Inch)?

Indeed, “gummies” (for lack of a better all-encompassing term defining the vast array of available adaptations on the original bear) surely have one of the most devoted followings of any candy in history; to know a gummy lover is to recognize both the gleam of greedy, fiendish glee that will appear in his or her eyes whenever some new form of gummy is discovered (“Oh my god, this store has gummy Smurfs!”) and the inner peace that can only be gained from a generous portion of an old favorite. (“There, there–eat your bag of Gold-Bears and you’ll feel much better.”) Yes, it’s a rabid and ever-expanding fan base; in fact, according to Haribo, if you laid all the Gold-Bears produced in a year head to toe, they would form a jiggly, tooth-decaying ring around the earth four times.

And to think it all started with a poor German factory worker, a bag of sugar, and a dream.

“Dancing Bears” and Wartime Economy

In 1920, Hans Riegel of Bonn, Germany, became frustrated with his dead-end job as a confectionary worker and started his own sweets company, making hard, colorless candies using a copper kettle and marble slab in his kitchen. His bicycle-riding wife was the sole delivery person. The name of his new business was a combination of the first two letters of his own first and last names and hometown: Hans Riegel of Bonn=Haribo.

The hard candies sold fairly well at local street fairs, but not as well as Riegel had hoped. Then, after a couple of years, Riegel hit upon what would prove to be a genius idea: He produced a line of soft, gelatin-based, fruit-flavored treats in the shape of dancing bears (then a popular diversion at festivals in Europe). But while Riegel is often credited as the inventor of gummy candy, he actually just improved upon an already successful, centuries-old, formula.

“Gummy candies descend from Turkish delight and even Japanese rice candy,” says candy historian Beth Kimmerle, author of Candy: A Sweet History. “But both of those are typically made with rice or corn starch versus gelatin.”

And when your kids plead that you ought to let you them eat gummy candies instead of the rest of their pickled vegetables because they’re both a kind of nutrition, well, they’ve actually got a point, historically speaking.

“Cooking sugar along with fruit has long been a way to preserve or store summer’s harvest,” Kimmerle says. “So technically gummy candies are also cousins of jams and jellies.”

As for what was happening on the gummy-candy timeline, when Riegel went into business, “one of the more popular pre-gummy-bear gummies at the time would have been wine gums,” Kimmerle says.

Gelatin-based chews originating in Great Britain in 1909, wine gums (which contain no alcohol, despite their name), like generic gumdrops, Jujubes (1920) and Chuckles (1921), predate Riegel’s dancing bears. However, starch-based Jujubes and pectin-based Chuckles lacked the precisely satisfying chewy texture of Riegel’s sweet creatures, and none of these candies offered the same brand of zoo-animal whimsy. As one might expect, the Tanzbären (“dancing bears”) were an instant hit with local tots; by the start of World War II, the future candy superpower had over 400 employees producing ten tons of candy each day.

However, also as one might expect, Haribo took quite a hit during the war: Hans Riegel died in 1945, and his two sons, Hans Jr. and Paul, were taken as prisoners of the Allied forces. By the time Paul and Hans were released, there were only about 30 employees working at the company.

But the sons didn’t let that discourage them from rebuilding their late father’s empire: Within five years time, Haribo had 1,000 workers in its employ, with Paul overseeing production and Hans Jr. at the helm as CEO, focusing on sales and marketing (the slogan “Kids and grown-ups love it so, the happy world of Haribo!” was his brainchild).


Trolli’s Gummy Worms

“Rubber Bears” and World Domination

Popular as Haribo’s fun, fruity teddies had become in the years following the war, the bears themselves were still in the process of becoming the iconic animals we scarf down by the handful today. At first, the bears were taller and slimmer, looking more like, well, actual critters you might see in the wild (or dancing at a festival). It wasn’t until 1960, when Hans and Paul began mass marketing the bears for a broader European market, that Haribo started producing the squatter, smushier, ostensibly more kid-friendly Gummibärchen (“little gummy bears”). In 1975, Haribo trademarked the term “Goldbären” globally. (The name is a play on the German words for “gold” and “cute.”)

And just in time, too. Thanks to German-language teachers in U.S. high schools dispensing gummy bears in classrooms so their students could sample foreign cuisines, and American servicemen bringing gummy souvenirs from overseas for their families, the demand for Gold-Bears in this country was growing. Naturally, professional sugar pushers looking to create a similar cash cow (or bear, as it were) had starting making their own versions of Haribo’s best-selling item: The American Jelly Belly Company (previously The Herman Goelitz Company) came out with a gummy bear in 1981, the same year Trolli launched gummy worms. In 1982, Haribo, which had been selling Gold-Bears through U.S. distributors, astutely decided it was time to open up its first American office and staked its claim in Baltimore (the branch is still in operation today).

So began the now decades-long debate over which was the superior Gold-Bear: German or American? (Not to mention the many debates over the merits of Gold-Bears versus Black Forest, Heide, Jelly Belly, and the countless other competitors who would crop up over the years.) Many insist to this day that the German version is better, with more “real fruit” taste, a chewier consistency, and one extra type of bear (apple! The rest are raspberry, orange, lemon, pineapple, and strawberry, in case you were wondering). Of course, there are also those rare gummy fans who prefer Trolli’s mouth-puckering, neon-colored worms or even the more subdued, somewhat unidentifiable flavors of Black Forest, for example. Perhaps this wide range of public opinion and appetite is the reason why no legal disputes over the origin or image of the gummy bear have been recorded, save one between Haribo and the chocolate company Lindt over the latter’s lookalike gold-foil-wrapped chocolate bear (Haribo won in 2012).

Still, as the confection’s creator (with a closely guarded secret recipe), Haribo has secured its status as one of the leading gummy manufacturers in the world*–*currently, they produce 100 million Gold-Bears every single day.

Haribo Gummy Worms

“I believe Haribo’s popularity has to do with their texture and flavors,” Kimmerle says. “In my opinion, Haribo gummy candies are more firm and the flavors more…sophisticated. Haribo’s bears pack more nuanced flavor, and the bite is better; it makes them more satisfying. The others offer sort of indistinguishable flavors and are mostly just sweet. Shape isn’t everything. Any company can make a starch-molded gummy bear, but Haribo’s has bold flavor and offers a good chew.”

Which is not to say that Haribo stopped at bears. Among the top ten most popular gummy products sold by the company are Happy Cola gummy cola bottles, gummy frogs, gummy peaches, gummy raspberries, and gummy peaches (not quite as popular? Haribo’s “A… Mit Ohren,” or “Bums With Ears”). Hans Riegel Jr. ran the company until his death in 2013, when two nephews took over, ensuring that Haribo would remain a family-run business (Hans Riegel Jr. famously turned down a huge buyout offer from Warren Buffett in 2008, claiming, “Money was never my motivation. I don’t even know when I made my first million.”).

Gummies Take Hollywood

Even Gold-Bear purists, however, have to marvel at the humble gummy’s place in pop culture. From gummy fast food (pizzas and hamburgers and French fries and even sushi) to gummy body parts to vegan, gelatin-free sour gummy worms to gummy vitamins, it seems there are fewer things that haven’t been gummified than have. And those are just the edible results of our nearly century-long obsession with gummy treats. They’ve made an almost equally sizable mark on entertainment, furnishings, jewelry, clothes, toys…the list is endless. According to rumor, the aforementioned Disney series The Adventures of the Gummi Bears came about because former CEO Michael Eisner’s young son had an affinity for the treats; in Hedwig, the title character’s first encounter with gummy bears from the U.S. (sweeter, less complicated) symbolizes the American Dream for a young man trapped behind the Berlin Wall. Then there’s the sly, almost subliminal plug for gummy bears hidden in the classic 1986 flick Ferris Bueller’s Day Off.

“My sister-in-law, Polly Noonan, is the actress who offers Principal Rooney a gummy bear at the end of the movie,” Kimmerle says.”Her line (“Gummy Bear? They’ve been in my pocket. They’re real warm and soft.”) ** sealed the cultural fate of the bears for most Gen Xers. They’ve only become more iconic since.”

But when it comes down it, it’s not really about the movies, the cartoons, or even the taste or chew, Kimmerle says. It’s that gummy bears (of all brands) are just so adorable: “I do think the bears have become so significant because they are anthropomorphic. They are so easy to personify and well, love. No other candy is a cute, mini creature quite like a gummy bear.”

To be sure, gummy bears are the only candy charming enough to propel a tune as absurd as “The Gummy Bear Song” to chart-topping status (sample lyrics: “Oh, I’m a yummy, tummy, funny, lucky gummy bear/I’m a jelly bear, cuz I’m a gummy bear”).

But perhaps it’s actually the theme song from The Adventures of the Gummi Bears that says it best:

Gummi Bears
Bouncing here and there and everywhere
High adventure that’s beyond compare
They are the Gummi Bears!



H/T Simply Swim.com.

I did not know the history of swim goggles went back that far in history.

Swimming goggles have been the swimmer’s blessing and curse for as long as we can remember. Swimming without them can mean sore eyes; swimming with them can mean clouded vision, stopping constantly to adjust the headband and goggle eyed when finished (or maybe that’s just me). They seem like such a modern invention in all their plastic glory, so how modern exactly are they?   zoggs_2015_10_09_9341  


It seems unfitting to describe the first goggles as ‘primitive’ when they represent the ingenuity of humanity in finding ways to problem solve. Finding something both waterproof and transparent in the middle ages was a challenge. The first known goggles used for swimming and diving were invented in Persia in the 14th Century. Originally made from polished tortoise shells (hence the transparency) and used by pearl divers, they remained popular for two centuries in the Middle East. There is evidence to suggest that they were imported into Europe, but they did not gain popularity there.  


Polynesian skin divers developed wooden goggles with deep frames that used trapped air to maintain visibility. These goggles were limited in that they could only be used in a downwards position to protect the eyes from sea water (otherwise the trapped air escaped). When European explorers brought glass to Polynesia this was incorporated into the design. However the resulting goggles were not fully waterproof and they were of no value competitively as the lenses were not secure and fell out when turning, or in dives.  


By the early twentieth century the production of goggles moved on in leaps and bounds, in part due to the use of goggles in other industries and the need for improvements to them there. This is best exemplified by the use of goggles in swimming the English/French Channel from 1911 onwards. Thomas ‘Bill’ Burgess is credited with being the first person to use goggles to cross the channel and whilst this is strictly true, he did not actually wear swimming goggles. Instead he used motorcycle goggles; these worked well as he was swimming breast stroke but were not fully waterproof and are indicative of how goggles were evolving. Rather than being designed for use in the water they were designed for use by pilots or drivers and then the technology imported (sometimes unsatisfactorily) for swimmers. In 1916 a patent was given for the production of swimming goggles, but there is no evidence that they were ever manufactured. As such in 1928 when Gertrude Erdle made her channel crossing – the first person ever to do so using front crawl- she too used motorcycle goggles but sealed them with paraffin to ensure they were watertight. In spite of the continued popularity of swimming and the increasing use of front crawl, waterproof goggles designed for swimming were still a long way off.  


There were no advancements during the 1930s in goggles for swimming. Nevertheless in 1940 the American magazine ‘Popular Scientist’ published instructions showing how to produce a pair of wooden goggles. Maybe unsurprisingly these didn’t catch on. During the 1950s open water swimmers including Florence Chadwick used rubber goggles with double lenses. These goggles were large and a bit ungainly but protected the swimmer from saltwater and improved visibility in the sea.  


The sixties brought us sex, drugs, rock and roll and swimming goggles. Individual swimmers started to create their own goggles based around cups and elastic during the decade, but the times they were a changing and manufacturers were becoming aware of a gap in the market. First advertised in Slimming World Magazine in 1968, these early pairs of manufactured goggles were marketed as an aid for swimming training. Available in only one size (and an uncomfortable one at that) and disqualified from use in competitions these did not bring use of goggles to a mass market. But that would change soon; the need was there even if there wasn’t yet a product that fully met that need. 1969 saw the revolutionising of swimming goggles as Tony Godfrey began manufacturing the ‘Godfrey Goggle’. Having tested several plastics, Godfrey settled on polycarbonate for use in his designs as it is light, thin, very hard wearing and shatter resistant. It had not previously been used in sportswear, but following Godfrey’s foresight is now widely used.  


In 1972 Scotland’s David Wilkie was the first Olympian to compete wearing both goggles and a swimming hat. A courageous decision if things had gone wrong, instead he achieved a personal best and a bronze medal: with his glory came the public desire for swimming goggles. Godfrey Goggles were pirated, copied and recreated in a race to match public demand and other goggle manufacturers were accused of ‘borrowing’ Godfrey’s work. Since then goggles have become standard equipment for all swimmers, with improvements to hydrodynamics, UV protection, and anti-fog amongst other things helping swimmers to go faster for longer in the pool.  


It is staggering to think how quickly the production of goggles has increased and how advanced designs and materials are becoming. From the early days where foam backed oval cylinders of plastic were all that was on offer, there is now a dizzying selection of shapes, colours and styles all developed for different uses and different swimmers. In the last 40 years goggle improvements have moved goggles from being a funny looking swimming accessory to being the second necessity in swimming after the swimsuit. It will be interesting to see where research and design lead us next. Maybe in 40 years time we too will be dumbfounded by something simple that changes the way we swim, just as goggles have.    

A Brief History of the Cheez-It

H/T Smithsonian Magazine.

I did not know the Cheez-It cracker had been around 100 years.

America’s iconic orange cracker turns 100 this year

Cheez-It’s 11-month shelf life is impressive, but so is the company’s history. (Kristoffer Tripplaar/Alamy)

Dayton’s historic Edgemont neighborhood is cocooned inside a crook in the Great Miami River, a winding waterway that snakes through the heart of southwest Ohio. Two miles from downtown, with its air of industry, the community hearkens to a time when Dayton was hailed “The City of A Thousand Factories.”


In the early 20th century, inside a foregone factory on the corner of Concord and Cincinnati Streets, Green & Green cracker company cooked up its Edgemont product line, a collection of grahams, crackers and gingersnaps that were shipped across the region. But of the company’s four Edgemont products, only one, in particular, a flaky one-by-one-inch cheese cracker, would revolutionize snack time. On May 23, 1921, when Green & Green decided to trademark the tasty treat’s unique name, the Cheez-It was born.

“In 1921, Cheez-It didn’t mean anything, so Green & Green marketed the cracker as a ‘baked rarebit,’ ” says Brady Kress, president & CEO of Dayton’s Carillon Historical Park, a nationally recognized open-air museum centered on the city’s history of innovation. (Inside Carillon Brewing Company, a fully operating 1850s brewery at the park, costumed interpreters still bake crackers over an open hearth.) “People were familiar with rarebit, a sort of melted cheddar beer cheese spread over toast. Cheez-It offered the same great taste, only baked down into a cracker that will last.”

Cheez-It’s 11-month shelf life is impressive, but so is the company’s history. This month, America’s iconic orange cracker turns 100. But the Cheez-It story stretches even further back than that.

Cheez-It cake.jpeg
The popular online food marketplace Goldbelly offered a limited-edition Cheez-Itennial Cake for a few days this week to celebrate the anniversary. (Kellogg)

In 1841, Dr. William W. Wolf moved to Dayton to practice homeopathy, a branch of alternative medicine that believes in the healing power of food. Hailed Dayton’s “Cracker King,” Wolf concocted the Wolf Cracker, a curious hard-butter snack made for medicinal purposes.

“In the 19th century, crackers were linked to Christian physiology and sectarian medical practitioners,” says Lisa Haushofer, a senior research associate at the University of Zurich’s Institute for Biomedical Ethics and History of Medicine. “Christian physiologists like Sylvester Graham, of Graham Cracker fame, were concerned about a modern diet that contained too many stimulating substances.” (In addition to being a cracker evangelist, Graham was also a pro-temperance Presbyterian minister who preached a vegetarian diet). Wolf echoed Graham’s concerns that food was far too rousing (though Graham also dubiously believed his crackers could cure licentiousness), so he launched the Wolf Cracker Bakery to churn out his wholesome snacks.

“They believed there was too much nourishment per food unit in modern bread, too much excitement,” says Haushofer. “So they recommended grain products made from coarse flour, which, they believed, contained a more natural ratio of nourishing and non-nourishing parts. Crackers were considered health food.”

According to Haushofer, homeopaths at the time were also concerned about digestibility, and since they believed heating food aided digestion, baked Wolf Crackers were just what the doctor ordered. But Wolf’s patients weren’t the only ones after his crackers. What started as a medical remedy soon became a sought-after treat.

In the 1870s, while living on the barren plains of North Dakota, Dayton natives J.W. and Weston Green often longed for a taste of home. “In those days food supplies were both expensive and scarce in that region,” wrote the Dayton Journal Herald in its October 31, 1907, edition, “and the father and son regularly sent back to their old home city, Dayton[,] for those necessities that could not be obtained there. ‘Invariably,’ Mr. Green says, ‘we would include in that order a good supply of … the ‘Wolfe Cracker’ [sic].”

J.W. Green never forgot the savory, buttery, nut-like flavor of Wolf Crackers. In 1897, when Wolf died, Green purchased the Wolf Bakery Company, then enlisted his son, Weston Green, to join him in business. The Greens renamed the enterprise Green & Green Company, and while Wolf’s recipe remained the same, they rebranded the doctor’s famous treat as the “Dayton Cracker.”

By the turn of the 20th century, Dayton held more patents, per capita, than any U.S. city; surrounded by this innovative environment, Green & Green flourished, expanding its operations to nearby Springfield and Lima, and delivering baked goods across southwest Ohio. But soon, the company’s crackers became more than a regional concern. During World War I, Green & Green fired up its ovens for the war effort.

“All our facilities but one little oven that can’t be used for Hard Bread will be speeded up to keep two car loads a day going by express,” read a Green & Green ad in the Dayton Daily News’s July 14, 1918, edition … “that OUR BOYS at the front may have their Fighting Bread.”

Though far less tasty than the Dayton Cracker, Dayton’s Fighting Bread sustained countless soldiers during the Great War. Typically made from salt, flour and water, Hard Bread—also known as hardtack, teeth dullers or jawbreakers—was often soaked in water before being served. If stored improperly, weevils and maggots made Hard Bread their home, prompting soldiers to dub the wartime ration “worm castles.”

“We are mighty glad and proud to be a cog in the big machine that will win the war,” read Green & Green’s ad. However, Doughboys weren’t the only ones helping win the war. “P.S. We could still use a few more women in the packing of Hard Bread.”

During World War I, Green & Green fired up its ovens for the war effort. This ad appeared in the Dayton Daily News‘s July 14, 1918 edition. (Dayton Daily News)

After World War I, Green & Green Company sidelined Hard Bread in favor of more flavorful fare. By Armistice Day, the Dayton Cracker (still made with Wolf’s original recipe) had been baked in Dayton for nearly 80 years. But while the hard butter-cracker was a local treasure, customers yearned for a delicate, flakier treat. Soon, Green & Green launched its Edgemont line, and in 1921, unveiled the “baked rarebit,” known as the Cheez-It.

“Welsh Rarebit, at its most basic form, is essentially a cheese sauce spread on toast,” says Rachael Spears, a living history specialist at Dayton’s Carillon Historical Park. “Some 19th-century English recipes specifically call for cheddar cheese. To this day, Cheez-It still advertises 100 percent real cheese, which draws a connection to its rarebit roots.”

But in 1921, Americans needed more than a novel snack. Following the Great War, the global economy dipped, and American wallets were increasingly thin. “Rarebit is a lesson in frugality,” says Kress. “It’s a nutritious dish that doesn’t cost a lot of money. When it’s baked down into a Cheez-It, it becomes a tasty treat. And just like hardtack, if you store it correctly, it will stay for a very long time. You don’t run the risk of it growing weevils.”

On May 23, 1921, when Green & Green decided to trademark the tasty treat’s unique name, the Cheez-It was born.

In 1915, one pound of Green & Green crackers sold for 10 cents, roughly $2.65 in 2021 dollars. “When Uncle Sam picked men for his army overseas,” read a June 1920 Green & Green ad, “he also picked foods that would keep those picked men robust and healthy—fit for the strenuous duties ahead of them. Just as the crackers for our soldiers kept sweet and fresh in tins, so Edgemont Crackers … keep crisp and creamy in the Family Tin. Ask mother to keep a tin in her pantry.”

Cheez-Its kept Americans fed during the post-war recession, throughout the Roaring Twenties, and at the onset of the Great Depression. But by 1932, Green & Green packed up its last Family Tin and sold the business to Kansas City’s Loose-Wiles Biscuit Company.

In 1947, the Loose-Wiles Biscuit Company became the Sunshine Biscuit Company; in 1996, Keebler acquired Sunshine; and in 2001, Kellogg acquired Keebler.


Cheez-It factory
In this photo from the 1930s, workers at the Sunshine Biscuit Co.in Dayton fill Cheez-It boxes. (From the collections of Dayton History)

“The Cheez-It name has accompanied the baked cracker since its creation in 1921,” says Jeff Delonis, senior director of marketing for Cheez-It. “The original Cheez-It packaging was green and white. In the 1930s, red was introduced into the brand logo, and by the 1940s, the box included the iconic red and yellow-orange colors that remain today. The general shape and look of the cracker has largely stayed the same.”

Cheez-Its may still look the same, but the cracker’s production has soared. Once baked on the corner of Concord and Cincinnati Streets in Dayton’s Edgemont neighborhood, then shipped to regional grocers, Cheez-It sold more than 400 million packages in the U.S. alone last year.

“It’s super fun to think about all the cities around the country that were producing foods for local audiences,” says Kress. “Every city had them. Here’s an idea that came out of Dayton, Ohio.”

But “baked rarebit,” once a prevalent idiom used to describe an obscure cracker, has since faded, replaced by the now-ubiquitous term, Cheez-It.

“When you bake a cracker, you roll the dough out thin, kind of like a pie crust,” says Spears. “But at the heart, it’s like a thin, crispy biscuit. When you bite into a Cheez-It, you get those nice layers. Those are the layers that form if you cook it a bit.”

Like the Cheez-It itself, we need only bite into the snack’s history to uncover countless compelling layers.

“Cheez-It is a survivor from a bygone time,” says Kress.



A Brief History of Ketchup and Mustard

H/T  Mental Floss.

Around 300 BCE, people in China were experimenting with making pungent pastes out of fermented fish guts. A few centuries later, the Greek historian Pliny shared a method to treat scorpion stings using the ground-up seeds of a common plant. These are the unlikely origin stories of ketchup and mustard, two condiments that people in the United States spend over $1 billion on annually. How did two condiments with thousands of years of history between them become associated with hot dogs and hamburgers?


Mustard has been around for a while—in fact, the plant the condiment comes from may have been among the first crops ever cultivated.

There are multiple species of mustard—most are members of the Brassica or Sinapis genera—and the plant (which is closely related to broccoli and cabbage) and its seeds first appear in the archaeological record in China around 6800 years ago. Before they became a condiment, the seeds harvested from the plant were used as a spice and a medicine; Indian and Sumerian texts from around 2000 BCE mention them in this context.

The paste-like form of mustard showed up roughly 2500 years ago. The Greeks and Romans blended ground-up mustard seeds with unfermented grape juice, or must, to make a smooth mixture. The first version of this concoction wasn’t necessarily food—it may have been used more for its medicinal properties, and not completely without reason: Mustard seeds are rich in compounds called glucosinolates, and when these particles get broken down, they produce isothiocyanates, powerful antioxidants that fight inflammation and give mustard its nose-tingling kick.

The Greeks and Romans applied mustard’s medicinal properties to almost every ailment imaginable—Hippocrates even praised its ability to soothe aches and pains. Many of mustard’s historical uses don’t hold up to modern science—for instance, it’s not a cure for epilepsy, as the Romans once believed—but it’s still used as a holistic treatment for arthritis, back pain, and even sore throats.

While experimenting with mustard as medicine, the Greeks and Romans discovered that pulverized mustard seeds were pretty tasty. In the first century CE, Roman agriculture writer Lucius Junius Moderatus Columella published the first recorded recipe for mustard as a condiment in his tome De Re Rustica. It called for an acid and ground mustard seeds—the same basic formula that’s used to make mustard today.


Meanwhile, the evolution of another popular condiment was underway halfway across the world.

Ketchup first appeared in China around 300 BCE. In the Amoy dialect of Chinese, kôe-chiap means “the brine of pickled fish,” according to the Oxford English Dictionary. Nineteenth century ethnologist Terrien de Lacouperie thought the word might have come from a Chinese community living outside of China. In any case, the name is pretty much the only thing that version of ketchup had in common with the bottle of red stuff in your fridge. It was actually much more like garum, a Mediterranean fish sauce that was once wildly popular in Ancient Roman cuisine. (Modern versions of garum can actually be found today in high-end restaurants like Denmark’s Noma.) Some have even suggested that Asian fish sauce is a descendant of garum.

The Chinese fish sauce known as ketchup was likely made by fermenting ingredients like fish entrails, soybeans, and meat byproducts. Fermentation creates byproducts that can be of great interest to human beings. One such byproduct is the ethanol that gives us beer and wine through alcohol fermentation. Another is monosodium glutamate, also known as MSG. A lot of theories fly around about MSG, but it’s worth pointing out that glutamates appear naturally in all sorts of foods, from tomatoes to beef to parmesan cheese. Our own bodies produce glutamates. And MSG can give foods a savory, hard-to-define flavor called umami.

The fish paste that was created by fermentation possessed this umami, and was used to add a salty, savory depth of flavor to a variety of dishes. And because fermentation can breed so-called “good” microorganisms while inhibiting the growth of the bad bacteria that cause foods to rot, this version of ketchup could be stored on ships for months without spoiling, an important factor at a time when trade routes could take months to traverse.

As ketchup spread to different parts of the globe, it went through a few transformations. Trade routes carried it to Indonesia and the Philippines, and it was likely around this part of the world that British traders discovered and fell in love with the funky seasoning. And as soon as ketchup landed in Great Britain in the early 1700s, Western cooks found ways to make it their own. One of the first English recipes for ketchup, published in Eliza Smith’s 1727 book The Compleat Housewife, calls for anchovies, shallots, ginger, cloves, and horseradish.

Some recipes used oysters as the seafood component, while others cut the fish out of the fish sauce completely. Popular bases for ketchup around this time included peaches, plums, celery seed, mushrooms, nuts, lemon, and beer. Like their predecessor, these sauces were often salty, flavorful, and had a long shelf-life, but beyond that, they could vary greatly. The word ketchup evolved into a catch-all term for any spiced condiment served with a meal—”spiced” referring to ingredients like cinnamon or nutmeg rather than heat level. Walnut is said to have been Jane Austen’s preferred ketchup variety.


Mustard received its own makeover when it was imported to different parts of Europe. The Romans invaded the land now known as France in the 1st century BCE, and the mustard seeds they brought with them thrived in the region’s fertile soil. Locals, including the monks living in the French countryside, loved the new condiment, and by the 9th century, monasteries had turned mustard production into a major source of income.

Mustard found its way into less humble settings as well. Pope John XXII was said to be such a fan that he appointed a Grand Moutardier du Pape, or “Grand Mustard-Maker to the Pope.” John XXII was one of the Avignon popes, who lived in what is now France rather than Rome, and he created the mustard-making position specially for his unemployed nephew who lived in Dijon, which was already the mustard capital of France by the 14th century.

Even French royalty developed a taste for mustard. King Louis XI made it an essential part of his diet, going so far as to travel with a personal pot of the sauce so he’d never have to eat a meal without it.


There are many types of mustard—yellow, spicy brown, English, Chinese, and German, to name a few. But to some condiment connoisseurs, mustard is still synonymous with the creamy Dijon variety that first took hold of France centuries ago.

In 1634, it was declared that true French mustard could only be made in Dijon. The recipe was an important part of French cuisine, but as one innovator proved, there was still room left for improvement.

Dijon native Jean Naigeon tinkered with the formula in 1752, swapping the traditional vinegar with verjuice, or the sour juice of unripened grapes. The simple change gave dijon the smooth taste and creamy texture that’s associated with the product today. Most modern dijon uses white wine or wine vinegar to imitate that original verjuice flavor. And most of it isn’t made in Dijon. Unlike champagne or Parmigiano-Reggiano, which must come from the areas who lend their names to the products, dijon no longer enjoys “protected designation of origin” status.

The dijon you’re most likely to find in your local supermarket is probably Grey Poupon. In 1866, inventor Maurice Grey teamed up with financier Auguste Poupon to revolutionize the mustard world. Grey’s automated mustard-making machine brought the artisan product into the Industrial Age. Today, most Grey-Poupon mustard is made in American factories.


While mustard was flourishing, ketchup was still figuring out how it would leave its mark on the white T-shirt of history. And after arriving in America by way of British colonization, the sauce joined forces with the ingredient that would define it for decades to come: the tomato.

The British had experimented with turning nearly everything they could find into ketchup, but tomatoes were the exception—at least in part because the New World fruit was believed, by some, to be poisonous when it was first introduced to Europe by explorers in the 16th century. It’s possible that some wealthy English people did get sick from eating tomatoes, though not for the reasons they suspected. If they were eating off lead and pewter plates, the acid from the tomatoes may have leached lead into their food, thus giving them a case of lead poisoning they might have mistaken for tomato poisoning. A lot of food historians doubt how much influence this could have had on public perception, though, arguing that lead poisoning takes too long to develop to get connected to any single dish. Instead, it could just be that tomatoes looked like plants that Europeans knew were poisonous, and so were branded with guilt by association. The bottom line is, the reasons are contested, but by the late 16th century, you can definitely find anti-tomato texts in English.

This misconception about the risks of tomatoes may have persisted among English Americans if it weren’t for the efforts of some passionate tomato advocates. One of these crusaders was Philadelphia scientist and horticulturist James Mease. He referred to tomatoes as “love apples,” and in 1812, he published the first known recipe for tomato ketchup.

Sadly, the name love apples didn’t stick, but tomato ketchup did. People with fears about tomatoes felt safer eating them in processed form. And ketchup may have gotten an assist from a bit of old-fashioned quackery. Dr. John Cook Bennett touted tomatoes as a cure for maladies ranging from diarrhea to indigestion. He published his own recipes for tomato ketchup, and eventually the product was being sold in pill form as patent medicine, helping to sway public perception about the benefits of tomatoes.

In reality, though, early tomato ketchup was actually less safe than tomatoes from the vine. The first commercial products were poorly preserved, resulting in jars that were teeming with bacteria—and not the good kind. Some manufacturers cut corners by pumping the condiment with dangerous levels of artificial preservatives. Coal tar was also added to ketchup to give it its red color.

It was the Heinz company that was largely responsible for elevating ketchup from potential botulism-in-a-bottle to staple condiment.


Pennsylvania entrepreneur Henry J. Heinz got his start in the condiment business in 1869 by making and selling his mother’s horseradish recipe. Seven years later, he saw an opportunity to bring some must-needed quality to the ketchup market. The first bottles of Heinz ketchup hit stores in 1876, and in the years that followed, they would do several things to set themselves apart from the competition.

For starters, Heinz got rid of the coal tar. Instead, he blended distilled vinegar with ripe, fresh tomatoes. His formula was shelf-stable and it tasted good, but that alone may not have been enough to make Heinz a household name. Arguably the biggest change he made was packaging his products in clear, glass bottles. Before that, ketchup had been sold in brown bottles to hide its poor quality. With Heinz, customers knew exactly what they were getting.

The Heinz ketchup bottle is one of the most iconic pieces of food packaging ever created, and it’s likely shaped your perception of the product. This extends even to the spelling of the word. If you write C-A-T-S-U-P you may get funny looks, but it’s a perfectly valid old spelling for the word, and for years was actually the preferred spelling in America. Heinz labeled his condiment ketchup with a K as another way to differentiate it from its catsup with a C counterparts. Today Heinz’s version is widely regarded as the correct spelling.


Mustard also arrived in America shortly after the first European settlers did, but All-American yellow mustard didn’t appear until much later—at the World’s Fair in St. Louis in 1904, when the R.T. French Company debuted its new “cream salad mustard.”

Distracted fairgoers may have overlooked the product if it wasn’t for a special new ingredient. Mustard is naturally brown or beige, but Brothers George and Francis French added turmeric to their mustard to give it a neon yellow look.

For a canvas to showcase their condiment, the Frenches chose the hot dog—a dish that was fairly new to Americans at the time. The R.T. French Company’s cream salad mustard, or French’s yellow mustard, is still a classic hot dog topping more than century later.

Ketchup and mustard have no doubt secured their positions as culinary heavyweights. Surprisingly, though, neither product is the top-selling condiment in the U.S. That distinction belongs to ranch dressing, which is a $1 billion industry as of 2019.



A Brief History of Pepper Mills & Grinders

H/T Peppermint.com.

PepperMate’s Brief History of Pepper Mills

Pepper mills, sometimes referred to as pepper grinders, are a common kitchen accessory designed to grind peppercorns into a fine powder used to season foods. Many of us probably have a pepper mill sitting in our kitchen right now! Yes, that’s very true.

History of the Pepper Mill

The pepper grinder was invented by Peugeot of France in 1842. Earlier versions of pepper mills were based on a mortar and pestle design. The pepper grinder allowed for a less labor intensive way to crack the peppercorns. The pepper grinder invented by Peugeot was constructed of metal and the individual grooves inside the casing were virtually indestructible. You may really want to see it.

Pepper Mill Construction

Peppermills can be made from a variety of different materials including steel, zinc alloy, ceramic or even acrylic. Stainless steel models are durable and crack resistant making it an excellent choice for a device that requires a fair amount of continuous pressure. Stainless steel in the number one choice for professional chefs and home chefs alike. Zinc alloy is also a popular choice because of its ability to resist corrosion. Zinc alloy is a composite material made up of a mixture of chrome plating, zinc and assorted metals. Ceramic peppermills are popular because a chef can use them to grind multiple things. Salt, pepper and even coffee can be ground in a ceramic model. Another popular choice is acrylic. It is durable and cost effective. While not as aesthetically pleasing as stainless steel or even ceramic, it get the job done!

See some our World Famous Pepper Grinders.

Electric Peppermills

Peppermills can also be electronic. An electric motor powered by battery or from an electricity source, grinds the peppercorn completely eliminating the need for manual operation. Electric grinders grind peppercorns much faster than manual models. A drawback to the electric model is that some heat is generated by the high amount of friction which can affect the taste and performance of the peppercorns.

Benefits of Freshly Ground Pepper versus store bought pepper

Pepper is almost always better when it is freshly ground. As soon as peppercorns are ground up they begin to lose some of their flavor and intensity. Within a period of about three months, pepper shows a marked difference in quality. Something to consider is that while pepper you buy from a store may have been placed on the shelf within the last week, the chances are that the peppercorns used in the manufacturing process were harvested many months before they were actually ground and packaged. This fact added to the time it takes to process the peppercorns means that the pepper you sprinkle into that pot of chili has been slowly degraded over a period of months. Professional chefs will almost always choose freshly ground pepper over any sort of pre-ground pepper.

Measuring Freshly Ground Pepper

When you look at a recipe in a book you will often see that many common recipes call for freshly ground black pepper. Unfortunately they do not always specify the exact amount. You may be told to simply “sprinkle” some freshly ground pepper or even “generously season” a piece of meat with pepper. What exactly does that mean?

Most home cooks would agree that when it comes to pepper, they are not pulling out a measuring cup or measuring spoons to determine the amount of pepper to use in a recipe. Using salt and pepper in a recipe is one of those things that most people just kind of leave up to chance. But you may be depriving yourself and the folks you are feeding by not putting enough seasoning into your dishes, or on the other hand, adding too much. Measurements like “sprinkle” don’t exactly help! The taste test doesn’t always work either. If you are cooking with raw eggs or meat, it’s not a good idea to taste your recipe before it I fully cooked.

A really easy way to measure the amount of ground pepper in a recipe is to count the number of rotations used. Try grinding out one or two rotations into to a bowl and measuring the output. For example, if five turns of the grinder equals one teaspoon, you will know that’s the amount you are adding. You can, in turn, experiment with your recipes so that you know exactly how spicy your casserole should be and then add the pepper accordingly.

Types of Peppercorns

Pepper is served at nearly every table on the planet. It may surprise you to know, however, that there are a wide variety of peppers out there, each one with own distinctive flavor.

Black Peppercorns  – are the most recognizable. They are actually a dried berry and happen to be the most flavorful and aromatic. The berries are harvested just before they are ripe and are traditionally laid in the sun to dry out. When the dried hull is cracked the flavor released is strong which is why chefs prefer this pepper above most others.

Tellicherry peppers  – are another popular type of peppercorn. It is the oldest known source for what we call “black pepper.” Its name comes from the region it is harvested in, India, and it has a complex flavor. It is darker than most other peppercorns and was actually used as form of currency in ancient times!

Green peppercorns  – are another popular option for chefs ad home cooks alike. Green peppercorn berries are picked well before they are ripe and then the berry is freeze-dried in most cases. The texture of these peppercorns is smooth and the taste is much milder than black peppercorns. Green peppercorns have a tart flavor that disappears quickly after the hull is cracked. Green pepper is best served freshly ground to preserve the flavor.

White pepper  – is not as well known in the United States as it is in Europe. While pepper is derived from fully matured berries and the hull is removed. The remaining berry is sun dried and as it is dried it becomes the distinctive pale color. White pepper is sold as either whole or ground and is one of the main ingredients in fish sauces and creamy soups.

A Brief History of Washing Machines

H/T ThoughtCo.com.

A look at some of the highlights of the washing machine.

Laundry May Not Be Fun, but the History Is Fascinating