The History of Root Beer

H/T ThoughtCo.com.

Love Root Beer? Thank Charles Hires.

The History of Bagels in America

H/T myjewishlearningcenter.com.

I first had bagels and Lox from a Mom and Pop Jewish Deli in Miami Beach.

A brief history of your favorite Sunday morning nosh.

We know you love bagels (and we do too), so we thought it was high time to really explore the history of bagels in America, and why they are so Jewish.

So let’s start from the beginning: A bagel is round, has a hole (no, it’s not a donut), and is made from a yeasted dough which is boiled and then baked in a very hot oven. It can be covered with sesame seeds, poppy seeds, all the seeds, or nothing at all. Some people even claim blueberry bagels are a legitimate flavor, but obviously, those people are wrong.

The bagel arrived in the United States with Jewish immigrants from Poland in the late 19th century. Long before it was schmeared with cream cheese and topped with lox, capers, tomatoes, and thinly sliced red onions, it was sold on the streets of New York City’s Lower East Side stacked up on poles or hung from strings — that’s why they have a hole — for people to buy and enjoy on the street. It was simple, comforting peasant food.

The yiddish word for bagel is actually beigel, and it is also theorized that the bagel is a descendent of the German pretzel, which is another yeasted dough bread that is boiled then baked. The boiling and baking process actually means that bagels stay fresher longer, which for poor Jews, was really important.

As Jews immigrated from Europe to North America, many settled in Toronto and Montreal, Canada, which created their own style of bagels distinct from the New York style. Meanwhile in New York City, there were so many bagel makers that Local 338, a bagel makers trade union, was created in 1915.

We can thank the invention of cream cheese in the 1930s, Lender’s Bagels, and 1950s housewives for marrying the bagel with cream cheese and lox, which was first suggested to serve as an appetizer at cocktail parties in Family Circle Magazine:

Split these tender little triumphs in halves and then quarters. Spread with sweet butter and place a small slice of smoked salmon on each. For variations, spread with cream cheese, anchovies or red caviar. (They’re also delicious served as breakfast rolls.)

Eventually the bagel, cream cheese, and lox became a quintessential Sunday morning staple as we know it today. One of the things we love most about bagels is that they are an iconic New York-ish, Jewish, mash-up food that tells an immigrant story through one simple food.

Watch our short video below to learn more. We dare you not to crave a good bagel and schmear after this.

The History of Life Jackets

H/T Backthenhistory.

A brief history of life jackets.

Life jackets are a life-saving essential for various water-related activities. But do you know the history behind these vital objects? Floatation devices have been around since ancient times – a cave drawing from 870 BC shows Assyrian soldiers swimming while holding inflated animal skins. However, life jackets as we know them today weren’t developed until the 1850s. Around this time, iron boats were replacing traditional wooden boats. In times past, if there was a shipwreck in a wooden boat, there would be plenty of floating debris for sailors to grab onto. However, this wasn’t the case with iron boats and therefore drowning deaths began increasing. Enter the life jacket. Early life jackets allowed sailors to float using a vest filled with buoyant balsa wood or cork blocks. Captain Ward, an inspector for the Royal National Lifeboat Institution in England, famously designed a cork vest. However, there were no international regulations requiring their use on ships until the first International Convention for the Safety of Life at Sea (SOLAS) was held in 1913 – one year after the Titanic sank. The convention proposed a set of recommendations with regard to “life saving appliances,” including the still-in-place requirement that there be a life jacket for every passenger on board.

World War II brought further improvements to the life jacket in response to losses at the Battle of Britain. Dr. Edgar Pask, a physician at the RAF Institute of Aviation Medicine, pioneered the use of the Mae West, a hybrid inflatable/inherent buoyancy jacket, among UK pilots. (Why Mae West? The inflatable chest on the life jacket was reminiscent of the actress’ famous curves.) German scientists working for the Luftwaffe responded to losses by developing dependably inflatable life jackets. While the materials have changed over the years – moving from cork and balsa to kapok, and from rubber to polyurethane-coated nylon – the basic technology and purpose of the life jacket has remained the same since its inception. These life-saving devices are now easily accessible, very affordable, and extremely reliable – all you have to do is wear one.

The History of Salt Water Taffy

H/T bulkcandystore.com.

I like saltwater taffy.

Most everyone loves salt water taffy, a classic chewy, sweet treat that makes everybody smile. But did you ever wonder when and how this iconic candy was first made? We know we have, and that’s why we’re here to bring you the sugary sweet history of Salt Water Taffy!

 

Salt Water Taffy is a variety of soft taffy originally produced and marketed in Atlantic City, New Jersey, beginning in the late 19th century. The most popular explanation of the name is that of a candy-store owner, David Bradley, whose shop was flooded during a major storm in 1883. His entire stock of taffy was soaked with salty Atlantic Ocean water. Shortly afterward, a young girl came into his shop and asked if he had any taffy for sale. Mr. Bradley jokingly offered her some “salt water taffy.”  After sampling a piece, the girl purchased the candy and proudly walked down to the beach to show her friends. Bradley’s mother was in the back of the store and overheard the whole conversation. She loved the name “saltwater taffy”, and that’s what it was called from then on.

Making Salt Water Taffy

Salt Water Taffy Pulling Machine

Taffy was first cooked in copper kettles over open coal fires, cooled on marble slabs, and pulled from a large hook. The “Taffy Pull” was a household enjoyment on Saturday nights as well as an Atlantic City enterprise. The process of pulling taffy adds air to the corn syrup and sugar  mix.  First the puller got  the taffy to about a 5 foot length, then it is looped over itself on the hook, trapping air between the two lengths of taffy. This process of aeration helped to keep the taffy soft. The pulled taffy was shaped by hand-rolling it on marble or wooden tables. It was then cut to a 2-inch length with scissors and, finally, wrapped in a pre-cut piece of wax paper with a twist at both ends. All of this was done by hand and usually within the sight of boardwalk strollers.

Whatever the origins, Joseph Fralinger really popularized the candy by boxing it and selling it as an Atlantic City souvenir in about 1886. Fralinger’s first major competitor was candy maker Enoch James, who refined the recipe, making it less sticky and easier to unwrap. James also cut the candy into bite-sized pieces, and is credited with mechanizing the “pulling” process. Both of the competitor’s stores still operate on the Atlantic City boardwalk.

Today’s taffy is cooked in large stainless steel or copper kettles and then vacuum cooked a second time. The pulling and packaging is now done with machines. This produces much more taffy at greater speeds.

Salt water taffy is still sold on the boardwalks in Atlantic City, nearby island Ocean City, and other tourist beachfront areas throughout the United States and Canada. It is also, of course, sold online at BulkCandy Store.

Ingredients

Salt water taffy is composed of sugar, cornstarch, corn syrup, glycerine, water, butter, salt, flavor, and food coloring. Contrary to popular belief, the taffy contains no actual sea water. However, it does contain both salt and water.

Where can I get saltwater taffy?

Salt water taffy is available all over the U.S., most commonly sold at dessert shops, ice cream shops, and candy shops, of course! At BulkCandy Store, we have a gigantic variety of flavors, including Peach, Licorice, Grape, Vanilla, Tutti Frutti, and Key Lime with Coconut Swirl, just to name a few. And good news! All of our salt water taffy collection is Kosher! So don’t wait, try a piece of this delicious chewy oldie but goodie treat today, and find out for yourself why these candies are still around after over 100 years!

You can find out loads about ALL candy and their sweet histories at our History of Candy Tour, where we take you back in time to the ancient Egyptians all the way to what candy is today. So if you’re crazy about candy, (come on, who isn’t?) definitely come on in to our retail store and experience the evolution of your favorite treats!

The History of Credit Cards

H/T thebalance.com.

A look back at credit cards of the past.

How Ancient Promises of Payment Became Modern Digital Transactions

An illustrated timeline of credit cards shows their development from paper to metal to plastic.

The Balance / Joshua Seong

If you’ve paid for a latte or plane ticket with one of those shiny, new metal credit cards, here’s something you might not know: Some of the very first credit cards were also made of metal. However, those early cards were clunky and not widely accepted. Today you can make quick payments with a credit card almost anywhere, and not think twice—that’s part of its modern design. But as with most things we take for granted, there’s a long history behind those cards you carry.

Let’s walk through the history of credit cards to better appreciate this convenient, and even rewarding, form of payment.

Early Forms of Credit

People have engaged in credit-like transactions for thousands of years. For example, merchants would give farmers seeds so long as repayment would come following the harvest. 

One of the earliest written examples of a credit system can be found in the Code of Hammurabi, a set of laws named after the ruler of Babylon from 1792 to 1750 B.C.1 This early credit system established rules for loaning and paying back money, and how interest could be charged, too. 

 

Jump forward to the late 1800s when consumers and merchants exchanged goods using the idea of credit, exchanging what were called credit coins and papers as temporary currency.2 This started among small merchants, but the idea of credit payments quickly spread to other industries. 

Around 1885, loyal customers of hotels and department stores received what can be considered early paper store credit cards. The credit lines were typically only for one location, but sometimes were accepted by competing merchants, too. 

Metal Money: Coins, Cards, and Charga-Plates

In 1914, Western Union gave metal plates to select customers that allowed them to defer payment until a later date.3 Oil companies followed suit in the next decade by creating similar courtesy cards that could finance gas and repair services at their stations. 

Next came the Charga-Plate, a metal card developed as early as 1928 that fit in wallets, was personalized with embossed cardholder’s information, almost like a military dog tag, and had paper on the back for the cardholder’s signature. The embossed card helped sales clerks quickly make imprints of the details for processing. These cards were issued during the 1930s through the 1950s primarily by larger merchants for use in their store networks.4 

The First Bank Card: Charg-It

The next credit card milestone came in 1946 when the first bank card system, called “Charg-It” was introduced by Brooklyn, New York banker John Biggins.5 The Charg-It model worked very similar to modern credit cards: A customer would use the card to pay a retailer and the issuing bank would reimburse the retailer and then seek payment from the customer. 

 

At this point, Charg-It cards only worked at stores located very close to the card’s issuing bank. These early credit cards were not national payment tools yet. 

Diners Club Card Is Created

In 1949, a man named Frank McNamara was dining at Major’s Cabin Grill in New York City and realized his wallet was sitting at home. He resolved the situation, but it was something he never wanted to happen again. His experience, dubbed “The First Supper” by Diners Club, inspired McNamara and his business partner Ralph Schneider to release the first cardboard Diners Club Card in 1950. It was a charge card intended for consumers who wanted to pay back their travel and entertainment purchases later. It was the first card to be accepted by multiple merchants outside a single geographic area.6 

 

The Diners Club Card exploded in popularity and by 1951, only a year after launch, Diners Club had more than 42,000 members, and card acceptance spread throughout major U.S. cities.

More Card Issuers and Networks Form

Following the success of Diners Club, other banks and financial players moved to get in on the action.

American Express

American Express started their own credit program in 1958.7 Like the original Diners Club Card, it was first a charge card intended to fund travel and entertainment expenses and bills were due in full at the end of each month. In 1959, American Express introduced the first card made of plastic. The issuing bank then launched their corporate credit card program for commercial customers in 1966. 

BankAmericard

In 1958, Bank of America introduced the first true general-purpose credit card, BankAmericard, which was most similar to the credit cards we use today.8 It was initially made of paper, but soon became plastic. It had a $300 spending limit and cardholders could carry balances month-to-month for a fee. It could be accepted by any merchants willing to take it.9 

 

Until this point, banking and financial services in the U.S. were largely conducted locally, not nationally. To better compete with the growing credit card industry, in 1966 Bank of America began licensing its cards to be used by other banks, expanding its reach around the nation. To strengthen the network, by 1970 Bank of America joined a group of banks to form National BankAmericard, Inc. which was later renamed Visa in 1976. 

Master Charge

In 1966, a small group of East-coast banks formed the Interbank Card Association (ICA) to compete with California-based BankAmericard. ICA’s answer to the BankAmericard was a card program called “Master Charge.” The organization began revolutionizing the payment authorization process and in 1973 established a central computer network that connected merchants with card-issuing banks. In 1979, Master Charge was renamed MasterCard.10

Discover

The card issuer and network now recognized as Discover was started by Dean Witter Financial Services Group, Inc, a subsidiary of Sears, Roebuck, and Co. in the late 1980s. Early Discover card purchases were made by Sears employees at stores in Atlanta and San Diego in 1985 to test the system. The Discover credit card then launched publicly via a national TV commercial during Super Bowl XX. Decades later in 2008, Discover acquired Diners Club International to expand their card reach globally.11 

Invention of the Magnetic Stripe

You know that black stripe on the back of your cards? It was put on a plastic card by IBM engineer Forrest Parry in the early 1960s.12 Parry’s magnetized tape first held details for CIA identity cards and became a simple and inexpensive way to store account information for payment cards and point-of-sale terminals, too. 

 

Until the introduction of the magnetic stripe (also known as “mag stripes”), credit card transactions were more physical than digital, so this was an historic step forward. Payment transactions could be computerized instead of dependent on manual processing. 

 

Magnetic stripes were adopted as a U.S. standard for payment cards in 1969 and as an international standard two years later.

Early Industry Regulations

While the credit card industry rapidly expanded in the 1960s, some fundamental issues still needed to be addressed. For example, card issuers had different ways of calculating interest rates with little consistency or transparency. Fraudulent charges were a problem and women typically couldn’t qualify for a card without a male co-signer. Card terms and conditions? They didn’t really exist. 

 

Lawmakers stepped in starting in 1968 by passing the Truth in Lending Act, which would eventually be part of a larger Consumer Credit Protection Act. The Truth in Lending Act standardized how banks and card issuers calculated annual percentage rates (APRs).13 

 

More laws were passed in the 1970s and became the groundwork for regulations that help protect credit card holders today.

 

Credit Card Laws of the 1970s

THE FAIR CREDIT REPORTING ACT OF 1970This law helps ensure the information gathered by credit reporting agencies is fair and accurate.14 

THE FAIR CREDIT BILLING ACT OF 1974Curbs abusive billing practices and permits consumers to dispute billing errors by following a get of guidelines.15

THE EQUAL CREDIT OPPORTUNITY ACT OF 1974Lenders must make credit available to all credit-worthy applicants and cannot discriminate based on gender, race, marital status, national origin, or religion.16

THE FAIR DEBT COLLECTION PRACTICES ACT OF 1977Debt collection agencies are banned from practicing predatory debt collection, such as using threats or harassment.17

Rewards Programs Gain Popularity

In 1984, Diners Club introduced its “Club Rewards” program and in 1987 Citibank established a credit card reward program with American Airlines, allowing customers to earn free or reduced airfare by using their card. 

 

Throughout the 1990s, reward programs gained momentum and card issuers began enticing customers with sign-up bonuses, cash back perks, and co-branded deals which made credit cards even more popular than before. For example, American Express first launched its Membership Rewards program in 1991 (then called Membership Miles), and it became the largest card-based rewards program in the world by 2001.7

 

New Technologies: Mini, Mobile, and Contactless Payments

After the turn of the century, credit cards kept evolving, especially the technology behind them. 

 

Starting in 2002 with Bank of America, a new “mini card” fad began, as some issuers rolled out keychain-sized versions of traditional cards. The Discover 2GO credit card was a kidney-shaped card that fit into a keychain case and made Time’s Top 10 Everything 2002 list.18 

 

Mastercard’s tiny Side Card was released in 2003 and also incorporated new technology that allowed cardholders to simply hover the card over contactless payment terminals and just like that, transaction complete. More recently, wearables, such as watches, wristbands, and even rings, have entered the contactless credit card payment space, too. 

 

Mobile wallets emerged in 2008, shortly after the dawn of smartphones when Apple opened their App Store. In May 2011 Google Wallet led the way for apps that stored payment card information for use in place of a physical card

 

With little bank and retailer participation at first, Google Wallet and competitors such as CurrentC and Softcard struggled to earn consumer adoption. Apple Pay launched in October 2014 with 220,000 merchants ready to accept wallet payments at launch.19

 

The CARD Act of 2009: Additional Regulations

The Credit Card Accountability Responsibility and Disclosure Act of 2009, also known as the CARD Act, was signed into law on May 22, 2009 by President Barack Obama and represented a sweeping attempt to crack down further on harmful card issuer practices.20

 

The CARD Act has reduced credit card costs to consumers by more than $100 billion over the past decade, which is one of its more significant impacts. The law, which is enforced by the Consumer Financial Protection Bureau (CFPB), offers several consumer protections

 
  • Cost savings: Limits surprise interest rate increases, caps late fees, and requires more consistent billing practices.
  • Statement clarifications: Requires that credit card statements must clearly state penalty disclosures such as due dates, late fees, and penalty APRs, and note how long it will take consumers to pay down their balances by only making minimum payments. 
  • Limits young adult marketing: Prohibits issuers from luring potential applicants with enticing freebies on or near college campuses. It also tightened applicant age restrictions. 
 

Following the CARD Act, the Dodd-Frank Wall Street Reform and Consumer-Protection Act was signed into law on July 21, 2010, further ensuring that consumers aren’t overcharged for the use of credit cards.21 The law also tightened card access following the Great Recession, when many consumers were drowning in credit card debt. 

 

Security Concerns and Solutions

Do you remember the infamous Target data breach? A December 2013 announcement confirmed that more than 40 million credit and debit account numbers had been stolen from Target’s payment database, and it was just one of many credit card security breaches to make headlines in a short period of time.22 

 

In addition to data hackers, card skimmers have also taken advantage of credit card payment technology. By copying the card information stored in the magnetic stripes of credit cards, skimmers can replicate cards and quickly rack up all sorts of fraud charges. Self-serve gas pumps and ATMs have been the most vulnerable to these security attacks, so much so that the U.S. Secret Service have cracked down on gas pump skimmers. 

 

While cardholders faced these mounting security issues, the U.S. began adopting EMV payment technology to encrypt payment information and combat counterfeit credit card fraud. The process started in 2011 and the official nationwide shift occurred October 1, 2015. 

 

EMV payment technology uses an encrypted smart chip instead of a magnetic stripe to hold account data and complete payments. Today nearly all credit cards sport silver EMV chips and consumers are adjusting to a new payment process at store registers: inserting cards instead of swiping them. 

 

Magnetic stripes are still on the backs of most credit cards just in case a retailer can’t accept chip cards, but the goal is for the U.S. to migrate away from magnetic stripe payments entirely to better secure payments at registers, gas pumps and ATMs. 

 

Credit Cards Today

There’s a more diverse selection of credit cards in the U.S. than ever before, as issuers offer cards with everything from travel rewards that entice big spenders to secured cards that help others build credit.

 

While the idea of credit cards isn’t going away, the physical cards might soon become just another part of history. In addition to an increased adoption of mobile wallets, industry predictions point to biometric payments—the use of selfies, fingerprints, and retina scans to verify the account holder—as the next big step for credit card payments.

 

We can already unlock our phones just by looking at them, after all. Maybe soon instead of reaching for our credit cards to pay for our lattes, we will reach to remove our sunglasses.

 

The History of Swedish Fish

H/T Back Then History.

I like these fish.

They Really Are from Sweden

Swedish fish are a beloved candy with an iconic status in America. They were introduced to the United States in the 1950s. And yes, they are actually Swedish! They were developed and then introduced to the US market by a Swedish confectionary company called Malaco. But their original manufacturer isn’t the only thing that’s Swedish about these candies! On every gummy, the word “Swedish” is stamped on the side. (The next time you buy a box, take a look and see!) It’s also purposeful that the shape of the candy is a fish, because fishing is a major part of the cultural and economic landscape in Sweden. The introduction of these classic candies to the American market in the 1950s was highly successful and they quickly gained popularity among consumers.

They’re Not Actually Gummies

They are sticky and sweet, but in candy terms, Swedish Fish aren’t actually gummies. Gummy candies, such as gummy bears, have a distinctly rubbery texture that isn’t too sticky. Swedish Fish fall into a confectionary category called wine gums – these candies are chewier, stickier, and less rubbery than gummy bears. Swedish Fish is actually one of the rare varieties of wine gum candy popular in the US; wine gum candies are much more popular in the UK and in other parts of the world.

The Swedish Version Is a Little Different

Swedish Fish do exist in Sweden, but the candies are a little different than their American counterparts. For starters, they’re called pastellfiskar, which translates to “pale-colored fish.” Instead of the word “Swedish” printed on the side like in the US, the Swedish candies have the name “Malaco” printed on them. Also, some of the Swedish Fish candies available in Sweden are darker in color than the well-known American red hue. There’s even a special flavor available in Sweden called salmiak, or black salted licorice. It is flavored with ammonium chloride and remains popular in Sweden as well as in other Nordic countries, but never caught on America.

Image Credit: https://www.brucescandy.com

They Have a Unique Taste

There’s a lot of debate over the flavor of Swedish Fish. A few different companies have been responsible for manufacturing the popular candy over the years, but none of them have ever issued a statement on what exactly the flavor is meant to be. However, based on the gummies’ red color, many people think they’re meant to taste like cherry, strawberry, raspberry, or even fruit punch. Others think Swedish Fish are meant to taste like lingonberry, a berry native to Scandinavia. While the mysterious red flavor is by far the most popular variety of Swedish Fish, the candies also come in assorted flavors and colors in the US. In fact, the assorted flavors box of Swedish Fish was a cult favorite in the 1960s and 1970s! Today you can get Swedish Fish in yellow, green, orange, and even a white, tropical-flavored variety, but sadly, the purple flavor was discontinued in the 2000s.

They’ve Become a Classic American Candy

Since their introduction to America in the 1950s, Swedish Fish have become a vital part of the American candy landscape and remain popular to this day. They’re a staple of movie theater concession stands and are often given out on Halloween as well. They’re enjoyed by people of all ages and from all walks of life. In fact, since they don’t include gelatin, most varieties of Swedish Fish are vegetarian and vegan-friendly (but just to be sure, always check the packaging!). In 2009, an estimate was made that 7,000 metric tons of Swedish Fish are churned out annually – that’s how popular these beloved classic candies are!

 

The History of Apple Pie

H/T Back Then History.com.

I was under the impression apple pie originated in America.

Apples Aren’t Native to America

You may be surprised to learn than neither the apple nor apple pie is actually native to America. In fact, apples are native to Asia. (The only apple variety native to North America is the crab apple.) The sweet yet tart apples that we are familiar with today first spread from Asia to Europe. Later, European colonists brought apples to North America. Specifically, the early colonists of Jamestown are thought to have brought European apple tree cuttings and seeds with them on their journey, thus introducing the apple to America. In the early days of colonization, European settlers primarily used their apples for making cider, which was preferred over water and easier to make than beer. Tree planting was helpful for maintaining a land claim in colonial America, and apple trees were often chosen for this purpose since they were also popular for cider-making. The result of this was that by the 1800s, Americans were growing over 14,000 different varieties of apples! It’s clear that apples were popular, but they weren’t associated with Americana until John Chapman, who is perhaps better known as Johnny Appleseed, made the apple part of American folklore.

Apple Pie Isn’t an American Invention

Like apples, apple pie isn’t nearly as American as you might think. In fact, apple pie originated in Europe and it was developed with the help of multiple culinary influences, including cuisine from Britain, France, the Netherlands, and the Ottoman Empire. The dish is also a lot older than you may think. A recipe for apple pie appears in a British cookbook, The Forme of Cury by Samuel Pegge, that dates all the way back to 1390! Early British pies were often on the savory side, but sweeter pies with apples and other fruits were often made as well. However, these early British versions of apple pie often did not include crust due to the high price of ingredients. It was the 15th century Dutch who first created the lattice-style pastry we are familiar with today. Dutch lattice-style pies caught on quickly, and a mere century later, they could be found all across Europe.

Apple Pie Was Introduced to America by European Colonists

Apple pie is thought to have been brought to America by European colonists – particularly the British, the Dutch, and the Swedish. The first mention of apple pie in America occurred in 1697, when Allen Metcalf’s America in So Many Words: Words that have Shaped America referenced it. The first two recipes for apple pie that were published in America were included in the new country’s first cookbook; it was called American Cookery and was published in 1796. Since it was an easy and affordable dish to make, apple pie quickly became part of the American culinary repertoire.

Image Credit: https://www.kroger.com

Apple Pie Became a Symbol of America in the Early 1900s

While apple pie was popular in America since its inception, it didn’t become associated with our cultural identity until the early 20th century. News, advertising, and the two World Wars catapulted the humble apple pie into the national consciousness as a symbol of American patriotism and nationalism. It started when various publications began positioning apple pie as uniquely American around the turn of the century. In 1902, an editorial in The New York Times stated that pie had become “the American symbol for prosperity” and that “pie is the food of the heroic.” In 1926, that same publication declared, “The Tourist Apple Pie Hunt Is Ended: American Army Abroad Has Failed Again to Find in Europe ‘the Kind They Make at Home,’” thus positioning the dish as a distinctly American phenomenon (which is, of course, untrue). The phrase “as American as apple pie” began to crop up around this time as well. In 1924, a Gettysburg Times advertisement promoted “New Lestz Suits that are as American as apple pie.” And in 1928 an article in The New York Times described the homemaking abilities of First Lady Lou Henry Hoover as being “as American as apple pie or corn pone” (the “corn pone” was quickly dropped from the now-famous saying). Clearly, apple pie had become a symbol for America and certain American ideals like motherly love, purity, wholesomeness, the comfort home, and even agrarian times gone by. It also came to be associated with patriotism and nationalism. This is especially apparent in the lingo of World War II soldiers, who would often say that they were fighting for “mom and apple pie.”

Apple Pie Remains a Classic American Dessert Today

Today, apple pie remains a beloved dessert in America. It’s often served at American holidays like Fourth of July celebrations and Thanksgiving dinners. Many families even have their own unique recipes for the dish that are passed down from generation to generation. While apple pie did not originate in America, over the years it has been fully integrated into American cuisine and is now considered a classic American dish. And indeed, much like America itself, apple pie is a melting pot of many different cultural and culinary traditions. Of course, it’s also just plain old delicious to eat!

The History of the Piñata

H/T Back Then History.

I learned a lot about the Piñata.

It Originated in China

Today, the Piñata is a staple at many celebrations and plays a particularly central role in Mexican fiestas. You may think of it as just a simple object, but it has a surprisingly fascinating history! The piñata is thought to have originated over 700 years ago in Asia. Specifically, the Chinese used to fashion paper-covered animals to celebrate the New Year. They decorated paper-covered animals (which included cows, oxen, and buffalos) with colorful harnesses and other trappings. Then, they filled the figures with seeds and knocked them with sticks until the seeds spilled out. Afterwards, the remains were burned; the ashes were thought to bring good luck in the coming year. It is thought that Marco Polo discovered this Chinese practice and introduced it to the Western world.

It Became Part of Lenten Traditions in Europe

In the 14th century, the piñata entered Europe and was quickly adapted to the Christian season of Lent. The first Sunday of Lent was known as “Piñata Sunday’ – the name comes from the Italian word pignatta, meaning “fragile pot,” because early European piñatas resembled clay pots. When the custom spread from Italy to Spain, the first Sunday in Lent there became known as the “Dance of the Piñata.” The Spanish fiesta featured a clay container called la olla (the Spanish word for pot); originally, it was not decorated, but over time decorations like tinsel, ribbon, and fringed paper were added.

Indigenous Peoples Had Their Own Version

When Spanish missionaries travelled to the Americas, they used the piñata to attract crowds and attention at their ceremonies. However, the indigenous peoples already had their own tradition that was similar; to celebrate the birthday of Huitzilopochtli, the Aztec god of war, Aztec priests put a clay pot on a pole in the temple at the end of each year. The clay pot was decorated with feathers and filled with small treasures. When broken with a stick, the treasures would fall at the god’s feet as an offering. The Mayans also played a sport where a player’s eyes would be covered and they would have to try to hit a hanging clay pot.

Image Credit: https://borrachavegas.com

Missionaries Gave It Religious Meaning

The missionaries transformed these indigenous traditions for the purpose of religious instruction. They covered the traditional pot in colored paper so that it appeared different (and perhaps even scary) to the local peoples. The original piñata features seven points; the missionaries used these to symbolize the Seven Deadly Sins in Christianity: envy, sloth, gluttony, greed, lust, anger/wrath, and pride. (There is also a traditional ten-pointed piñata, which missionaries said symbolized the sins that come from breaking the Ten Commandments.) The missionaries said that the stick used to break the piñata represented love. The stick, representing love, destroyed the piñata, which represented sins and temptation, thus imparting a religious lesson. Some people also say the piñata was meant to represent Satan. The treats (usually candies and fruits) that fell out of the broken piñata were said to represent God’s forgiveness of sins and a new beginning. Another interpretation holds that the fruits represented temptations and earthly pleasures, while yet another holds that the sharing of the fruits and candies represented a reward for keeping the faith – a share in divine blessings and gifts.

Today, the piñata has lost most of its religious meaning. Instead of being used as tool to teach the Christian catechism, it is simply a fun pastime at celebrations. It’s especially popular at Mexican fiestas and is used to mark special holidays, such as Christmas and Cinco de Mayo. It’s also popular at children’s parties, and many commercially available piñatas are made in the likeness of beloved children’s characters.

 

The History of the Necktie

H/T Back Then History.

I learned quite a lot about the history of ties.

It’s Been Around for a Long Time

Neckties are a staple fashion accessory in the 21st century. But when and how did ties become an integral part of formal dress for men? The history of the necktie goes back a lot further than you might think! Many of the terracotta sculptures of Chinese soldiers from the 2nd century B.C. feature a carefully tied neck cloth. Since the item is only shown on select soldiers, historians think that the cloth was an honorary badge used to denote exemplary performance. In ancient Rome, a band of linen called a “sudarium” was worn around the neck (or sometimes tied around the waist) of most men. Military genius Emperor Trajan’s soldiers are depicted wearing them on a pillar, leading historians to believe that the sudarium had its roots in military attire. In ancient Egypt, cloths adorned with precious stones were worn around the necks of some Pharaohs. Certain members of tribes in Oceania wore neck adornments as well.

It Has Its Roots in Military Dress

While neck cloths and adornments were worn in ancient cultures across the world, what we think of as the modern necktie didn’t come into existence until the 17th century. During the Thirty Years War, French King Louis XIII hired Croatian mercenaries to help aid his cause. The Croatian mercenaries all wore colorful neckerchiefs as an official part of their uniforms. The front-line soldiers’ neckerchiefs were made from common materials, while the officers wore neckerchiefs made from muslin or silk. The neckerchiefs were knotted around the soldiers’ necks and used to hold up their capes; the ends of the cloths were either arranged in a bow, finished with a tuft or tassel, or left hanging loosely. King Louis XIII liked the look of these neckerchiefs so much that he required the item be worn during all royal events. The French first called the accessory a “croate,” after the Croatian soldiers, but it was soon corrupted to “la cravate” or “cravatte” – cravat in English. The cravat caught on in England after Charles II reclaimed the throne, and there are reports of German soldiers adopting the Croatian mercenaries’ neckties as a style accessory as well. Over the course of the next century, the cravat caught on in Germany, France, and throughout the English colonies.

It Was a Gentlemen’s Accessory

The cravat quickly became a style accessory for gentlemen and was associated with power, wealth, and elegance. A group of young British gentlemen who called themselves the Macaronis helped popularize the cravat in the 1760s. In the early 1800s, Napoleon wore one during the Battle of Waterloo. And in Regency England, a young dandy named Beau Brummell helped to solidify the importance of how a cravat was tied; the young fashion icon wore a complicated cravat knot that appeared effortless but was actually incredibly intricate and took hours to get correct. However, his style was financially accessible to both middle-class and upper-class gentlemen, so it caught on quickly. As the use of precise cravat knots spread, the notion that a well-tied knot was the mark of a true gentlemen arose; in fact, we still consider sharp dress the mark of a man of taste today! (Incidentally, Brummell also started the trend of wearing black evening wear, another trend that still endures today.) Soon there were so many complicated ways to tie a cravat that in 1818, a pamphlet called Neckclothitania was published to help men learn about all the different options. Some of the popular knots covered in the instructional booklet included: The Mathematical, The American, The Irish, and The Mail Coach. Interestingly, the booklet was the first printed material to use the word “tie” to refer to a neckcloth as opposed to “cravat.” By 1840, the term “tie” had almost entirely replaced the word “cravat” in popular usage.

Image Source: https://narwhalcompany.com

Until the 1860s, the tie was a handmade product fashioned from high-quality materials such as muslin, white lace, or linen. It also had to be laundered and pressed often by servants and was (generally) only worn by gentlemen. However, during the late 1800s, men became increasingly aware of how they looked while out in public and the tie provided a way to take pride in their appearance. More and more men in the middle class began to dress above their current station as part of their upwardly mobile goals. The invention of the sewing machine and industrialization made ties even more accessible, since they were no longer handmade items but were instead mass produced.

Its Famous Knot Was Invented in the 1800s

Increased availability coupled with its continued popularity among London’s most influential young gentlemen ensured the tie’s enduring place in men’s fashion. In fact, the Four-in-Hand Knot, which remains one of the most well-known tie knots today, was actually created all the way back in the late 1800s! Since the knot resembled the way a driver would tie his reigns to direct a carriage pulled by four horses, the name was a natural fit. There’s also a theory that the knot is named after London’s famous gentlemen’s carriage-driving club, the Four in Hand, who helped popularize the new knot. In addition to the Four-in-Hand knot, another neck accessory, the ascot, also became popular around this same time (it is named for the Royal Ascot horse race). The well-known bow tie also emerged as a fashion favorite during this period, although it was first invented at the beginning of the 18th century.

It Was Modernized in the 1920s

In the 1920s, a New York tie maker named Jesse Langsdorf came up with a new way of cutting the fabric when creating a tie. He cut the fabric on an angle and then sewed it in three segments. Unlike older versions of the necktie, Langsdorf’s innovation allowed a tie to spring back to its original shape after being worn. This made tie wearing even more accessible for men from all walks of life. And indeed, the popularity of Langsdorf’s necktie endured; it is the version that men still wear today!

It Gained Variety in the 20th Century

After Langsdorf’s innovation, the basic construction of the tie did not change, but the accessory still underwent many different style periods throughout the 1900s. In 1936, the Duke of Windsor famously created the Windsor knot, which is still worn by many men today. Ties in bright colors and patterns were introduced after the end of World War II and remain popular today as well. In the 1950s, the skinny tie was introduced to the public. Legend has it that the style was invented because tie makers were running short on fabric, but whether or not this explanation is true, it’s certain that the skinnier style complemented the well-tailored suits favored by men in the 1950s. In the 1970s, extra-large ties called kipper ties became popular, as they complemented the dominant style of men’s suits at the time.

It Has an Enduring Legacy

The tie remains as popular as ever in the 21st century. Men commonly wear them to job interviews, at the office, and for important events. Just like in the 19th century, it is thought that wearing a tie may help men from all walks of life convey a sense of professionalism and power. But its modern popularity also has to do with self-expression. As society moved through the close of the 19th century and into the early 20th century, men’s fashion became less extravagant, leaving the tie as one of the only elements of fashion through which men could express their personality. In the 21st century, the tie continues function as a way for men to express themselves through their sartorial choices. Today, ties are available in a variety of widths, lengths, materials, colors, and patterns, so men can choose whatever combination best complements their style and personality.

 

The History of Shortbread Cookies

H/T Back Then History.

Shortbread has an amazing history.

They Have Medieval Origins

Shortbread cookies are a beloved staple at many family gatherings, including holidays and weddings. But you may be surprised to learn that there’s quite the history behind the buttery treats you love so much! Shortbread cookies are thought to have evolved from a food called medieval biscuit bread. Medieval biscuit bread was a type of twice-baked biscuit made from leftover bread dough and it usually contained some added spices and/or sugar. Medieval biscuit bread, sometimes called rusk, was popular in many European countries during the Middle Ages. Over time, the Scottish began to use butter in place of yeast, and the dry, hard biscuit bread evolved into the crumbly, buttery shortbread cookies we know and love today!

They’ve Been Around for Centuries

A Scottish woman named Mrs. McLintock is credited with writing the first shortbread recipe that appeared in print. It was included in a cookbook in 1736. However, shortbread had become a staple at Scottish family events long before that – some experts estimate that shortbread was being consumed in Scotland as early as the 12th century. Over time, the proportions of one part sugar, two parts butter, and three parts flour became the standard shortbread recipe and many bakers still use this same basic recipe today.

They Were Once Reserved as a Celebratory Treat

Because shortbread cookies rely on ingredients that would have been expensive for most Scottish families to afford, they were reserved for special occasions such as weddings, Christmas, and celebrating the New Year. In Shetland, it was traditional to break shortbread over a new bride’s head on the threshold of her new home, and all across Scotland, shortbread is still offered to the “first footers” when celebrating Hogmanay, the Scottish New Year’s Eve. After the Scottish started making shortbread, it made its way to the rest of the United Kingdom and ultimately all across the commonwealth territories. That’s why today, many families have a tradition of serving shortbread cookies at holiday parties and important life events.

They Were Famously a Royal Favorite

Early shortbread recipes were somewhat different from the shortbread cookies we know and love today. So how did the change take place? Many give the credit to Mary, Queen of Scots. The Queen spent much of her childhood in France and her French tastes influenced the food served in her court. Her chefs created a version of shortbread called “petticoat tails” that the Queen famously loved; the cookies were shaped like pizza slices and included caraway seeds for flavor as well as plenty of butter. The Queen’s preferred shortbread recipe is likely to have been more decadent than earlier versions and may have helped give rise to the buttery and sweet shortbread cookies that we know and love today. The Queen’s fondness for these shortbread cookies also helped to further popularize the Scottish treats and elevated them from their peasant roots to a more elegant food worthy of royalty.

Image Credit: https://www.scottishscran.com/petticoat-tails-shortbread-recipe/

There Are a Variety of Shapes & Styles

Shortbread generally comes in one of three shapes. The shortbread cookies most favored by Mary, Queen of Scots are called “petticoat tails” – they are made by dividing a large circle of dough into segments like a pizza. Shortbread shaped into thick rectangles (which are sometimes called “fingers”) are a common type still seen today. Individual circles called “shortbread rounds” are another shape that you may be familiar with. Sometimes, these round shortbreads are baked with a traditional design on top that’s reminiscent of the decorations used on the ancient Yule bannock – a round cake linked to sun worship in pre-Christian times. In addition to the different shapes, there are also many regional varieties of shortbread. Some popular regional variations include shortbread made with coriander and caraway seeds; shortbread made with almonds and orange peel; and ginger shortbread. And of course, bakers today often experiment with unique ingredients and flavors to create gourmet versions of these simple treats.

They’re Cleverly Named

Ever wondered why shortbread is called, well, shortbread? It’s actually quite clever! In baking, the term “short” indicates a crisp, crumbly texture. Shortbread is known for this type of texture, so the name makes sense. But there’s also a second layer of meaning. The term “shortening” is sometimes used more generally in baking to mean any type of fat in the recipe; in the case of shortbread, the fat source is butter, and it’s essential for achieving shortbread’s famous crumbly texture. Since fat is such a key ingredient in shortbread, the cookie’s name is thought to be a reference to the fat (or “shortening”) called for in the recipe. It’s also interesting to note that shortening does indeed function as its name implies – it interferes with the formation of gluten strands in pastry dough, making them shorter. These shortened gluten strands result in a pastry with the tender, crumbly texture characteristic of the shortbread cookies we know and love today!