We Had No Idea These Celebrities Served In The Military

H/T War History OnLine.

I knew some of these celebrities served some I never heard of or knew they served.

When it comes to celebrities who have served in the military at some point in their lives, there are usually two separate types: famous veterans, and veterans who are famous. The former are people who became famous for their military exploits, like “Chesty” Puller, while the latter are people who became famous through other means but served in the military at one point, like Elvis Presley. There are actually many high-profile celebrities who did a stretch in the military.

 

This is a list of a few celebrities whose military service may surprise you.

MC Hammer

MC Hammer smiles while holding microphone
Photo Credit: Rich Polk/Getty Images for Capitol Music Group

MC Hammer, or Stanley Kirk Burrell, is an American rapper who shot to celebrity status in the 1980s and 1990s with the release of a number of popular songs.

Before his entry into the music scene, Burrell had tried to achieve his dream of becoming a baseball player, but failed when he did not make it through tryouts. After this, he joined the U.S. Navy, where he served for three years. He was a Petty Officer Third Class Aviation Store Keeper at the time of his honorable discharge.

Charles Bronson

Celebrities Charles Bronson and David Carradine relaxing with a cup of coffee, Cannes Film Festival, 1977
Photo Credit: Heinz Browers/United Archives via Getty Images

On top of speaking three different languages, legendary actor Charles Bronson served in the U.S. Air Force during WWII. He entered the service in 1943 as part of the 760th Flexible Gunnery Training Squadron.

In 1945, he served in the 61st Bombardment Squadron which operated from Guam in the Pacific. Here, he flew 25 missions as a B-29 Superfortress tail gunner and even received a Purple Heart for his battle wounds.

Adam Driver

Adam Driver attends the European Premiere of Star Wars: The Last Jedi
Photo Credit: Gareth Cattermole/Getty Images for Disney

Adam Driver is most famous for his role as Kylo Ren in the Star Wars franchise, but he’s less well-known for his time in the U.S. Marine Corps.

He enlisted after the September 11 attacks in 2001 and joined the 1st Marines as a mortar man. After two and a half years in the Marines, he had a mountain biking accident and fractured his sternum just before he and his unit left for Iraq. “To not get to go with that group of people I had been training with was…painful,” Driver said.

Gal Gadot

Photo Credit: Frazer Harrison/Getty Images
Photo Credit: Frazer Harrison/Getty Images

After earning the crown for Miss Israel in 2004, Gal Gadot served in the Israel Defense Forces (IDF). Service in the IDF is mandatory for all Israelis over 18 including women, where they must serve for two to three years.

In the IDF, Gadot taught gymnastics and calisthenics, a position that gave her the ideal skills to perform in action movies: “You give two or three years, and it’s not about you. You give your freedom away. You learn discipline and respect.”

Chuck Norris

Chuck Norris in a scene from 1984's Missing in Action
Chuck Norris in 1984’s Missing in Action. (Photo Credit: Sunset Boulevard/Getty Images)

Probably the least shocking on this list is Chuck Norris. Norris is a legendary American film producer, actor, martial artist, and the only person to have counted to infinity, twice. Prior to his time as a movie star and the subject of internet memes, he served in the U.S. military.

He joined the Air Force in 1958 as an Air Policeman on Osan Air Base in South Korea. Here, he discovered his interest in martial arts, something that would propel him through his career. He was discharged from the U.S. Air Force in 1962.

Arnold Schwarzenegger

Celebrities Arnold Schwarzenegger and Sally Field pose for photo, Arnold curling his biceps
Photo Credit: Bettmann / Getty Images

Schwarzenegger has a list of achievements that could fill multiple libraries. He is the most famous bodybuilder in history, and once he retired from that sport, he tried his hand in Hollywood, again reaching the top of his field as one of the most famous stars ever — and for a time, the highest-paid. He managed to squeeze in the time to become the governor of California too.

Born in Austria, Schwarzenegger served a year in the Austrian Army in 1965, which was compulsory for all males over 18 at the time. While in the Army, Arnold snuck off his base to participate in the junior Mr. Europe bodybuilding competition.

He won the competition, but upon his return, he was placed in a military prison for a week. “Participating in the competition meant so much to me that I didn’t carefully think through the consequences,” Schwarzenegger said.

Morgan Freeman

Morgan Freeman, one of many celebrities to serve in the military
Photo Credit: Axelle/Bauer-Griffin/FilmMagic

Freeman is well known for his interesting characters, deep voice, and calming persona. After leaving high school in 1955, he turned down a partial drama scholarship and instead chose to join the U.S. Air Force.

In the Air Force, Freeman served as a radar technician: “I took to it immediately,” he said. “I did three years, eight months, and ten days in all, but it took me a year and a half to get disabused of my romantic notions about it.”

Freeman left in 1959 and began his career in drama

How McDonald’s Beat Its Early Competition and Became an Icon of Fast Food

H/T History.com.

Looking back at the origins of McDonalds.

The future fast-food giant started out as anything but swift, serving up slow-cooked barbecue. How did it become the behemoth it is today?
 
 

New Hampshire brothers Richard and Maurice McDonald opened the very first McDonald’s on May 15, 1940, in San Bernardino, California. Their tiny drive-in bore little resemblance to today’s ubiquitous “golden arches,” but it would eventually come to epitomize the fast-food industry, thanks to a pioneering system for food prep. 

The first McDonald’s started slow, but caught on fast

The first McDonald’s—located at the corner of 14th and North E Streets, just off Route 66—started out serving up barbecue slow-cooked for hours in a pit stocked with hickory chips imported from Arkansas. With no indoor seating and just a handful of stools at its exterior counters, the establishment employed female carhops to serve most customers who pulled into its parking lot. The brothers’ business quickly caught on. Sales soon topped $200,000 a year.

Richard "Dick" and Maurice "Mac" McDonald. (Credit: McDonald's)

 

Richard “Dick” and Maurice “Mac” McDonald.

McDonald’s

After World War II, drive-in competition in San Bernardino grew, and the McDonald brothers discovered something surprising about their barbecue restaurant: 80 percent of their sales came from hamburgers. “The more we hammered away at the barbecue business, the more hamburgers we sold,” said Richard McDonald, according to John F. Love’s book McDonald’s: Behind the Arches.

McDonald’s grew thanks to its ‘Speedee Service System’

The brothers closed their doors for three months and overhauled their business as a self-service restaurant where customers placed their orders at the windows. They fired their 20 carhops and ditched their silverware and plates for paper wrappings and cups so that they no longer needed a dishwasher. According to Love, they simplified their menu to just nine items—hamburgers, cheeseburgers, three soft drink flavors in one 12-ounce size, milk, coffee, potato chips and pie.

“Our whole concept was based on speed, lower prices and volume,” Richard McDonald said. Taking a cue from Henry Ford’s assembly-line production of automobiles, the McDonald brothers developed the “Speedee Service System” and mechanized the kitchen of their roadside burger shack. Each of its 12-person crew specialized in specific tasks, and much of the food was preassembled. This allowed McDonald’s to prepare its food quickly—and even ahead of the time—when an order was placed. All hamburgers were served with ketchup, mustard, onions and two pickles, and any customers who wanted food prepared their way would have to wait. 

Original McDonald's

 

The original McDonald’s restaurant, featuring a ten item menu built around a 15 cent hamburger, in San Bernadino, California, circa 1955.

CSU Archives/Everett

“You make a point of offering a choice and you’re dead,” Richard McDonald told The Chicago Tribune in 1985. “The speed’s gone.”

According to Love, the first customer at the newly reopened McDonald’s was a 9-year-old girl ordering a bag of hamburgers. The retooled restaurant struggled at first, though, and fired carhops heckled the brothers. Once McDonald’s replaced potato chips with french fries and introduced triple-thick milkshakes, however, the business began to take off with families and businessmen drawn by the cheap, 15-cent hamburgers and a low-cost menu

Twilight_Zone_GettyImages-860965078
The Real Colonel Sanders

McDonald’s begins to franchise 

With labor costs slashed and revenue growing to $350,000 a year by the early 1950s, the McDonald brothers saw their profits double. They had already established a handful of franchises in California and Arizona by the time a milkshake mixer salesman named Ray Kroc visited San Bernardino in 1954. Kroc couldn’t understand why the McDonalds could possibly need eight of his Multi-Mixers, capable of making 48 milkshakes at once, for just one location until he set eyes on the operation.

Seeing the potential in the business, the salesman quickly became the buyer. Kroc bought the rights to franchise the brothers’ restaurants across the country, and in 1955 he opened his first McDonald’s in Des Plaines, Illinois.

The First McDonald's

 

Exterior view of the first McDonald’s fast food restaurant with its neon arches illuminated at night, in Des Plaines, Illinois, circa 1955.

Hulton Archive/Getty Images

The relationship between Kroc and the McDonald brothers quickly grew very contentious as the aggressive salesman and the conservative Yankees had different philosophies about how to run their business. Kroc chafed at the requirement that he receive a registered letter from the McDonalds to make any changes to the retail concept—something the brothers were reluctant to grant. “It was almost as though they were hoping I would fail,” Kroc wrote in his 1977 autobiography, Grinding It Out.

Ray Kroc becomes the owner of the company 

In 1961, Kroc purchased the company from the McDonald brothers for $2.7 million. While the name of the chain may have been McDonald’s, the face of the restaurants quickly became Kroc’s. Plaques with his likeness were mounted on the walls of many franchises with a description of how “his vision, persistence and leadership have guided McDonald’s from one location in Des Plaines, Illinois to the world’s community restaurant.”

Roy Kroc of McDonalds

 

Fred Turner and Ray Kroc the executive leaders of McDonald’s Corporation looking at blueprints of future restaurant in 1975.

McDonald’s/Everett

The brothers who lent their name to the business and pioneered the fast-food concept faded to the background. After selling the business, the founders kept their original San Bernardino restaurant, to the annoyance of Kroc, which they renamed “Big M,” with the golden arches on the marquee sharpened to form a giant letter “M.” To gain his revenge, Kroc opened a McDonald’s around the block that eventually drove the brothers out of business.

The original McDonald’s was torn down in the 1970s and later replaced by a nondescript building that housed the San Bernardino Civic Light Opera. In 1998, it became the headquarters of a regional fast-food chain, Juan Pollo Chicken, which operates a small unofficial museum with McDonald’s artifacts inside.

The Extra-Long History of the Hot Dog

H/T History.com.

Hot Dogs either you love them or hate them.

From ancient Roman sausage to Nathan’s Coney Island hot dog, the history of tubular meat may stretch back millennia.
 

The hot dog, a quintessential American summer grill food, has origins that may go back millennia.

Historians believe its beginnings can be traced to era of the notorious Roman emperor Nero, whose cook, Gaius, may have linked the first sausages. In ancient Rome, it was customary to starve pigs for one week before the slaughter. As the legend goes, Gaius was watching over his kitchen when he realized that one pig had been brought out fully roasted, but somehow not cleaned. 

He stuck a knife into the belly to see if the roast was edible, and out popped the intestines: empty because of the starvation diet, and puffed from the heat. According to legend, Gaius exclaimed, “I have discovered something of great importance!” He stuffed the intestines with ground game meats mixed with spices and wheat—and the sausage was created.

After that, the sausage traveled across Europe, making its way eventually to present-day Germany. The Germans adopted the sausage as their own, creating scores of different versions to be enjoyed with beer and kraut. In fact, two German towns vie to be the original birthplace of the modern hot dog. Frankfurt claims the frankfurter was invented there over 500 years ago, in 1484, eight years before Columbus set sail for America. But the people of Vienna (Wien, in German) say they are the true originators of the “wienerwurst.” 

No matter which town might have originated this particular sausage, it’s generally agreed that German immigrants to New York were the first to sell wieners, from a pushcart, in the 1860s.

 

Most Iconic Cars in TV History

 

Two boys greedily eat hot dogs.

 

Credit: Hulton-Deutsch Collection/CORBIS/Corbis via Getty Images

The man most responsible for popularizing the hot dog in the United States was, however, neither German nor Austrian. His name was Nathan Handwerker, a Jewish immigrant from Poland. In 1915, Handwerker worked at a hot dog stand at Coney Island, where he made a whopping $11 a week slicing buns. The hardworking Handwerker lived entirely on hot dogs and slept on the kitchen floor for a year until he’d saved $300, enough to start a competing stand. He was a savvy businessman: Knowing his former boss charged 10 cents apiece for dogs, Handwerker charged only 5 cents. Customers flocked to him, his competitor went out of business, and Nathan’s Famous was born.

By the Depression, Nathan’s hot dogs were known throughout the United States. In fact, they were so prized as delicious, all-American eats that they were even served to royalty. When President Franklin Roosevelt hosted King George VI of England and his queen at a picnic in Hyde Park in 1939, first lady Eleanor decided to make grilled hot dogs part of the menu, a choice that received much press coverage at the time. 

One month before the picnic, Mrs. Roosevelt mentioned the hubbub in her syndicated newspaper column. “So many people are worried that the dignity of our county will be imperiled by inviting royalty to a picnic, particularly a hot dog picnic!” Ultimately, the hot dogs proved to be a great hit: The king enjoyed them so much he asked for seconds.

 
 

Bankrupt and Dying from Cancer, Ulysses S. Grant Waged His Greatest Battle

H/T History.com.

Ulysses “Sam” Grant did while dying did what most healthy men could not he wrote his complete biography leaving his family financially well off.

Aided by Mark Twain, the former president and Civil War hero raced to complete a literary masterpiece that saved his wife from destitution.
 

Shortly before noon on May 6, 1884, Ulysses S. Grant entered the office of his Wall Street brokerage firm a wealthy man. Hours later, he exited a pauper.

Thanks to a pyramid scheme operated by his unscrupulous partner, Ferdinand Ward, Grant’s investment firm had instantly collapsed, wiping out his life savings. “When I went downtown this morning I thought I was worth a great deal of money, now I don’t know that I have a dollar,” the swindled Civil War hero lamented to a former West Point classmate. In fact, Grant had all of $80 to his name. His wife, Julia, had another $130. Kind-hearted strangers responded by mailing Grant checks. Desperate to pay his bills, the former U.S. president cashed them.

Still smarting from bankruptcy’s bitter sting, Grant that summer suffered from an excruciating sting in his throat as well. When he finally visited a doctor in October, Grant learned he had incurable throat and tongue cancer, likely a product of his longtime cigar-smoking habit. 

Grant had been no stranger to financial misfortune. Failing as a farmer and a rent collector prior to the Civil War, he lived in a log cabin that he dubbed “Hardscrabble” and sold firewood on the streets of St. Louis to make ends meet. However, now that he was confronting the terrifying prospect of leaving Julia a penniless widow, the grizzled general who fought to save the Union undertook one final mission to save his family from impoverishment.

Mark Twain paid Grant to publish his memoirs

Ulysses S. Grant and his family

 

Former Civil War General and U.S. President Ulysses S. Grant (1822-1885) sits (center, with top hat) for a family portrait with his wife, Julia Dent Grant, and their children and grandchildren at the family’s seaside cottage in Long Branch, New Jersey, circa 1883.

Oscar White/Corbis/VCG/Getty Images

Divested of his property and possessions, Grant still retained something of great value—his recollections of past glories. Lurking behind the taciturn façade was a convivial storyteller who entertained friends such as Mark Twain with yarns of war and politics. “While we think of Grant as silent and reserved, he was a captivating raconteur with a dry wit and a ready fund of stories,” says Ron Chernow, Pulitzer Prize-winning author of Grant.

For years Twain had suggested that Grant pen his memoirs. Now destitute, the former president finally agreed to cash in on his celebrity. In need of financial rescue himself after a series of failed investments, the debt-ridden Twain inked Grant to a contract with his newly launched publishing house and gave him a $1,000 check to cover living expenses.

Engaged in a furious race against time as the cancer attacked his body, Grant dug into his writing with military efficiency, churning out as many as 10,000 words in a single day. “Grant approached his memoirs with the same grit and determination as he tackled his Civil War battles,” says Chernow, who also serves as executive producer of HISTORY’s documentary series “Grant.” “As in those encounters, he was thorough and systematic, a real stickler for precision and the truth. In his home, he amassed tall stacks of orders and maps that helped him to recreate his most famous battles with minute fidelity. In war and in writing, Grant had the most amazing ability to marshal all his energy in the pursuit of a single goal.”

Grant astounded Twain with not just the quantity, but the quality of his prose. “Grant prided himself on his writing skills,” Chernow says. “His wartime orders were renowned for their economy and exactness, and he made a point of writing all his own speeches as president—something unthinkable today.”

 

With just weeks to live, Grant made one final push

Ulysses S. Grant

 

Ulysses S. Grant reading on a house porch, thought to be the last photograph taken before his death, 1885.

Afro American Newspapers/Gado/Getty Images

Grant penned his manuscript until his hand grew too feeble in the spring of 1885, forcing him to employ a stenographer. Even speaking, however, became laborious as his condition deteriorated. Following the advice of doctors who vouched for the salubrious power of pure mountain air, Grant decamped at the onset of summer from his Manhattan brownstone to an Adirondack resort north of Saratoga Springs. In a cottage on the slopes of Mount McGregor, Grant launched his final campaign to complete his tome.

With excruciating pain accompanying every swallow, Grant was unable to eat solid food. His body withered by the day. The voice that once commanded armies could barely muster a whisper. “He endured great pain with incredible stoicism,” says Ben Kemp, operations manager at the U.S. Grant Cottage State Historic Site. While Grant’s doctors gave him morphine only sparingly in order to keep his mind clear for writing, they swabbed his throat with cocaine to provide topical pain relief and used hypodermic needles to inject him with brandy during the worst of his coughing fits.

Through it all, Grant persisted in honing his manuscript—editing, adding new pages and poring over proofs of his first volume—as he sat on the cottage porch on even the steamiest of days swaddled in blankets, a wool hat and a scarf covering his neck tumor, which was now “as big as a man’s two fists put together” according to the New York Sun. When his voice finally abandoned him, Grant scribbled his thoughts in pencil on small slips of paper.

When Twain visited Grant at the cottage, he brought the good news that he had already pre-sold 100,000 copies of the autobiography. A relieved Grant knew he had succeeded in giving Julia and his children financial security. “Taking care of his family is all that mattered at that point,” Kemp says. “Grant knew at that moment this was going to be a success. Like in a battle, that was the moment he knew the tide had turned.”

With his mission accomplished, Grant finally laid down his pen on July 16 after crafting a herculean 366,000 words in less than a year. “There is nothing more I should do to it now, and therefore I am not likely to be more ready to go than at this moment,” he wrote. Seven days later, Grant’s pulse flickered and ultimately gave out.

Grant’s autobiography was a commercial and literary smash

Ulysses S. Grant memoir

 

The personal memoirs of Ulysses S. Grant.

MPI/Getty Images

Employing an army of door-to-door salesmen, Twain sold more than 300,000 copies of the Personal Memoirs of Ulysses S. Grant. The two-volume boxed set even outsold Twain’s latest work, Adventures of Huckleberry Finn, and resulted in Julia Grant receiving $450,000 in royalties (equivalent to $12 million today).

Grant’s memoir proved not just a commercial success, but a literary one as well. Although Grant omitted discussion of his presidency or sensitive personal matters such as his drinking, many scholars consider his autobiography the finest memoir ever penned by an American president and perhaps the foremost military memoir in the English language. “The emotion of the situation probably lent energy and eloquence to his work,” Chernow says. “In all likelihood, he had narrated many of these stories in the years since the war and they had acquired a certain smoothness and polish in the retelling.”

“There was no doubt among his family and friends that Grant had willed himself to stay alive to complete the book,” Chernow says. “He may have originally undertaken the memoirs to provide for his wife after his death, but it must also have soothed and consoled him at the end of his life to recount his glorious victories in the Civil War.”

The Old Headquarters of Murder, Inc.

H/T Atlas Obscura.

Imagine running a murder for hire out of a candy store.

This otherwise innocuous bodega was once the headquarters of the most feared assassin’s guild in American history. 

The Old Headquarters of Murder Inc. Luke J Spencer Atlas Obscura user

 

ON THE CORNER OF SARATOGA and Livonia Avenues in Brownsville, Brooklyn, there used to be a 24 hour candy store. 

During the 1930s and 40s, the Midnight Rose Candy Store, located under the elevated portion of the 3 subway train, was run by a little old lady in her 60’s, Mrs. Rosie Gold. It’s hard to imagine a more innocuous and unoffending picture. But all was not as it seemed at the Midnight Rose, for this was the secret headquarters of one of the most infamous and deadly groups in the history of organized crime: Murder, Inc. 

Murder Incorporated was created in the 1930s to act as the execution squad of the newly formed National Crime Syndicate. The death squad was comprised of mostly Jewish and Italian gangsters centered around the Brownsville neighbourhood of Brooklyn. Exact numbers aren’t known, but it is estimated that Murder Inc., carried out 400-1000 executions, making the innocent looking candy store responsible for more murders than anywhere else in the United States. 

The National Crime Syndicate was the ruling elite of East Coast organized crime, counting amongst their ruthless members Lucky Luciano, Meyer Lansky, Bugsy Siegel and Dutch Shultz. Murder Inc. was run by  Louis “Lepke” Buchalter and Albert Anastasia, known as the ‘Lord High Executioner’. The group of killers were paid a basic retaining salary and a freelancer fee for each hit of anywhere from $1,000 to $5,000. Rosie Gold kept a wall of pay phones along the back wall of the candy store; the members of Murder Inc., would pass the time at the Midnight Rose, sipping on Rosie’s malted milks until one of the phones rang, giving the details of the hit.

The group included such cold blooded killers as Abe “Kid Twist” Reles, “Pittsburgh Phil” Strauss, Allie “Tic Toc” Tannenbaum, and Martin “Buggsy” Goldstein. As the murders were carried out by men unknown to the victims, Murder Inc., was able to remain unassociated with executions that took place all along the East Coast, and as far west as Detroit. Strangers were often dispatched at the whim of the National Crime Syndicate with the preferred weapon of choice for Murder Inc., the ice pick. 

The assassins eventually met their downfall in 1940 when “Kid Twist” Reles was caught by a police informant. Ratting on each of his colleagues, the majority of members of Murder Inc., would go on to meet their own grisly ends in the electric chair at Sing Sing. On November 12th, 1941, whilst staying at the Half Moon Hotel in Coney Island under 24 hour police supervision, Kid Twist was found dead after falling seven stories from his room. The case was never solved, but it seems likely that he paid the price for turning informant. As one member of the National Crime Syndicate is supposed to have said of Reles, “the canary could sing, but he couldn’t fly.”

With most of its members in prison or executed, Murder Inc., faded into memory; but immortalized in film, literature, and television shows such as “Boardwalk Empire” and Neil Kleid & Jake Allen’s comic book “Brownsville” (2006), their name has passed into legend.

Rosie Gold disappeared into obscurity, but her corner candy store is still there, on the corner of Saratoga and Livonia. For the past three years it has been a 24 hour bodega. The current owner when interviewed had no idea that his corner deli was once the headquarters of one of  the most feared collection of assassins in American history. The row of telephones have gone, and with it the ring of an incoming call that would have meant the end of someone’s life at the deadly hands of Murder Incorporated. 

The “Real” Cardiff Giant

H/T Atlas Obscura.

It turned out to be a giant hoax. Pun intended.

Fort Dodge Museum
Fort Dodge, Iowa

Hidden in the grounds of the Fort Dodge Museum is a “real” and terrifying petrified giant.

 

THE FORT DODGE MUSEUM, AS far as historical reconstructions go, is pretty customary. This well-curated reconstruction of a pre-Civil War military fort has been around for decades. Historic additions like log cabins, a one-roomed schoolhouse, and a general store are standard fare in these places. But there is something unusual, a hidden treasure in the back of the grounds, that makes this place special.

An octagonal building stands just outside the back gate of the fort with no particular signage to warn you of what you are about to experience. Entering the front door, you come to a significant depression below floor level. The building was constructed around a now-walled pit, the location of what appears to be a significant archaeological dig. Within the pit lies a petrified giant, over 10 feet tall. Everything about the giant is giant; His ears are huge, his head is massive, his arms lengthy, his feet colossal, his intimates… well, everything is huge.

In 1868, New York tobacconist George Hull, visiting his sister in a nearby Iowa town, started a debate with a traveling revivalist who loved to quote scripture in Genesis about giants in the earth. An idea sprung into Hull’s head, a grand hoax that might make him some money and expose the fool-preachers buzzing around America. Hull was a fired-up agnostic, an instigator, and an avid entrepreneur.

He traveled to the gypsum quarry in Ft. Dodge, bought a five-ton block of rock for a barrel of beer, and had it shipped to Chicago where he hired an artist and stonecutter to create a giant in stone. He instructed them to take the utmost care in aging the stone and making it look a real as possible. The petrified “giant” was shipped to a farm near Cardiff, New York, and buried in a field for a year. With a few paid insiders, Hull pulled off one of the greatest hoaxes in American history. For over a year, Hull fooled scientists, scholars, preachers, and common folk into what they thought was the eighth wonder of the world. Hull especially loved preacher’s proclamations of the petrified giant as clear proof the book of Genesis was true.

Hull made boatloads of money on people paying a dollar piece to see the archaeological miracle. Mark Twain wrote a couple of articles and a short story about the Petrified Man. Twain appreciated a good gaff on the American people. P.T. Barnum tried to lease the Cardiff Giant, as it became known, but when rebuffed by Hull, he had his own made declaring it the real giant. Years after the hoax was exposed (the giant became known as “Old Hoaxy”) it continued to travel across the U.S., displayed at state fairs, carnivals, and even a World’s Fair. Today, Hull’s original giant resides in the Farmer’s Museum in Cooperstown, New York. Barnum’s fake of the fake is in Marvin’s Marvelous Mechanic Museum in Farmington Hills, Michigan.

Fort Dodge’s own giant was “discovered” in a block of Fort Dodge gypsum, right on this spot, exactly 100 years after the first fake giant was found in New York. Artist Cliff Carson said that as he took his chisel to the gypsum, the stone fell away revealing a giant big toe. A few more hits revealed a huge foot. He had discovered the real giant, encased in gypsum. A Blue-Ribbon Commission, after carefully examining the body, declared this was a true petrified stone giant.

A few minutes with the giant will make you a believer in the real Cardiff Giant. All the others are hoaxes.

Know Before You Go

The Fort Dodge Museum is open every day from 10:00 – 5:00.  Entrance fees range from free to $10. There is nothing on the website about the giant. This is to keep it a surprise for visitors. 

The Reason Why a Standard Piece of Paper Is 8.5 Inches by 11 Inches

H/T Mental Floss.

I wondered how these dimensions were arrived at.

Whether you’re printing documents at home or the office, you know it’s wise to keep reams of paper on hand. But when you visit an office supply store, there isn’t a wealth of options. The standard paper size is 8.5 inches wide by 11 inches long.

Which invites a bit of pulpy pondering: Who decided that?

According to Marketplace writer Jack Stewart, the answer resides in the early days of paper production, when workers dipped paper molds made of wood into vats containing pulp and water. Once dried, you had paper.

This technique was pioneered by the Dutch paper makers of the 1660s. Through trial and error, the frames settled into a standard size of 44 inches long to accommodate the outstretched arms of the laborers. Divided up by four, that gave paper makers a paper size of 11 inches.

The width is a little bit murkier. It may be that the Dutch allowed for 17 inches on the mold to make room for watermarks. Cutting those in half meant paper that was 8.5 inches.

But that’s only part of the standardization process. With people using typewriters, copiers, and printers, it made little sense to have multiple sizes of paper available. There needed to be a one-size-fits-all philosophy. Paper coming in at 8.5 inches by 11 inches fit the bill—or, in this case, the typewriter. The size allows for a comfortable 65 to 78 characters per 6.5 inches per type, which is what you get after subtracting 1-inch margins.

This size became more ubiquitous when Presidents Herbert Hoover and Ronald Reagan both mandated the dimensions for all government forms in the 1920s and 1980s, respectively. (Hoover was looking to minimize paper waste.)

There is another standardized length of paper—14 inches. That extra 3 inches is thought to be the result of lawyers needing more room for wordy contracts, which is why it’s often referred to as legal-size paper. It’s gotten increasingly popular in restaurants, where the additional space is helpful to list menu items. Most paper, however, is the result of craftspeople who simply couldn’t get their arms around anything else.

[h/t Marketplace]

History of the Lottery in the United States

H/T Lottery.net.

When did the lottery start in the U.S.? It’s been a part of life since at least 1776, when the Continental Congress voted to use a lottery to raise money for the War of Independence.

Although the idea didn’t end up being used, lotteries were a popular way to raise funds in early America for expenses like paving roads, building wharves, and even constructing churches.

No one invented the lottery in America, because it was already used in England and spread to the New World. In fact, the Jamestown colony was partly financed by private lotteries in the 1600s.

A number of the Founding Fathers promoted lotteries, mostly unsuccessfully.

In 1768, George Washington held a lottery to fund building the Mountain Road in Virginia, but it failed.

Benjamin Franklin also unsuccessfully tried to use a lottery to buy cannon to defend Philadelphia during the Revolutionary War.

Thomas Jefferson was also a fan of lotteries. “Far from being immoral, they are indispensable to the existence of man,” he wrote. In 1826, the Virginia legislature gave Jefferson permission to conduct a private lottery to pay off his many debts. He died before it could be held, but it was unsuccessfully attempted by his children.

John Hancock was the exception to the rule, successfully using a lottery to finance the rebuilding of Faneuil Hall in Boston after it burned down in 1761.

Lotteries were widespread in the early American republic. In 1832 it was reported that 420 lotteries had been held in eight states in the last year.

Lotteries also helped fund many college buildings, including at Columbia, Dartmouth, Harvard, and Yale.

After the Civil War, the Southern states used lotteries to finance Reconstruction. However, corruption by the private lottery organizers led to increasing opposition.

In 1868 Congress outlawed the use of the mail for lottery advertising “or other similar enterprises on any pretext whatsoever.” In 1878, the Supreme Court decided that lotteries had “a demoralizing influence upon the people.”

The Louisiana lottery leads to ban

However, the most successful lottery in the country continued to flourish. The Louisiana lottery was privately run by the Louisiana Lottery Company. At its height it was estimated to achieve sales of over $20 million per year. Prizes in the monthly drawings went up to the princely sum of $250,000, and twice a year special prizes could rise to $600,000.

The company had agents in every U.S. city, and 93% of its revenue came from out of state. Special trains were needed to transport the huge volume of mail, including thousands of ticket receipts, sent to the company’s headquarters in New Orleans.

The company gained a monopoly as Louisiana’s lottery provider in 1868 through the extravagant bribes paid by its founder, Charles T. Howard. In exchange it was allowed to keep all lottery proceeds tax-free.

Howard became a very powerful figure in Louisiana, although he wasn’t popular with everyone. The Metairie Jockey Club wouldn’t let him become a member, so when their racecourse ran into trouble, Howard purchased it and turned it into a cemetery – where he is buried in a huge tomb.

Despite paying thousands in bribes, the company still made an impressive 48% profit. One reason for this was that if there were unsold tickets before a drawing, they were put into the barrel the winning numbers were drawn from (the drawings were overseen by two former Confederate generals, Jubal Early and P.G.T. Beauregard). In many cases, this trick led to the company winning its own prize money.

In 1890, the lottery’s charter was up for renewal, and company officials bribed lawmakers to put the lottery in the state Constitution. However, this required a public vote, and furious citizens rejected the amendment.

The federal government had also had enough. President Benjamin Harrison denounced lotteries as “swindling and demoralizing agencies” and Congress banned sending lottery tickets by mail or taking them across state lines, finishing off the lottery.

As the abuses of the Louisiana lottery became known, they caused a huge national scandal and the public soured on lotteries.

States legalize lotteries in the twentieth century

Opinion on lotteries began to soften again during the early twentieth century, especially after the disaster of Prohibition, which ran from 1920-1933 and involved widespread organized crime related to illegal alcohol operations.

Nevada made casinos legal in the 1930s, and betting to benefit charity became more widespread throughout the country. However, the lingering memory of the Louisiana scandal kept lotteries from gaining public support for another thirty years.

In 1963, the New Hampshire legislature allowed a sweepstakes to raise money for education. The funds were badly needed because the state had no income or sales tax to finance educational programs.

Based on the popular Irish Sweepstakes, a ticket cost $3 and the winners of horse races at the Rockingham Park racecourse determined the biggest prizes. Despite the drawings not being held regularly, almost $5.7 million worth of tickets were sold in the first year.

Not to be outdone, New York started its own lottery in 1967. It proved spectacularly successful, bringing in $53.6 million in its first year. Just like today, residents of neighboring states without lotteries were tempted to cross the border to buy tickets.

The success of the New York lottery didn’t go unnoticed, and twelve more states introduced their own lotteries in the 1970s – ConnecticutDelawareIllinoisMaineMarylandMassachusettsMichiganNew JerseyOhioPennsylvaniaRhode Island and Vermont.

Why was the Northeast such fertile ground for lotteries? There seem to be three reasons.

First, the states needed money but didn’t want to take the always-unpopular step of raising taxes.

Second, each state had a large Catholic population that widely tolerated gambling.

Third, there was a domino effect: a state is much more likely to start a lottery if a nearby state already has one. The governor of North Carolina, Michael Easley, expressed a popular view when he promoted a lottery by saying, “Our people are playing the lottery. We just need to decide which schools we should fund, other states’ or ours.”

Most lotteries in the 1970s were extremely slow-paced by today’s standards. In 1974, Massachusetts introduced the first instant win game using scratch-off tickets, but the majority of lotteries were “passive drawing games” – basic raffles where tickets printed with a number were sold. Players often had to wait weeks for a drawing, so the suspense must have been intense!

In the 1980s the lottery boom intensified, with seventeen more states plus the District of Columbia taking part: ArizonaCaliforniaColoradoFloridaIdahoIndianaIowaKansasKentuckyMissouriMontanaOregonSouth DakotaVirginiaWashingtonWest Virginia, and Wisconsin.

The 1990s brought a further expansion of the lottery to six states – GeorgiaLouisianaMinnesotaNebraskaNew Mexico, and Texas.

In the 2000s they were joined by North CarolinaNorth DakotaOklahomaSouth Carolina, and Tennessee.

Lotteries today

Lotteries have come a long way from the 1960s – so what types of lotteries are there? They come in a variety of forms, from the instant-win scratch-off cards to multi-state draw games like Powerball and Mega Millions.

There’s something to appeal to every kind of player, whether you want instant gratification, more chances to win, or the potential for a bigger prize.

Research shows that the majority of the U.S. public approves of lotteries. Even many people who don’t buy tickets themselves still have a positive view. In a 2014 Gallup poll, 62% said gambling is “morally acceptable.” State lotteries are the most common type of gambling in the country, with about half of those polled saying they had bought a lottery ticket in the past 12 months.

Which states have the lottery today? Currently, there are only five states that do not have lotteries: Alabama, Alaska, Hawaii, Nevada, and Utah.

Alabama could be the next state to introduce lotteries, and there are also persistent attempts to pass a lottery bill in Hawaii. In the past, Alaska had enough oil money that it didn’t need a lottery, but views about an Alaska lottery may be changing since the state has recently been short of revenue.

How much money does the lottery make a year?

In 2017 Americans spent $73.5 billion on lottery tickets. That’s about $230 per year for every person in the country, which is an increase from the previous year. The total increases to $80 billion when electronic lottery games are counted.

The state with the highest lottery revenue was New York, which took in $8,344,023,000 in 2016.

So it’s no surprise that there are only a handful of states and territories without lotteries, because the lottery is a big benefit to state budgets. It’s an attractive way to raise money without raising taxes.

Lotteries are accepted by the public where they have been introduced as long as they contribute towards the common good, such as education programs and college scholarships.

Lottery proponents argue that states like Alabama lose a lot of money from residents who cross the border into neighboring states to buy lottery tickets.

Lottery retailers near the border in states like Florida or Louisiana do a roaring trade, especially when there’s a big Powerball or Mega Millions jackpot.

The argument that the money could be spent locally instead and benefit good causes in-state is persuasive to many residents.

WHAT YOU DIDN’T KNOW ABOUT THE APOLLO 11 MISSION

H/T Smithsonian Magazine.

From JFK’s real motives to the Soviets’ secret plot to land on the Moon at the same time, a new behind-the-scenes view of an unlikely triumph 50 years ago

The Moon has a smell. It has no air, but it has a smell. Each pair of Apollo astronauts to land on the Moon tramped lots of Moon dust back into the lunar module—it was deep gray, fine-grained and extremely clingy—and when they unsnapped their helmets, Neil Armstrong said, “We were aware of a new scent in the air of the cabin that clearly came from all the lunar material that had accumulated on and in our clothes.” To him, it was “the scent of wet ashes.” To his Apollo 11 crewmate Buzz Aldrin, it was “the smell in the air after a firecracker has gone off.”

 

All the astronauts who walked on the Moon noticed it, and many commented on it to Mission Control. Harrison Schmitt, the geologist who flew on Apollo 17, the last lunar landing, said after his second Moonwalk, “Smells like someone’s been firing a carbine in here.” Almost unaccountably, no one had warned lunar module pilot Jim Irwin about the dust. When he took off his helmet inside the cramped lunar module cabin, he said, “There’s a funny smell in here.” His Apollo 15 crewmate Dave Scott said: “Yeah, I think that’s the lunar dirt smell. Never smelled lunar dirt before, but we got most of it right here with us.”

Moon dust was a mystery that the National Aeronautics and Space Administration had, in fact, thought about. Cornell University astrophysicist Thomas Gold warned NASA that the dust had been isolated from oxygen for so long that it might well be highly chemically reactive. If too much dust was carried inside the lunar module’s cabin, the moment the astronauts repressurized it with air and the dust came into contact with oxygen, it might start burning, or even cause an explosion. (Gold, who correctly predicted early on that the Moon’s surface would be covered with powdery dust, also had warned NASA that the dust might be so deep that the lunar module and the astronauts themselves could sink irretrievably into it.)

Among the thousands of things they were keeping in mind while flying to the Moon, Armstrong and Aldrin had been briefed about the very small possibility that the lunar dust could ignite. “A late-July fireworks display on the Moon was not something advisable,” said Aldrin.

rock astronauts diptych
Armstrong collected the fragment of fine-grained basalt pictured on the left. Lunar rocks were stored onboard in stainless steel vacuum containers (NASA). On the right, Buzz Aldrin and Neil Armstrong participate in a simulation of deploying and using lunar tools on the surface of the Moon during an April 1969 training exercise. Aldrin (left) uses scoop and tongs to pick up a sample while Armstrong holds a bag to receive the sample in front of a Lunar Module mockup. Both are wearing Extravehicular Mobility Units. (NASA)

Armstrong and Aldrin did their own test. Just a moment after he became the first human being to step onto the Moon, Armstrong had scooped a bit of lunar dirt into a sample bag and put it in a pocket of his spacesuit—a contingency sample, in the event the astronauts had to leave suddenly without collecting rocks. Back inside the lunar module the duo opened the bag and spread the lunar soil on top of the ascent engine. As they repressurized the cabin, they watched to see if the dirt started to smolder. “If it did, we’d stop pressurization, open the hatch and toss it out,” Aldrin explained. “But nothing happened.”

The Moon dust turned out to be so clingy and so irritating that on the one night that Armstrong and Aldrin spent in the lunar module on the surface of the Moon, they slept in their helmets and gloves, in part to avoid breathing the dust floating around inside the cabin.

By the time the Moon rocks and dust got back to Earth—a total of 842 pounds from six lunar landings—the odor was gone from the samples, exposed to air and moisture in their storage boxes. No one has quite figured out what caused the odor to begin with, or why it was so like spent gunpowder, which is chemically nothing like Moon rock. “Very distinctive smell,” Apollo 12 commander Pete Conrad said. “I’ll never forget. And I’ve never smelled it again since then.”

In 1999, as the century was ending, the historian Arthur Schlesinger Jr. was among a group of people who was asked to name the most significant human achievement of the 20th century. In ranking the events, Schlesinger said, “I put DNA and penicillin and the computer and the microchip in the first ten because they’ve transformed civilization.” But in 500 years, if the United States of America still exists, most of its history will have faded to invisibility. “Pearl Harbor will be as remote as the War of the Roses,” said Schlesinger. “The one thing for which this century will be remembered 500 years from now was: This was the century when we began the exploration of space.” He picked the first Moon landing, Apollo 11, as the most significant event of the 20th century.

The trip from one small planet to its smaller nearby moon might someday seem as routine to us as a commercial flight today from Dallas to New York City. But it is hard to argue with Schlesinger’s larger observation: In the chronicle of humanity, the first missions by people from Earth through space to another planetary body are unlikely ever to be lost to history, to memory, or to storytelling.

The leap to the Moon in the 1960s was an astonishing accomplishment. But why? What made it astonishing? We’ve lost track not just of the details; we’ve lost track of the plot itself. What exactly was the hard part?

The answer is simple: When President John F. Kennedy declared in 1961 that the United States would go to the Moon, he was committing the nation to do something we simply couldn’t do. We didn’t have the tools or equipment—the rockets or the launchpads, the spacesuits or the computers or the micro-gravity food. And it isn’t just that we didn’t have what we would need; we didn’t even know what we would need. We didn’t have a list; no one in the world had a list. Indeed, our unpreparedness for the task goes a level deeper: We didn’t even know how to fly to the Moon. We didn’t know what course to fly to get there from here. And as the small example of lunar dirt shows, we didn’t know what we would find when we got there. Physicians worried that people wouldn’t be able to think in micro-gravity conditions. Mathematicians worried that we wouldn’t be able to calculate how to rendezvous two spacecraft in orbit—to bring them together in space and dock them in flight both perfectly and safely.

On May 25, 1961, when Kennedy asked Congress to send Americans to the Moon before the 1960s were over, NASA had no rockets to launch astronauts to the Moon, no computer portable enough to guide a spaceship to the Moon, no spacesuits to wear on the way, no spaceship to land astronauts on the surface (let alone a Moon car to let them drive around and explore), no network of tracking stations to talk to the astronauts en route.

“When [Kennedy] asked us to do that in 1961, it was impossible,” said Chris Kraft, the man who invented Mission Control. “We made it possible. We, the United States, made it possible.”

Ten thousand problems had to be solved to get us to the Moon. Every one of those challenges was tackled and mastered between May 1961 and July 1969. The astronauts, the nation, flew to the Moon because hundreds of thousands of scientists, engineers, managers and factory workers unraveled a series of puzzles, often without knowing whether the puzzle had a good solution.

trajectory of Apollo 11 mission
A computer-generated illustration shows the trajectory of the Apollo 11 mission and the stages of the spacecraft from launch to orbit and return. (Claus Lunau / Science Source)

 

In retrospect, the results are both bold and bemusing. The Apollo spacecraft ended up with what was, for its time, the smallest, fastest and most nimble computer in a single package anywhere in the world. That computer navigated through space and helped the astronauts operate the ship. But the astronauts also traveled to the Moon with paper star charts so they could use a sextant to take star sightings—like 18th-century explorers on the deck of a ship—and cross-check their computer’s navigation. The software of the computer was stitched together by women sitting at specialized looms—using wire instead of thread. In fact, an arresting amount of work across Apollo was done by hand: The heat shield was applied to the spaceship by hand with a fancy caulking gun; the parachutes were sewn by hand, and then folded by hand. The only three staff members in the country who were trained and licensed to fold and pack the Apollo parachutes were considered so indispensable that NASA officials forbade them to ever ride in the same car, to avoid their all being injured in a single accident. Despite its high-tech aura, we have lost sight of the extent to which the lunar mission was handmade.

The race to the Moon in the 1960s was, in fact, a real race, motivated by the Cold War and sustained by politics. It has been only 50 years—not 500—and yet that part of the story too has faded.

One of the ribbons of magic running through the Apollo missions is that an all-out effort born from bitter rivalry ended up uniting the world in awe and joy and appreciation in a way it had never been united before and has never been united since.

 

The mission to land astronauts on the Moon is all the more compelling because it was part of a decade of transformation, tragedy and division in the United States. The nation’s lunar ambition, we tend to forget, was itself divisive. On the eve of the launch of Apollo 11, civil rights protesters, led by the Rev. Ralph Abernathy, marched on Cape Kennedy.

In that way, the story of Apollo holds echoes and lessons for our own era. A nation determined to accomplish something big and worthwhile can do it, even when the goal seems beyond reach, even when the nation is divided. Kennedy said of the Apollo mission that it was hard—we were going to the Moon precisely because doing so was hard—and that it would “serve to organize and measure the best of our energies and skills.” And measure the breadth of our spirit as well.

Today the Moon landing has ascended to the realm of American mythology. In our imaginations, it’s a snippet of crackly audio, a calm and slightly hesitant Neil Armstrong stepping from the ladder onto the surface of the Moon, saying, “That’s one small step for man, one giant leap for mankind.” It is such a landmark accomplishment that the decade-long journey has been concentrated into a single event, as if on a summer day in 1969, three men climbed into a rocket, flew to the Moon, pulled on their spacesuits, took a few steps, planted the American flag, and then came home.

Cape Kennedy
An aerial view of Cape Kennedy, May 20, 1969, shows the Saturn V rocket as it was transported down the 3.5-mile approach to Launch Complex 39A. (NASA)
 

But the magic, of course, was the result of an incredible effort—an effort unlike any that had been seen before. Three times as many people worked on Apollo as on the Manhattan Project to create the atomic bomb. In 1961, the year Kennedy formally announced Apollo, NASA spent $1 million on the program for the year. Five years later NASA was spending about $1 million every three hours on Apollo, 24 hours a day.

One myth holds that Americans enthusiastically supported NASA and the space program, that Americans wanted to go to the Moon. In fact two American presidents in a row hauled the space program all the way to the Moon with not even half of Americans saying they thought it was worthwhile. The ’60s were tumultuous, riven by the Vietnam War, urban riots, the assassinations. Americans constantly questioned why we were going to the Moon when we couldn’t handle our problems on Earth.

As early as 1964, when asked if America should “go all out to beat the Russians in a manned flight to the Moon,” only 26 percent of Americans said yes. During Christmas 1968, NASA sent three astronauts in an Apollo capsule all the way to the Moon, where they orbited just 70 miles over the surface, and on Christmas Eve, in a live, prime-time TV broadcast, they shared pictures of the Moon’s surface, as seen out their windows. Then the three astronauts, Bill Anders, Jim Lovell and Frank Borman, read aloud the first ten verses of Genesis to what was then the largest TV audience in history. From orbit, Anders took one of the most famous pictures of all time, the photo of the Earth floating in space above the Moon, the first full-color photo of Earth from space, later titled Earthrise, a single image credited with helping inspire the modern environmental movement.

Anticipation for the actual Moon landing should have been extraordinary. In fact, as earlier in the decade, and despite years of saturation coverage of Apollo and the astronauts, it was anything but universal. Four weeks after Apollo 8’s telecast from lunar orbit, the Harris Poll conducted a survey and asked Americans if they favored landing a man on the Moon. Only 39 percent said yes. Asked if they thought the space program was worth the $4 billion a year it was costing, 55 percent of Americans said no. That year, 1968, the war in Vietnam had cost $19.3 billion, more than the total cost of Apollo to that point, and had taken the lives of 16,899 U.S. troops—almost 50 dead every single day—by far the worst single year of the war for the U.S. military. Americans would prove to be delighted to have flown to the Moon, but they were not preoccupied by it.

The big myth of Apollo is that it was somehow a failure, or at least a disappointment. That’s certainly the conventional wisdom—that while the landings were a triumph, the aimless U.S. space program since then means Apollo itself was also pointless. Where is the Mars landing? Where are the Moon bases, the network of orbital outposts? We haven’t done any of that, and we’re decades from doing it now. That misunderstands Apollo, though. The success is the very age we live in now. The race to the Moon didn’t usher in the space age; it ushered in the digital age.

When Kennedy asked us to do that in 1961, it was impossible. We made it possible. We, the United States, made it possible.

Historians of Silicon Valley and its origins may skip briskly past Apollo and NASA, which seem to have operated in a parallel world without much connection to or impact on the wizards of Intel and Microsoft. But the space program in the 1960s did two things to lay the foundation of the digital revolution. First, NASA used integrated circuits—the first computer chips—in the computers that flew the Apollo command module and the Apollo lunar module. Except for the U.S. Air Force, NASA was the first significant customer for integrated circuits. Microchips power the world now, of course, but in 1962 they were little more than three years old, and for Apollo they were a brilliant if controversial bet. Even IBM decided against using them in the company’s computers in the early 1960s. NASA’s demand for integrated circuits, and its insistence on their near-flawless manufacture, helped create the world market for the chips and helped cut the price by 90 percent in five years.

NASA was the first organization of any kind—company or government agency—anywhere in the world to give computer chips responsibility for human life. If the chips could be depended on to fly astronauts safely to the Moon, they were probably good enough for computers that would run chemical plants or analyze advertising data.

 

NASA also introduced Americans, and the world, to the culture and power of technology—we watched on TV for a decade as staff members at Mission Control used computers to fly spaceships to the Moon. Part of that was NASA introducing the rest of the world to “real-time computing,” a phrase that seems redundant to anyone who’s been using a computer since the late 1970s. But in 1961, there was almost no computing in which an ordinary person—an engineer, a scientist, a mathematician—sat at a machine, asked it to do calculations and got the answers while sitting there. Instead you submitted your programs on stacks of punch cards, and you got back piles of printouts based on the computer’s run of your cards—and you got those printouts hours or days later.

 

The Cabbage Patch Kids’ Twisted History

H/T Yahoo.com.

I saw fights over these dolls some times they were a brutal assault.

 

If you were a kid in the ‘80s, you probably, at some point, had a Cabbage Patch Kid. The soft-bodied baby-like dolls, each of which came with its own adoption certificate, were a worldwide phenomenon, even instigating riots amongst parents at toy stores.

STORY: Real-Girl Barbie Is a Hit

Every doll also came with a signature on its rear end — that of Xavier Roberts, the manufacturer and supposed creator of the Cabbage Patch Kids. But a new Vice mini-documentary, “The Secret History of the Cabbage Patch Kids,” reveals that the dolls were actually invented by artist Martha Nelson Thomas, who died of ovarian cancer in 2013. According to the 16-minute doc, which is part of Vice’s American Obsessions series, Roberts stole the idea after coming across Nelson Thomas’s dolls at a fair, and eventually turned them into a $2 billion franchise.

image

Artist Martha Nelson Thomas with her ‘doll babies,’ the original Cabbage Patch Kids. (Photo: VICE)

Martha Nelson Thomas started working on her “doll babies” — her name for the line of toys — in the 1970s. “Martha was basically flat-out reinventing the doll,” Guy Mendes, a photographer and friend of Nelson Thomas’s, told Vice. “The doll babies were her brood. She shopped for them. She dressed them. They were expressions of her.”

STORY: The Best Board Games to Play with Kids

According to Nelson Thomas’s husband, Tucker Thomas, Roberts first came across the doll babies at a craft fair. “[He] saw Martha’s dolls and purchased one of them,” Thomas told Vice. 

Soon, Roberts’ adopted the idea as his own, and started mass producing Cabbage Patch Kids in 1982. “He took her idea and he made a fortune,” Mendes said.

image

Xavier Roberts admitted that his Cabbage Patch Kids, some of which are pictured here, were inspired by Martha Nelson Thomas’s doll babies. (Photo: Molly V/Flickr)

Not only did Roberts create similar dolls to Nelson Thomas’s, but he used the same marketing concept, providing adoption papers with each toy. The one difference? The signature. “Martha did not sign her work. Most of the time she sold it with children in mind,” Thomas said. “Children were taking this doll in as a member of their family and becoming a mom for the doll. And to find somebody’s name stamped on it totally took away from that feel.”

Representatives of Roberts or Cabbage Patch Kids did not respond to Vice’s request for comment. 

According to the doc, Nelson Thomas eventually sued Roberts, who admitted to being inspired by her work but claimed to have created his own design. (Nelson Thomas never copyrighted the product.) The lawsuit, which was filed in 1975 and went to trial in 1985, was eventually settled outside of court, though the amount of the settlement has never been revealed. “She couldn’t tell us what the settlement was but she said her children would go to college,” Mendes said.

“We didn’t want the conflict to go on forever,” Thomas said. “That’s not a good way to live.”

image

Artist Martha Nelson Thomas sued Xavier Roberts, manufacturer of the Cabbage Patch Kids, after he allegedly copied her doll babies, above. The case settled for an undisclosed amount. (Photo: VICE). 

Nelson Thomas isn’t the first woman to have had her ideas allegedly stolen and turned into a childhood icon. A similar fate befell the inventor of board game favorite Monopoly. According to The Monopolists: Obsession, Fury, and the Scandal Behind the World’s Favorite Board Game, Monopoly was originally created by stenographer Elizabeth Magie Philips, but was later appropriated by a man name Charles Darrow, who sold it to Parker Brothers in the 1930s and became a millionaire. His name, and the story of how his board game idea saved him from bankruptcy, became nearly synonymous with the game itself. 

Philips spoke out about her stolen invention after it became a hit, but never saw more than $500 for it, according to the book. “In 1948, Magie died in relative obscurity, a widow without children,” wrote author Mary Pilon, whose book was adapted in the New York Times. “Neither her headstone nor her obituary mentions her role in the creation of Monopoly.”

Like Philips, Nelson Thomas is now being recognized for her innovative idea. While many Cabbage Patch lovers still remember the name Xavier Roberts, Nelson Thomas’s husband says his wife’s legacy is intact. “[Xavier Roberts] marketed a product very well. I really don’t begrudge him,” Thomas said. “Martha and I had a wonderful life together. It wasn’t elaborate but it was wonderful. I’m not going to trade in that life for a few dollars.”