10 Things You May Not Know About the Mexican-American War

H/T History.com.

Some of this I knew some I did not know.

Explore 10 fascinating facts about what has often been called America’s “forgotten war.”

1. Before invading Mexico, the U.S. tried to buy some of its territory.
In late-1845, President James K. Polk sent diplomat John Slidell on a secret mission to Mexico. Slidell was tasked with settling a longstanding disagreement about the border between the two countries, but he was also authorized to offer the Mexicans up to $25 million for their territories in New Mexico and California

When the Mexicans refused to consider the offer, Polk upped the ante by ordering 4,000 troops under Zachary Taylor to occupy the land between the Nueces River and the Rio Grande—a region Mexico claimed as its own territory. Mexico replied by sending troops to the disputed zone, and on April 25, 1846, their cavalry attacked a patrol of American dragoons. Polk’s opponents would later argue the president had goaded the Mexicans into the fight. 

Nevertheless, on May 13, 1846, Congress voted to declare war on Mexico by an overwhelming margin.

2. The war marked the combat debut of several future Civil War generals.
Along with future presidents Zachary Taylor and Franklin Pierce, the U.S. force in Mexico included many officers who later made their name on the battlefields of the Civil War

Union Generals Ulysses S. Grant, George Meade and George McClellan all served, as did many of their Confederate adversaries such as Robert E. LeeStonewall Jackson and George Pickett. Lee, then a captain in the Army Corps of Engineers, emerged from the war a hero after he scouted passes that allowed the Americans to outmaneuver the Mexicans at the Battles of Cerro Gordo and Contreras.

3. Santa Anna used the war to reclaim power in Mexico.
Most Americans considered Antonio Lopez de Santa Anna a mortal enemy for his actions at 1836’s Battle of the Alamo, but the charismatic general returned to power during the Mexican-American War thanks to a surprising ally: James K. Polk. 

Santa Anna was languishing in Cuba when the war began, having been driven into exile after a stint as Mexico’s dictator. In August 1846, he convinced the Polk administration that he would negotiate a favorable peace if he were allowed to return home through an American naval blockade. Polk took the general at his word, but shortly after setting foot on Mexican soil, Santa Anna double-crossed the Americans and organized troops to fight off the invasion. Along with reclaiming the presidency, he went on to lead the Mexicans during nearly all the war’s major battles.

4. Abraham Lincoln was one of the war’s harshest critics.
The invasion of Mexico was one of the first U.S. conflicts to spawn a widespread anti-war movement. Political opponents labeled “Mr. Polk’s War” a shameless land grab, while abolitionists viewed it was a scheme to add more slave states to the Union. Among the more notable critics was freshman Illinois congressman Abraham Lincoln, who took to the House floor in 1847 and introduced a series of resolutions demanding to know the location of the “spot of soil” where the war’s first skirmish took place. 

Lincoln maintained that the battle had been provoked on Mexican land, and he branded Polk a cowardly seeker of “military glory.” The so-called “Spot Resolutions” helped put Lincoln on the map as a politician, but they also damaged his reputation with his pro-war constituents. One Illinois newspaper even branded him “the Benedict Arnold of our district.”

5. It included the U.S. military’s first major amphibious attack.
The most significant phase of the Mexican-American War began in March 1847, when General Winfield Scott invaded the Mexican city of Veracruz from the sea. In what amounted to America’s largest amphibious operation until World War II, the Navy used purpose-built surfboats to ferry more than 10,000 U.S. troops to the beach in just five hours. The landings were mostly unopposed by the town’s outnumbered garrison, which later surrendered after an artillery bombardment and a 20-day siege. Having secured Veracruz, Scott’s army launched the war’s final thrust: a six-month, 265-mile fighting march to the “Halls of Montezuma” at Mexico City.

6. A band of Irish Catholics deserted the U.S. and fought for Mexico.
One of the war’s most storied units was St. Patrick’s Battalion, a group of U.S. soldiers who deserted the army and cast their lot with Mexico. The 200-man outfit was mostly made up of Irish Catholics and other immigrants who resented the prejudice they faced from Protestants in the United States. 

Under the leadership of an Irishman named John Riley, the “San Patricios” defected and became Santa Anna’s elite artillery force. They served with distinction at the Battles of Buena Vista and Cerro Gordo, but most of their unit was later killed or captured during an August 1847 clash at Churubusco. Following a court martial, the U.S. Army executed around 50 of the soldiers by hanging. Several others were whipped and branded with a “D” for “deserter.” Though scorned in the United States, the San Patricios became national heroes in Mexico, where they are still honored every St. Patrick’s Day.

7. The Battle of Chapultepec gave rise to a famous legend in Mexico.
When they arrived in Mexico City in September 1847, U.S. forces found the western route into the capital blocked by Chapultepec Castle, an imposing fortress that was home to Mexico’s military academy. General Scott ordered an artillery bombardment, and on September 13 his troops stormed the citadel and used ladders to scale its stone façade. Most of the Mexican defenders soon withdrew, but a group of six teenaged military cadets remained at their posts and fought to the last. 

According to battlefield lore, one cadet prevented the capture of the Mexican flag by wrapping it around his body and leaping to his death off the castle walls. While Chapultepec was lost, Mexicans hailed the six young students as the “Niños Heroes,” or “Hero children.” They were later honored with a large monument in Mexico City.

8. An American diplomat disobeyed orders to end the war.
As the war inched toward its conclusion in 1847, President Polk sent State Department clerk Nicholas P. Trist south of the border to seal a peace treaty with the Mexicans. Negotiations proceeded slowly at first, and in November 1847 Polk grew frustrated and ordered Trist to end the talks and return home. Trist, however, would do no such thing. Believing that he was on the verge of a breakthrough with Mexicans, he disobeyed the President’s order and instead wrote a 65-page letter defending his decision to continue his peace efforts. Polk was left seething. He called Trist “destitute of honor or principle” and tried to have him removed from the U.S. Army headquarters, but he was unable to stop the negotiations. 

On February 2, 1848, Trist struck the Treaty of Guadelupe Hidalgo, an agreement in principle to end the war. While Polk reluctantly accepted the deal, he fired Trist as soon as the rogue diplomat returned to the United States.

9. The war reduced the size of Mexico by more than half.
Along with relinquishing all claims to Texas, the Treaty of Guadalupe Hidalgo also forced Mexico to accept an American payment of $15 million for 525,000 square miles of its territory—a plot larger than the size of Peru. The lands ceded by Mexico would later encompass all or part of the future states of California, New Mexico, Nevada, Utah, Arizona, Colorado, Wyoming, Oklahoma and Kansas.

10. It had one of the highest casualty rates of any American war.
The U.S. never a lost a major battle during the Mexican-American War, but the victory still proved costly. Of the 79,000 American troops who took part, 13,200 died for a mortality rate of nearly 17 percent—higher than World War I and World War II

The vast majority were victims of diseases such as dysentery, yellow fever, malaria and smallpox. According to scholar V.J. Cirillo, a higher percentage of U.S. troops died from sickness during the Mexican invasion than any war in American history. Mexican casualties were also high, with most historians estimating as many as 25,000 dead troops and civilians.


The History of Lollipops

H/T Candy Creek.

Today, lollipops are typically defined as a hard candy that is eaten off of a stick. Lollipops are available in a wide variety of shapes, sizes, and flavors, and are enjoyed by people around the world. The history of lollipops, and where their name originally came from, is debated, but the story begins thousands of years ago, perhaps with cavemen.


Eating sugary substances off a stick has been a practice in many civilizations throughout history. It has been speculated that the first instance of this was cavemen collecting and eating honey from beehives with a stick. The next, and slightly more advanced, development in this practice is thought to be during the ancient ages. The Chinese, Arabs, and Egyptians used honey to preserve fruit and nuts. This mixture would be made on a stick to make it easier to eat as it hardened over time. In the middle ages, nobles ate boiled sugar with a stick. At this time, sugar was not produced in large quantities, making it a very expensive and luxurious treat, only accessible to the wealthy. Shortly after the end of the middle ages, this changed as technology improved and sugar cane was grown and produced in bulk.

Records show that in 17th century England, an early version of what we know as the lollipop was sold throughout London by street vendors. These sugary treats were made of soft candy since machines to insert the sticks automatically had not yet been invented. Although these candies were different in texture, and most likely appearance, than modern lollipops, the concept was the same: a delicious, sugary treat that can be eaten without making a mess, and enjoyed by children and adults alike.


The beginning of the 20th century brought about the modern lollipop. There is much debate about who first made hard candies on a stick, where the name lollipop came from, and who was the first to invent a machine to produce them. What we know for sure, is that during the first half of the 1900s, there were a few different people, in factories around the US, that helped shape lollipops as we know them.


In the 1880s, George Smith of New Haven, CT observed a chocolate company who was making chocolates and caramels on sticks. He found this an intriguing idea and decided to apply this technique to his own, hard candy business, Bradley Smith Company. In 1908, Smith named these candies Lolly Pops after a local race horse, and applied for a trademark on the name. It was years before he was granted the trademark as there were records of candies called this name in the past, but in 1931, Lolly Pops officially got their name. The Bradley Smith Company started out making the candies by hand, but in order to meet demand, they created their own patented machine to automate the process. These early lollipops were sold for a penny each.


There are records of another confectionery company in Connecticut, the McAviney Candy Company, also creating a product like the modern lollipop around the same time as the Bradley Smith Company. As the story goes, this happened almost by accident. The employees would use wooden sticks to stir the candy as it cooked, and throughout the day candy would accumulate on the stick. By the end of each day, there would be many left over “candy sticks” which employees would bring home to their children. Soon they started selling these candy sticks to the public.

fruit lollipops by candy creek in a glass jar

Also in 1908, the Racine Confectionery Machine Company in Racine, Wisconsin was creating hard candy on a stick. They created the Racine Automatic Sucker Machine which placed hard candies on the ends of sticks. Shortly after in 1912, Samuel Born invented the Born Sucker Machine in California, which automatically inserted sticks into hard candy. This invention was widely celebrated in San Francisco.

No matter who was truly the first to put hard candy on a stick, all of these endeavors helped create the modern lollipop, one of the world’s most popular candies.

The Women Who Rode Miles on Horseback to Deliver Library Books

H/T Atlas Obscura.

I learned something new about libraries during the Depression.

Librarians are amazing.

A group of "book women" on horseback in Hindman, Kentucky, 1940.

A group of “book women” on horseback in Hindman, Kentucky, 1940. KENTUCKY LIBRARY AND ARCHIVES

They were known as the “book women.” They would saddle up, usually at dawn, to pick their way along snowy hillsides and through muddy creeks with a simple goal: to deliver reading material to Kentucky’s isolated mountain communities.

The Pack Horse Library initiative was part of President Franklin Roosevelt’s Works Progress Administration (WPA), created to help lift America out of the Great Depression, during which, by 1933, unemployment had risen to 40 percent in Appalachia. Roving horseback libraries weren’t entirely new to Kentucky, but this initiative was an opportunity to boost both employment and literacy at the same time.

A pack horse librarian at an isolated mountain house, carrying books in saddle bags and hickory baskets, year unknown.
A pack horse librarian at an isolated mountain house, carrying books in saddle bags and hickory baskets, year unknown. UNIVERSITY OF KENTUCKY LIBRARIES SPECIAL COLLECTIONS RESEARCH CENTER.
The WPA paid the salaries of the book carriers—almost all the employees were women, making the initiative unusual among WPA programs—but very little else. Counties had to have their own base libraries from which the mounted librarians would travel. Local schools helped cover those costs, and the reading materials—books, magazines, and newspapers—were all donated. In December 1940, a notice in the Mountain Eagle newspaper noted that the Letcher County library “needs donations of books and magazines regardless of how old or worn they may be.”

Old magazines and newspapers were cut and pasted into scrapbooks with particular themes—recipes, for example, or crafts. One such scrapbook, which still is held today at the FDR Presidential Library & Museum in Hyde Park, New York, contains recipes pasted into a notebook with the following introduction: “Cook books are popular. Anything to do with canning or preserving is welcomed.” Books were repaired in the libraries and, as historian Donald C. Boyd notes, old Christmas cards were circulated to use as bookmarks and prevent damage from dog-eared pages.

Pack horse librarians start down Greasy Creek to remote homes, date unknown.
Pack horse librarians start down Greasy Creek to remote homes, date unknown. UNIVERSITY OF KENTUCKY LIBRARIES SPECIAL COLLECTIONS RESEARCH CENTER.


The book women rode 100 to 120 miles a week, on their own horses or mules, along designated routes, regardless of the weather. If the destination was too remote even for horses, they dismounted and went on foot. In most cases, they were recruited locally—according to Boyd, “a familiar face to otherwise distrustful mountain folk.”

By the end of 1938, there were 274 librarians riding out across 29 counties. In total, the program employed nearly 1,000 riding librarians. Funding ended in 1943, the same year the WPA was dissolved as unemployment plummeted during wartime. It wasn’t until the following decade that mobile book services in the area resumed, in the form of the bookmobile, which had been steadily increasing in popularity across the country.

Pack horse librarians cross a log bridge to reach home used as a distribution center for a mountain community, year unknown.
Pack horse librarians cross a log bridge to reach home used as a distribution center for a mountain community, year unknown. UNIVERSITY OF KENTUCKY LIBRARIES SPECIAL COLLECTIONS RESEARCH CENTER.


In addition to providing reading materials, the book women served as touchstones for these communities. They tried to fill book requests, sometimes stopped to read to those who couldn’t, and helped nurture local pride. As one recipient said, “Them books you brought us has saved our lives.” In the same year as the call for books, the Mountain Eagle exalted the Letcher County library: “The library belong to our community and to our county, and is here to serve us … It is our duty to visit the library and to help in every way that we can, that we may keep it as an active factor in our community.”


Atlas Obscura has a selection of images of the Kentucky pack horse librarians.

Children greet the "book woman," 1940.
Children greet the “book woman,” 1940. KENTUCKY LIBRARIES AND ARCHIVES
"Sometimes the short way across is the hard way for the horse and rider but schedules have to be maintained if readers are not to be disappointed. Then, too, after highways are left, there is little choice of roads," c. 1940.
“Sometimes the short way across is the hard way for the horse and rider but schedules have to be maintained if readers are not to be disappointed. Then, too, after highways are left, there is little choice of roads,” c. 1940. KENTUCKY LIBRARIES AND ARCHIVES
Book delivery to a remote home, 1940.
Book delivery to a remote home, 1940. KENTUCKY LIBRARIES AND ARCHIVES
A man reading to two small children, c. 1940.
A man reading to two small children, c. 1940. KENTUCKY LIBRARY AND ARCHIVES
The library in Stanton, Kentucky, 1941.
Packing saddle bags with books, date unknown.
A trunk full of donated magazines, c. 1940.
A trunk full of donated magazines, c. 1940. KENTUCKY LIBRARIES AND ARCHIVES
Making a scrapbook, c. 1940.
Making a scrapbook, c. 1940. KENTUCKY LIBRARIES AND ARCHIVES
Front porch delivery, c. 1940.
Front porch delivery, c. 1940. KENTUCKY LIBRARIES AND ARCHIVES

The Enduring Mystery of H.H. Holmes, America’s ‘First’ Serial Killer

H/T Smithsonian Magazine.

Sometimes it is hard to recognize the devil when he is staring you in the face.

The infamous “devil in the White City” remains mired in myth 125 years after his execution

Illustration of H.H. Holmes in front of newspaper headlines
Mired in myth and misconception, the killer’s life has evolved into “a new American tall tale,” argues tour guide and author Adam Selzer. (Illustration by Meilan Solly / Photos via Wikimedia Commons under public domain and Newspapers.com)

Four days before H.H. Holmes’ execution on May 7, 1896, the Chicago Chronicle published a lengthy diatribe condemning the “multimurderer, bigamist, seducer, resurrectionist, forger, thief and general swindler” as a man “without parallel in the annals of crime.” Among his many misdeeds, the newspaper reported, were suffocating victims in a vault, boiling a man in oil and poisoning wealthy women in order to seize their fortunes.

Holmes claimed to have killed at least 27 people, most of whom he’d lured into a purpose-built “Murder Castle” replete with secret passageways, trapdoors and soundproof torture rooms. According to the Crime Museum, an intricate system of chutes and elevators enabled Holmes to transport his victims’ bodies to the Chicago building’s basement, which was purportedly equipped with a dissecting table, stretching rack and crematory. In the killer’s own words, “I was born with the devil in me. I could not help the fact that I was a murderer, no more than a poet can help the inspiration to sing.”

More than a century after his death, Holmes—widely considered the United States’ first known serial killer—continues to loom large in the imagination. Erik Larson’s narrative nonfiction best seller The Devil in the White City introduced him to many Americans in 2003, and a planned adaptation of the book spearheaded by Leonardo DiCaprio and Martin Scorsese is poised to heighten Holmes’ notoriety even further.

But the true story of Holmes’ crimes, “while horrifying, may not be quite as sordid” as popular narratives suggest, wrote Becky Little for History.com last year. Mired in myth and misconception, the killer’s life has evolved into “a new American tall tale,” argues tour guide and author Adam Selzer in H.H. Holmes: The True History of the White City Devil. “[A]nd, like all the best tall tales, it sprang from a kernel of truth.”

The three-story building at the center of the H.H. Holmes myth
The three-story building at the center of the H.H. Holmes myth (Public domain via Wikimedia Commons)

The facts are these, says Selzer: Though sensationalized reports suggest that Holmes killed upward of 200 people, Selzer could only confirm nine actual victims. Far from being strangers drawn into a house of horrors, the deceased were actually individuals Holmes befriended (or romanced) before murdering them as part of his money-making schemes. And, while historical and contemporary accounts alike tend to characterize the so-called Murder Castle as a hotel, its first and second floors actually housed shops and long-term rentals, respectively.

“When he added a third floor onto his building in 1892, he told people it was going to be a hotel space, but it was never finished or furnished or open to the public,” Selzer added. “The whole idea was just a vehicle to swindle suppliers and investors and insurers.”

As Frank Burgos of PhillyVoice noted in 2017, Holmes was not just a serial killer, but a “serial liar [eager] to encrust his story with legend and lore.” While awaiting execution, Holmes penned an autobiography from prison filled with falsehoods (including declarations of innocence) and exaggerations; newspapers operating at the height of yellow journalism latched onto these claims, embellishing Holmes’ story and setting the stage for decades of obfuscation.

Born Herman Webster Mudgett in May 1861, the future Henry Howard Holmes—a name chosen in honor of detective Sherlock Holmes, according to Janet Maslin of the New York Times—grew up in a wealthy New England family. Verifiable information on his childhood is sparse, but records suggest that he married his first wife, Clara Lovering, at age 17 and enrolled in medical school soon thereafter.

Holmes’ proclivity for criminal activity became readily apparent during his college years. He robbed graves and morgues, stealing cadavers to sell to other medical schools or use in complicated life insurance scams. After graduating from the University of Michigan in 1884, he worked various odd jobs before abandoning his wife and young son to start anew in Chicago.

1895 newspaper detailing Holmes' so-called murder castle
A highly exaggerated 1895 newspaper report detailing Holmes’ so-called murder castle (Public domain via Wikimedia Commons)

Now operating under the name H.H. Holmes, the con artist wed a second woman, Myrta Belknap, and purchased a pharmacy in the city’s Englewood district. Across the street, he constructed the three-story building that would later factor so prominently in tales of his atrocities. Work concluded in time for the May 1893 opening of the World’s Columbian Exposition, a supposed celebration of human ingenuity with distinct colonialist undertones. The fair drew more than 27 million visitors over its six-month run.

To furnish his enormous “castle,” Holmes bought items on credit and hid them whenever creditors came calling. On one occasion, workers from a local furniture company arrived to repossess its property, only to find the building empty.

“The castle had swallowed the furniture as, later, it would swallow human beings,” wrote John Bartlow Martin for Harper’s magazine in 1943. (A janitor bribed by the company eventually revealed that Holmes had moved all of his furnishings into a single room and walled up its door to avoid detection.)

Debonair and preternaturally charismatic, Holmes nevertheless elicited lingering unease among many he encountered. Still, his charm was substantial, enabling him to pull off financial schemes and, for a time, get away with murder. (“Almost without exception, [his victims appeared] to have had two things in common: beauty and money,” according to Harper’s. “They lost both.”) Holmes even wed for a third time, marrying Georgiana Yoke in 1894 without attracting undue suspicion.

As employee C.E. Davis later recalled, “Holmes used to tell me he had a lawyer paid to keep him out of trouble, but it always seemed to me that it was the courteous, audacious rascality of the fellow that pulled him through. … He was the only man in the United States that could do what he did.”

Holmes’ probable first victims were Julia Conner, the wife of a man who worked in his drugstore, and her daughter, Pearl, who were last seen alive just before Christmas 1891. Around that time, according to Larson’s Devil in the White City, Holmes paid a local man to remove the skin from the corpse of an unusually tall woman (Julia stood nearly six feet tall) and articulate her skeleton for sale to a medical school. No visible clues to the deceased’s identity remained.

The <em>Chicago Chronicle</em>'s illustrations of Minnie and Anna Williams, two of Holmes' likely victims
The Chicago Chronicle‘s illustrations of Minnie and Anna Williams, two of Holmes’ likely victims (Newspapers.com)

Larson recounts Julia’s final moments in vivid detail—but as historian Patrick T. Reardon pointed out for the Chicago Tribune in 2007, the book’s “Notes and Sources” section admits that this novelistic account is simply a “plausible” version of the story woven out of “threads of known detail.”

Other moments in Devil in the White City, like a visit by Holmes and two of his later victims, sisters Minnie and Anna Williams, to Chicago’s meatpacking district, are similarly speculative: Watching the slaughter, writes Larson, “Holmes was unmoved; Minnie and Anna were horrified but also strangely thrilled by the efficiency of the carnage.” The book’s endnotes, however, acknowledge that no record of such a trip exists. Instead, the author says, “It seems likely that Holmes would have brought Minnie and Nannie there.”

These examples are illustrative of the difficulties of cataloguing Holmes’ life and crimes. Writing for Time Out in 2015, Selzer noted that much of the lore associated with the killer stems from 19th-century tabloids, 20th-century pulp novels and Holmes’ memoir, none of which are wholly reliable sources.

That being said, the author pointed out in a 2012 blog post, Holmes was “certainly both … a criminal mastermind [and] a murderous monster.” But, he added, “anyone who wants to study the case should be prepared to learn that much of the story as it’s commonly told is a work of fiction.”

Holmes’ crime spree came to an end in November 1894, when he was arrested in Boston on suspicion of fraud. Authorities initially thought he was simply a “prolific and gifted swindler,” per Stephan Benzkofer of the Chicago Tribune, but they soon uncovered evidence linking Holmes to the murder of a long-time business associate, Benjamin Pitezel, in Philadelphia.

Chillingly, investigators realized that Holmes had also targeted three of Pitezel’s children, keeping them just out of reach of their mother in what was essentially a game of cat and mouse. On a number of occasions, Holmes actually stashed the two in separate lodgings located just a few streets away from each other.

“It was a game for Holmes,” writes Larson. “… He possessed them all and reveled in his possession.”

Illustration of H.H. Holmes' execution
Illustration of H.H. Holmes’ May 7, 1896, execution (Public domain via Wikimedia Commons)

In July 1895, Philadelphia police detective Frank Geyer found the bodies of two of the girls buried beneath a cellar in Toronto. Given the absence of visible injuries, the coroner theorized that Holmes had locked the sisters in an unusually large trunk and filled it with gas from a lamp valve. Authorities later unearthed the charred remains of a third Pitezel sibling at an Indianapolis cottage once rented by Holmes.

A Philadelphia grand jury found Holmes guilty of Benjamin’s murder on September 12, 1895; just under eight months later, he was executed in front of a crowd at the city’s Moyamensing Prison. At the killer’s request (he was reportedly worried about grave robbers), he was buried ten feet below ground in a cement-filled pine coffin.

The larger-than-life sense of mystery surrounding Holmes persisted long after his execution. Despite strong evidence to the contrary, rumors of his survival circulated until 2017, when, at the request of his descendants, archaeologists exhumed the remains buried in his grave and confirmed their identity through dental records, as NewsWorks reported at the time.

“It’s my belief that probably all those stories about all these visitors to the World’s Fair who were murdered in his quote-unquote ‘Castle’ were just complete sensationalistic fabrication by the yellow press,” Harold Schecter, author of Depraved: The Definitive True Story of H. H. Holmes, Whose Grotesque Crimes Shattered Turn-of-the-Century Chicago, told History.com in 2020. “By the time I reached the end of my book, I kind of realized even a lot of the stuff that I had written was probably exaggerated.”

Holmes for his part, described himself in his memoir as “but a very ordinary man, even below the average in physical strength and mental ability.”

He added, “[T]o have planned and executed the stupendous amount of wrongdoing that has been attributed to me would have been wholly beyond my power.”

Who Invented the Telescope?

H/T Space.com.

The telescope has an interesting history.

The telescope is one of humankind’s most important inventions. The simple device that made far away things look near gave observers a new perspective. When curious men pointed the spyglass toward the sky, our view of Earth and our place in the universe changed forever.


But the identity of the ingenious mind who invented the telescope remains a mystery. Although the invention changed humankind’s perspective of the universe forever, It was probably inevitable that as glassmaking and lens-grinding techniques improved in the late 1500s, someone would hold up two lenses and discover what they could do.

The first person to apply for a patent for a telescope was Dutch eyeglass maker  Hans Lippershey (or Lipperhey). In 1608, Lippershey laid claim to a device that could magnify objects three times. His telescope had a concave eyepiece aligned with a convex objective lens. One story goes that he got the idea for his design after observing two children in his shop holding up two lenses that made a distant weather vane appear close. Others claimed at the time that he stole the design from another eyeglass maker, Zacharias Jansen.

Jansen and Lippershey lived in the same town and both worked on making optical instruments. Scholars generally argue, however, that there is no real evidence that Lippershey did not develop his telescope independently. Lippershey, therefore, gets the credit for the telescope, because of the patent application, while Jansen is credited with inventing the compound microscope. Both appear to have contributed to the development of both instruments.

Adding the confusion, yet another Dutchman, Jacob Metius, applied for a patent for a telescope a few weeks after Lippershey. The government of the Netherlands turned down both applications because of the counterclaims. Also, officials said the device was easy to reproduce, making it difficult to patent. In the end, Metius got a small reward, but the government paid Lippershey a handsome fee to make copies of his telescope.


A 1754 painting by H.J. Detouche shows Galileo Galilei displaying his telescope to Leonardo Donato and the Venetian Senate. (Image credit: Public domain)

Enter Galileo

In 1609, Galileo Galilei heard about the “Dutch perspective glasses” and within days had designed one of his own — without ever seeing one. He made some improvements — his could magnify objects 20 times — and presented his device to the Venetian Senate. The Senate, in turn, set him up for life as a lecturer at the University of Padua and doubled his salary, according to Stillman Drake in his book “Galileo at Work: His Scientific Biography” (Courier Dover Publications, 2003).


Galileo’s ink renderings of the moon: the first telescopic observations of a celestial object. (Image credit: NASA)

Galileo was the first to point a telescope skyward. He was able to make out mountains and craters on the moon, as well as a ribbon of diffuse light arching across the sky — the Milky Way. He also discovered the rings of Saturn, sunspots and four of Jupiter’s moons.

Thomas Harriot, a British ethnographer and mathematician, also used a spyglass to observe the moon. Harriot became famous for his travels to the early settlements in Virginia to detail resources there. His August 1609 drawings of the moon predate Galileo’s, but were never published.

The more Galileo looked, the more he was convinced of the sun-centered Copernican model of the planets. Galileo wrote a book “Dialogue Concerning the Two Chief World Systems, Ptolemaic and Copernican” and dedicated it to the Pope Urban VIII. But his ideas were considered heretical, and Galileo was called to appear before the inquisition in Rome in 1633. He struck a plea bargain and was sentenced to house arrest, where he continued to work and write until his death in 1642.

Elsewhere in Europe, scientists began improving the telescope. Johannes Kepler studied the optics and designed a telescope with two convex lenses, which made the images appear upside down. Working from Kepler’s writings, Isaac Newton reasoned it was better to make a telescope out of mirrors rather than lenses and built a reflecting telescope in 1668. Centuries later the reflecting telescope would dominate astronomy.

Exploring the cosmos

The largest refracting telescope (one that use lenses to gather and focus light) opened at Yerkes Observatory in Williams Bay, Wisconsin, in 1897. But the 40-inch (1 meter) glass lens at Yerkes was soon made obsolete by larger mirrors. The Hooker 100-inch (2.5 m) reflecting telescope opened in 1917 at Mount Wilson Observatory in Pasadena, Calif. It was there that the astronomer Edwin Hubble determined that the Andromeda Nebula was indeed (as some astronomers had argued) a galaxy far, far away (2.5 million light-years) from the Milky Way.

With the development of the radio, scientists could start to study not just light, but other electromagnetic radiation in space. An American engineer named Karl Jansky was the first to detect radio radiation from space in 1931. He found a source of radio interference from the center of the Milky Way. Radio telescopes have since mapped the shape of galaxies and the existence of background microwave radiation that confirmed a prediction in the Big Bang Theory.

Famous telescopes


Here are some of the more famous telescopes:


Hubble Space Telescope


This telescope launched in 1990. Some of Hubble’s major contributions include determining the age of the universe with more precision, finding more moons near Pluto, doing observations of galaxies in the young universe, monitoring space weather on the outer planets, and even observing exoplanets — a situation not anticipated for the telescope as the first major exoplanet discoveries didn’t happen until the mid-1990s. 


A flaw in its mirror was fixed with an upgrade from a space shuttle crew in 1993. Hubble underwent five servicing missions by shuttle crews, with the last one being in 2009. It remains in good health to this day and is expected to overlap some observations with the James Webb Space Telescope (Hubble is part of a set of four “great observatories” launched by NASA in the 1990s and 2000s. The other members included the Spitzer Space Telescope, the Compton Gamma Ray Observatory and the Chandra X-Ray Observatory, which made many discoveries of their own.)


James Webb Space Telescope


This is the successor to Hubble, and its launch date has been delayed several times over the years, with the latest estimate now for 2020. Unlike Hubble, this telescope will be parked far from Earth and out of reach of repair crews. Its science will look at four major themes: the universe’s first light, how the first galaxies were formed, how stars are formed, and looking at the origins of life (including exoplanets).


Kepler telescope


This planet-hunting machine has found more than 4,000 potential planets since first launching in 2009. Initially, it focused on a section of the Cygnus constellation, but in 2013 problems with pointing consistently created a new mission in which Kepler moves between different regions of the sky. One of Kepler’s major contributions is finding more super-Earths and rocky planets, which are harder to spot near bright stars.


Atacama Large Millimeter/submillimeter Array (ALMA)


This telescope in Chile has 66 receivers and its specialty is looking through the dust in young planetary systems (or through dusty stars and galaxies) to see how cosmic objects are formed. It was fully operational as of 2013. ALMA is unique in its sensitivity because it has so many receivers available. Some of its results include the clearest-ever image of the star Betelgeuse, and precise measurements of black hole masses.


Arecibo Observatory


This observatory has been operating since 1963, and is famous for many radio astronomy studies. The Puerto Rican telescope is also know for a message called the Arecibo Message that was directed at the globular cluster M13 in 1974. The observatory was damaged during a 2017 hurricane that devastated Puerto Rico. In popular culture, Arecibo was also the location of the climax of the 1995 James Bond film “Goldeneye”, and it appeared in the 1997 movie “Contact.”


Karl G. Jansky Very Large Array 


This is a set of 27 telescopes located in the New Mexico desert. Construction began on the VLA in 1973. Some of the VLA’s major discoveries include finding ice on Mercury, peering into the dusty center of the Milky Way, and looking at the formation of black holes. The telescope array also was prominently featured in the 1997 movie “Contact” as the site where a purported extraterrestrial signal arrived.


W.M. Keck Observatory


The twin telescopes at the W.M. Keck Observatory in Hawaii are the largest optical and infrared telescopes available. The telescopes started their work in 1993 and 1996. Some of their major discoveries including finding the first exoplanet “transiting” across its parent star, and learning about star movements in the nearby Andromeda Galaxy.


Palomar Observatory


The Palomar Observatory, located in San Diego County, Calif., began work in 1949. The telescope is best known for discovering the small worlds Quaoar, Sedna and Eris in the Kuiper Belt, but its work also includes discovering supernovas (star explosions), tracking asteroids and looking at gamma-ray bursts.

License Plate History: A Timeline

H/T Your AAA Daily.

An interesting history.

World War II. The Supreme Court. Idaho potatoes. License plates have quite the story to tell.

license plate history

(Library of Congress)

With some 270 million vehicles registered in the U.S., each one adorned with an alphanumeric metal panel, it’s easy for license plates to be overlooked – maybe even maligned for their connotation of trips to the DMV. But like other parts of the automobile’s past, there’s more to the history of license plates than what meets the eye. As unlikely as it may seem, these vehicle identifiers have been influenced by technology, culture and current events. Thus, they offer a unique window into our country’s past.

Let’s take a look at the last 100-plus years of license plate history.

  • Photos courtesy of the National Museum of American History
  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10

1901 – New York Requires Vehicles to Be Registered 

On April 25, 1901, New York Governor Benjamin Odell Jr. signed a bill into law that required vehicle owners to register their cars with the state. As part of the registration process, the law dictated all automobiles have “the separate initials of the owner’s name placed upon the back thereof in a conspicuous place, the letters forming such initials to be at least three inches in height.”

There was one catch: New York State did not issue the plates; owners were expected to create them on their own. This meant there was no standardization and early plates varied widely in materials, style and color. Motorists commonly used metal, wood or leather. Some even painted letters directly onto the vehicle.


Approved Auto Repair Limited Time Offer!

Get a free wheels-off brake inspection this summer at your local AAR facility.LEARN MORE

1903 – Massachusetts Issues First State License Plates

Massachusetts becomes the first state to issue license plates to drivers. These cobalt blue plates were made of iron and covered with porcelain enamel.

The very first plate featured just the number “1.” It was issued to Frederick Tudor, who worked for the highway commission. It remains an active registration by a member of his family.

1928  Idaho Introduces License Plate Slogans

Nowadays it’s very common for state slogans or other phrases to adorn license plates. That all started when Idaho began stamping “Idaho Potatoes” on all its license plates back in the 1920s.

1931 – The First Vanity Plates

Pennsylvania becomes the first state to issue customized license plates, beginning what would grow into a popular trend. At the time, however, drivers could only add their initials to the plate.

According to the American Association of Motor Vehicle Administrators, there were 9.7 million vehicles with personalized vanity plates in North America in 2007.

1944 – A Supply Shortage

During World War II, a vast amount of the country’s metal supply was used to build war supplies. This led to a nationwide metal shortage and states were forced to use alternative materials for their license plates, including fiberboard, cardboard and even soybean-based plastic.

license plate history

History of state license plates. See the full infographic of all 50 states.

1957 – The Standard Size Is Set

Automobile manufacturers come to an agreement with international governments and standards organizations on the size of license plates. The standard plate size is set at 12 by 6 inches in the United States.

1971 – A New Material Arrives

The manufacturing company 3M introduces High Intensity Grade Reflective Sheeting. States began to require the new material be used in the production of license plates in order to improve visibility.

1977 – License Plates Reach the Supreme Court

The land’s highest court gives their decision on the case of Wooley v. Maynard. Up until that point, the state of New Hampshire required all noncommercial vehicles to have license plates containing the state motto “Live Free or Die.” Resident George Maynard cut off the words “or Die,” believing they went against his religious beliefs. He was cited for violating the state law, fined, and after refusing to pay, jailed for 15 days.

Maynard sued and the case eventually made its way to the Supreme Court. In a 6-3 decision, the court ruled that New Hampshire could not require citizens to display the state motto, stating “New Hampshire’s statute in effect requires that appellees use their private property as a ‘mobile billboard’ for the State’s ideological message…The First Amendment protects the right of individuals to hold a point of view different from the majority, and to refuse to foster, in the way New Hampshire commands, an idea they find morally objectionable.”

License plates are not the only automobile accessory to make it to court. Bumper stickers have been there too.

2000 – America’s Most Expensive License Plate

A 1921 Alaska license plate is sold for $60,000. To date, it is the country’s most expensive license plate. Its high value is a product of its rarity as the plate is one of only four known to exist.

Why so few? Alaska in the 1920s was not even a U.S. state at the time. It remained mostly undeveloped, with little infrastructure, including roads. Navigating its terrain in an automobile was nearly impossible, so very few people owned one. Fewer cars meant fewer license plates.

The History of the Piñata

H/T Back Then History.

I learned a lot about the Piñata.

It Originated in China

Today, the Piñata is a staple at many celebrations and plays a particularly central role in Mexican fiestas. You may think of it as just a simple object, but it has a surprisingly fascinating history! The piñata is thought to have originated over 700 years ago in Asia. Specifically, the Chinese used to fashion paper-covered animals to celebrate the New Year. They decorated paper-covered animals (which included cows, oxen, and buffalos) with colorful harnesses and other trappings. Then, they filled the figures with seeds and knocked them with sticks until the seeds spilled out. Afterwards, the remains were burned; the ashes were thought to bring good luck in the coming year. It is thought that Marco Polo discovered this Chinese practice and introduced it to the Western world.

It Became Part of Lenten Traditions in Europe

In the 14th century, the piñata entered Europe and was quickly adapted to the Christian season of Lent. The first Sunday of Lent was known as “Piñata Sunday’ – the name comes from the Italian word pignatta, meaning “fragile pot,” because early European piñatas resembled clay pots. When the custom spread from Italy to Spain, the first Sunday in Lent there became known as the “Dance of the Piñata.” The Spanish fiesta featured a clay container called la olla (the Spanish word for pot); originally, it was not decorated, but over time decorations like tinsel, ribbon, and fringed paper were added.

Indigenous Peoples Had Their Own Version

When Spanish missionaries travelled to the Americas, they used the piñata to attract crowds and attention at their ceremonies. However, the indigenous peoples already had their own tradition that was similar; to celebrate the birthday of Huitzilopochtli, the Aztec god of war, Aztec priests put a clay pot on a pole in the temple at the end of each year. The clay pot was decorated with feathers and filled with small treasures. When broken with a stick, the treasures would fall at the god’s feet as an offering. The Mayans also played a sport where a player’s eyes would be covered and they would have to try to hit a hanging clay pot.

Image Credit: https://borrachavegas.com

Missionaries Gave It Religious Meaning

The missionaries transformed these indigenous traditions for the purpose of religious instruction. They covered the traditional pot in colored paper so that it appeared different (and perhaps even scary) to the local peoples. The original piñata features seven points; the missionaries used these to symbolize the Seven Deadly Sins in Christianity: envy, sloth, gluttony, greed, lust, anger/wrath, and pride. (There is also a traditional ten-pointed piñata, which missionaries said symbolized the sins that come from breaking the Ten Commandments.) The missionaries said that the stick used to break the piñata represented love. The stick, representing love, destroyed the piñata, which represented sins and temptation, thus imparting a religious lesson. Some people also say the piñata was meant to represent Satan. The treats (usually candies and fruits) that fell out of the broken piñata were said to represent God’s forgiveness of sins and a new beginning. Another interpretation holds that the fruits represented temptations and earthly pleasures, while yet another holds that the sharing of the fruits and candies represented a reward for keeping the faith – a share in divine blessings and gifts.

Today, the piñata has lost most of its religious meaning. Instead of being used as tool to teach the Christian catechism, it is simply a fun pastime at celebrations. It’s especially popular at Mexican fiestas and is used to mark special holidays, such as Christmas and Cinco de Mayo. It’s also popular at children’s parties, and many commercially available piñatas are made in the likeness of beloved children’s characters.


A Brief History of Chocolate

H/T Mental Floss.

A sweet history.

In 2017, two members of a Russian crime syndicate in the United States were charged with the transport and sale of 10,000 pounds of “stolen chocolate confections.” The indictment didn’t mention whether the thieves took a few bites for themselves, but if they did have a sweet tooth they’d hardly be alone: Napoleon Bonaparte was a fan of chocolate, which was said to be his drink of choice when working late. Thomas Jefferson fell in love with it while serving as minister to France, and proclaimed that it might soon be more popular than tea or coffee. And though she probably never said “let them eat cake,” Marie Antoinette was known to enjoy hot chocolate, which was served at the Palace of Versailles.

Chocolate’s worldwide popularity streak has lasted centuries, but it wasn’t always the sweet, easily accessible treat we know today. So what is chocolate, and how did it transform from sacred beverage to sweet snack?


Every chocolate product starts with the cacao tree. The plants were originally native to the Americas, but today they’re grown worldwide, primarily in tropical regions. The fruits of cacao trees are called pods; one pod is roughly the size of a football, and it contains around 40 almond-sized cacao beans—which are actually seeds.

When fermented and roasted, cacao beans develop a rich, complex flavor. They’re the key to making chocolate taste chocolatey. The word cacao, by the way, usually refers to the plant and its seeds before they’re processed, while chocolate describes products made from processed cacao beans. And if you’re wondering what the difference between cacao and cocoa is, there really isn’t one. Both versions of the plant name are technically correct, but in modern usage, cacao is increasingly applied to things closer to the plant while cocoa is used for the more processed stages.

There’s some debate over who first decided to turn raw cacao beans into processed chocolate. One long-standing theory posits that humans were first drawn to the pulp of the cacao pod, which they used to make an alcoholic beverage. The oldest evidence we have for the consumption of cacao products comes from 5000 years ago in what is now Ecuador.

At some point, chocolate migrated north: Evidence of cacao residue has been found in vessels from the Olmec people, in what is now southern Mexico. It’s still unclear if this cacao was the result of beer-like fermented beverages made from cacao pods or some kind of chocolate that would be more recognizable to us today.

According to art and hieroglyphs from Central America and southern Mexico, chocolate was a big part of Maya culture. It didn’t look or taste anything like a Hershey’s bar, though. Back then, chocolate was sipped rather than eaten, and to make these chocolate drinks, the Maya harvested beans from cacao pods and fermented them.

Fermentation is basically controlled rot. Microorganisms like yeast and bacteria break down the organic substances in food, changing the taste on a biochemical level without making the food go bad. Fermentation also generates heat, and when a pile of cacao beans ferments, it can exceed 120 degrees Fahrenheit. This heat is essential in developing chocolate’s signature flavor and aroma. It unlocks flavor compounds we associate with chocolate and activates enzymes that mellow the cacao beans’ natural bitterness. It’s also what kills the germ, or embyro, in the middle of a bean that would cause it to sprout, and dissolves any leftover pulp from the cacao pod surrounding the beans.

After they’re fermented for several days, cacao beans are dried, roasted, shelled, and ground into a paste called chocolate liquor. Roasting is an important step. It creates new flavor compounds and concentrates other flavors that were already there. It also burns off acetic acid, a natural byproduct of fermentation that can give chocolate an undesirable, vinegary flavor.


These early steps in the chocolate-making process haven’t changed much over the centuries. The main difference in the Maya preparation came after the beans were processed. Instead of using the ground cacao beans to make candy or desserts, they mixed the paste with water, cornmeal, and spices like chili peppers to make a thick, savory beverage. By pouring the liquid from one container to another a few times, they were able to give it a frothy head, which was a big part of the drink’s appeal.

Chocolate was especially popular among elite members of society. It was enjoyed by Maya rulers, and cacao beans and chocolate paraphernalia have been found in royal tombs. Priests drank chocolate and used it in religious ceremonies. Cacao was considered a gift from the gods, and it was featured in Maya weddings, where the bride and groom would exchange sips of the beverage to seal their union. After important transactions were agreed to, the two parties would share a drink of chocolate to make it official.

The Aztecs, who dominated central Mexico from around 1300 to 1521, were just as enamored with chocolate. They used cacao beans as currency. One bean was worth a tamale, while 100 beans were enough to get you a quality female turkey.

Chocolate played a role in Aztec religious ceremonies, too. In their book The True History of Chocolate, Sophie and Michael Coe mention a Spanish chronicler who wrote that sacrifice victims who weren’t in the mood to participate in the ritual dances leading up to their deaths were given chocolate—mixed with the blood from previous human sacrifices—to boost their spirits.

According to Aztec legend, the emperor Montezuma II (who, incidentally, is increasingly referred to as Moctezuma in English because it more closely resembles the original Aztec) was rumored to have drunk a gallon of chocolate a day, but he didn’t just like it for the taste. Chocolate was believed to be an aphrodisiac, and he purportedly binged the drink to fuel his affairs.

Chocolate never lost its romantic reputation, but the scientific evidence for its amorous abilities is actually pretty limited. It contains the compounds tryptophan and phenylethylamine, and tryptophan does help the body make serotonin, which is associated with feelings of happiness and well-being. Phenylethylamine releases dopamine, otherwise known as the “feel-good” neurotransmitter. Tryptophan and phenylethylamine may qualify as aphrodisiacs, but there probably aren’t enough of them in cacao beans to produce any noticeable effects.


The word chocolate originated in Mesoamerica. Like the Aztecs and Maya, the Pipil people of what is today El Salvador brewed drinks from cacao beans, and they called these beverages chocola-tl. It’s thought that when the first Spaniards to visit the region heard the word, they basically kept it. The name still persists today, largely unchanged from its original language.

A number of European explorers, from Christopher Columbus to Hernan Cortes, have been credited with bringing chocolate back home after traveling to the Americas. But the first chocolate to land in Europe may not have come from a famous explorer at all. Some historians say Spanish missionaries were instrumental in getting cacao across the Atlantic. Upon returning from an overseas trip, Catholic friars presented a group of Maya dignitaries to the court of Prince Philip in 1544. The Maya brought with them gifts from the New World, including chocolate. This offering marks the first recorded evidence of chocolate in Spain.

Soon enough, chocolate spread to the rest of Europe, where it underwent its next big transformation. The drink was too bitter for European palates, so people started adding more sweeteners to the mix. Different countries added their own spices—the Spanish liked cinnamon and vanilla in their chocolate, while the French flavored their chocolate with cloves.

In Europe, as in Mesoamerica, chocolate was mostly enjoyed by the upper classes. In 17th century Britain, a pound of chocolate cost 15 shillings, which was about 10 days’ worth of wages for a skilled tradesman. In 1657, London opened its first chocolate house, a place where men could gather to gamble, do business, and discuss politics over a nice cup of cocoa.


Chocolate was already a global success story by the 19th century, but it might never have become the nearly ubiquitous treat we know today if it wasn’t for a Dutch chemist named Coenraad Johannes van Houten. In 1828, he discovered that by removing some of the fat, or cocoa butter, from chocolate liquor and treating it with alkaline salt, he could turn the ingredient into a new kind of powder. Alkaline substances are basically the opposite of acidic substances; adding the alkaline salts to chocolate created a product that had a more mellow, earthier taste. If you see natural cocoa powder and Dutch-process cocoa powder next to each other at the grocery store, know that the natural stuff will generally be more acidic than van Houten’s “Dutch” version.

Dutch cocoa powder was easier to mix with water than ground-up beans, but the invention had implications far beyond that. His work eventually helped give us the first modern chocolate bars. A British candy maker named J.S. Fry & Sons created solid chocolate in 1847 after mixing melted cocoa butter back into cocoa powder and letting it harden. If you’re not familiar with his company J. S. Fry & Sons, you’ve likely heard of Cadbury, which pioneered the heart-shaped chocolate box in the 1860s.

In the 1900s, the two companies worked together to import South American cacao beans to England, but the Cadburys eventually made a series of deals with farmers to cut their partner-rivals out of the supply chain. This led to some good old-fashioned Chocolate Beef: In his book, Chocolate: A Bittersweet Saga of Dark and Light, Mort Rosenblum tells the story of Cecil Fry’s funeral at Westminster Abbey. When Fry’s widow saw the patriarch of the Cadbury family file into the ceremony late, she apparently rose to her feet and shouted, “Get out, Devil.”


Swiss chemist Henri Nestlé created a powdered milk product in the mid-19th century, which a countryman by the name of Daniel Peter decided to add to chocolate. This was the debut of a new product called milk chocolate.

Today, the FDA defines milk chocolate as having at least 10 percent chocolate liquor and 12 percent milk solids. These standards are far from universal; in Europe, milk chocolate must contain at least 25 percent dry cocoa solids and 14 percent dry milk solids. (When it comes to white chocolate, on the other hand, the only product derived from cacao beans is cocoa butter. There’s some debate over whether it should be considered chocolate at all.)

The company many Americans associate with chocolate today didn’t arrive on the scene until fairly recently. Milton Hershey got his start in the candy business selling caramels, not chocolate bars. The entrepreneur fell in love with chocolate at the 1893 World’s Fair. He was so impressed by Germany’s chocolate production display that he bought their machinery when the exposition was over and started making chocolate professionally the next year. An early slogan for Hershey’s was “Our Milk Chocolates are highly nutritious, easily digested, and far more sustaining than meat.”

In 1900, Milton sold his caramel business for $1 million and fully devoted himself to the Hershey Chocolate Company. The company got so big that Milton Hershey built an entire town for his employees to live in. Now, people can visit Hershey, Pennsylvania, to enjoy candy-themed rides at Hersheypark, see how chocolate is made at Hershey’s Chocolate World, or take a bath in real chocolate at the Hotel Hershey.


The differences in cocoa content might lead some international readers to turn their noses up at a Hershey’s bar, but try one in a s’more and then thank the U.S. of A. and the Girl Scouts of America, who published what is debatably the first known recipe for “Some Mores” in the 1927 guidebook “Tramping and Trailing with the Girl Scouts.” And be thankful that it’s not worse: Back in 2007, a group of lobbyists sought to change the FDA’s definition of chocolate to allow for the removal of cocoa butter entirely, in exchange for more affordable, accessible alternatives like vegetable oils.

It seems this effort failed, so you can rest assured: The next time a pair of former-Soviet-bloc gangsters steal a few tons of chocolate here in the United States, cocoa butter will be part of the haul.

Genocide: Never Again, and Again


Powerful words of wisdom.


The 47th Vice President of the United States recently formally recognized the death of Armenians at the hands of the Ottoman Empire as genocide:

“Each year on this day, we remember the lives of all those who died in the Ottoman-era Armenian genocide and recommit ourselves to preventing such an atrocity from ever again occurring …

“… Let us renew our shared resolve to prevent future atrocities from occurring anywhere in the world. And let us pursue healing and reconciliation for all the people of the world.”

Here’s what happened to nearly two million Armenians:

“… Armenians in the area were blamed for siding with the Russians and the Young Turks began a campaign to portray the Armenians as a kind of fifth column, a threat to the state …”

“… A later law allowed the confiscation of abandoned Armenian property. Armenians were ordered to turn in any weapons that they owned to the authorities. Those in the army were disarmed and transferred into labor battalions where they were either killed or worked to death… ”

The University of Minnesota’s Center for Holocaust and Genocide Studies has compiled figures by province and district that show there were 2,133,190 Armenians in the empire in 1914 and only about 387,800 by 1922.

The pattern of government actions culminating in genocide are sadly familiar: ostracize, disarm, then kill, as Jews for the Preservation of Firearms Ownership points out:

“Disarmed people are neither free nor safe – they become the criminals’ prey and the tyrants’ playthings. When the civilians are defenseless and their government goes bad, however, thousands and millions of innocents die.”

But is the 47th VP preventing future atrocities? First came the ostracizing, from one his comrades. Listen to the full clip of former CIA director Brennan, briefly quoted below:

“… the members of the the Biden team who have been nominated or have been appointed are now moving in laser-like fashion to try to uncover as much they can about what looks very similar to insurgency movements that we’ve seen overseas, where they germinate in different parts of the country, and they gain strength, and it brings together an unholy alliance frequently of religious extremists, authoritarians, fascists, racists, nativists, even libertarians … and so I really do think that the law enforcement Homeland security intelligence and even the defense officials are doing everything possible to root out what seems to be a very very serious and insidious threat to our democracy in our Republic.”

It’s beyond the pale to throw everyone who is concerned with current events into one basket, but it’s an effective way to set the stage for taking action. As infringement on our right to self-defense have become increasingly unpopular, the confiscationists have pivoted from “stop gun violence” to “save the Republic.”

The administration’s message to garden-variety firearms enthusiasts should be: Don’t let seditious radicals imperil your access to the guns you cherish. Protect your hobby by backing enforcement. Hunting, recreational shooting and personal defense against criminal threats are all fine; anti-government, white supremacist militia activity is not.

Although I absolutely oppose white supremacy and I oppose the initiation of violence against anyone, including the government, guns are not a hobby, and their ultimate purpose is more important than the items on their list: it is to stop wayward governments’ violence on their people. What’s particularly terrifying about the piece is that the authors, who are “National Security Council veterans who have specialized in counterterrorism” have already looked ahead optimistically to deploying the military against Americans:

“… the concern isn’t that [commonly-owned modern sporting rifles] will somehow enable militias to challenge the U.S. military on the battlefield, which they certainly will not …”

These confiscationists will never acknowledge the magnitude of genocide, because doing so destroys their narrative. Perhaps Ayn Rand rendered those figures into words best, back in 1963:

“Criminals are a small minority in any age or country. And the harm they have done to mankind is infinitesimal when compared to the horrors — the bloodshed, the wars, the persecutions, the confiscations, the famines, the enslavements, the wholesale destructions — perpetrated by mankind’s governments. Potentially, a government is the most dangerous threat to man’s rights: it holds a legal monopoly on the use of physical force against legally disarmed victims. When unlimited and unrestricted by individual rights, a government is men’s deadliest enemy. It is not as protection against private actions, but against governmental actions that the Bill of Rights was written.”

Taken from the internet, though I cannot locate the source:

“Grant me the serenity to accept that I don’t have the right to violate others;
The courage to change the things I can through voluntary interactions;
And the wisdom to know that I can’t delegate a right I don’t have to politicians to violate others on my behalf.”

Stay frosty, train, and pray.

— Dennis Petrocelli, MD is a clinical and forensic psychiatrist who has practiced for nearly 20 years in Virginia. He took up shooting in 2019 for mind-body training and self-defense, and is in the fight for Virginians’ gun rights.

This Odd Early Flying Machine Made History but Didn’t Have the Right Stuff

H/T Smithsonian Magazine.

A piece of aviation history most people are un aware of.

Aerodrome No. 5 had to be launched by catapult on the Potomac River on May 6, 1896, but it flew unpiloted 3,300 feet

Tandem Wings of Aerodrome No. 5
In 1891, Samuel P. Langley began experiments with large, tandem-winged models powered by small steam and gasoline engines that he called aerodromes. After several failures with designs that were too fragile and under-powered to sustain themselves, Langley had his first genuine success on May 6, 1896. (NASM)

The vessel floated in the shallows of the Potomac River on the leeward side of Chopawamsic Island, just off Quantico, Virginia. At first glance, it could have been mistaken for a houseboat—except for the large scaffold that protruded from the top of the superstructure.


Even more unusual on that calm spring day, 125 years ago, was what was hanging from the formidable framework—a 13-foot-long apparatus made of wood and metal tubing that had two sets of long silk-covered wings forward and aft. Weighing 25 pounds, the contraption also included a small steam-powered engine and two fabric-covered propellers.

History would be made that day, May 6, 1896, as this apparatus—a flying machine, known as Aerodrome No. 5—was started and then launched from a spring-loaded catapult. The Aerodrome would take off and travel for 90 seconds some 3,300 feet in an effortless spiral trajectory and then gently land in the river.


Flight of Aerodrome No. 5
On May 6, 1896, Aerodrome No. 5 completed two successful flights of 3,300 feet and of 2,300 feet. (NASM)


The third Secretary of the Smithsonian Institution, Samuel Pierpont Langley, an astronomer who also enjoyed tinkering with his own creations, was aboard the boat. His winged invention had just made the world’s first successful flight of an unpiloted, engine-driven, heavier-than-air craft of substantial size.

With Langley that day, was his friend Alexander Graham Bell, the inventor of the telephone, who watched in amazement. Bell later wrote about how Aerodrome No. 5, now held in the collections of the Smithsonian’s National Air and Space Museum in Washington, D.C., moved with “remarkable steadiness” while in the air. Bell’s account describes the historic moment:

… and subsequently swinging around in large curves of, perhaps, a hundred yards in diameter and continually ascending until its steam was exhausted, when at a lapse of about a minute and a half, and at a height which I judge to be between 80 and 100 feet in the air, the wheels ceased turning, and the machine, deprived of the aid of its propellers, to my surprise did not fall but settled down so softly and gently that it touched the water without the least shock, and was in fact immediately ready for another trial.

The world rightly remembers that in 1903 the Wright brothers achieved human flight at Kitty Hawk in North Carolina. “Langley’s Aerodrome No. 5 wasn’t practical and it wasn’t a working prototype for any real flying machine,” says Peter Jakab, senior curator at the museum. But the largely forgotten unpiloted flight that took place seven years before Kitty Hawk did move motorized flight from the drawing board into reality.

Langley was a renowned physicist, who founded the Smithsonian Astrophysical Observatory, today located in Cambridge, Massachusetts. He built a telescope and recorded exact movements of extraterrestrial bodies to create a precise time standard, including time zones. Known as the Allegheny Time System, this development established the correct time, which was sent twice daily over telegraph wires and allowed trains to run on schedule—a significant problem in the days before standardized timekeeping.

“Langley’s real accomplishments in research were in astronomy,” says Jakab. “He had done a great deal of significant work in sun spots and solar research, some of that while at the Smithsonian.”

Langley also had an abiding curiosity in aviation. He became consumed with the possibility of human flight after attending a lecture in 1886 and began experimenting with a variety of small-scale models. His interest, while serving as Secretary of the Smithsonian—sort of the unofficial chief scientist of the United States at the time—spurred others to further investigate the new field of aeronautics.

“This was still a period when people didn’t think flight was possible,” Jakab says. “If you were a young person in the 1890s contemplating a career in engineering, flight was not exactly an area you would go into. It wasn’t taken seriously by a lot of people. The fact that someone like Langley was starting to study flight gave the field credibility.”


Bell's Photo of Aerodrome in Flight
With Langley that day, was his friend Alexander Graham Bell, the inventor of the telephone, who took this photo and later wrote that the Aerodrome moved with “remarkable steadiness.” (Wikimedia Commons)

Langley had some success with small model aircraft, and conducted aerodynamic research with a large whirling arm apparatus he designed. He increased the size of his prototypes and began to develop small engines to power them. His first attempts at unpiloted powered flight failed.

After Aerodrome No. 5 completed its two successful flights, Langley began to boast he would be the first to accomplish human powered flight. He repeated the success six months later with a newer improved Aerodrome No. 6.

However, Langley’s designs were inherently flawed. While he had made limited strides in the understanding of lift, thrust and drag, he failed to see that his models when scaled up to include a human and larger engine were structurally and aerodynamically unsound, and were not capable of flight.

“Langley had this fundamentally flawed notion about the relationship between aerodynamics and power,” Jakab says. “He came up with the Langley Law, which basically said the faster you flew, the less drag there was. He believed the faster you would go, the less power you would need. As strange as that sounds to us today, that’s what his data seemed to be telling him then.”

The Smithsonian secretary also did not realize he needed a better control system for a pilot to guide the aircraft in flight. The tail only moved vertically, which provided minimal pitch, while the rudder was located in the center of the fuselage, which offered little aerodynamic effect. Langley also miscalculated the stress factors of building a much larger plane.

Weighing 25 pounds, the Aerodrome No. 5 also included a small steam-powered engine and two fabric-covered propellers. (NASM)


“He didn’t grasp that the flight loads on the structure increase exponentially as you increase the size of the craft,” Jakab says. “To build a full-size aircraft, Langley simply scaled up the smaller models. If you tried to use that same structural design for something four times the size, it was not going to sustain itself—and that’s exactly what happened.”

Langley began building larger prototypes in preparation for test flights. The U.S. Department of War took an interest and provided $50,000 in grants to fund the project. Langley also found a young scientist, Charles M. Manley, who was more than willing to pilot the craft on what they hoped would be the first flight.


On October 7, 1903, the full-scale aircraft, called the Great Aerodrome, was loaded on the houseboat on the Potomac River, not far from what is now Marine Corps Air Facility Quantico, and made ready for takeoff. With news reporters watching and photographers making pictures, the Great Aerodrome was launched—and then, it promptly collapsed upon itself and fell into the water. A second attempt on December 8 produced the same results. Less than 10 days later, the Wright brothers would fly into history with Orville at the controls while Wilbur steadied the Wright Flyer as it began its takeoff run.

As might be expected, Langley was humiliated by the press for his failures in flight. That defeat, along with an embezzling scandal by Smithsonian accountant William Karr, left him deeply distraught.

“Those two catastrophic failures in 1903 ended Langley’s aeronautical work,” Jakab says. “He was a broken man because he took a lot of ridicule. He spent a lot of money and did not achieve a great deal in this field.”


Samuel P. Langley
Samuel Pierpont Langley served as the third Secretary of the Smithsonian Institution from 1887 to 1906 and was the founder of the Smithsonian Astrophysical Observatory. (Smithsonian Institution Archives)

Langley died in 1906 at the age of 71. Jakab believes Langley should be remembered for what he accomplished in 1896. His successes with Aerodrome No. 5 and Aerodrome No. 6 are significant and worthy of recognition today. In fact, the Smithsonian Institution once honored May 6 as Langley Day.

“It used to be an unofficial holiday and employees would get the day off,” Jakab says with a hint of mischief in his voice. “I’ve always advocated that we should reinstitute Langley Day and have May 6 off, but the administration has not taken me up on that so far.”

Langley’s Aerodrome No. 5 will be on view in the “Early Flight” gallery at the National Air and Space Museum, currently undergoing a major renovation. The museum is slated to reopen in the fall of 2022.