Jump to content

Fact of the Day


DarkRavie

Recommended Posts

Fact of the Day - NOSELESS

berlin-germany-18-september-2019-260nw-2

Did you know.... Many old statues from past civilizations have been disfigured in a remarkably similar way.

 

Many statues that have survived for thousands of years—including Greek, Egyptian, and Roman works—have not always made it to the modern era completely intact: A fair number are actually missing their noses. (Various museums and cultural institutions have such specimens in their collections, including a statue in the possession of the Acropolis Museum in Athens.)  But where have all these old statues’ noses gone? There are several reasons that could explain the disfigurement. 

 

Bad Reputation
One of the reasons stems from a common belief in antiquity that to damage the image of a person was also to damage that person themselves. This became a practice in Ancient Rome, where the defilement and destruction of statues was connected to the reputations of the people embodied in those artifacts. The removal of noses has also been linked with the real-life penalty of facial mutilation that also existed during these ages and continued through the time of the Byzantine Empire.

 

Stopping the Supernatural
There is another reason for the iconoclasm of noses that goes beyond attitudes toward people in the physical world: The belief in the connection between the physical form of a statue and who it represented extended past human beings and into the realm of statues of the gods. Some past civilizations, including Ancient Egypt, believed souls could inhabit statues and that it was therefore possible to communicate with the gods via effigies of their forms. 

 

images?q=tbn:ANd9GcRv83vEC0wDq907H7h4i5O

 

The disfigurement of a statue could thus sometimes be a superstitious act, one designed to halt or thwart the presence of a deity or a supernatural being. People specifically targeted the nose because of how crucial the appendage was to the process of living; noses do, after all, play a significant part in how people breathe. To remove the nose was therefore seen as a way to kill the spirit the statue was believed to represent. 

 

Got Your (Plaster) Nose
There was a period during the 19th century when some museums attempted to repair the missing noses by adding replacements to these statues—the Glyptotek in Copenhagen contains a cabinet of more than 100 of such plaster appendages that were once used to perform these artistic nose jobs.

 

The cosmetic repairs were partly done because the financial value of a statue during this period increased the more complete it was. As such, people had some additional motivation to add fake noses to any less-than-whole faces. 

 

By the 20th century, however, cultural preference had shifted back to displaying these sculptures in the condition in which they were left in antiquity. Today, it’s more likely you’ll see a statue missing its nose than one sporting a fake schnoz.

Source: Why Are So Many Old Statues Missing Their Noses?

  • Like 1
Link to comment
Share on other sites

Fact of the Day - WATERMELON SNOW

6151ca9fac6446d4799ffa4158477e26.jpg?v=1

Did you know.... Watermelon snow may sound like something from the Candy Land board game, but the phenomenon is very real — Aristotle even wrote about a “reddish” snowbank he found on Mount Parnassus in the fourth century BCE. Visitors to Antarctica, the Himalayas, the Rockies, the French Alps, and Yosemite National Park have also glimpsed this colorful occurrence. In the 1800s, Scottish botanist Robert Brown finally determined the culprit: a species of algae called Chlamydomonas nivalis. Under a microscope, single-celled C. nivalis appear green, but they also feature a secondary red pigment, astaxanthin, which is a carotenoid, part of the chemical family that can make carrots orange. This astaxanthin is dormant for much of the year, but when winter ice and snow start to thaw and the algae surface to divide and photosynthesize, they trigger their astaxanthin as a barrier against the sun’s harsh UV rays, turning red in the process. Some say this rosy snow smells sweet and fruity, although experts warn that eating large amounts can cause digestive problems. 

 

Algae is responsible for creating much of the world’s oxygen and forming the basis for most food webs; thousands of species exist. Algae is also often a factor associated with major color changes: Dunaliella algae are believed to be one origin for the pink lakes that draw shutterbugs to places such as Australia, Senegal, and Spain, and many experts hypothesize that Trentepohlia algae led to the red rains that fell in Kerala, India, between July and September 2001. Recently, cold climates on different continents have witnessed an increase in clusters (or blooms) of C. nivalis algae, and scientists are working to understand why. Besides red, the blooms can appear green, gray, or yellow. 

 

Watermelon seeds were found in King Tutankhamun’s tomb.
King Tut was just 19 years old when he died in approximately 1324 BCE. When British archaeologist Howard Carter unsealed his tomb in 1922, he found 116 baskets and 12 additional containers full of goods and treasures that were meant to help the late pharaoh transition to the afterlife. In 1988, a graduate student in London named Christian Tutundjian de Vartavan came across 30 small cardboard boxes that had been languishing in a Royal Botanic Gardens storage room since their contents were discovered by Carter. Within the boxes, de Vartavan found around 25 plant food species that had once been inside the tomb, including sesame seeds, millet, barley, black cumin seeds, coriander, and watermelon seeds. However, more than 3,300 years ago, wild watermelons were the opposite of the juicy, sweet produce we think of today, and were likely included less for their deliciousness than for their hydrating properties.

 

 

Source: Some parts of the world have pink snow, known as ‘watermelon snow.’

Link to comment
Share on other sites

Fact of the Day - 911

contact-img.png

Did you know... Should you ever have to call 911, don’t worry about how many bars you have — you can make emergency calls even without cell service or a SIM card. This has been the case since the Wireless Communications and Public Safety Act of 1999 took effect, as one provision of the law required the Federal Communications Commission to make 911 the universal emergency number for all telephone services. This is why iPhones sometimes say “SOS only” and Android phones display the message “emergency calls only” when you don’t have reception.

 

There’s a caveat, however: Calls made from phones without active service can’t automatically deliver your location to the dispatch center, which also won’t be able to call you back if you become disconnected. Another unfortunate side effect is an increase in prank calls made from phones without service, as they’re essentially untraceable; children given phones without service as toys can sometimes make errant emergency calls as well. 

 

Because call centers are required to find out whether an emergency actually exists, such calls are a burden on the system — so use this safety net responsibly if you ever have to use it at all. Nonetheless, this is still an improvement on the pre-911 system, which required people to remember the phone number of their local police or fire station.

 

911 was chosen as the emergency number by AT&T.
The idea of implementing a nationwide emergency number dates back to 1957, when the National Association of Fire Chiefs suggested adopting a single number for reporting fires. It took another decade for the FCC to formally meet with AT&T about doing so, and in 1968 the company established 911 as the chosen digits. There were several reasons for this: 911 is short, easy to remember, and a number that can be dialed quickly, and it had never before been used in any other context prior to its implementation.

 

 

Source: You can call 911 in the U.S. even with no cell service.

Link to comment
Share on other sites

Fact of the Day - HEMINGWAY ADAPTATIONS

cooper_200-06ccf29922eb3558ee3c0e2c265f6

Did you know... While Hemingway wasn’t generally a fan of the adaptations of his works, these five films are must-watches.


Ernest Hemingway wasn’t a huge fan of cinema. According to his son Patrick, “pictures on the silver screen were nothing but pure illusion (…) and not to be taken seriously.” 

His relationship with the screen, which over the course of his lifetime developed from a technological curiosity into a cultural force, was undoubtedly shaped by his identity as a writer—as an artist who expressed himself not in images but in words, and by the time of his death saw his age-old trade swept aside by a new, different medium. While the author himself would probably have begged to differ, the following five films are considered some of the best Hemingway adaptations out there. 

 

A Farewell to Arms (1932) 

 

 

Originally published in 1929 and based on his experience serving as an American ambulance driver during the First World War, A Farewell to Arms follows a wounded lieutenant who falls in love with the nurse who nurses him back to health, culminating in the couple’s ill-fated attempt to leave the war behind.

 

This adaptation, directed by Frank Borzage and starring Gary Cooper and Helen Hayes in the two leading roles, was nominated for four Academy Awards and ended winning two: one for Best Cinematography, and another for Best Sound. Made before the existence of codes, the film was—for a time—banned on account of its portrayal of sexuality and violence.

 

For Whom the Bell Tolls (1943)

 

 

Like A Farewell to Arms, this novel was released only a few years before its big screen adaptation, in 1940. Also steeped in personal experience, it follows an American volunteer fighting against fascist forces during the Spanish Civil War. This soldier, too, falls in love, forcing him to choose between duty and happiness.

 

Directed by Sam Wood, this adaptation was nominated for Best Picture. Gary Cooper returns to play the leading role, this time starring alongside Ingrid Bergman—seen for the first time in Technicolor. Aside from faithfully adapting the story, it sticks close to its themes of pacifism and the futility of war.

 

The Killers (1964)

 

 

Unlike most of the other entries on this list, this film isn’t based on a novel or novella but a short story. First seen in 1927 in Scribner’s Magazine and later republished alongside other stories as part of the Men Without Women collection, it tells the story of two hitmen who set out to kill a former boxer—a premise revisited by Quentin Tarantino in Pulp Fiction.

 

This film, directed by Don Siegel, is actually the story’s second big-screen adaptation. The only adaptation Hemingway somewhat liked (thanks in no small part to Gardner), it stars U.S. President Ronald Reagan in one of his last acting roles before he began focusing exclusively on his political career.

 

The Old Man and the Sea (1958)

 

 

Arguably Hemingway’s best-known literary work, ultimately earning him the Nobel Prize in Literature, The Old Man and the Sea follows an elderly Cuban fisherman who spends 84 days out at sea trying to catch a giant marlin, only to see the object of his obsession devoured by sharks. 

 

Brought to the big screen by John Sturges, this adaptation stars Spencer Tracy as the fisherman (named Santiago) and Felipe Pazos as Manolin, the young boy whom he refuses to bring along onto the water. Winning an Academy Award for Best Score, the film preserves and effectively communicates its source material’s musings on courage, perseverance, and loneliness.

 

Captain Khorsid (1987)
The only entry on this list produced outside of Hollywood, as well as the only one to bear a different name, Captain Khorsid is loosely based on Hemingway’s 1937 novel To Have and Have Not, about a cash-strapped boat captain forced to take up illegal smuggling during the Great Depression.

 

Filmed in Iran and directed by Nasser Taghvai, this adaptation transplants the story from Cuba to the Persian Gulf and reimagines the captain as a one-handed sailor transporting wanted criminals. Captain Khorsid is not just one of the best Hemingway adaptations; it’s also widely considered one of the best Iranian movies ever.

 

Hemingway’s gripes with cinema were deeply personal for other reasons, too, including the fact that he famously disliked most Hollywood adaptations of his work. He once told Ava Gardner, who played Lady Brett Ashley in an adaptation of The Sun Also Rises, that one of the only things he liked about the film was her. 

 

A stickler for realism, Hemingway was happy to learn that a TV version of The Killers cast an actual Swedish boxer in the role in the role of a Swedish boxer, and saddened to hear that The Old Man and the Sea would not be directed by Vittorio De Sica—known for Bicycle Thieves—nor star nonprofessional actors. 

 

Whether he liked it or not, though, Hemingway was one of the most frequently adapted writers of the 20th century—a feat attributable to his popularity as much as his curt, direct, no frills writing style, not unlike the language style of screenplays.

 

 

Source: Best Ernest Hemingway Adaptations

Link to comment
Share on other sites

Fact of the Day - LESSER KNOWN VERSES

tFgdfyLBPA4.jpg?size=320x213&quality=96&

Did you know.... It’s easy to bungle or forget the lyrics to “The Star-Spangled Banner.” But it may shock you to learn that most people only know a quarter of the song to begin with. The U.S. national anthem actually contains four stanzas, the last three of which are almost always omitted in live performances for brevity’s sake. Despite one verse being favored, however, all four are part of Francis Scott Key’s original 1814 poem that the national anthem is based on.

 

Key wrote the poem soon after the Battle of Baltimore in the War of 1812, during which British forces bombarded Maryland’s Fort McHenry for 25 hours. As the smoke cleared in the wake of the battle, Key saw the American flag still flying over the fort, signifying a U.S. victory.

 

The familiar first verse begins with “O say can you see” and ends with the question, “O say does that star-spangled banner yet wave / O’er the land of the free and the home of the brave?” It refers to “bombs bursting in air,” while the second verse discusses the “dread silence” after battle. The second verse also celebrates the flag still flying as a symbol of a U.S. victory after the fighting. 

 

In Key’s original manuscript, he swapped out the question mark in the first verse with an exclamation point in the second, thus ending with a definitive and joyous declaration: “‘Tis the star-spangled banner, O long may it wave / O’er the land of the free and the home of the brave!”  With two additional verses, the song ultimately totals 32 lines and 32 bars of music. If you were to perform it in its entirety, the anthem would take around six minutes or sometimes far more to sing — a long time to be on your feet at the start of a game.

 

Oliver Wendell Holmes wrote a fifth verse for “The Star-Spangled Banner.”
Nearly 50 years after Francis Scott Key wrote “The Star-Spangled Banner,” the esteemed American poet Oliver Wendell Holmes Sr. penned an unofficial fifth verse. Holmes wrote the verse in 1861 at the start of the U.S. Civil War, advocating for freeing enslaved people in the name of liberty.

 

Holmes wrote about “a foe from within” — a stark contrast from Key’s original poem about the British invading America. Holmes’ fifth verse also speaks of “the millions unchain’d who our birthright have gained,” and how their freedom is essential for keeping the flag’s “bright blazon forever unstained!”

 

 

Source: ‘The Star-Spangled Banner’ contains several lesser-known verses.

Link to comment
Share on other sites

Fact of the Day - CRYSTALS IN YOUR EARS?

ear_infections_480x480.png?v=1695821090

Did you know.... Did you know that hidden within your inner ear are microscopic crystals called otoconia? These tiny grains of calcium carbonate help your brain sense gravity and linear movement. 

 

The crystals rest on a gelatinous membrane above sensory hair cells inside two small chambers called the utricle and saccule. When you tilt your head or move forward, backward, or sideways, the otoconia shift slightly, moving the membrane and bending the hair cells beneath them. That movement sends signals to your brain about your body’s orientation relative to gravity, helping you stay balanced and aware of your position in space.

 

That process is just one component of your balance system, which also relies on the semicircular canals (SCC) to detect rotation, the eyes to track visual movement, and sensory feedback from the muscles and joints. But without otoconia, your sense of “up” and “down” would blur, and even small motions could leave you disoriented.

 

With age — or occasionally after a head injury — some otoconia can become dislodged and move into the nearby canals. Once there, they disrupt normal fluid movement, sending conflicting signals to the brain and causing sudden spinning sensations known as benign paroxysmal positional vertigo (BPPV). Though the dizziness can be startling, BPPV is common and treatable. A series of gentle head and body movements can use gravity to guide the stray crystals back to their proper chamber, often relieving symptoms within minutes.

 

So yes, you really do have tiny “ear rocks” — and though they’re microscopic, they play a surprisingly large role in keeping your world steady. 

 

Astronauts often have trouble with balance and coordination when they return to Earth.
After months in weightlessness, the body’s vestibular system — including the otoconia — is no longer calibrated to the pull of Earth’s gravity. In space, those crystals don’t settle downward the way they do on Earth, so they send mixed signals to the brain about which way is up. When astronauts reenter Earth’s gravitational environment, they may feel dizzy, off-balance, or unsteady while their brains and vestibular systems readjust.

 

To help reduce those effects, NASA uses simulations that create brief moments of weightlessness to challenge the inner ear. That training helps astronauts’ brains adjust more quickly once they return to Earth. Even with such preparation, though, it can take days or weeks for the otoconia to “relearn” how to respond to gravity and for the brain to interpret those signals correctly again.

 

Source: You have tiny crystals inside your ears.

Link to comment
Share on other sites

Fact of the Day - PICASSO? A THIEF?

CE80E614CB2502B3AA564CB4359A4921_0.png

Did you know.... When the “Mona Lisa” was stolen from the Louvre on August 21, 1911, the art world immediately went into mourning — and began wondering who was behind the dastardly deed. One man soon under suspicion was none other than Pablo Picasso, whose name was given to the authorities by Honore-Joseph Géry Pieret, the former secretary of Picasso’s friend (and famed poet) Guillaume Apollinaire. Pieret had previously stolen at least two Bronze Age Iberian sculptures from the Louvre and sold them to the then-up-and-coming cubist artist, who used them as inspiration for his painting “Les Demoiselles d’Avignon.” (At the time, the Louvre security was rather lacking; the paintings weren’t even bolted to the walls.) A terrified Picasso and Apollinaire were eventually brought to court, where it was determined that Picasso was indeed in possession of stolen art — just not the “Mona Lisa.” (The Iberian statues were quickly returned, and the judge let both Picasso and Apollinaire off with a warning.)

 

The search for the mysterious “Mona Lisa” took two years, during which time its popularity grew exponentially as reproductions were splashed across newspapers worldwide. In December 1913, Vincenzo Peruggia — an Italian employee of a firm that cut glass for the Louvre — emerged as the real thief after he tried to sell the painting to an antique dealer in Florence. (Peruggia is said to have believed that the “Mona Lisa” rightfully belonged to Italy and expected a reward for “returning” it.) Fortunately, the antiques dealer called the police. Peruggia later served eight months in prison for his crime.

 

Napoleon once hung the “Mona Lisa” in his bedroom.
When the portrait (painted by Leonardo da Vinci in 1503–1519) was first displayed at the Louvre in 1815, it didn’t take long for admirers to become smitten by it — and her. Shortly thereafter, a number of “suitors bearing flowers, poems, and impassioned notes climbed the grand staircase of the Louvre to gaze into her ‘limpid and burning eyes,’” according to Dianne Hales, author of Mona Lisa: A Life Discovered. It wasn’t just museumgoers who developed a fancy for the painting, though: Napoleon once hung it in his bedroom and referred to its subject as “Madame Lisa.” Years later, Hales adds, he became “infatuated with a young Italian woman who bore a remarkable resemblance to the lady in the painting.” That woman was Teresa Guadagni, who just so happened to be a descendant of Lisa del Giocondo, the actual subject of da Vinci’s masterpiece.

 

 

Source: Picasso was once suspected of stealing the ‘Mona Lisa.’

Link to comment
Share on other sites

Fact oof the Day - POMATOES

2019-02-20-Transgene-FreeGenomeEditing.j

Did you know... Tomatoes: so easy a potato can grow them. Well, not quite, but the two do occasionally join forces and result in the aptly named “pomato” plant. That two-for-the-price-of-one hybrid occurs when a tomato plant is grafted onto a potato plant, which is relatively easy to do since both belong to the Solanum genus of the nightshade family. 

 

The pomato isn’t its own fruit, however — it’s a plant that grows both foods at the same time: tomatoes on the vine and potatoes under the soil. Peppers, eggplants, and tobacco are also members of the Solanum genus, and tomato plants can be grafted onto them as well.

 

Nicknamed the “ketchup ’n’ fries” plant and sometimes called “tomtatoes,” these plants have been grown since at least 1833. In addition to the novelty of growing two things at once, pomatoes can benefit from both plants’ natural advantages: potatoes’ cold resistance and tomatoes’ heat resistance. The potatoes and tomatoes grown from these hybrid plants don’t taste any different than their normal counterparts, but they are more convenient to grow.

 

Heinz ketchup has a speed limit.
As the brand practically synonymous with ketchup, Heinz has a reputation to uphold. A big part of that image is the consistency and viscosity of its flagship product, which is meant to be thick enough to pour onto your fries at a diner by turning the bottle upside down but not so smooth that the ketchup splatters everywhere.

 

As part of its quality control process, the company has even imposed a speed limit on the condiment of 0.028 mph, which is checked at its factories. That’s the exact speed at which Heinz ketchup should move when poured upside down from its bottles. This speed limit even inspired a promotional campaign in collaboration with Waze, in which anyone forced to go 0.028 mph while stuck in traffic could get a free bottle of ketchup.

 

 

Source: ‘Pomatoes’ are potato plants that can also grow tomatoes.

Link to comment
Share on other sites

Fact of the Day - PUMPKIN PIE

full-pumpkin-pie-seen-from-top-view-isol

Did you know.... America’s pumpkin spice obsession dates back to at least the 1500s. No Thanksgiving dessert spread is complete without pumpkin pie. But how did this tasty treat come to be such an integral part of the annual feast? Let’s dig in.

 

The Origins of Pumpkin Pie
While it’s possible—even probable—that pumpkins were served at the 1621 harvest festival that’s now considered the predecessor to Thanksgiving, attendees definitely didn’t dine on pumpkin pie (there was no butter or wheat flour to make crust).

 

The earliest known recipes for pumpkin pie actually come from 17th-century Europe. Pumpkins, like potatoes and tomatoes, were first introduced to Europe in the Columbian Exchange, but Europeans were more comfortable cooking with pumpkins because they were similar to their native gourds.

 

nelboowNt.jpg?crop=1245:830;3,0&downsize

 

By the 18th century, however, Europeans on the whole lost interest in pumpkin pie. According to HowStuffWorks, Europeans began to prefer apple, pear, and quince pies, which they perceived as more sophisticated. But at the same time pumpkin pie was losing favor in Europe, it was gaining true staple status in America.

 

In 1796, Amelia Simmons published American Cookery, the first cookbook written and published in the American colonies. Simmons included two recipes for pumpkin pudding cooked in pastry crust. Simmons’s recipes call for “stewed and strained” pumpkin, combined with a mixture of nutmeg, allspice, and ginger (yes, it seems our pumpkin spice obsession dates back to at least the 1500s).

 

How Pumpkin Pie Became a Classic Thanksgiving Dessert
But how did pumpkin pie become so irrevocably tied to the Thanksgiving holiday? That has everything to do with Sarah Josepha Hale, a New Hampshire-born writer and editor who is often called the “Godmother of Thanksgiving.” In her 1827 abolitionist novel Northwood, Hale described a Thanksgiving meal complete with “fried chicken floating in gravy,” broiled ham, wheat bread, cranberry sauce, and, of course, pumpkin pie. For more than 30 years, Hale advocated for Thanksgiving to become a national holiday, writing regular editorials and sending letters to five American presidents. Thanksgiving was a symbol for unity in an increasingly divided country, she argued [PDF].

 

ae56e127182359cd8ebbd2789b79430d.jpg

 

Abraham Lincoln eventually declared Thanksgiving a national holiday in 1863 (to near-immediate outcry from Southerners, who viewed the holiday as an attempt to enforce Yankee values). Southern governors reluctantly complied with the presidential proclamation, but cooks in the South developed their own unique regional traditions. In the South, sweet potato pie quickly became more popular than New England’s pumpkin pie (mostly because sweet potatoes were easier to come by than pumpkins).

 

Now, pumpkin pie reigns deliciously supreme as the most popular holiday pie across most of the United States, although there are those in Northeast who may prefer apple, and the South is often split between apple and pecan, another Southern staple.

 

Source: Why Do We Eat Pumpkin Pie on Thanksgiving?

Link to comment
Share on other sites

Fact of the Day - PHILADELPHIA CREAM CHEESE

philadelphia_cream_cheese_spread.jpg

Did you know.... The City of Brotherly Love has clear-cut claims on many food origins — cheesesteaks, stromboli, and even root beer. But ironically, Philadelphia Cream Cheese is not from Philly. The iconic dairy brand secured its misleading name (and gold-standard status) thanks to a marketing ploy that’s been working for more than 150 years … and it’s all because of Pennsylvania’s reputation for impeccable dairy. Small Pennsylvania dairies of the 18th and early 19th centuries were known for using full-fat milk and cream to make rich cheeses — in contrast to New York dairies, which mostly used skim milk — and because the perishables couldn’t be easily transported, they gained a reputation as expensive luxury foods. So when upstate New York entrepreneur William Lawrence began making his skim milk and (for richness) lard-based cream cheese in the 1870s, he needed a name that would entice customers and convey quality despite it being made in Chester, New York, and not Philadelphia. Together with cheese broker and marketing mastermind Alvah Reynolds, Lawrence’s cheese was branded under the Philadelphia name in 1880, which boosted sales and promoted its popularity with home cooks well into the early 1900s. 

 

Lawrence is often credited with inventing cream cheese, and culinary lore frequently cites its creation as an accident. But some food historians say he wasn’t the first person to concoct the cheesy spread — recipes for it had been circulating for some time in newspapers and magazines. Lawrence did, however, create the first commercial cream cheese factory, which made the product accessible to home cooks. Lawrence eventually left the dairy industry for politics, becoming the mayor of Chester, but his legacy remains in every foil-wrapped block found in an American fridge.

 

Cream cheese has been wrapped in foil since the 1880s.
Besides the whipped or flavored versions that usually come in plastic tubs, most American cream cheese comes in a foil-wrapped block — and it’s almost always been that way. William Lawrence’s first mass-produced cream cheeses were wrapped in thick tissue paper commonly used by cheesemongers. But a few years into production, the rebranded Philadelphia Cream Cheese of the 1880s opted for foil packaging that helped the moldable cheese keep its shape — and more importantly, provided a firm wrapper that was an easier surface on which to print the brand name. Today, Kraft (the current owner of the Philadelphia brand) says the foil helps retain moisture and freshness.

 

 

Source: Philadelphia Cream Cheese isn’t actually from Philadelphia.

Link to comment
Share on other sites

Fact of the Day - Brad's Drink?

drinks-lemonade-cola-drink-softdrinks-26

Did you know... Pepsi has been nearly synonymous with cola for more than a century, but it wasn’t always called that. We have pharmacist Caleb Bradham to thank for the bubbly beverage, as well as its original name: Brad's Drink. Believing that his concoction had digestive benefits, Bradham sold it at his pharmacy in New Bern, North Carolina. Brad’s Drink didn’t last long, however — it was renamed Pepsi-Cola in 1898.

 

The new name was partly derived from the word “dyspepsia,” a technical term for indigestion, and was meant to convey the tasty beverage’s supposed medicinal properties. Bradham trademarked the name in 1903, and the company grew exponentially over the next few years, with 240 franchises opening across 24 states by 1910. Pepsi isn’t the only major company to undergo a name change, of course — 7-Eleven used to be known as Tote’m Stores, Nike was founded as Blue Ribbon Sports, and Canon was originally called Precision Optical Instruments Laboratory, among others.

 

Dr. Pepper used to be served warm.

Dr. Pepper used to be advertised as a hot holiday drink, a response to declining sales in the winter months. The original ad from the 1960s even came with helpful instructions: Simply warm the beverage in a saucepan until it steams, then pour it over a lemon slice. The result was a “distinctively different hot Dr. Pepper” and “the holiday favorite of the proud crowd,” per the festive commercial. Heating the drink to 180 degrees Fahrenheit eliminated the carbonation, leaving behind a sweet, flat flavor that was especially popular in the South.

 

Source: Pepsi was originally called ‘Brad’s Drink.’

Link to comment
Share on other sites

Fact of the Day - WHEELED SUITCASES

IF_FOD_wheeled-suitcase-moon-landing.jpg

Did you know... One small step for man took place before astronauts could even roll their suitcases across the spaceport. The first wheeled suitcase was invented in 1970, a year after the moon landing. It was the brainchild of inventor Bernard D. Sadow, who called it one of his best ideas, despite the fact that the product wasn’t immediately popular. 

 

Mind you, this wasn’t the upright luggage we know today. “The Luggage That Glides,” as Macy’s marketed the product after buying it, rolled on its side and was pulled with a strap attached to the top. The innovation may not have been very sophisticated, but it nonetheless improved ease and convenience by adding wheels to something that could certainly use them.

 

Sadow applied for a patent in 1970 and received it in 1972. “Whereas formerly, luggage would be handled by porters and be loaded or unloaded at points convenient to the street, the large terminals of today, particularly air terminals, have increased the difficulty of baggage-handling,” the patent stated. “Baggage-handling has become perhaps the biggest single difficulty encountered by an air passenger.” That remains true today, even with the 1987 invention of the vertical Rollaboard, the now-ubiquitous style of vertical wheeled suitcases.

 

No one has walked on the moon in more than 50 years.
A dozen people have walked on the moon, but no one has done it in more than half a century. Eugene A. Cernan was the last astronaut on the lunar surface, a feat he achieved as part of the Apollo 17 mission in December 1972. He previously served as the lunar module pilot of Apollo 10.

 

Cernan logged 566 hours and 15 minutes in space throughout his NASA career, 73 hours of which were spent on the surface of the moon. “As I take man’s last step from the surface, back home for some time to come … America’s challenge of today has forged man’s destiny of tomorrow,” he said as he climbed the ladder for the final time.

 

Source: The moon landing happened before wheeled suitcases were invented.

Link to comment
Share on other sites

Fact of the Day - THE TERM "GAUDY"

communityIcon_i53z3l8djmd81.png?width=25

Did you know.... No, it did not come from a famous architect with a similar-sounding name.

 

The term “gaudy” refers to something that is needlessly extravagant, intricate, or even tasteless. As far as historians can tell, the word first appeared sometime in the 16th century. While the adjective version of the word became popular in the 16th century, “gaud” first appeared in the early 15th century to refer to an ornamental trinket like a bead. 

 

Is There Overlap Between the Term “Gaudy” and Antoni Gaudí?
Today, many people erroneously assume that the term appeared in reference to a famous Catalonian architect, Antoni Gaudí, due to his unique structures and colorful designs. 

 

Gaudí lived from 1852 to 1926, during which time he worked as an architect and designer. His most famous masterpiece is the Sagrada Familia, which is located in the heart of Barcelona and is known for its sharp, angular, and ornate style. In the same region of the city, Gaudí also designed the Park Güell, Casa Batló, and Casa Milà—all of which gained notoriety due to their unique, extravagant, ornate design. 

 

s-antoni_gaudi_1878-large.jpg?1441192465

 

Gaudí came from a line of coppersmiths, which instilled in him a respect and reverence for special skills like handling large volumes of material. As a sickly young man, Gaudí spent a lot of time in nature, where he absorbed its character before later using it in his architectural masterpieces by highlighting sustainable practices. 

 

While Gaudí’s work as an architect reinforces the meaning of the term “gaudy,” the two are not directly related since he lived centuries after the word became popular. 

 

The Origins of the Term “Gaudy”
Some historians suggest that the term “gaudy” may have been related to the Middle English word “gaudegrene,” which referred to a chartreuse-colored dye that was taken from a plant. The color was eventually used to enhance ornamentation by creating a vivid and eye-catching design. The plant that was responsible for this type of ornamentation was eventually labelled “Gaude” in old French, which may have resulted in the word that we use today. 

 

As early as 1434, the word “gaud” was also used to refer to a trinket. And, in some religious circles, the term also referred to a prayer bead. Historians suggest that the word evolved from the Latin word “gaudium,” which translates to “joy.” The gaud, then, was used in Catholic rosaries as a way to point to the Joyful Mysteries while also rejoicing in the early life of Jesus Christ. 

 

 

Source: Where Does the Term “Gaudy” Come From?

Link to comment
Share on other sites

Fact of the Day - FIRST CHARACTER BALLOON

_.jpg

Did you know... Spoiler alert: The balloon was not deflated after the parade.

 

When Macy’s first introduced a helium-filled character balloon into the Thanksgiving Day Parade in 1927, people watching from the sidewalks of New York didn’t quite know what they were looking at.

 

Initially, the parade had relied on animals from the Central Park Zoo—everything from camels to bears to monkeys. This caused some issues, including fearful children and tired animal handlers. Macy’s executives started looking for an alternative that was safer and more controlled, but still visually dynamic. This led to an idea that was novel for its time: giant floating balloons shaped like characters! 

 

Macy’s needed someone for the job, so they hired a man by the name of Tony Sarg, a German-American puppeteer and illustrator who had already been famous for his animated holiday window displays at Macy’s. He wanted to create upside-down marionette figures that floated instead of dangling on strings. So, he partnered with the Goodyear company, where they experimented with rubber fabric and large air-filled designs. 

 

OIP.6X0ZNNmTq0wQDp41FuJcmAHaFh?rs=1&pid=

 

Enter Felix the Cat
Felix the Cat was chosen as the first character balloon simply because he was everywhere at the time—newsreels, toys, print comics, store displays, even novelty clocks. To audiences in 1927, Felix was as recognizable as any movie star, which made him the obvious candidate to float above the parade.

 

The balloon itself looked almost comically simple by today’s standards: stiff, rounded edges, a painted-on grin, and a body that didn’t bend so much as tilt. But in that moment, seeing Felix drifting above Manhattan gave people the uncanny feeling that a cartoon character had stepped out of the screen and into the real world.

 

The first Felix balloon was filled with oxygen—helium wouldn’t make its parade debut until the following year—but the early work between Sarg and Goodyear still set the blueprint for every balloon that followed.

 

Macy’s Originally Set Balloons Free—Including Felix the Cat
What Macy’s didn’t account for was how you get gigantic rubber balloons down once you’ve filled them up. Due to technology at the time, or lack of a better idea, they simply let the balloons go. Felix the Cat eventually burst. This practice excited spectators, but was hated by the environmentalists and anyone who understood a little bit about aviation safety in the 1920s.

 

In 1932, a pilot decided to try to capture these renegade balloons with her biplane. She almost crashed, though, when the balloon wrapped itself around her wing.

 

The near-miss was enough to end the tradition for good. Macy’s realized that releasing multi-story balloons into open airspace—especially when ambitious stunt pilots started chasing them like trophies—was a disaster waiting to happen. By the following year, the practice was officially scrapped, and the parade shifted toward safer, more controlled handling techniques that eventually evolved into the system used today.

 

 

 

From that point on, the balloons were treated less like oversized toys and more like aircraft. It was a far cry from the early days, when Felix the Cat could simply float off toward the Atlantic without anyone giving it a second thought, but that first balloon—simple, stiff, and a little awkward—changed the parade forever.

 

And while Felix didn’t survive his voyage, he became a legend the moment he was let go from the handlers’ grip and drifted out of sight.

 

Source: The First Character Balloon in the Macy’s Thanksgiving Day Parade—And What Happened to It

Link to comment
Share on other sites

Fact of the Day - CROOKED PINE TREES

crooked-forest-260nw-506039725.jpg

Did you know... Near the small town of Gryfino in northwestern Poland is a forest unlike any other. It’s not the biggest or the tallest, but it just might be the strangest. In this forest stand about 400 pine trees that have all been uniformly deformed into a shape resembling the letter “J.” These trees are bent by about 90 degrees at the start of their trunk, and then slowly grow upward some 50 feet, creating a curve that can be nearly 10 feet long in some cases. The effect is so stunning that the forest earned the nickname Krzywy Las, or “Crooked Forest,” and has become a significant tourist attraction. But perhaps the strangest aspect of this natural phenomenon is that no one is exactly sure why the trees are growing like this in the first place. 

 

Estimates show that these crooked pines were likely planted sometime in the late 1920s or early 1930s, though no records show who planted them. While some have theorized that a bizarre snowstorm or a strange effect of the Earth’s gravitational pull somehow deformed the trees, the leading theory is that their odd shape was created by human hands. This theory argues that local foresters interrupted the trees’ growth when the plants were between 7 to 10 years old, forcing them to bend so that furniture and boats could be fashioned out of their unique shape. But with the outbreak of World War II and the invasion of Poland in 1939, the trees were abandoned and left to grow into their famously crooked shapes. Sadly, many of these trees are now dying (perhaps partly as a result of visitor traffic) and so the Gryfino Forest District has begun a revitalization project by setting aside two 10-acre plots for recreating these crooked pines. The project will experiment with planting seeds from existing crooked pines to observe any unusual traits. The forest service also is clearing away the tops of some dead trees that pose hazards — while leaving behind their characteristic curve for tourists to enjoy.

 

There are several dozen trees living in the U.S. that have been to the moon.
In February 1971, NASA astronauts Alan Shepard and Edgar Mitchell landed on the moon at the apex of the Apollo 14 mission. Astronaut Stuart Roosa was orbiting overhead in the command module, and packed away in his personal kit were seeds from five tree varieties — loblolly pine, sycamore, sweetgum, redwood, and Douglas fir. This lunar journey was part of a joint experiment between NASA and the U.S. Forest Service to test the effects of zero gravity on seeds and their subsequent growth into trees. After Apollo 14 returned safely to Earth, these seeds were eventually planted throughout the U.S., where they became known as “moon trees.” One was planted at the White House, and many others throughout the country in state capitals and parks. (Scientists never found evidence that their trip to space had affected the seeds or trees.) The project was mostly forgotten as the fervor surrounding the Apollo program subsided, but in 1996 NASA astronomer and archivist David Williams received an e-mail from a third grade teacher in Indiana about a nearby “moon tree.” Fascinated by this lost piece of NASA history, Williams began cataloging the location of these arboreal space voyagers. Today, you can visit several dozen “moon trees” that are surviving and thriving across the U.S.

 

 

Source: Poland has a forest of crooked pine trees — and no one is sure why.

Link to comment
Share on other sites

Fact of the Day - SEA WATER TO FRESH WATER

OIP.N_HkfaArKH_2nDUiooNndQAAAA?rs=1&pid=

Did you know..... When seawater freezes, it undergoes a remarkable natural desalination process — known as brine rejection — that expels much of its salt content. So when that seawater ice melts, the result is almost pure fresh water. It may seem impossible, but it all comes down to what happens at the molecular level when the salty water freezes.

 

As ocean temperatures drop below freezing, water molecules begin forming ice crystals with a highly organized structure. That structure cannot incorporate salt ions, so those ions are largely excluded, pushing the salt out of the ice. Initially, not all of the salt is rejected; some of it is trapped in the ice, forming pockets of concentrated brine. The brine remains in a liquid state because it requires lower temperatures to freeze. So at that stage, the sea ice still has a high salt content, but over time, the ice continues to eject the brine.

 

Thanks to this phenomenon, sea ice contains significantly less salt than the water it came from. In liquid form, ocean water has an average salinity of 35 parts per thousand, while newly formed seawater ice has a salinity of between 12 and 15 parts per thousand. As the ice grows thicker and brine rejection takes place, the salinity decreases significantly: Arctic first-year ice has an average salinity of 4 to 6 parts per thousand, and sea ice four years or older is nearly free of brine.

 

When seawater that has been frozen for years eventually melts, the water released is dramatically fresher than the ocean around it. In these cases, when nearly all the brine is gone, the ice can be fresh enough to provide drinking water when melted — something that’s often done during polar expeditions. 

 

In his 1911 book, Polar Exploration, the British polar scientist and explorer William Speirs Bruce described how whalers and exploring ships in the Arctic extracted water from pools on the ice, which was often drinkable fresh water. Today, polar expedition members still take an occasional drink from these pools.

 

The world’s largest waterfall is underwater.
When we think of the world’s mightiest waterfalls, we normally picture them cascading majestically over cliffs to a turbulent plunge pool far below. But the world’s largest waterfall is actually located in the ocean. Known as the Denmark Strait cataract, it flows beneath the Denmark Strait, which separates Iceland and Greenland. At the bottom of that strait, a series of cataracts — beginning some 2,000 feet beneath the surface — plunge to a depth of 10,000 feet, a drop of nearly 2 miles. 

 

This underwater waterfall exists due to density differences between the two water masses on either side of the Denmark Strait. When the southward-flowing cold water from the Nordic Seas meets the warmer water from the Irminger Sea, the cold, dense water quickly sinks below the warmer, less dense water, and plunges over a huge drop in the ocean floor. The resulting downward flow is estimated to well exceed 123 million cubic feet per second. By comparison, the discharge of the Amazon River into the Atlantic Ocean is just 7.74 million cubic feet per second.

 

Source: Ice from seawater melts into fresh water.

Link to comment
Share on other sites

Fact of the Day - SIGN LANGUAGE

ChMkLGc2s-mIBCxkAAJvyTdUtpYAAlpbgPoVPoAA

Did you know.... Some deaf people have been known to continue signing even after they drift into sleep, through unconscious movements of the hands. The phenomenon is similar to talking in your sleep: The brain’s language and motor circuits remain active while you snooze, though they are suppressed, and during sleep-talking episodes, brain activity more closely resembles the awake state, allowing speech to slip out. 

 

Sign and spoken languages involve similar neural processes, so while hearing people may talk in their sleep, this subconscious impulse takes the form of sleep signing or involuntary hand movements for deaf people. Sleep signing has been documented since as early as 1935, when electrophysiology studies found deaf people made signing motions while asleep. Bursts of activity were observed in the fingers and arms, a pattern not seen in hearing sleepers.

 

In one widely cited 2017 case, a 71-year-old deaf man with REM sleep behavior disorder was observed signing fluently in his sleep. Because that disorder prevents the usual temporary paralysis that occurs during REM (rapid eye movement) sleep, he was able to sign so clearly that researchers could even decode aspects of his dreams. 

 

This same mind-body link may also explain why dogs sometimes bark or twitch their paws during REM sleep and why some chimpanzees who know sign language have also been observed signing as they snooze.

 

Martha’s Vineyard once had its own sign language.
In the 18th and 19th centuries, Martha’s Vineyard, Massachusetts, was home to a surprisingly high number of deaf residents. At its peak, about one in 155 of the island’s residents were born completely deaf, a ratio much higher than in the general U.S. population, in which about one in 5,700 people are deaf. The higher rate in Martha’s Vineyard is believed to have been caused by a hereditary form of deafness traced to settlers from Kent, England.

 

To communicate, the islanders developed Martha’s Vineyard Sign Language (MVSL), a fully functional language. Many hearing residents also learned and used it daily, making communication seamless across the community. MVSL declined in the late 19th and early 20th centuries when American Sign Language gradually replaced it. It wasn’t rediscovered until the 1970s, when anthropologist Nora Ellen Groce traced the island’s unusually large deaf population and uncovered the language.

 

 

Source: Some deaf people use sign language in their sleep.

Link to comment
Share on other sites

Fact of the Day - BEAVER DAM

beaver-260nw-702517171.jpg

Did you know... While there are a few human-made objects visible from space, there’s only one known example constructed by beavers: the world’s largest beaver dam, located in the Peace-Athabasca Delta of Canada’s Wood Buffalo National Park. You can’t see it from space with the naked eye, but the dam was discovered on October 2, 2007, using satellite imagery provided by Google Earth. It appears to have been built during the last five decades, as photographs taken of the same location in 1975 show limited beaver activity.

 

Estimates put the length of the dam at more than 2,600 feet, and based on satellite imagery, it’s been measured to cover an approximate surface area of roughly 750,000 square feet. The pond created by the dam is estimated to hold nearly 2.5 million cubic feet of water. To use an analogy Canadians would surely approve of, that’s roughly the same amount of water needed to fill 1,600 standard ice hockey rinks.

 

Given its remote location miles from any paved road or trail, accessing the dam requires a multiday trek through wetlands and forest. That inaccessibility poses such a challenge that only one known individual has ever visited the dam itself. In July 2014, adventurer Rob Mark completed the perilous trek, snapping a celebratory selfie with the beaver lodge behind him. Upon his arrival, he noted how difficult it is to grasp the dam’s enormity from up close, and how its sheer size is better appreciated with the photographs taken from space.

 

China’s Three Gorges Dam slowed the Earth’s rotation.
Depending on their enormity, it’s possible for massive infrastructure projects to impact the Earth’s rotation. One such example is China’s Three Gorges Dam, which measures 600 feet tall and 7,500 feet long and spans the Yangtze River. When filled, the dam’s reservoir is able to hold 10 trillion gallons of water.

 

In 2005, NASA scientist Benjamin Fong Chao calculated that a completely filled reservoir would slow the Earth’s spin, increasing the length of each day by 0.06 microseconds (or 60 billionths of a second). This is caused by the redistribution of mass away from the center of Earth’s rotation, which makes the planet twirl less fluidly. As water is raised higher above sea level, it impacts the Earth’s moment of inertia and causes the planet’s rotation to slow.

 

 

Source: The world’s largest beaver dam is visible from space via satellite imagery.

  • Like 1
Link to comment
Share on other sites

Fact of the Day - STRINGED POPCORN

551339248071352361_1447420162.jpg

Did you know.... The history of why people started using popcorn as Christmas tree decor.

 

The tradition of decorating evergreen trees at Christmastime first emerged in Europe in the Middle Ages. From there, it was taken to America by German immigrants, while across in England, the practice was popularized by the German spouses of King George III and Queen Victoria in the late 1700s and early 1800s. Ornamented trees have been a popular and widespread Christmas staple ever since. 

 

The History of Decorating Christmas Trees
Precisely what we’ve hung on our Christmas trees to decorate them each year has changed over the centuries, though. The Lutheran Germans who first helped to establish the tradition typically decorated their so-called “paradise tree” (a relic of ancient religious plays and performances) with a mixture of apples, dried fruits, sweet bread, wafers, and other sugary treats, all intended to act as symbols of the biblical Garden of Eden.

 

The Victorians—inspired by a famous sketch of Queen Victoria’s royal family standing around a tree decorated with toys, candles, and baubles—were quick to see a market in manmade ornaments and shiny trinkets, and produced glass baubles and glittering metal toys that could be suspended from the branches using ribbons. And as manufacturing techniques improved, Christmas turned kitsch in the 20th century, with the first mass-produced strings of electrical lights and reflective plastic tinsel

 

95ee956bbe533737ce993f22b92f992c.jpg?nii

 

But it was way back in 1842 that a German-born teacher named Charles Minnigerode is known to have used garlands of popcorn to decorate a Christmas tree that he put up in Williamsburg, Virginia. Minnigerode certainly wasn’t the first person to use popcorn as a decorative object (the Aztecs had used popped corn kernels on the likes of ceremonial headdresses and even jewelry), yet his Christmas tree is the earliest that we know to have been adorned with garlands of popcorn. But why use popcorn at all? 

 

Well, at least part of the answer to that question involves the date of Minnigerode’s tree. The tradition of decorating Christmas trees would have certainly been fairly well established by that time, but the mass-production and widespread availability of metal ornaments, sparkly toys, and colorful glass baubles was still several decades away.

 

As a result, people would have had to have taken something of a do-it-yourself approach to either finding or making Christmas decorations for their trees, and so would have relied on items and other materials that they would have found easy to come by at home. So, as well as strings of readily available popcorn, early American trees were often decorated with the likes of cutout paper shapes, seed pods, dried slices of apple, pine cones, and sugarwork—anything and everything that people would have found easy to lay their hands on. 

 

As the country’s staple food in the mid-19th century, moreover, corn would have been not only plentiful and freely available at that time, but relatively inexpensive too, making it all the more attractive as a makeshift Christmas decoration. Yet the use of popped corn was also a neat way of continuing the long-standing tradition of using edible produce as Christmas decorations, too. 

 

478dca94-fc8e-4a34-8848-0d1f9a88b6d2.73e

 

In fact, as well as popcorn, American Christmas trees in the 19th century were often festooned with the likes of sugar cookies, raisins, apples, threaded strings of dried cranberries, and marzipan—and even Minnigerode’s tree is known to have also been decorated with nuts as well as garlands of popcorn.

 

All of this harks back to the ancient Lutheran tradition of the “paradise tree,” with the decorative use of fruits, sweets, and other foodstuffs intended to symbolize the bounty of the Garden of Eden. 

 

So when you’re stringing together garlands of popped corn for the tree this year, you’re not only continuing a long-standing American festive tradition, but an even longer-standing practice whose roots lie in the European Middle Ages

 

 

Source: Why Do People String Popcorn on Christmas Trees?

Link to comment
Share on other sites

Fact of the Day - ONE NOSTRIL AT A TIME

only-best-fruits-vegetables-beautiful-26

Did you know.... The human nose is a biological wonder. It can smell up to 1 trillion odors, trap harmful debris in the air before it enters your lungs, and even help regulate emotion. But arguably its most important job is to condition the air you breathe before that air enters your respiratory tract. This means warming and humidifying the air before it passes to your throat and beyond. To do this, the nose undergoes a nasal cycle in which one nostril sucks in the majority of the air while the other nostril takes in the remaining portion. A few hours later (on average), the nostrils switch roles. This cycle is regulated by the body’s autonomic nervous system, which swells or deflates erectile tissue found in the nose. Although we don’t notice this switch throughout the day, if you cover your nostrils with your thumb one at a time, you’ll likely observe that air flow through one is significantly higher than in the other. This is also why one nostril tends to be more congested than the other when you have a cold (the nondominant one gets more filled with mucus).

 

There are a few possible reasons for this nasal back-and-forth. Some scientists theorize that the cycle actually improves our sense of smell. Because scent molecules degrade at differing rates, some smells are easier to identify through fast-moving air (in the dominant nostril), while others are more easily picked out in slower currents of the nondominant, usually more congested, nostril. Very few smells can get past our nose undetected thanks to this alternating nasal superpower.

 

The size of a human nostril is determined by climate.
Nostrils come in all shapes and sizes, and like most other parts of the human body, that’s the result of millions of years of evolution. In 2017, scientists confirmed a long-held theory that climate plays a vital role in determining the size of our nostrils. People whose ancestors hail from warm, humid climates have little need for nostrils to humidify air before it enters the lungs. As a result, their nostrils are wider. But in cold, dry climates — where air easily irritates the lining of the nose and throat — smaller nostrils create a more “turbulent” air flow, causing the air to mix in the nose. This turbulent mixing interacts with the nose’s mucus-covered lining, which warms and humidifies the air before it passes to the lungs. Over the long, grinding process of evolution, as humans traveled farther from the equator, smaller nostrils were naturally selected as better-suited for the cold and dry areas of the world.

 

 

Source: People breathe primarily out of one nostril at a time.

Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...
Please Sign In