Jump to content

Fact of the Day


DarkRavie

Recommended Posts

Fact of the Day - MOO DENG PYGMY HIPPO

f5d8be70-718b-11ef-8331-3bcdbb18c020.jpg

Did you know... “Moodeng Moodeng” has been released in four languages, with a video to accompany each.

 

The world’s most famous pygmy hippo, Moo Deng (which is also a type of meatball and means “bouncy pork” in Thai), has entertained the internet since she made her debut in the summer of 2024. The 4-month-old celebrity gained famed on the social media accounts of Thailand’s Khao Kheow Open Zoo, where she resides. People quickly became obsessed with Moo Deng thanks to her petite stature and cute features. Now, fans can relish the adorable hippo’s existence through a new song.

 

As the Associated Press reports, renowned Thai composer Mueanphet Ammara produced and wrote the song for the zoo animal and her fans. GMM Music, a large music company based in Thailand, released it.

 

The catchy tune was recorded in four languages—Thai, English, Chinese, and Japanese—and has four separate videos. Each lasts for only 50 seconds—though if you’re a fan of the pudgy hippo’s antics, it’s time well spent. The videos feature clips of Moo Deng thrashing around, tripping over her own feet, and generally being silly. For example, the English version shows one clip of Moo Deng falling to the beat of the music for the duration of the song. Meanwhile, the Thai version shows a collection of viral Moo Deng clips.

 

The song plays up the creature’s rubbery appearance, with lyrics like “boing, boing” and “bounce with me, Mom.” You can listen to the English version of “Moodeng Moodeng” below:

 

 

Moo Deng may be the biggest celebrity success story of the year. She’s inspired a plethora of fan art, memes, and merchandise. Even makeup influencers have shared Moo Deng-inspired looks, complete with rosy cheeks, grey eyeshadow, and glossy skin. It turns out the pygmy hippo’s fame has been good for Khao Kheow Open Zoo’s business, too. Her popularity has boosted the zoo’s social media presence, which now has over 530,000 followers on Facebook and more than 133,000 on X. The attraction has also seen a huge increase in weekend visitors since the baby hippo went viral.

 

Source: Moo Deng the Viral Pygmy Hippo Has Inspired a Thai Pop Song

  • Like 1
Link to comment
Share on other sites

Fact of the Day - CORNUCOPIA

2T0OfR0=&risl=&pid=ImgRaw&r=0

Did you know... Long before Americans made it about Thanksgiving, the cornucopia invoked magic goats and god-bulls.

 

The cornerstone of Thanksgiving decor is the not-so-humble cornucopia: a horn whose broad opening overflows with the fruits of a bountiful harvest. Today the symbol may seem as American as, well, Thanksgiving—but it didn’t start out that way.

 

Baby Zeus’s Big Break
In Greek mythology, the Titan ruler Kronos fears that his children will usurp him, so he eats each baby soon after their birth. His wife (and sister) Rhea saves their son Zeus from this fate by enlisting Amalthea to raise him secretly in Crete.

 

18435.jpg

'The Nurture of Jupiter' (the Roman equivalent of Zeus)

 

Amalthea is either a goat or a water nymph with access to goats, depending on which version of the story you’re reading; in any case, baby Zeus grows up on goat’s milk. At some point, Zeus either breaks off one of the goat’s horns to give to the goat, or the goat snaps off her own horn to give to Zeus. The horn is then filled with fruit (and flowers or herbs), either by Zeus’s godly power or manually by Amalthea. In some accounts, Zeus’s power guarantees that the horn will never run empty. Suffice it to say that most of the details of this tale are up for debate—but the main point is that Zeus was directly or indirectly behind the creation of the inaugural horn of plenty, which came from a goat.

 

According to that legend, at least. Ancient Romans had their own origin story involving Hercules. In Book 9 of Ovid’s Metamorphoses, the river god Achelous describes transforming into a bull during a battle with Hercules, who then tears off one of his horns. “Grasping one of my horns in his brutal hand, he broke it, tough as it was, and tore it away from my forehead, leaving me maimed,” Achelous says. “The horn was filled by the naiads with fruit and with fragrant flowers and, thus made holy, enriches the Spirit of Plenty.”

 

bf63c28a6d13b3881366788fd309ff94.jpg

An engraving of 'Hercules and Achelous' 

 

After Achelous finishes his story, one of his nymph servants appears, Ovid wrote, “displaying the Horn of Plenty and carrying all the choicest fruits of the autumn to serve the guests for the second course.” It’s from Latin that we get the word cornucopia: Cornū cōpiae means “horn of plenty.”

 

The cornucopia didn’t stay confined to either originator. Various deities representing agriculture or prosperity have been depicted with it, including the Greek goddess of harvest, Demeter (Zeus’s sister) and her Roman counterpart, Ceres; and the Greek goddess of chance, Tyche, and her Roman counterpart, Fortuna

 

Eventually, Americans co-opted it, too.

 

A Cornucopia of Cornucopias
It’s unclear who first featured a cornucopia in their Thanksgiving decor; mentions of them date at least as far back as the 1870s.

 

“Below the speaker’s stand was an immense cornucopia, from whose open mouth rolled melons, apples, peaches, pears, oranges, lemons, limes and olives,” one newspaper correspondent wrote of a Presbyterian service in Santa Barbara, California, on Thanksgiving in 1875. “This great ‘Horn of Plenty’ was backed by sheaves of grain, interspersed with feathery pampas grass.”

 

01-canadian-thanksgiving.jpg?w=413&h=270

A Thanksgiving greeting card from the early 20th century.

 

Early-20th-century columnists penned instructions for DIY cornucopia centerpieces made from wire covered with linen, wrapping paper, and even silk. “The cornucopia, or horn of fruitfulness and abundance, always used by the Greeks and Romans as the symbol of plenty, is an apt expression of the sentiment that prevails on Thanksgiving day,” Vermont’s St. Johnsbury Republican wrote in 1913. “The contents should be arranged so that the cornucopia is overflowing, the fruits and flowers running out of the horn and over the table.”

 

By the 1930s, the cornucopia’s association with Thanksgiving was solid enough that people started thinking outside the box. In 1930, for example, the Chronicle Tribune of Marion, Indiana, printed a recipe for “cornucopia sandwiches” that required rolling “fan-shaped pieces” of de-crusted bread and smoked sturgeon into cones and tucking “small sprigs of parsley” into each opening. “Other fillings such as creamed mushrooms, olive and cheese, creamed chicken or lobster, in fact any tasteful moist filling makes the Cornucopia sandwich a delight,” the recipe’s creator wrote. Another widely published recipe from 1940 involved rolling pastry dough into cones, baking them, and then filling the diminutive horns with candied cranberries. 

 

The mid-20th century saw gargantuan cornucopias stuffed with gargantuan fake fruit (“The bananas are six feet long and the apples are two and a half feet in diameter!”); smaller edible ones formed from “chilled leaves of lettuce”; and all manner of the centerpiece variety, from wicker to gold ceramic. The Macy’s Thanksgiving Day Parade has done a lot to keep the trend of massive cornucopias alive in the modern era; and directions for making cornucopia appetizers from, say, ice cream cones and candy are now just a quick internet search away.

 

 

 

In short, Americans have been trying to outdo themselves in the cornucopia department for decades—but it’s hard to compete with gods.

 

 

Source: Horns Aplenty: The Ancient Greek Origins of the Cornucopia

  • Like 1
Link to comment
Share on other sites

Fact of the Day - CATS COVERING THEIR FACES

mqdefault.jpg

Did you know... The explanations range from trying to keep warm to protecting their whiskers.

 

Cats are pros at relaxing. They spend almost two-thirds of their lives asleep. If you’ve ever noticed your feline friend covering their face as they nap, you may have wondered why. It turns out the action makes it easier for them to catch some Z’s.

 

Though domestic pets usually don’t have to stress about killing prey, house cats are hardwired to conserve energy for hunting. According to experts, covering their faces helps them fall asleep in several ways.

 

Many pet experts believe cats cover their eyes to block out sunlight. Felines usually seek sunny spots for naps, but the glare can irritate their sensitive eyes and prevent them from getting sleep. By covering their faces they manufacture a dark environment for themselves, making it easier to doze off.

 

Cats may also try to stay warm and retain body heat by curling up tight. Certified cat behaviorist Stephen Quandt explained to Woman’s World, “Curling up into a ball, tail wrapping, and face covering all help them conserve body temperature.” 

 

Pet food brand Canidae explains that cats often cover their faces while sleeping for security. They know their faces are vulnerable, so covering them is a way to feel safe. Their whiskers are especially sensitive, helping felines sense parts of their environment they may not see or feel otherwise. Therefore, felines might shield their faces from things that can touch the follicles and cause sensory overload. If there is no good place to bury their faces, they may use their paws instead. It’s also possible that cats like sleeping this way simply because it’s comfortable, similar to a person scrunching into the fetal position in bed.

 

The last—and possibly funniest— reason is that cats may fall asleep in this position while grooming. Sometimes, they’ll lick their paws and wipe them across their faces to clean themselves. Doing this may tucker them out and cause them to nod off on the job. 

 

 

Source: Why Do Cats Cover Their Faces While Sleeping?

  • Like 1
Link to comment
Share on other sites

Fact of the Day - "EVIL" PRINCE ALBUM

01jd07ptsyndpmfav1d2.jpg

Did you know... It wasn’t just the cover of the Purple One’s shelved 1987 LP ‘The Black Album’ where things got dark. From the contractual dispute in which his name changed to an unpronounceable symbol to the studio effort he gave away for free via a conservative British newspaper, musical genius Prince made a number of career moves that sent the record industry (and no doubt his team of accountants) into the depths of despair. But even by the Purple One’s disruptive standards, withdrawing a new album just days before its release, as he did in 1987, appeared to be a truly baffling act of self-sabotage—even more so when you learn exactly why it didn’t hit shelves. 

 

A Return To His Roots
Prince’s 1987 untitled 10th LP—known as The Black Album due to its monochromatic front cover and lack of both title and artist name—was intended to prove that he hadn’t entirely abandoned his cultural roots: After blockbuster albums such as Purple Rain and Sign o’ the Times, the superstar was accused by some of deliberately courting a white pop audience. 

 

That’s why the majority of The Black Album’s eight tracks are firmly grounded in the sounds of distinctly Black music, whether it’s the James Brown-esque jazz-funk instrumental “2 N*** United 4 West Compton,” the missing link between Mary Poppins and Funkadelic that is “Superfunkycalifragisexy,” or the groove-laden “Cindy C,” a lascivious come-on to the supermodel Ms. Crawford only someone as sexually magnetic as Prince could get away with.

 

Accompanied by the likes of backing vocalist Sheila E (whose birthday party inspired three tracks) and saxophonist Eric Leeds (who provides the only other songwriting contribution on album closer “Rockhard in a Funky Place”) Minneapolis’s finest still found the time to get a little weird, too. “Dead On It” is a bizarre hip-hop parody that takes aim at rappers’ inability to hold a tune; recorded at a time when Eric B & Rakim and Public Enemy were revolutionizing the music world, this is, ironically, Prince at his most off-key. 

 

 

 

Then there’s “Bob George” in which he assumes the identity of a gun-wielding domestic abuser who suspects his girlfriend of having an affair with the titular character (whose name apparently came from former manager Bob Cavallo and one of Prince’s fiercest critics, Nelson George). “That skinny motherfucker with the high voice,” sings an unrecognizable pitch-shifted Prince in a meta self-own amid squalling guitars, robotic beats, and gunfire shots. 

 

Hitting the Self-Destruct Button
While always intriguing, The Black Album didn’t quite hit the same heights as Prince’s earlier ‘80s classics. Esteemed critic Robert Christgau, who managed to source a copy from an industry insider, said that “at its weirdest” it was “an unpleasant impersonation of a dumbfuck B-boy that’s no lost masterpiece.” But it wasn’t quality control that prompted its maker to hit the self-destruct button shortly before its scheduled December 8, 1987 release. 

 

Indeed, it was a spiritual epiphany experienced during a nightmarish trip on MDMA that caused Prince to reevaluate what he’d just committed to record—and conclude it was a work of pure evil (which he would later say had been channeled through a devilish entity named Spooky Electric). The star demanded that his label, Warner Bros., withdraw each and every single copy in existence. 

 

“I was very angry a lot of the time back then,” Prince explained to Rolling Stone three years later. “And that was reflected in that album. I suddenly realized that we can die at any moment, and we’d be judged by the last thing we left behind. I didn’t want that angry, bitter thing to be the last thing. I learned from that album, but I don’t want to go back.”

 

 

 

Of course, at the time, half a million copies of The Black Album were on the verge of being shipped from loading docks to retail outlets. But proving just how seriously Prince had invested in his demonic theory, he agreed to take the financial hit himself, using his own royalties to pay for their destruction. Luckily, what sounded like a gargantuan task on paper proved to be relatively straight-forward. 

 

"It was a top security release," ex-Warner Bros executive vice president Jeff Gold later told the BBC. "There was no single, there was no video, there was no announcement. Nobody knew it was coming. So because of that, there was a lot of security around it in the pressing plants. So when Prince decided it could not come out, it was relatively easy for the people at Warner Bros. to say, ‘Alright, destroy every one.’”

 

Inevitably, the LP did find its way onto the bootleg market. But just in case the message still hadn’t gotten across, the music video for “Alphabet Street,” the lead single from Prince’s less malevolent 1988 follow-up Lovesexy, contained the warning, “Don’t buy The Black Album, I’m sorry.” Yet Prince’s attitude toward the record did appear to soften over time. 

 

 

 

He performed both “Bob George" (where he’d pretend to get shot), "Superfunkycalifragisexy,” and “When 2 R in Love,” on the 1988–’89 Lovesexy tour, with the latter ballad also getting a new life on the Lovesexy album. And then he reluctantly agreed to an official limited release edition on November 22, 1994; Warner also offered free copies to the first 1000 people who mailed their counterfeit versions back in return. Despite its “lost classic” status, however, The Black Album only peaked on the Billboard 200 at a lowly No. 47.  

 

Like much of Prince’s material, the LP—which was also called “The Funk Bible”—remained unavailable to stream in the early Spotify age (the majority, however, is now available). It did eventually show up, though, on Tidal in 2016 to commemorate what would have been the star’s 58th birthday; just two months earlier, he’d tragically died from an opioid overdose.

 

A Collector’s Item
Prince’s untimely passing was inevitably capitalized on by those who’d been lucky enough to get their hands on his “evil” album before he'd yanked the project. Soon after, a promotional version of The Black Album fetched $15,000 online. A year later, one of five original copies discovered by chance in two Warner Bros. mailers went for nearly three times that amount at auction. And then in 2018, another first edition, also on vinyl, stored away by a crafty pressing plant employee broke the Discogs all-time record when it was snapped up for a cool $27,500. 

 

Remarkably, The Black Album wasn’t the first Prince LP to get such treatment. For reasons unknown, the Purple One also canceled 1986’s Camille, a funk LP recorded under the guise of the titular feminine alter-ego, weeks before its scheduled release. And although the majority of its eight tracks were quickly repurposed elsewhere, including “Rockhard in a Funky Place,” the original remains in-demand: In 2016, a rare pressing sold for $58,787.

 

Yet thanks to its mysterious backstory, The Black Album remains the ultimate enigma in Prince’s vast back catalog. “He loved that album, but it seemed dark to him,” backing vocalist Cat Glover later told Wax Poetics about the moment he decided to leave it in the vault. “Something hit him that night that made him change—an enlightenment, a higher power.” Whether the record originated from evil or not, it undoubtedly provided one of the most colorful chapters in Prince’s deep mythology. 

 

 

Source: The “Evil” Album Prince Withdrew Days Before Its Release

  • Like 1
Link to comment
Share on other sites

Fact of the Day - COMFORT FOOD

FEATUREdone_IF_Science-of-comfort-foods_

Did you know.... In times of stress, sadness, or solitude — or when we simply need a pick-me-up — comfort foods have long been a source of solace. These indulgent, often calorie-rich dishes seem to possess an almost magical ability to soothe our emotions and lift our spirits. But what exactly is the science behind this phenomenon? Why do certain foods have such a profound effect on our mood and the ability to give us a mental boost?

 

The answer lies within a complex interplay of biological, psychological, and cultural factors. As it turns out, our brains and bodies respond to comfort foods in ways that extend beyond the basic concepts of nutrition. Here, we delve into the fascinating science behind comfort foods, from the neurochemical responses they elicit to the psychological mechanisms at play. 

 

 

We Have Neurotransmitters to Thank

southern-comfort-food-restaurant-dinner-
Comfort foods can have a powerful impact on our brain chemistry, triggering the release of neurotransmitters — chemical messengers that allow neurons to communicate with each other throughout the body — that can affect our mood and emotions. Arguably the most significant neurotransmitter associated with the consumption of comfort foods is dopamine, which is released by our brain’s hypothalamus. Dopamine gives us feelings of pleasure, satisfaction, and motivation. In other words, it floods us with good feelings after doing something we enjoy, thus reinforcing our desire to engage in these behaviors.

 

Many comfort foods, particularly those high in carbohydrates, can also increase the production of serotonin, another neurotransmitter associated with feelings of happiness and well-being. This is why we often crave sugary or starchy foods when we’re feeling down, as they can help our bodies make serotonin, thereby decreasing the stress hormone cortisol and making us feel calmer

 

A Shortcut to Childhood

th?id=OIP.RvfvT_k6Jm1IMs2MlFNLtAAAAA&rs=
Psychology goes hand in hand with neurochemistry when it comes to understanding the wondrous effects of comfort foods. Foods can easily trigger emotional associations — both unconsciously and consciously — to past joyful and pleasant experiences. Many comfort foods are linked to positive childhood memories from childhood, which explains why we often crave the foods we ate as kids, whether it’s mac and cheese, chicken pot pie, or chocolate cake. Consuming these foods can evoke nostalgia and its associated positive emotions, which in turn releases those feel-good neurotransmitters.

 

Comfort Foods Are Social and Cultural

afro-family-four-having-lunch-260nw-1904
The psychology behind comfort foods is heavily related to our cultural influences. Not only do comfort foods elicit a strong emotional connection to our family and upbringing, but sharing these comfort foods with others can also strengthen social bonds and foster a sense of belonging, which can have a positive impact on our well-being. A 2015 study found that comfort foods were associated with our close relationships, reminding us of our social ties and therefore helping us feel less lonely. 

 

Culture, of course, is a wonderfully diverse and colorful concept. Different cultures have their own traditional comfort foods, which often reflect beloved local ingredients and cooking methods — from chicken congee in China to moussaka in Greece. These foods are often closely tied to cultural identity and shared experiences, again providing us with a sense of nostalgia and belonging. Because of this, what any one person considers comfort food is highly individual.

 

For example, memoirist and civil rights activist Maya Angelou said, “The best comfort food will always be greens, cornbread, and fried chicken.” Austrian celebrity chef Wolfgang Puck, meanwhile, prefers wiener schnitzel and mashed potatoes, “because it reminds [him] of [his] youth.” Wherever we come from and whatever our cultural background may be, it’s good to know we can always turn to our particular comfort foods when we have need of them. 

 

 

Source: What Makes Something a “Comfort Food”?

  • Like 1
Link to comment
Share on other sites

Fact of the Day - CADUCEUS

01jd2z0qysqfcbavhna5.jpg

Did you know... Whoever started putting the ancient Greek icon on American public health officers’ uniforms had it confused with another snake-and-staff symbol.

 

The caduceus is the “Born in the U.S.A.” of Greek iconography: Everyone seems to think it stands for something it doesn’t. 

 

This ancient symbol, also called the Staff of Hermes, depicts two snakes intertwining around a stick that is capped by wings. You’ve likely seen it on the logos of the U.S. Army Medical Department [PDF], the Office of the Surgeon General and Commissioned Corps of the U.S. Public Health Service, and some medical professional associations and health-related businesses. But for almost all its roughly 5000-year history, the caduceus had nothing to do with medicine

 

In Greek mythology, Hermes was the messenger of the gods, gliding between the worlds of humanity and immortality on his winged sandals. He was also the class clown of the 12 Olympic gods, the frenemy of his half-brother Apollo, and the guide who ushered souls into the afterlife. Among his inventions were fire, the alphabet, and (perhaps less impressively) dice. Hermes was also the god of trade, wealth, language, thieves, and travelers, and in his spare time, he was one hell of lyre player. 

 

In many visual depictions, Hermes carries the caduceus staff. In some myths, it was a gift from Apollo, exchanged for a lyre made from a turtle shell. In others, it commemorates an incident in which Hermes stopped two serpents from fighting with the touch of his staff, a symbol of the messenger god’s power as a peacemaker. The actual design may just be a carryover from the iconography of serpent worship that dates back to the dawn of civilization.

 

In the centuries following the heyday of ancient Greece, the caduceus endured as a symbol of travel and commerce, two subjects under Hermes’s godly domain. The icon was worked into emblems of the customs departments of China, Bulgaria, Russia, and other countries.

 

In the U.S., the caduceus took on a brand new meaning. Its misuse as a medical symbol seems to date to 1902 when it was added to the uniform of the United States Medical Corps. Someone in the corps may have confused the caduceus with the Rod of Asclepius, the signature walking stick of the actual Greek god of medicine, which took the form of a wingless staff with a single snake wrapped around it. While the caduceus made its way to other American military and government offices related to medicine, the Rod of Asclepius can be found on the flag of the World Health Organization and the blue Star of Life that has become a cross-cultural symbol of emergency medical providers, plus countless emblems of national medical agencies. 

 

But for the U.S., one of the world’s most flamboyant and image-conscious countries, it surely makes sense to use the cooler-looking symbol. Its ancient origin as an icon of commerce seems fitting for the nation with the most expensive healthcare on Earth. 

 

 

Source: The Caduceus: The Mistaken Meaning of the Medical Symbol

  • Like 1
Link to comment
Share on other sites

Fact of the Day - TURKEYS AWAY

 

Did you know.... “As God is my witness, I thought turkeys could fly!“

 

When WKRP in Cincinnati aired its seventh episode more than 45 years ago on October 30, 1978, no one—including creator Hugh Wilson, who passed away in January 2018—had any idea the freshman series was about to become part of television holiday special history. And they didn’t even have to use any actual turkeys to do it.

 

“Turkeys Away,” which was credited to the late writer Bill Dial, was a Thanksgiving-themed entry for the sitcom about an Ohio-based radio station and its eccentric staff. For a holiday tie-in, Wilson decided to use an anecdote he had heard from Atlanta radio executive Jerry Blum: that another station had once arranged a publicity stunt in which a number of turkeys were thrown out of either a helicopter or a truck—the exact details are lost to time—and proceeded to horrify the gathered crowd with an unintended turkey massacre.

 

Wilson thought this would be a fine premise for a show. As he explained to the Classic TV History blog in 2012, the incident morphed into a plot in which station manager Arthur Carlson (Gordon Jump) arranged for an equally misguided stunt, where broadcaster Les Nessman (Richard Sanders) narrates from the street in a style reminiscent of the Hindenburg disaster.

 

Nessman’s growing horror as the birds fall “like sacks of wet cement” to the pavement below was inspired by watching footage of the accident prior to shooting. (In 1997, Sanders was present for a homage to the episode when WKRQ in Indiana humanely dropping toy turkeys from a chopper that could be redeemed for the real thing.)

 

You’ll have to watch the complete episode to fully appreciate the payoff—including one of the most often-quoted closing lines in sitcoms—but know that no turkeys were actually harmed.

 

 

Source: How ‘WKRP in Cincinnati’ Made Holiday TV History With “Turkeys Away”

  • Like 1
Link to comment
Share on other sites

Fact of the Day - TIME AND AGE

FEATUREdone_IF_Time-as-you-Age.jpg?w=300

Did you know... When we’re younger, time tends to feel as if it drags on forever. Think of those long, lazy summers that seemed never-ending, or how it could feel like an eternity watching the clock tick away and waiting for the final school bell to finally ring. But as we grow older, many of us feel like time is moving more quickly. This curious phenomenon has nothing to do with any change in the measurement of time, of course; a minute today is the same length it was 50 years ago. According to some scientific theories, this sensation actually has to do with how our brains process the experiences around us, which changes as we age, leading to a feeling of increasing rapidity. Let’s take a look at some potential explanations for this odd yet seemingly universal experience.

 

Experiments Have Revealed How We Perceive Time Differently

Robert_Ornstejn.jpg
In the 1960s, psychologist Robert Ornstein conducted a series of experiments leading up to the publication of his 1969 work On the Experience of Time. Two tests were particularly notable: In the first experiment, Ornstein showed subjects two diagrams — one with a complicated design and another featuring a comparatively simple pattern. Subjects were presented with each image for an identical period of time, but when asked which one had appeared for longer, test subjects chose the more complex diagram.

 

Ornstein also conducted a second experiment with audio files featuring clicking sounds and basic household noises. Some of the recordings were more intricate, containing more clicks produced at a quicker frequency. When Ornstein asked his subjects to tell him which audio file was longer, they chose the more complex one with the greater number of sounds.

 

Ornstein concluded that across the board, people’s perception of time appeared to slow down when they were presented with greater amounts of new and complex information. He posited that our brains require extra time to process unfamiliar experiences, resulting in a feeling of time essentially moving in slow-motion. 

 

Childhood vs. Adulthood

concept-transition-child-adolescent-adul
So what do Ornstein’s experiments have to do with time slowing down as we age? Well, when we’re young, our days are filled with first-time experiences rife with complex and often novel information that our brains work hard to process. There are countless new lessons to learn, new locations to explore, and new sensations to feel. In the context of Ornstein’s experiments, these are akin to seeing the more complex diagrams or hearing those more detailed audio files. 

 

When we’re younger, it takes time for our brains to take in and process all the sights and sounds we’ve never experienced before. This overwhelming flood of knowledge may contribute to the sensation of time moving more slowly. As we grow older, however, we often find ourselves falling into familiar routines. Days, weeks, or even months can pass in which our lives remain largely unchanged. Our brains aren’t working as hard on a daily basis to process and analyze new experiences, so time can feel to be moving faster. 

 

This is all subjective, of course, as some older people may actively seek out stimulating activities that keep the brain active and therefore help “slow things down.” But generally speaking, time tends to blend together more and more as we age, when it isn’t broken up by fresh, original experiences as frequently as in our younger years.

 

How Our Brains Change as We Age

Le-droit-a-une-sedation-en-fin-de-vie.jp
Even if you make a concerted effort to seek out challenging new experiences as you age, your perception of time will likely still be affected by the inevitable changes to your body and brain. Older folks often experience time more rapidly due to physical changes in the neural receptors, which become larger and more complex as we mature. These changes mean it takes longer for signals to traverse the nervous system and reach the brain, so our bodies are unable to process details as efficiently as in our younger days. 

 

As Psychology Today explains, younger people are physically capable of processing more mental images than their older counterparts, whose brains function less efficiently as they age. Let’s say there’s a 15-year-old and a 65-year-old who are both witness to the same experience over the course of one hour. When looking back, the teenager will likely recount more vivid memories than the adult, given the capabilities of their brain. The teen may feel as if they’re recounting those events in slow motion, considering all of the details they can recall.

 

The adult, on the other hand, may remember fewer details, and so it may seem like that time flew by. In other words, younger people recall past memories as if they were watching slow-motion footage, which allows them to look at every minor detail. But for older folks, certain details may be missed entirely as the footage flies by.

 

Another possible explanation for time appearing to speed up is a concept called proportional theory. Essentially, this theory suggests that the way we perceive time is related to the amount of time we’ve already experienced. Younger people have been alive for less time, so each new experience seems more substantial in comparison.

 

No Singular Answer

280.jpg
There’s still a great deal of uncertainty as to exactly how the human brain functions, and, in turn, why time seems to speed up as we get older. But researchers who have studied this topic generally agree that new and unfamiliar situations can make time seem to slow down. If you’ve begun to feel like the minutes are flying by, try exposing yourself to something new, such as a trip to somewhere you’ve never visited, learning a new language, or even stopping into local restaurants where you’ve never eaten. These may help you “slow down” and allow you to soak up each moment like you did when you were younger, when the world was a less familiar place.

 

Source: Why Life Feels Faster the Older You Get

  • Like 1
Link to comment
Share on other sites

Fact of the Day - PHONY MEANS FAKE. WHY?

593202.jpg

Did you know... Conmen are the reason the word exists.

 

From disingenuous people to counterfeit goods, most of us have dealt with fakes at some point. A common term for something that’s not real is the word phony. But where did the word come from, anyway? Perhaps unsurprisingly, the credit goes to con artists.

 

Merriam-Webster reports that the word entered the English language through an old British scam. It involved coating a brass ring with a thin layer of gold to make the jewelry appear more valuable than it was. The scammer would intentionally drop the item and pick it up when another person noticed it. They would then suggest to their mark that the ring’s value be split between them. Once the stranger was convinced of the ring’s supposed value, the conman then offered to give it to them in exchange for money—an amount too high for brass.

 

The jewelry piece was called a fawney, borrowed from the Irish word fáinne, which translates to “ring.” The earliest known use of fawney likely dates back to the late 1700s. As a noun, fawney referred to the thief that performed the scam. The phrase to go on the fawney also referred to the ring-dropping trick. The ring’s key role in the ploy likely birthed the word phony.

 

According to the Oxford English Dictionary, the word phony first appeared in print in 1893. At the time, it was used as an adjective to describe shady horse racing bookmakers. By the early 1900s people used the word as a general noun for fake things and insincere people. Phony became increasingly popular from the early 1900s to the 1960s, with The Catcher in the Rye solidifying its place in the lexicon in 1951. Usage of the word has declined since then, but it hasn’t vanished from the English vocabulary. When you encounter something—or someone—fake out in the world, sometimes phony is still the right word for the job.

 

 

Source: Why Does ’Phony’ Mean Fake?

  • Like 1
Link to comment
Share on other sites

Fact of the Day - "BEYOND THE PALE"

mqdefault.jpg

Did you know... To get to the answer, we need to discuss Latin, wooden stakes, Catherine the Great, and, of course, Shakespeare. 

 

Towards the end of Jane Eyre, our eponymous hero—having rebuffed her cousin St. John’s marriage proposal—lets us know that a touch of frost has settled on their relations as a result, and “he contrived to impress me momently with the conviction that I was put beyond the pale of his favour.”

 

Much like St. John, who eventually made a long journey to India without Jane, beyond the pale—which means “unacceptable”—has made its own long journey: not to India, but to idiom. It’s a trek involving wooden stakes, Catherine the Great, and, of course, Shakespeare. 

 

The Etymology of Pale
To chart the journey, we must first dust off our Latin dictionaries. Pale ultimately derives from the Latin word pālus, meaning “wooden stake.” (It also gives us the extant verb impale, and the noun palisade, or a fence made from pales.) The Oxford English Dictionary’s (OED) first citation for pale in this sense dates back to the Wycliffite Bible—a 1382 translation from the Latin Vulgate into English—in which a passage from Ecclesiasticus describes someone camping near a house, “and in the walles of it pitcheth a pale.”

 

Within just a few years, pale left its post as a noun for a wooden stake and began to be used to describe a fence made of wooden stakes; around the 1450s, it came to mean “A district or territory within determined bounds,” per the OED.

 

Historical Pales
History is rife with these large boundaries. Following the Battle of Crécy in 1346, English monarchs ruled an area of northern France known as The Pale of Calais. For more than 200 years, this area became an important economic center for England—as well as a strategic outpost on the continent. In 1558, France unexpectedly reclaimed the area following a siege. The retaking of the Pale precipitated a terrible year for the ruling English Queen Mary I (a.k.a. Bloody Mary), who died that November.

 

cropped-the-pale-of-settlement.jpg?w=200

 

In 1791, Catherine the Great’s Pale of Settlement kettled Jewish people into a designated living area within a corner of the Russian Empire, where they lived in impoverished conditions and were heavily restricted in how they could make a living. The Pale of Settlement lasted right up until World War I, when many parts of the area became an active warzone and Jews were forced to flee from the invading German army into interior Russia. The Pale was officially abolished after the February Revolution in 1917.

 

One of the most famous of these boundaries was the one in Ireland known simply as The Pale; it centered around Dublin and stretched from Bray in Wicklow to Dundalk in County Louth. In 1470—a century before the Tudor Conquest of Ireland got going in earnest—the Pale represented the last part of Ireland under English control. Inside its fortified ditches and ramparts, the Pale’s occupants lived separately from the Irish natives.

 

Pale Goes Figurative
Having first referred to an actual stake, and then gaining another sense as a word for a large boundary, pale still wasn’t done evolving: The word came to be used to delineate figurative rather than actual boundaries—or, as the OED puts it, “A realm or sphere of activity, influence, knowledge, etc.; a domain, a field.”

 

The first figurative use of pale comes from a 1483 translation of J. de Voragine’s Golden Legende. Rendering the Latin into English, W. Caxton uses pale to describe permissible conduct in the monasteries: “monkes wyth hym went for to dwelle in deserte / for to kepe more straytelye the professyon of theyr pale.”

 

M373761_Florizel-Perdita.jpg

 

Shakespeare also uses the figurative sense of pale in The Winter’s Tale when he writes about “the red blood raigns in ye winters pale.”

 

The earliest appearance of the phrase beyond the pale identified so far is a 1612 commentary on the Epistle to Titus. As explained by WordOrigins.org, the writer “connects Paul’s admonition in Titus 2:3 that women should not gossip and slander with his commanding women be silent in church from 1 Corinthians 14:33,” saying (in modern spelling), “And thus the Apostle by this precept backeth the former, the due observance of which would cut off much false accusing in such meetings; and in the neglect of it, it is impossible but that the tongue will be walking without his own hedge, and wandering beyond the pale of it.” Pale is clearly being used both metaphorically and literally as an enclosed area. But what enclosed area?

 

The Origins of Beyond the Pale
According to various sources, the origin tale of beyond the pale is an open and shut case. In Ireland, English colonizers barricaded themselves from what they believed to be an uncivilized native population. Lawlessness lay outside the pale; therefore, anything beyond that was unacceptable.

 

It’s a neat story, but almost certainly apocryphal. According to the OED, “the theory that the origin of the phrase relates to any of several specific regions, such as the area of Ireland formerly called the Pale … is not supported by the early historical evidence and is likely to be a later rationalization.” While there’s no definite answer to where the phrase actually comes from, the best guess is that it’s a reference to a generic bounded area (or even a metaphorical bounded area) like the hedge in 1612 rather than any particular historical Pale.

 

The OED’s earliest example of beyond the pale meaning “outside the limits of acceptable behaviour; unacceptable or improper” can be found in the daintily-titled book The Compleat History of the Lives, Robberies, Piracies, and Murders Committed by the Most Notorious Rogues, published in 1720. In it, we learn about Acteon’s roving eye being “beyond the Pale of Expedience.” From there, it grew in popularity, appearing in works like The Pickwick Papers (in which Charles Dickens has Mr. Pott say that Mr. Slurk is “a man who has placed himself beyond the pale of society”), Jane Eyre, and others.

 

When we hear the word pale today, we probably think of complexion first, maybe a bucket (pail) second, and fence later (if at all). But as long as there are people around to transgress, there will always be a handy way to tell them their behavior is unacceptable. Lose this from our language, however, and that truly will be beyond the pale.

 

 

Source: Where Does the Phrase ‘Beyond the Pale’ Come From?

  • Like 1
Link to comment
Share on other sites

Fact of the Day - FLIP A COIN

ifacts-fact-12ba93f6-cf98-4169-9311-c18f

Did you know... Coin flipping is a time-honored tradition for making decisions. Long before the NFL used the method to determine opening kickoffs, Romans employed coin tossing to settle personal disputes (though they called it “heads or ships,” a reference to the Roman coin’s two-faced Janus on one side and the prow of a ship on the other). While the mechanics of coin flipping are simple enough — guess a side and flip — the physics of how a coin flips are anything but. By exploring this complicated motion, scientists have discovered that coin flips are not as random (and thus impartial) as most of us think.

 

A 2023 study from the University of Amsterdam flipped 350,757 coins across 46 different currencies and discovered that a coin flipped to its starting position 50.8% of the time — close to 50/50, but not quite. In other words, if a coin started heads up, there was a slightly greater chance it would land heads up, too. This proves a previous theorem, developed in 2004, which argued that coin tosses landed as they started about 51% of the time. This small difference likely won’t dissuade humans from practicing the coin flip tradition, however. A more serious concern comes from a 2009 study, which revealed that coin tosses can be easily manipulated with just a few minutes of practice. So if you’re relying on the “randomness” of a coin toss to determine important decisions, make sure you trust the person doing the flipping.

 

The U.S. was one of the first countries to have a decimal currency.
The United States has been pretty slow on the metric uptake, but when it comes to rationalizing currency, it’s actually one of the leaders. Although the first (incomplete) example of decimalization occurred in Czarist Russia around 1704, the U.S. decimalized its currency with the Coinage Act of 1792, which established that 100 pennies make a dollar. This was a huge improvement, especially for the nonmathematically inclined, over the British system, wherein 1 pound equals 20 shillings, 1 shilling equals 12 pence, and 1 pence equals 4 farthings. However, this decimal system only pertained to coins at the time. Paper money didn’t enter circulation until 1861, when an embattled Union government, desperate for money during the Civil War, produced the first banknotes — known as “greenbacks.”

 

 

Source: Coin flips are not actually random.

  • Like 1
Link to comment
Share on other sites

Fact of the Day - DETROIT LIONS AND DALLAS COWBOYS

football2.jpg

Did you know...  Just like turkey and cranberry sauce, the Detroit Lions and Dallas Cowboys playing football on Thanksgiving Day is a holiday tradition.

 

Every year since 1934, the Detroit Lions have taken the field for a Thanksgiving game, no matter how bad their record has been. It all goes back to when the Lions were still a fairly young franchise. The team was founded in 1929 in Portsmouth, Ohio, as the Spartans. Portsmouth, while surely a lovely town, wasn’t quite big enough to support a pro team in the young NFL. So Detroit radio station owner George A. Richards bought the Spartans and moved the team to Detroit in 1934.

 

Although Richards’s new squad was a solid team, they were playing second fiddle in Detroit to the Hank Greenberg-led Tigers, who had gone 101-53 to win baseball’s 1934 American League pennant. In the early weeks of the 1934 season, the biggest crowd the Lions could draw for a game was a relatively paltry 15,000. Desperate for a marketing trick to get Detroit excited about its fledgling football franchise, Richards hit on the idea of playing a game on Thanksgiving. Since Richards’s WJR was one of the bigger radio stations in the country, he had considerable clout with his network and convinced NBC to broadcast a Thanksgiving game on 94 stations nationwide.

 

The move worked brilliantly. The undefeated Chicago Bears rolled into town as defending NFL champions, and since the Lions had only one loss, the winner of the first Thanksgiving game would take the NFL’s Western Division. The Lions not only sold out their 26,000-seat stadium, they also had to turn fans away at the gate. Even though the juggernaut Bears won that game, the tradition took hold, and the Lions have been playing on Thanksgiving ever since.

 

How ’bout them Cowboys?
The Cowboys, too, jumped on the opportunity to play on Thanksgiving as an extra little bump for their popularity. When the chance to take the field on Thanksgiving arose in 1966, it might not have been a huge benefit for the Cowboys. Sure, the Lions had filled their stadium for their Thanksgiving games, but that was no assurance that Texans would warm to holiday football so quickly.

 

Tex Schramm, then the general manager of the Cowboys, was something of a marketing genius, though; among his other achievements was the creation of the Dallas Cowboys Cheerleaders.

 

Schramm saw the Thanksgiving Day game as a great way to get the team some national publicity, even as they struggled under young head coach Tom Landry. Schramm signed the Cowboys up for the game even though the NFL was worried that the fans might just not show up—the league guaranteed the team a certain gate revenue in case nobody bought tickets. But the fans showed up in droves, and the team broke its attendance record as 80,259 fans crammed into the Cotton Bowl. The Cowboys beat the Cleveland Browns 26-14 that day, and a second Thanksgiving pigskin tradition caught hold. Since 1966, the Cowboys have missed having Thanksgiving games only twice.

 

Who is playing on Thanksgiving this year?
Home Team: Detroit Lions

Away Team: Chicago Bears

Kickoff Time: 12:30 p.m. ET

Broadcasting Network: Fox

Halftime Performer: Shaboozey

 

Home Team: Dallas Cowboys

Away Team: New York Giants

Kickoff Time: 4:30 p.m. ET

Broadcasting Network: CBS

Halftime Performer: Lainey Wilson

 

Home Team: Green Bay Packers

Away Team: Miami Dolphins

Kickoff Time: 8:20 p.m. ET

Broadcasting Network: NBC

Halftime Performer: Lindsey Stirling 

 

For 2024, the Detroit Lions will host the Chicago Bears and the New York Giants will travel to Dallas to play the Cowboys.

 

In 2006, because six-plus hours of holiday football was not sufficient, the NFL added a third game to the Thanksgiving lineup. This game is not assigned to a specific franchise. For 2024, the Green Bay Packers will welcome the Miami Dolphins.

 

 

Source: Why Do the Lions and Cowboys Always Play on Thanksgiving?

Edited by DarkRavie
  • Like 1
Link to comment
Share on other sites

Fact of the Day - CASINOS AND WINDOWS

Caesars-Southern-Indiana-390x220.jpg

Did you know... You may assume it’s to hide the passage of time from patrons, but the truth is less straightforward.

 

It is rare to see a ray of sunlight in a modern-day casino. The vast caverns full of blinking lights and second-hand smoke infamously lack windows to the outside world. Many think this is intentional, a psychological trick played by the house, which, as the saying goes, always wins.

 

“You should forget when you entered the casino and how long you have been playing there,” one of the self-appointed experts on Quora explains. “For this reason, casino designers removed windows from the building structure to prevent players from seeing outside. Because the light would be an alarm for you spending the whole night at the casino!”

 

This theory has been repeated on Medium, YouTube, and other forums where citations are optional.

 

The truth is more complicated. Casino operators dismiss the idea that they conspire to disrupt their patrons’ sense of time. (In an article for Time Out Chicago, the manager of the Horseshoe Casino in Hammond, Indiana, pointed out that nearly everyone carries around a phone that doubles as a clock.) But thought leaders in casino design have been open about how they keep gamblers engaged, and some of their approaches avoid the use of natural lighting. 

 

Bill Friedman, former manager of Castaways and the Silver Slipper in Las Vegas, wrote what might be the definitive book on the subject, Designing Casinos to Dominate the Competition. Released in 2000, the book is organized into 13 principles (like “Gambling Equipment Immediately Inside Casino Entrances Beats Vacant Raised Entrance Landings and Empty Lobbies” and “Short Lines of Sight Beat Extensive Visible Depth”). He exhaustively covers every aspect of casino interior design in over 600 pages.

 

Friedman promotes a maze-like layout that keeps visitors curious about what’s beyond the next corner while also giving them small cubbies in which to be entranced by specific games. The overall intent is to keep the gambler focused on the machines rather than the larger environment. As neuroscientist Colin Ellard put it, “In view, attention paid to the walls, floors, or ceilings of a casino represents wasted potential profit.”

 

So bright open windows would conflict with Friedman’s general method, but the book doesn’t explicitly endorse blocking them to tamper with the gamblers’ internal clock. Friedman himself has said it’s a “myth” that “Nevada’s casino operators removed clocks and sunlight glass doors and windows, to manipulate players to gamble longer.” Instead, he said, “it was the serious players who demanded they remove them” because they willfully “use a fantasy to escape reality while they gamble.” In addition to distracting gamblers from what’s happening inside the casino, natural light can create an unpleasant glare on playing cards and slot machine screens.

 

Not every casino employs Freidman’s maze concept though. The competing idea is the “playground” layout pioneered by Roger Thomas, who designed Wynn Resorts. Starting in the late ’90s, Thomas designed casino floors as open spaces with comfortable furniture, artwork on the walls, and orchids in vases on tables. Many were flooded with natural light. This design is also not without an element of manipulation. “They make you feel comfortable, of course, but they also constantly remind you to have fun,” Thomas told the New Yorker. When people feel relaxed, they gamble more. So the cave atmosphere is not even a prerequisite in Vegas.

 

Even harsh critics of casinos don’t cite the sunlight manipulation theory much. In her book Addiction by Design: Machine Gambling in Las Vegas, Natasha Dow Schüll, a cultural anthropologist at New York University, does not mention it. Instead, Schüll criticized the gambling industry’s deemphasis of craps, roulette, and other games that involve socialization and the elevation of video-screen slot machines, a “solitary, absorptive activity can suspend time, space, monetary value, social roles and sometimes one’s very sense of existence.”

 

In other words, casinos have better ways of disrupting the senses of their patrons than lying to them about the time. 

 

Source: Why Don’t Casinos Have Windows?

  • Like 1
Link to comment
Share on other sites

Fact of the Day - TALKING ANIMALS? A CHRISTMAS LEGEND?

01jd35gyc0g13m37191e.jpg

Did you know.... The legend that animals gain the power of speech on Christmas Eve has roots all the way back in the manger.

 

For all its very logical and sensible legends and traditions, Christmas has quite a few strange ones too (like, say, gravity-defying reindeer). Some rare bits of Christmas mythology are even stranger still—like the one that claims that at the stroke of midnight on Christmas Eve, animals gain the power of speech.

 

Tidings of Vengeance and Death

trp550-s.jpg&ehk=tilJyftdIpuKaAhjvCnkqHq

The legend—most common in parts of Europe—has been applied to farm animals and household pets alike. It operates on the belief that Jesus’s birth occurred at exactly midnight on Christmas Day, leading to various supernatural occurrences. Many speculate that the myth has pagan roots or may have morphed from the belief that the ox and donkey in the Nativity stable bowed down when Jesus was born. In any case, the story has since taken on a life of its own, with different versions ranging from sweet to scary.

 

According to The Christmas Troll and Other Yuletide Stories by Clement A. Miles, variations of the legend can be surprisingly sinister for holiday lore. One tells the story of vengeful pets plotting against their masters, like this tale from Brittany:

 

Once upon a time there was a woman who starved her cat and dog. At midnight on Christmas Eve she heard the dog say to the cat, ‘It is quite time we lost our mistress; she is a regular miser. To-night burglars are coming to steal her money; and if she cries out they will break her head.’
‘Twill be a good deed,’ the cat replied. The woman in terror got up to go to a neighbor's house; as she went out the burglars opened the door, and when she shouted for help they broke her head
.”

 

Another tale, this time hailing from the German Alps, features animals foretelling their caretakers’ death. On Christmas Eve, a young farm servant hides in the stables hoping to witness the animals’ speech, where he overhears an alarming conversation between two horses:

 

We shall have hard work to do this day week,” said one horse.
“Yes, the farmer's servant is heavy,” replies another horse.
“And the way to the churchyard is long and steep,” says the first
.

 

The servant dies a few days later, leaving those horses to do some heavy lifting.

Away in a Manger

 

A more modern version of the tale first aired on ABC in 1970, and while it’s animated and for children, it’s still surprisingly grim. In the made-for-TV cartoon titled The Night The Animals Talked, animals gain the power of speech and sing a song exalting their newfound ability—to insult each other: “You can bicker with anyone you hate / It’s great to communicate.”

 

By the time the animals realize that they’ve been given the ability in order to spread the message of Jesus’s birth, it’s too late. While running through the streets of Bethlehem, they lose their speech one by one. The ox, last to lose the ability, is left to lament that so many humans seem to waste the gift of speech.

 

And then there’s “The Friendly Beasts,” a lighter version of the legend in the form of a Christmas carol. The hymn takes a less literal approach to the “talking animals” theory, instead focusing more on the connection each animal had to Jesus’s birth: “’I,’ said the donkey, shaggy and brown, ‘I carried His mother up hill and down; ‘I,’ said the cow, all white and red, ‘I gave Him my manger for His head,’” and so on with the sheep and dove.

 

The song’s origins purportedly lie in a mostly forgotten French medieval feast day, the Fete de L’Ane, or the Feast of the Ass, which honors Mary, Jesus, and Joseph’s flight into Egypt, and the donkey who transported them. The carol was born of an early Latin hymn commonly sung at the feast, “Orientis partibus Adventavit asinus," or “From the East the ass has come,” which included a chorus of “Hail, Sir donkey, hail!”

 

Christmas Bees Are Singing

images.jpeg&ehk=sPSeJy5QMuwJYpwtlOJLPTe0

The variations of Christmas legends about special or supernatural animal behavior are diverse and far-reaching. Not all necessarily involve animals speaking. In John Howison’s 1821 Sketches of Upper Canada, the author recounts a Native American who told him that “ Christmas night and all deer fall upon their knees to the Great Spirit.” William Henderson’s 1879 book Folk-lore of the Northern Counties of England and their Borders recounts the legend that, on Christmas Eve, bees assemble into a type of choir:

 

Thus the Rev. Hugh Taylor writes: ‘A man of the name of Murray died about the age of ninety, in the parish of Earsdon, Northumberland. He told a sister of mine that on Christmas Eve, the bees assemble and hum a Christmas hymn, and that his mother had distinctly heard them do this on one occasion when she had gone out to listen for her husband’s return. Murray was a shewd man, yet he seemed to believe this implicitly.’”

 

In some cases, the myth of the singing bees circles back to that of the kneeling oxen: “[…]In the parish of Whitebeck, in Cumberland, bees are said to sing at midnight as soon as the day of the Nativity begins, and also that oxen kneel in their stalls at the same day and hour.”

 

So, singing bees, plotting pets, clairvoyant horses, praying oxen, and more, all to illustrate the power of Christmas Eve—short of supernatural power, it certainly has a strong hold on the collective human imagination.

 

 

Source: Talk is Sheep: Behind the Christmas Eve Myth That Animals Speak at Midnight

Edited by DarkRavie
  • Like 1
Link to comment
Share on other sites

Fact of the Day - PLUMBERS

ifacts-fact-6a0b7bf9-9ed5-46a8-90d5-3875

Did you know.... For many of us, the day after Thanksgiving is primarily known as Black Friday — the kick-start to the winter holiday shopping season. But for workers in one industry, it goes by a slightly different moniker: Brown Friday. The nickname comes from the high number of service calls plumbers receive the day after a holiday that strains people's waistbands and kitchen sinks. Many plumbers say that Friday following Thanksgiving is twice as busy as any other day of the year.

 

While Brown Friday gets its unappealing name from the sewage byproducts workers are often hired to handle, many plumbers report that service calls for bathroom fixes aren’t as common on that day. Instead, kitchen sinks, garbage disposals, and drains are the top offenders (though plumbers acknowledge that having more guests does put additional pressure on a home’s wastewater system). Most post-Thanksgiving plumbing issues stem from two culprits: grease and potato peels. Hot grease washed down sink drains eventually cools and solidifies, leading to buildup that can plug pipes. And when a massive heap of starchy potato peels makes its way down a partially clogged pipe, the grease and peels can congeal to create a kitchen nightmare. Fortunately, experts say there’s an easy way to prevent a Thanksgiving catastrophe: Toss meats, bones, and stringy or dense foods like those potato peels into the trash can instead of down the sink. 

 

Garbage disposals were illegal in New York City until 1997.
The first garbage disposal — the InSinkErator — was patented in 1935, but it was decades before the scrap-busting appliances were officially permitted in the nation’s largest city. While New Yorkers were initially free to use kitchen garbage disposals, the city reversed course in the 1970s over concerns that food scraps would overload the city’s then-aging sewer system. The ban, however, didn’t last. In the mid-1990s, New York City sanitation officials gave out 200 grinders as part of a study to evaluate the impact of garbage disposal use, and by 1997 the ban was repealed entirely. Still, more than 25 years later, it’s a rarity to find a New York City apartment that has an under-sink compactor. That’s because many of the city’s apartment buildings were constructed in the early 20th century and still use original plumbing, which landlords worry could clog or break down due to sludgy food particles.

 

 

Source: The day after Thanksgiving is the busiest day of the year for plumbers.

  • Like 1
Link to comment
Share on other sites

Fact of the Day - EVERGREEN TREES

images?q=tbn:ANd9GcQvPBvpbZ6Ze--tDOc_8w9

Did you know... There’s quite a long history behind the most iconic Christmas decoration.

 

Many people look forward to bringing home a traditional tree for Christmas. There’s nothing like picking out the biggest evergreen from the farm, decorating it, and waiting for the day to open presents. The trees are a ubiquitous part of the holiday season, but how did the tradition start? Pagan solstice celebrations, German influences, and Queen Victoria’s reputation are the primary reasons the custom became mainstream.   

 

Believe it or not, evergreens initially had nothing to do with Christianity—although religion was involved. According to History, many ancient cultures viewed the sun as a god whose strength ebbed and flowed throughout the year. When the winter solstice (the shortest day and longest night of the year in the Northern Hemisphere) arrived on December 21 or 22, it was a sign that the sun god was about to regain his strength. They brought greenery indoors to mark the sun’s return and evoke the lusher, warmer times soon to come. People from many parts of the world, including Egypt, Rome, and northern Europe, celebrated the winter solstice in this way. 

 

As Texas A&M University history professor Troy Bickham wrote on the school’s blog, many Europeans continued to practice the pagan winter solstice traditions even as Christmas gained popularity. However, the Christmas tree as we know it today didn’t rise to prominence until the 16th century. In their effort to distance themselves from the Catholic church, Protestant leaders in Germany promoted the Christmas tree to replace images of the nativity, also known as the birth of Jesus. German religious reformer Martin Luther is said to have been the first to decorate a tree for Christmas by adding lit candles. 

 

The earliest European settlers in North America considered Christmas, and the evergreen tradition, sacrilegious. Puritan colonists in Massachusetts even fined people who were caught celebrating the holiday. As German people migrated to the soon-to-be United States in the 18th century, they brought their Christmas traditions with them, and Christmas trees gained popularity accordingly.

 

Queen Victoria is largely responsible for solidifying the Christmas tree's place in popular culture. The queen’s mother and husband hailed from Germany and influenced the royal holiday celebrations, which included the decorating of a large evergreen tree and placing gifts underneath its boughs. The Christmas tree custom became widespread in England after an 1848 issue of the London Illustrated News showed the queen and her family decorating one. America’s middle class, in turn, deeply admired Victorian culture at the time and adopted the trend. Today, 25 to 30 million Christmas trees are sold annually in the U.S.

 

 

Source: How Did Evergreen Trees Become a Christmas Symbol?

  • Like 1
Link to comment
Share on other sites

Fact of the day - CHRISTMAS CRACKERS

group-christmas-crackers-260nw-211907422

Did you know.... George Bernard Shaw is said to have quipped that Britain and America are two countries separated by a common language. But around the holidays, there is at least one more way in which these two nations differ: Only Brits eat their Christmas dinner wearing flimsy paper hats.

 

It’s a sight that’s probably most familiar to Americans thanks to festive scenes in quintessentially British movies, such as Bridget Jones’ Diary and About A Boy. To British people, however, the festive paper hat is as much a part of the big day as roast turkey (rather than ham) and the king’s (rather than the president’s) speech.

 

What Is a Christmas Cracker?
These crown-shaped paper hats are found inside Christmas crackers, traditional festive novelties—far more popular in the UK than the U.S.—that are typically placed on the Christmas dining table or hidden among the decorations on the Christmas tree. Each one consists of a small cardboard tube containing an array of silly throwaway prizes (including a paper hat), which is in turn wrapped inside a longer roll of brightly colored paper and often fastened with decorative ribbons or foil bows.

 

Running through the cracker is a thin firecracker-like “banger,” containing a tiny explosive patch of a friction-sensitive chemical called silver fulminate. When the time comes, neighboring diners at the Christmas dinner table grab one end of the cracker and wrench it apart, with the banger producing a loud “crack” sound as it snaps open. The winner is the person whose side of the torn cracker remains attached to the tube that contains the prizes, which they then get to keep.

 

A Brief History of Christmas Crackers
Christmas crackers date back to the mid-1800s, when a London confectioner named Tom Smith first began adding paper mottos (originally short love poems) into packets of sugared almonds that he sold in twisted tissue-paper packets from his shop on Goswell Road in Clerkenwell.

 

 

 

The idea of adding an explosive “crack” to the equation apparently came to Smith when he heard the loud pop of a log burning in his fireplace, and he spent years concocting a way to safely replicate the surprising sound in his novelty bonbon packets. He patented his first design in 1847, and gradually honed the abrasive mechanism needed to explode his silver fulminate banger until the 1860s—during which time his novelty crackers became the talk of the city. By the end of the 19th century, his business was employing some 2000 staff.

 

Adding the Crown
Smith may have come up with both the Christmas cracker itself and its eponymous “crack,” but it was apparently his son who added the paper hat into the mix, alongside a handful of other novelty items. The prizes included in Smith’s crackers needed to be light and compact, so in that respect a tissue-paper hat might seem like a sensible choice. But why make it crown-shaped?

 

GettyImages-532179309.jpg?quality=90&str

 

The festive crown actually has a far longer heritage than the Christmas crackers in which it’s now found. According to the BBC, the hat wearing can be traced to the ancient Romans’ Saturnalia festival, held in mid-December, “which also involved decorative headgear.” In the medieval era, the festive period from Christmas to Twelfth Night was a seen as a time of misrule, when a servant would be crowned as a “king” or “queen” and made to preside over the holiday season’s madcap celebrations. It seems that the addition of the paper crown to the Christmas cracker in the 1800s may have been a jokey nod to this age-old tradition of festive misrule, which we’ve maintained ever since.

 

 

Source: Why Do Christmas Crackers Come With Paper Crowns?

Edited by DarkRavie
  • Like 1
Link to comment
Share on other sites

Fact of the Day - THE WORD OF 2024

01je445xnkk5dc90t1bx.jpg

Did you know.... Each year, Oxford University Press—the publisher behind the esteemed Oxford English Dictionary—chooses a word or phrase from the national discourse to be its Word of the Year. For 2024, their selection is a popular condemnation for consuming low-quality information. The unofficial mental health diagnosis is dubbed brain rot.

 

The OED defines brain rot (sometimes collapsed to brainrot) as “the supposed deterioration of a person’s mental or intellectual state, especially viewed as the result of overconsumption of material (now particularly online content) considered to be trivial or unchallenging. Also: something characterized as likely to lead to such deterioration.” In other words, one can suffer from brain rot by marathoning endless conspiracy theory videos, and the videos themselves can be labeled brain rot.

 

Per The New York Times, signs of brain rot can include the increased insertion of internet slang or meme references into everyday conversation. Those who take pride in their knowledge of online culture may interpret it as a compliment. Still, it’s mostly meant to indicate someone is losing touch with reality. In some cases, it’s labeled as a genuine mental health struggle: Connecticut’s Newport Institute, which offers inpatient counseling, regards brain rot as synonymous with digital or screen dependency.

 

Oxford’s methodology for plucking brain rot from a pile of buzzwords starts with editors selecting six words that saw increased usage and relevance over the past year. The online use of brain rot jumped 230 percent between 2023 and 2024. Public voting was also taken into account.

 

Here are the other candidates for Word of the Year:

  • Demure: A reserved and restrained appearance
  • Dynamic Pricing: The act of altering costs based on real-time demand
  • Lore: A body of knowledge surrounding a person or subject
  • Romantasy: A genre blending romance and fantasy
  • Slop: Pig feed, yes, but in this context low-grade content often generated by language models

 

Brain rot may have surged in the past year, but its use dates back to 1854, when Henry David Thoreau referenced it in his book Walden. He, too, wielded it to describe a diminished mental acuity. “While England endeavours to cure the potato rot, will not any endeavour to cure the brain-rot—which prevails so much more widely and fatally?”

 

In a statement acknowledging the rise of brain rot, Oxford Languages president Casper Grathwohl related the following: “Looking back at the Oxford Word of the Year over the past two decades, you can see society’s growing preoccupation with how our virtual lives are evolving, the way internet culture is permeating so much of who we are and what we talk about. Last year’s winning word, ‘rizz,’ was an interesting example of how language is increasingly formed, shaped, and shared within online communities. ‘Brain rot’ speaks to one of the perceived dangers of virtual life, and how we are using our free time. It feels like a rightful next chapter in the cultural conversation about humanity and technology. It’s not surprising that so many voters embraced the term, endorsing it as our choice this year.”

 

In some cases, being Word of the Year can affect a term’s usage. When Oxford announced rizz as its 2023 selection, the popularity of the term—slang for charisma, or an abundance of confidence—spiked.

 

 

Source: ‘Brain Rot’ Is the Word of 2024, According to Oxford

  • Like 1
Link to comment
Share on other sites

Fact of the Day - NIAGARA FALLS

17_view.jpg

Did you know... Niagara Falls is one of the world’s great ecological wonders, dumping a rush of 3,160 tons of water over its crest every second. That’s likely why more than 100,000 people showed up to see part of the waterfall suddenly run dry in 1969 — a feat that was orchestrated not by nature, but by engineers.

 

Niagara Falls consists of three waterfalls: Horseshoe Falls (the largest), Bridal Veil Falls (the smallest), and American Falls, which today stands 190 feet tall. But major rock falls in 1931 and 1954 shortened the American Falls’ drop by nearly half, threatening its structural stability. So in June 1969, the U.S. Army Corps of Engineers “dewatered” the massive cataract in an attempt to survey its sturdiness and give crews an opportunity to remove the enormous rock pile sitting below (although that plan was later abandoned due to cost concerns). Temporary cofferdams were built above the American Falls, diverting water to the other two drop-offs and effectively drying out the waterfall for months. 

 

Seeing the underlying rocky riverbed was a rare sight; onlookers hadn’t seen the bare crest since March 1848, when an ice dam on the Niagara River curtailed the watery curtain. This time, scientists used the water shutoff to map the waterfall’s face, collect core samples, and install water pressure monitors. Work at the site was completed in five months, and by November 1969, the U.S. Army Corps of Engineers removed its dam, unveiling a restored waterfall to a surge of visitors.

 

A vice president’s daughter helped popularize honeymooning at Niagara Falls.
Today’s newlyweds often look to tropical destinations for a post-wedding getaway, though at one time, the ultimate honeymoon spot was in upstate New York. For decades, Niagara Falls was considered the “honeymoon capital of the world.” The fact that the massive waterfall had such a draw for lovers has to do with its early history of attracting high-profile couples. Historians point particularly to Theodosia Burr Alston, daughter of third Vice President Aaron Burr (arguably best known for his infamous duel with Alexander Hamilton). Theodosia and her new husband, Joseph Alston, visited the spot in 1801 after their nuptials; a few years later, Niagara Falls received another publicized visit from Jerome Bonaparte (brother to Napoleon) and his bride, Elizabeth Patterson. With the help of easy railroad access and a community that catered to tourism, Niagara Falls reached its peak popularity as a honeymoon destination in the 1950s, but the tradition hasn’t entirely faded. Newlyweds who visit today receive certificates signed by the mayor of Niagara Falls to commemorate their honeymoon choice.

 

 

Source: In 1969, the Army Corps of Engineers turned off part of Niagara Falls.

  • Like 1
Link to comment
Share on other sites

Fact of the Day - WAS THE HOVERBOARD REAL?

01jabgyksvx78srpkz72.jpg

Did you know.... When he appeared in promotional footage to coincide with the release of 1989’s Back to the Future Part II, Zemeckis was asked about the hoverboard, the futuristic skateboard that glides over the ground and is employed by Marty McFly (Michael J. Fox) when he finds himself chased by rival Biff Tannen and goons in 2015 Hill Valley.

 

“The hoverboard is a board that hovers on magnetic energy, and it works just like a skateboard except it doesn’t have any wheels,” he said. “You don’t have to have any pavement to hover on.”

 

Zemeckis went on to explain the product was part of a larger toy conspiracy: “They’ve been around for years, it’s just that parents’ groups have not let the toy manufacturers make them. We got our hands on some and we put them in the movie.”

 

The hoverboard was, of course, a fictional construct, one that simply wasn’t possible in the real world given the technology of the 1980s. (Or now, but more on that later.) But thanks to Zemeckis’s tongue-in-cheek answer and a wave of stories spread on school playgrounds, a number of people were convinced the hoverboard was real and that the future was being suppressed by fun-hating authoritarians.

 

Hello, McFly
Urban myths about consumer products are a common part of childhood folklore. In the 1970s, word spread that the actor portraying “Mikey” in commercials for Life cereal, John Gilchrist, had perished after consuming a deadly combination of Pop Rocks and soda. (The fatal gastrointestinal blow supposedly came when the carbonation in both proved too much for his tiny stomach.)

 

But Gilchrist was fine. So were kids who enjoyed Bubble Yum chewing gum even as rumors spread the secret to its chewiness was processed spider eggs. Such stories may have given kids a chance to express a rising distrust for authority or conventional wisdom—so believing some futuristic technology was being kept from them wasn’t much of a leap.

 

 

The hoverboard depicted in Back to the Future Part II was the work of John Bell, a concept designer recruited by Robert Zemeckis and his co-writer and producer Bob Gale following the success of 1985’s Back to the Future. That film—which followed teenager Marty McFly as he uses his pal Doc Brown’s DeLorean to travel back to 1955 and make sure his parents get together while also saving Doc’s life—teased another installment at its conclusion. (Doc, for those who may not recall, visits Marty after returning from the “future” of 2015.)

 

Once they were on board, Bell and his colleague Dave Carson began to conceive of what Hill Valley might look like in 30 years. He initially envisioned the hoverboard as a shrunken type of hovercraft with the aesthetics of a drag racer—exhaust pipes and all. (While it was the first hoverboard in Future lore, it was not the first-ever incorporation of a hovering skateboard in fiction. One is mentioned in the 1967 sci-fi novel The Hole in the Zero by M.K. Joseph.)

 

Zemeckis asked Bell to simplify the design. Marty’s hoverboard ultimately resembled little more than a skateboard without wheels and featured a prominent logo for toymaker Mattel, which the script—or possibly someone at Universal’s tie-in marketing arm—imagined as a possible manufacturer for the product. Marty swipes a pink hoverboard from a little girl and flees, with Biff and his henchmen in pursuit. (Biff’s board actually adheres to Bell’s idea of a souped-up racer, with customized touches.)

 

The sequence was accomplished through conventional means, including magnets embedded in sneakers (so they’d appear to snap onto the decks) and wires that suspended the actors above the ground. Rear projection created a sense of movement.

 

But it wasn’t without risk: During a sequence in which several stunt performers were suspended on a crane and soaring toward a building, one stuntwoman, Cheryl Wheeler, fell from a height of 30 feet and sustained serious injuries. (Wheeler had replaced another stunt performer who was wary of how the sequence was being staged and bowed out.)

 

Even simulating a hoverboard in action was dangerous; the real thing, had it existed, would probably have proved a catastrophe. But in a special intended to promote the film, both Zemeckis and the narrator spoke earnestly of the hoverboards, as though they were genuine props. The promo video aired on network television prior to the film’s debut on November 22, 1989. And by that point, both parents and their kids were assuming Mattel’s toy would be something they could stick under a Christmas tree.

 

The Conspiracy
Industrial Light and Magic, the effects house that worked on the films, began to receive calls shortly after the promotional footage aired; parents wanted to know where hoverboards could be purchased. According to Caseen Gaines, who authored 2015's We Don't Need Roads, a book on the making of the trilogy, one parent cited the Zemeckis interview as the reason the product was credible.

 

A letter that a person named Lance Hall sent to the Miami Herald in December 1989 was typical of the ensuing consumer confusion, which married wishful thinking (a real hoverboard) with the conspiratorial.

 

“I just saw the movie Back to the Future II,” Hall wrote. “My brother says the hoverboards Michael J. Fox rides on are real but it’s illegal to sell them because they’re too dangerous. Do hoverboards really exist?”

 

The Herald reached out to Kris Kelley, a spokesperson for Amblin, the Steven Spielberg production company behind the Back to the Future series. “ thought it would be a good joke, but it wasn’t taken that way,” Kelley said. “We began to hear from parents who wanted them outlawed.”

 

Mattel was also brought into the fray. A 1-800 number set up by the company intended to assist consumers with toy assembly or operating instructions during the holiday season was instead bombarded with queries about the hoverboard, which had by that point developed a reputation for possibly maiming children.

 

“In most cases, we make it clear it was made specifically for the movie,” Mattel spokesperson Glenn Bozarth said. “But if they’ve got a sense of humor, we tell them to wait until 2015.”

 

As it turns out, Bozarth’s joke was something of a prediction.

 

Hovercrafting
In 2014, a video began circulating online featuring skateboard legend Tony Hawk and Christopher Lloyd, the actor who portrayed Doc Brown in the Back to the Future films. They were demonstrating HUVr, which was purportedly a functional hoverboard akin to the one seen in the sequel.

 

 

The footage is sincere, with little hint of the sense of humor behind it: It was produced by Funny or Die, the humor label behind a series of viral videos like Will Ferrell’s encounter with his toddler landlord. Some viewers took it seriously, and the are-hoverboards-real conversation was renewed.

 

Not long after, in 2015, luxury automaker Lexus offered a glimpse of a hoverboard that was no joke. The device—which resembled Jabba’s barge from Return of the Jedi more than the slim board used by McFly—operated on superconductors and magnets cooled by liquid nitrogen. It could not, as the fictional board did, hover over solid, unprepped pavement. But given proper surface elements, it could achieve lift.

 

 

But Lexus never intended to develop it further, or put it up for sale—it was created for an ad campaign with technology sourced from German engineers. Making it actually move was another engineering challenge altogether. The board could surge forward, but testers, including The Verge, concluded it was highly impractical. For one thing, the board required a superconductor track, which is something cities are unlikely to construct. For another, balance was hard to maintain. (And so was the liquid nitrogen, which was good for about 20 minutes of travel time.)

 

Other enterprising (or nostalgic) souls have tried to develop real hoverboards with mixed results. The Mattel prop hoverboard from the film, which was signed by Fox, sold for $501,200 in 2021—a value that was perhaps realized not only as being a part of movie history but urban legend infamy.

 

But why did Zemeckis conceive of his tall tale in the first place? In all likelihood, the filmmaker was growing disenchanted with audiences demanding to know how effects were accomplished, and so he began to explain that they weren’t effects at all.

 

“I remember when someone asked Bob how he did the hoverboard sequences in Back to the Future,” Michael J, Fox said in 1996. “Bob would say, ‘What do you mean, how did we do it? It’s a real hoverboard. It flies. Michael just practiced a lot.’ ”

 

 

Source: When People Thought the ‘Back to the Future II’ Hoverboard Was Real

  • Like 1
Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...
Please Sign In