DarkRavie Posted October 8, 2024 Author Report Share Posted October 8, 2024 Fact of the Day - EGG CREAMS Did you know... Foods tend to get their names from their appearance or ingredients, though not all are so clear-cut. Take, for instance, the egg cream, a beverage that has delighted the taste buds of New Yorkers (and other diner patrons) since the 1890s. If you’ve never sipped on the cool, fizzy drink known for its chocolate flavor and foamy top, you should know: There are no eggs or cream in a traditional egg cream drink. According to culinary lore, the first egg cream was the accidental invention of Louis Auster, a late-19th- and early-20th-century candy shop owner in New York’s Lower East Side. Auster’s sweet treat arrived in the 1890s, at a time when soda fountains had started selling fancier drinks, and it was a hit — the enterprising inventor reportedly sold upwards of 3,000 egg creams per day by the 1920s and ’30s. However, Auster kept his recipe well guarded; the confectioner refused to sell his formula, and eventually took his recipe to the grave. The origins of the drink’s name have also been lost to time. Some believe the name “egg cream” came from Auster’s use of “Grade A” cream, which could have sounded like “egg cream” with a New York accent. Another possible explanation points to the Yiddish phrase “echt keem,” meaning “pure sweetness.” Regardless of the misleading name, egg creams are once again gaining popularity in New York, though you don’t have to be a city dweller to get your hands on the cool refreshment. Egg creams can be easily made at home with just three ingredients: milk, seltzer, and chocolate syrup. Chocolate syrup was once marketed as a health tonic. Centuries before it became a dessert, chocolate was employed medicinally. In Mesoamerica, where chocolate originated, cacao was used among Indigenous communities to treat indigestion, fatigue, and even some dental problems. Europeans of the 17th century also consumed chocolate for health purposes, hoping to cure a variety of ailments. By the late 1800s, pharmaceutical publications widely advertised chocolate powders and syrups, promoting them as healthful aids that also masked the bitter flavors of other medications. Brands like Hershey’s began marketing their syrups and chocolates to everyday consumers as health tonics that were wholesome and nutritious — even “more sustaining than meat.” Eventually, however, regulations against dubious health claims and patent medicines, combined with equipment improvements and declining sugar prices, set the stage for chocolate to be considered more treat than tonic, even as some health claims for it have endured. Source: Egg creams contain neither eggs nor cream. 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 9, 2024 Author Report Share Posted October 9, 2024 Fact of the Day - NAMING HURICANES Did you know... Beryl, Helene, Milton—recent hurricane monikers aren’t exactly the most popular girls’ and boys’ names. The National Oceanic and Atmospheric Administration (NOAA) predicted a busier-than-normal hurricane season in 2024—and we’ve already seen plenty of evidence of an above-average year. In the chaos of preparing for the storms to hit land, we would also have to deal with the confusion of telling them apart, if it weren’t for a naming system that’s been used for decades. Prior to the 1950s, Atlantic hurricanes were identified simply by the year and the order in which they occurred. This system was imperfect, however, especially when meteorologists and the media had to keep tabs on multiple storms at the same time. So in 1953, the U.S. began using a list of female names ordered alphabetically to better clarify which hurricanes were coming when. Male names were assigned to storms in 1978, and in 1979 the coed database of names we now use to track Atlantic storms was officially adopted. The list includes 21 names for each year, with names for the letters Q, U, X, Y, and Z missing from the lineup. For years when more than 21 storms appear, letters from the Greek alphabet are used to label the extras. The catalog has enough names to last six hurricane seasons, after which it gets recycled. When hurricanes are especially fatal or destructive, those names may be retired. In those cases, the World Meteorological Organization convenes to decide on a new name to fill the empty slot. Andrew, Katrina, Ike, and Sandy are a handful of names that have lost their place on the list in recent decades. The World Meteorological Organization retired at least one name each year between 2015 and 2022. While the organization doesn’t take suggestions, it does make the updated list available for the public to see years in advance. Source: How Do Hurricanes Get Their Names? 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 10, 2024 Author Report Share Posted October 10, 2024 Fact of the Day - VACUUM SPACE Did you know... Roborock recently sent a robot vacuum into the vacuum of space for the first time ever, showcasing the engineering excellence, durability, and performance of its S8 MaxV Ultra in a unique and memorable way. Now, the company has teamed up with Mental Floss to explain why space is a vacuum—all while pushing the boundaries of smart cleaning technology back on Earth. You may have heard that space is a vacuum and wondered, “so why doesn’t it suck us off the Earth, like a robot vacuum cleaner sucks dirt off my rug?” The answer goes back to how the universe got started, and how the Earth came to be. Now that Roborock has sent its S8 MaxV Ultra into space, they’re partnering with Mental Floss to break down the answer to this big question. Defining Vacuum Before we get into it, though, let’s talk about what a vacuum is and isn’t. A vacuum isn’t really a thing—it’s more the absence of anything at all. So when someone says “space is a vacuum,” what they really mean is that it’s pretty much empty. If you waved a bottle around on a spacewalk and sealed it tight, there probably wouldn’t be a single atom inside of it. Your robot vacuum maintains a region of empty space, too, but it’s different from the vacuum of space: The dust and hairs are swept into the vacuum by differences in pressure caused by suction. A Packed Beginning Space wasn’t always empty. For the first 400,000 years after the big bang, every inch of the universe was packed with a dense, swirling soup of stuff—mostly hydrogen. All these tightly-packed atoms bounced off one another incessantly, keeping the soup incredibly hot at first. But as time went on, space expanded, stretching the soup thinner (like what might happen if you spilled real soup across a table). The thinned-out soup also cooled down and began to congeal. Thanks to gravity, matter doesn’t like to remain spread out; eventually, everything clumps. The first little clumps of matter formed by pure accident—some patches of space happened to contain a bit more stuff than others. And since all matter has gravity, the more stuff there is in a patch, the more it tugs everything else toward it. Across eons, these puddles of matter collected more matter. Eventually, some clumps reached a critical mass, and their pull became so intense that they rapidly collapsed into stars. The debris they left behind then formed the planets that orbit them. The near absolute vacuum of space was left in the areas between. Gravity formed the “clumps” we live in, but it’s also what keeps things on Earth from flying off into space. The vacuum of space does exert a pull on the gas in our atmosphere. But just like you can pull your finger out of a vacuum’s hose with a bit of force, the force of gravity is able to keep the air in our atmosphere from getting sucked out into space. Cabin Pressure When you’re watching a sci-fi movie and a hole gets blown in the side of a spaceship, everything starts getting sucked out. That’s because the pressure inside the ship is higher than the very low pressure of space, so there’s nothing holding it in to fight the pull of the vacuum. All matter gravitates, but a ship doesn’t consist of enough matter to pull gas in—it takes something as big as the Earth. Space isn’t actually completely empty. If you waved around a big trash bag instead of a tiny bottle, you might catch a few atoms of hydrogen that managed to escape the gravitational pull of all the stars and planets. And just like the densely packed gas of the early universe was extremely hot, the hydrogen left in space is so loosely packed that it’s freezing cold. The vacuum of space is also why it’s silent. Sound is just vibration, and without any air molecules to shake, sound can’t travel. Now, the next time someone mentions the “cold vacuum of space,” you can explain how it got to be such an uninviting place. Want to take your cleaning beyond limits? Pick up Roborock’s S8 MaxV Ultra. The combination robot vacuum and mop is equipped with a FlexiArm Design Side Brush and Extra Edge Mopping System, which helps it get into corners and other hard to reach areas, ensuring a thorough clean, as well as a voice assistant that will jumpstart cleaning tasks—no WiFi necessary. You can get the Roborock S8 MaxV Ultra on Roborock’s official website. Source: Why Is Space a Vacuum? 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 11, 2024 Author Report Share Posted October 11, 2024 (edited) Fact of the Day - WEAPONIZED BUTTS? Did you know.... An important investigation of the muscular marsupials’ bony backsides. A butt that can kill sounds like a boast from a Nicki Minaj rhyme. But, according to a viral TikTok video, such a thing may be found in nature. A user named “drtanveerk” posted a 24-second computer animation with a supposed wild fact about wombats claiming that, when chased by predators, the muscular marsupials run back to their dens, pointing their backsides toward the entrance. “This is because a wombat’s butt is made mostly of thick cartilage which is hard and tough for predators to bite,” a voice says. “If a predator tries to follow it into the den, the wombat will use its butt to crush the predator’s head against the ceiling. In this way, the wombat can even kill the predator by twerking it to death.” Another video posted by the same account claims that three quick chimes on an airplane’s overhead sound system alerts the crew to an emergency. This is true. Another says that, when under assault, ants can willfully to explode to coat an adversary with toxic goo. That is true of some species. But the wombat’s deadly derrière? Is that for real? Wombats do have exceptionally tough bums. Their backside consists of four bone plates covered by thick cartilage and skin and fur. This makes it difficult for predators to snatch them from behind, and wombats have guarded the entrances of dens using their rears as a shield. But experts are not sure their butts are instruments of death. A biologist from the University of Adelaide told the Guardian in 2020, “It is possible, by their physiology and everything, but there is no evidence that that is actually happening.” Wildlife biologists have told the Washington Post and Australian Geographic that they have found dead foxes and dingoes outside wombat dens. But the wombats could just be tidying up. They often take over the dens of other animals, whose remains may still be there. Inspired by the viral video, a reporter for Popular Science asked Lisa Martin, a wildlife care supervisor from the San Diego Zoo Wildlife Alliance, directly. She said she doubted wombats would make such skilled use of their dens but it’s possible one could shake their butt “in such a way that caused the rock to go into the skull or make a cut” and “it might look like the wombat surely intended for that damage to happen to the predator.” Still, wombat butts are one of nature’s wonders, if only because they poop in cubes and control their marsupial pouch with their sphincter muscle. Source: Do Wombats Use Their Butts as Deadly Weapons? Edited October 11, 2024 by DarkRavie 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 12, 2024 Author Report Share Posted October 12, 2024 (edited) Fact of the Day - CHILDREN'S HORROR BOOM Did you know.... Just as adult horror fiction was beginning to lose steam, authors like R.L. Stine and Christopher Pike traumatized a new generation of readers in the 1990s. Undead children lure families to an abandoned town, where they murder them and drink their blood to sustain themselves. A teenage girl beats another girl to death with a baseball bat in a high-school shower. At a summer camp, a young counselor’s face is ground away on a pottery wheel, leaving behind a ”bloody mass of raw pulp.” Readers of a certain age won’t be surprised to learn that the above scenarios are taken from the pages of 1990s kid lit: R.L. Stine’s Welcome to Dead House from the Goosebumps series, Christopher Pike’s Die Softly, and Stine’s Fear Street book Lights Out, respectively. By the start of the decade, adult horror fiction was drowning in its own figurative blood. Writers and cover artists were constantly trying to one-up each other in shock value, resulting in an oversaturated market. But if horror has taught us anything, it’s to expect a coda—a final scene where Jason lunges out of the lake, Freddy reaches through the window, or Michael’s supposedly lifeless body has lumbered off to parts unknown. Horror fiction didn’t die when grown-ups stopped buying it; it just slunk off to the children’s section, where it turned the 1990s into a golden age of genuinely frightening kids’ horror literature. YA Comes of Age In 1990, young adult (or YA) literature was a relatively new phenomenon. The category is usually traced to the late ’60s, when publishers began in earnest to market books specifically to teen readers. Almost from the beginning, YA lit embraced grim themes such as gang violence (S.E. Hinton’s The Outsiders) and addiction (Robert Lipsyte’s The Contender). YA fiction, which is typically aimed at readers between the ages of 12 and 18, took a turn into even darker territory in 1974 with Robert Cormier’s The Chocolate War. The book was so controversial that teachers in Panama City, Florida, were placed under police protection after opposing local efforts to ban it from schools. Cormier’s novel wasn’t a horror story—it’s about a boy who is singled out for increasingly vicious harassment after refusing to sell chocolate for a school fundraising drive—but its blend of disturbing content and anti-authoritarianism helped establish the YA category as a haven for stories that rubbed adults the wrong way. The Adult Horror Boom Around the time that YA lit was carving out a place for itself in the American book market, horror literature was finding its footing. At this point, it might be helpful to understand the difference between category and genre, two terms that have very specific meanings in the publishing industry. Genre is a term that describes a book’s content, while category refers to a book’s target audience. Horror and romance are genres; YA, adult, and middle grade are categories. Horror novels and stories have been around for hundreds of years, but it wasn’t until the late 1960s and early 1970s that publishers were marketing scary stories as horror novels. According to horror writer Grady Hendrix, you can largely credit the horror section of your local bookstore to the publications of three powerhouse novels: Ira Levin’s Rosemary’s Baby in 1967, followed by Thomas Tryon’s The Other and William Peter Blatty’s The Exorcist in 1971. Book sales have ebbed and flowed, but horror fiction has been a distinct genre ever since. The rise of the YA market and the proliferation of horror fiction began to converge by the late 1970s, when authors such as Lois Duncan were writing thrillers and supernatural suspense novels for the teen market. But the kids’ horror boom of the late 20th century still needed one final spark to ignite an explosion. It got just that in 1981 courtesy of Alvin Schwartz, a writer and folklorist who reached into the archives of the Library of Congress and came out with the nightmares that would haunt an entire generation of American children. Enter Scary Stories By 1981, Alvin Schwartz, a prolific author of children’s books, had published several volumes of folklore, including 1973’s Witcracks: Jokes and Jests from American Folklore and the following year’s Cross Your Fingers, Spit in Your Hat: Superstitions and Other Beliefs. While sifting through the archives, Schwartz began collecting spooky folktales, urban legends, and ghost stories from oral and written traditions around the world. In 1981, a collection of those stories, accompanied by nightmarish ink-wash illustrations by Stephen Gammell, was published as Scary Stories to Tell in the Dark. It wasn’t the first collection of truly scary, sometimes disturbing stories aimed at kids—Judith Bauer Stamper’s Tales for the Midnight Hour preceded it by four years. But thanks to its combination of short, compelling stories, Schwartz’s distinctive voice, and Gammell’s nightmare-fuel illustrations, Scary Stories book became a sensation, leading to follow-ups in 1984 and 1991. The books proved that kids could not only handle grisly, genuinely frightening horror stories—they were champing at the bit for more. Publishers were happy to oblige. With the adult horror publishing boom in full swing and Stephen King in ascent, Dell Publishing launched Twilight: Where Darkness Begins, a line of young adult horror novels by writers such as Betsy Haynes, Richie Tankersley Cusick, and splatterpunk provocateur Richard Laymon, writing as Carl Laymon. Bantam Books countered in 1983 with its Dark Forces line. The series of teen shockers rode the coattails of the decade’s Satanic Panic with cautionary tales about Ouija boards, devilish video games, and sinister heavy metal music. Two years later, the young adult horror market was doing so well that a struggling writer named Kevin Christopher McFadden ditched the adult-oriented mysteries and sci-fi tales he’d been writing in favor of YA horror and suspense. He published 1985’s Slumber Party under the pen name Christopher Pike and never looked back. The next year, a kids’ humor writer named R. L. Stine followed suit, penning a teen horror novel called Blind Date at the behest of an editor. It worked out pretty well for him, and several more YA horror titles followed. Stine closed out the decade with another teen horror milestone: 1989’s The New Girl, the first book in his Fear Street series. “Forget the funny stuff,” Stine said in a 2015 interview with the Los Angeles Times. “Kids want to be scared.” Pike ’n’ Stine By 1991, the adult horror market had played out, a victim of its own excess. In his horror fiction retrospective Paperbacks from Hell, Hendrix writes that publishers had begun to lose money on adult horror titles thanks to a flooded market and a tendency to chase trends—never a formula for success in an industry where it can take years to get a manuscript from a writer’s head to a bookstore endcap. But horror never really dies; it just shapeshifts into new markets, mediums, and subgenres. Adult horror fiction was languishing in the early ’90s, but kids’ horror was thriving, and it was pushing new boundaries in terms of content. Once the teen horror market was firmly established and writers such as Pike and Stine (sometimes referred to, not exactly charitably, as “Pike ’n’ Stine”) were dominating bestseller lists, middle-grade genre fiction followed suit. The MG horror market caught fire in 1992, when Scholastic, the publisher behind the teen-centric Point Horror series, introduced R.L. Stine’s Goosebumps series with its debut installment, Welcome to Dead House. Aimed at readers aged 8 to 12, Dead House wasn’t the sort of watered-down ghost story parents might expect their kids to read. It was a legitimately creepy tale of undead ghouls who lured entire families to their deaths. The book’s young protagonists survive their ordeal, but not before the family dog is murdered and they face down a horde of zombified town folk. Scholastic immediately followed up with another Goosebumps installment, Stay Out of the Basement. (Stine’s Fear Street, meanwhile, was still going strong. Just a couple of months after Goosebumps kicked off, he introduced the Fear Street Cheerleaders series with The First Evil.) “More and more people are jumping on the horror bandwagon,” Barnes & Noble representative Ann Rucker told the syndicated newspaper supplement Kids Today in 1995. “It’s a very in thing now.” Goosebumps became a sales juggernaut. In 1996, Gannett News Service reported that 3.5 million Goosebumps novels were being sold every month in the United States. That year the series accounted for an incredible 15 percent of Scholastic’s $1 billion annual revenue, according to The New York Times. Other publishers answered with their own middle-grade horror titles, including Simon & Schuster’s Christopher Pike-authored Spooksville series and Betsy Haynes’s Bone Chillers. “Maybe it’s a fad, but I think kids have always loved scary stories,” Haynes told The Miami Herald in 1996. “There’s nothing like getting that little thrill, a little jolt, sitting in your home, knowing it will be all right in the end.” The ’90s kids horror boom wasn’t restricted to bookstores and libraries. It was a momentous decade for small-screen horror, thanks to innovative series such as The X-Files, Tales from the Crypt, Twin Peaks, and Buffy the Vampire Slayer. In October 1991—months before the first Goosebumps book hit stores—Nickelodeon kicked off its Are You Afraid of the Dark? kids horror series with a pilot episode titled “The Tale of the Twisted Claw,” a reworking of W.W. Jacobs’s classic three-wishes horror story “The Monkey’s Paw.” Fox launched a Goosebumps TV series in 1995, adapting dozens of Stine’s novels and short stories over the course of its original four-year run. The Horror Ends (Sort of) While many parents and educators were simply happy to see kids get excited about books, other adults were considerably less enthused. Thanks to their subversive content, the kids’ horror books of the ’90s roused the same force that took on horror comics in the 1940s and ’50s and rock music in the 1980s: adult outrage. The precedent had already been set in the ’80s, when Scary Stories to Tell in the Dark was targeted by conservative activists. Book-banning advocates came for Stine, Pike, and their peers in the ’90s, demanding that the books be restricted to older readers or tossed from library shelves altogether. In 1993, community watchdog Mildred Kavanaugh wrote to The Olympian demanding that librarians remove some of the more graphic kids’ horror titles from their shelves. Kavanaugh was especially disturbed by Pike’s Monster, singling it out as “vile drivel” and warning that it and similar books could “provoke some young people into self-destruction and violent acts.” Young readers were quick to defend their books—13-year-old Krista Smith fired back at Kavanaugh, suggesting that the would-be moral guardian should “spend her time writing interesting books for kids [her] age instead of complaining about the ones already written”—but they were often fighting a losing battle. The same year, Pike’s Final Friends trilogy was removed from a middle school library in Bothell, Washington, on the grounds that it supposedly promoted “alcohol, euthanasia, and cheating on tests.” Publishers also committed the same cardinal sin that doomed the adult horror fiction market: They flooded the market. In 1997, The New York Times reported a precipitous drop in Goosebumps sales. Ray Marchuk, Scholastic’s vice president for finance and investor relations, admitted to the outlet that “nontraditional booksellers” such as gas stations and supermarkets were returning large numbers of unsold Goosebumps paperbacks. “As soon as there’s full saturation and they see slowness of sales, they dump it,” Marchuk said. The original Goosebumps book series officially ended in 1997, but the brand survived thanks to a seemingly endless rollout of spinoffs such as Horrorland and Slappy World. Stine floated other series, including Mostly Ghostly and The Nightmare Room, but none of them gained traction like Goosebumps had. Pike ended the decade with 1999’s The Grave, but a car accident sidelined him for much of the next 10 years; he didn’t write another book until 2010. Other MG and YA horror authors soldiered on, but by the time the 2000s rolled around, kids’ horror had been largely tamed, and young readers were clamoring for a different sort of literary escape: Harry Potter and the Sorcerer’s Stone made its American debut via Scholastic in 1998. Even there, though, Stephen King, saw the ghost of the ’90s kids horror boom. “I have no doubt that Stine’s success was one of the reasons Scholastic took a chance on a young and unknown British writer in the first place,” King wrote for Entertainment Weekly (via Stephen King Fanclub). “[Stine’s] books drew almost no critical attention—to the best of my knowledge, Michiko Kakutani never reviewed Who Killed the Homecoming Queen?—but the kids gave them plenty of attention.” Source: It Came From the Kid Lit Section: Remembering the Children’s Horror Boom of the 1990s Edited October 12, 2024 by DarkRavie 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 13, 2024 Author Report Share Posted October 13, 2024 Fact of the Day - TRICK-OR-TREATING Did you know.... October 31 wasn’t always all about costumed children going door to door for candy. Come nightfall on October 31, you can expect to see hordes of costumed children roaming their neighborhoods while clutching buckets or bags stuffed with candy. This sugar-fueled tradition is, for many, a highlight of childhood. Trick-or-treating—which has has admittedly changed in recent decades with the rise of trunk-or-treat events—evolved from several different rituals. Historians link it to a few different ancestors, some old and some new. One is the Celtic festival of Samhain, which marked the transition to the new year, as well as the end of the harvest and beginning of the winter. The ancient Celts believed that during this short window (October 31 to November 2 in our modern calendar), the realms of the living and the dead overlapped and that spirits both good and bad could walk among the living. To confuse and ward off the evil spirits, they would sometimes impersonate them with costumes of white clothing and masks or blackface. If they encountered a spirit during the feast, the costumed Celts would be mistaken for one of the otherworldly beings and left alone. As Christianity gained influence in the British Isles, the old pagan customs were Christianized and adapted to help ease the Celts’ conversion. Three Christian holidays—All Hallows’ Eve, All Saints’ Day and All Souls’ Day, together known as Hallowmas—were placed on the same days as Samhain. All Hallow’s Eve eventually got shortened to Hallowe'en, and then Halloween, in conversation and casual usage. Going around the neighborhood for goodies may be an offshoot of souling, which started in the Middle Ages, also in the British Isles. Soulers, mostly children and some poor adults, would go to local homes during Hallowmas and collect food or money in return for prayers said for the dead on All Souls’ Day. A secular version of souling, called guising, eventually sprang up and is first recorded in Scotland in the 19th century. Guisers went door to door and earned food treats or money by offering a small performance, like telling a joke or singing a song. Some accounts of both these traditions make note of costumes that borrowed both from Samhain and British mummery. (They also mention soulers and guisers carrying vegetable lanterns, precursors to the jack-o’-lantern.) In Trick or Treat: A History of Halloween, horror author and Halloween historian Lisa Norton argues that, rather than old British customs, trick-or-treating is rooted in a more modern, more American practice with no ties to the usual ghouls and ghosts. Belsnickling, derived from the German mumming tradition of Peltznickel, was a Christmastime tradition in German-American communities where children would dress in costume and then call on their neighbors to see if the adults could guess the identities of the disguised guests. In one version of the practice, the children were rewarded with food or other treats if no one could identify them. “This same custom appears in some early descriptions of trick or treat,” Norton writes, “lending credence to the possibility that it derived from its Christmas cousin.” Whether it was born of guising or belsnickling, trick-or-treating emerged from the ethnic enclaves as its own, wholly North American custom in the early 20th century. In 1927, a newspaper in Alberta makes the one of the earlier recorded uses of trick or treat (“The youthful tormentors were at back door and front demanding edible plunder by the word 'trick or treat' to which the inmates gladly responded and sent the robbers away rejoicing”), and the term and the practice spread throughout the 1930s. After a lull caused by WWII sugar rationing, trick-or-treating surged in popularity in the 1950s, and became enshrined in pop culture with appearances in national media like The Jack Benny Show and Peanuts comic strips. Source: Why Do We Go Trick-or-Treating on Halloween? 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 14, 2024 Author Report Share Posted October 14, 2024 Fact of the Day - HOLLYWOOD MONSTERS Did you know... Undead. Hungry for blood. Centuries-old. Hollywood movie monsters have been keeping audiences awake at night and fearful about turning dark corners for more than seven decades. First popularized on the big screen in the 1930s during the silent-to-sound transition, these iconic black-and-white creations continue to frighten moviegoers and inspire modern updates in film, TV, and beyond. From a blood-thirsty vampire and an oversized ape to a creature lurking from the deep, here are the origins of seven haunting old-school movie monsters. Dracula (1931) Universal Pictures hesitated before making a Dracula movie. When first presented with the idea in the 1920s, the studio worried about negative audience reactions to a supernatural tale centering around a bloodthirsty vampire. But then a successful play based on Bram Stoker’s novel Dracula (1897) arrived in theaters, and Universal became desperate for a hit. Dracula was greenlit with the intention of placing silent screen superstar Lon Chaney Sr., known as the “Man of a Thousand Faces,” in the title role. However, Chaney died in 1930. Bela Lugosi, who’d won over audiences in U.S. theatrical productions of Dracula, was subsequently hired to portray the vampire onscreen. His good looks, Hungarian accent, and ability to carry off a tux and cape (attire that had initially been seen on the stage) helped make the movie a hit. Frankenstein (1931) Dracula‘s success prompted Universal to search for another monster movie for Lugosi, now a star. The studio opted for Frankenstein, based on Mary Shelley’s 1818 book, with Lugosi slated to play Frankenstein’s monster, a dead criminal’s body reanimated by science. Lugosi wasn’t thrilled about a role that called for his face to be hidden under layers of makeup, but he needn’t have been concerned. When James Whale was brought on to direct, he didn’t want Lugosi in the part, and instead selected Boris Karloff. The monster’s makeup was applied by Jack Pierce, who used his skills to create a flat dome on Karloff’s head to reflect the skull surgery the monster would have endured. Other touches, such as neck bolts and shortening the sleeves of Karloff’s coat to suggest long arms, resulted in an unforgettable archetype. Paired with Karloff’s acting abilities, which communicated the monster’s existential pain, this film won over critics and succeeded at the box office. The Mummy (1932) Universal soon wanted to feature Karloff in another monster movie: The Mummy. Rather than based on a book or play, this movie was partially inspired by the Egypt-mania that overtook the world following the discovery of King Tut’s tomb in 1922. The screenplay was penned by a former reporter, John L. Balderston, who’d written about the tomb. The film also spoke to fears of a so-called Curse of Tutankhamun, which had supposedly claimed the lives of several people with ties to the tomb’s opening. In the story, an Egyptian priest (Karloff) who was buried alive for trying to resuscitate his dead lover is himself restored to life when someone reads a magical scroll. Karloff appeared onscreen in bandages and in makeup that gave him an ancient, withered face (again thanks to Pierce’s skills). The movie, another hit for Karloff and Universal, installed mummies forever in the pantheon of movie monsters. King Kong (1933) Universal featured many cinematic monsters, but wasn’t the only studio to cash in on the phenomenon. In 1933, RKO Pictures wowed moviegoers with a rampaging giant ape known as King Kong. King Kong’s beginnings can be traced to Merian Coldwell Cooper filming exotic locations across the globe in the 1920s. His voyages sparked an idea for a movie that would feature a real gorilla in New York City — but then the Great Depression nixed any notion of getting the funds to shoot abroad or transport a gorilla. Cooper found a job at RKO, where he saw Willis O’Brien using stop-motion animation on another film. Cooper and RKO head David O. Selznick believed that this technique could work for a movie about an enormous ape on the loose in New York City. The result, which Cooper co-directed, was the perennially popular King Kong. The Wolf Man (1941) Universal’s Werewolf of London (1935) wasn’t a big hit, but the studio eventually decided to try another werewolf film. In The Wolf Man, Lon Chaney Jr. plays Larry Talbot, who returns from America to his family’s Welsh estate. He’s bitten by a wolf soon after his arrival, which leads to his transformation into the Wolf Man. Screenwriter Curt Siodmak drew on legends of men transforming into destructive wolves, as well as lore that a werewolf emerges during a full moon and can only be killed by silver. The movie’s original title was Destiny, to evoke how outside forces can overshadow personal will. Audiences flocked to the film and empathized with the Wolf Man, cursed with an affliction he cannot control. Creature From the Black Lagoon (1954) Creature From the Black Lagoon was the last in Universal’s old-school era of monster success. The initial idea for the film came from writer and producer William Alland, who in the 1940s heard a tale about a fish-man living in the Amazon and wrote a story treatment in 1952. But it was the look of the monstrous creature that made the film stand out. This was largely conceived by Milicent Patrick, an artist employed by Universal’s special effects shop (though her male boss claimed credit at the time). For her creature designs, Patrick studied prehistoric life from the Devonian period, a time 400 million years in the past, when some species were leaving the oceans to live on land. Though it had the option to shoot in color, Universal stuck to black and white for this movie; this cost-saving choice links this monster to earlier ones. Gojira (1954) / Godzilla (1956) Hollywood wasn’t the only place to birth movie monsters. In 1954, Japan’s Toho Studios released Gojira, about an ancient reptile who was brutally awakened by a nuclear test. Director Honda Ishiro wanted to make the movie in part due to the devastation wreaked by the nuclear bombs dropped on Hiroshima and Nagasaki during World War II. Further inspiration came early in 1954, when crew members of a Japanese fishing vessel got radiation sickness due to a nuclear test. Gojira, his name a combination of the Japanese words for whale (“kujira”) and gorilla (“gorira”), was embraced by Japanese audiences. The film was re-edited for its U.S. release. Scenes were added in which Raymond Burr played an American reporter following the story of this monster, but this version erased any message about the dangers of nuclear weapons and testing. Gojira was given a new name as well, becoming Godzilla, King of the Monsters. Source: Hollywood Movie Monsters and Their Scary Origins 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 15, 2024 Author Report Share Posted October 15, 2024 Fact of the Day - YADA-TADA-YADA Did you know.... The dispensing of pesky details dates much further back than the 1997 ‘Seinfeld’ episode that popularized it. When actress Suzanne Cryer appeared in the Neil Simon-penned play Proposals in New York City in 1997, she could hear the audience whispering. It shortly dawned on her what the murmuring was about. People were saying “yada, yada, yada.” Cryer had just appeared in an episode of Seinfeld titled “The Yada Yada,” in which her character Marcy uses the phrase as a kind of exposition filler in conversation—one that George Costanza (Jason Alexander) increasingly finds suspicious. There’s no doubt most people in attendance for Proposals, as well as everywhere else, are familiar with the idiom thanks to that episode. But yada yada yada as a piece of American slang can be traced much further back than George’s relationship troubles. Yada, Yatter, and Yaddega According to the Oxford English Dictionary, yada yada yada is likely a linguistic descendant of yatter, a Scottish word dating to 1827 that means “idle talk” or “incessant chatter or gossip.” Yatter was malleable and may have influenced similar expressions in Yiddish, like yatata or yaddega-yaddega, in the early to mid-20th century, to denote a blithe response to conversation or content. It often meant, “Then other things happened, but it’s not relevant or interesting.” The phrase appeared in gossip columns and comic strips of the 1940s, as well as a Judy Garland and Bing Crosby duet, “Yah Ta Ta.” (“When I’ve got my arm around you and we’re going for a walk, must you ya-ta-ta, ya-ta-ta, talk talk talk?”) Most notably, Richard Rodgers and Oscar Hammerstein composed a different song, “Ya-ta-ta,” for the musical Allegro in 1947, which was intended to satirize the sort of insubstantial small talk found at gatherings. Instead of dialogue, cast members at a cocktail party just repeatedly murmured, “yatata yatata.” The yada yada yada version may have been coined by controversial stand-up comedian Lenny Bruce, who once wrote “yaddeyahdah” in his 1967 book, The Essential Lenny Bruce. Bruce may have been influenced by Jewish comedians he watched growing up in the 1940s—comics who may, in turn, have adopted the phrase from vaudeville performers. It’s hard to know whether Bruce or simply cultural traditions kept the phrase afloat. But either way, the phrase was an uncommon sight in print until the 1970s. In a 1975 profile of actress Elizabeth Ashley, her use of yada yada yada is noted, which author Sally Quinn defined as “a nonsense line denoting unnecessary explanation.” Later, in 1988, Miami Herald book reviewer Debbie Sontag began a review of Jay McInerney’s Story of My Life with: “Yada, yada, yada. That’s how celebrity writer Jay McInerney’s 20-year-old narrator, the brash-tongued, screwed-up Alison Poole, describes the way her friends drone on, repeating themselves but saying very little. Well, yada, yada, yada is the story of Story of My Life. It dribbles on and on, readable but not compelling, a yada-yada novel about yada-yada people.” (Sontag did not like the book.) “The Yada Yada” Yada yada yada is similar to other hand-wave expressions like etc., or blah, blah, blah. Yet yada yada yada didn’t seem to enjoy the same popularity as those conversation fillers until it became part of the Seinfeld lexicon and its laundry list of idioms (“the double dip,” “spongeworthy,” and so on). Credit for its currency in modern language goes to Peter Mehlman, a Seinfeld writer who told Cracked in 2023 that he once encountered an editor who used the phrase yada yada. It occurred to Mehlman that, for his script, yada yada could be symbolic of details that the speaker didn’t want the listener to be privy to. Mehlman co-wrote the script with Jillian Franklyn, who kept in her notes a reminder to play with the phrase blah blah blah. A similar one, yada yada yada, won out, however. In the episode, which aired in 1997 during the show’s eighth season, George is seeing Marcy (Suzanne Cryer) , who tends to say “yada yada yada” over seemingly pertinent details. “I notice she’s big on the phrase ‘yada yada,’” Jerry observes. “That’s good. She’s very succinct.” (Jerry is more concerned that his dentist, Tim Whatley, has abruptly converted to Judaism and now appears too comfortable making Jewish jokes.) George is initially impressed by Marcy’s brevity. (“So, I’m on 3rd Avenue, minding my own business, when yada yada yada, I get a free massage and a facial.”) He adopts it for himself to gloss over his strained relationship with his parents and dead fiancé. But soon, Marcy appears to be obfuscating troubling details. (“My old boyfriend came over late last night, and yada yada yada … anyway, I’m really tired today.”) Ultimately, George realizes her use of yada yada yada is omitting a fundamental character flaw: a shoplifting habit. (Oddly enough, Mehlman believed the phrase anti-dentite, which Kramer lobs at Jerry over his concerns about Jerry’s offensive dentist jokes, would be the quotable line of the episode.) Should you use yada yada yada or add commas (yada, yada, yada)? It probably doesn’t matter, though the former might carry a staccato rhythm more conducive to whatever subject matter you’re trying to obscure. Then again, yada yada yada might be one of those rare everyday phrases meant to be spoken rather than written; its meaning is communicated better in inflection and hand gestures. Yada yada yada, or something like that. Source: Where Did ‘Yada Yada Yada’ Come From? 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 16, 2024 Author Report Share Posted October 16, 2024 Fact of the Day - CHATTERING CATS Did you know... We know that cats are prone to purring, meowing, and even hissing, but what about chirping and chattering? The first time you notice it, it can be a bit jarring (and quite funny) to hear your cat making these strange sounds at a beast outside the window. But don’t let the noises startle you. There’s a natural explanation for the very un-catlike vocalizations your pet starts emitting at passing birds and squirrels. Our feline friends let out a diverse range of sounds—cats can reportedly make as many as 21 distinct types of vocalizations, though it’s estimated that the number might be even higher. When it comes to the noises your cat makes when it sees birds or other small prey animals, they can usually be classified into two types: chirping and chattering. Why do they make these noises? Because they’re staring at their next meal, of course! The chirping and chattering occur because your cat is (to the best of their ability) attempting to mimic the sounds of the animal they’re after. Some theorize that this noise is the result of evolution, meaning that cats who made these specific noises were able to catch their prey with greater success. It doesn’t matter if you give Fluffy four meals a day—you can’t change the fact that your cat is a predator, and it’s only natural for their hunting instincts to kick in when they see an appetizing animal. Your cat’s prey instinct isn’t something that’s going to suddenly stop, and if it’s not channeled in a healthy direction, you might be the next item on the menu. This is why playtime is so healthy for felines. When a cat is “bad” and starts to go after your hands or furniture, it’s because their prey instinct isn’t being directed toward any other, more appropriate objects. It isn’t just house cats that chirp and chatter, either. It’s actually quite common to see big cats like cheetahs and lions making the very same sounds that your beloved pet expresses toward the living room window. Source: Why Do Cats ‘Chatter’ and ‘Chirp’ at Birds? 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 17, 2024 Author Report Share Posted October 17, 2024 Fact of the Day - HISTORY OF BEDSHEET GHOSTS Did you know.... These days, we tend to view bedsheet ghosts as the most benign of specters—but it wasn’t always that way. Francis Smith stared nervously at the three judges in London’s main criminal courthouse. Smith, a mild-mannered excise tax collector, had no known criminal history and certainly no intention to become the centerpiece of one of 19th century England’s most unusual murder trials. But a week earlier, Smith had made a criminally foolish mistake: He had shot and killed what he believed to be a ghost. The spectators inside the courthouse sat hushed as the prosecutor and a cross-examiner questioned about half a dozen eyewitnesses. Each person had seen Smith in the village of Hammersmith (now a part of London) the night of the crime, or they had previously seen the ghost that Smith was zealously hunting. One such eyewitness, William Girdler, the village night-watchman and Smith’s ghost-hunting partner, had not only seen the white-sheeted specter lurking across the street—he had chased it. “When you pursued it,” the cross-examiner asked, “how did it escape?” “Slipped the sheet or table-cloth off, and then got it over his head,” Girdler responded. “It was just as if his head was in a bag.” “How long had the neighborhood been alarmed with its appearance?” “About six weeks or two months.” “Was the alarm great and general?” “Yes, very great.” “Had considerable mischief happened from it?” “Many people were very much frightened.” Girdler was telling the truth. The people of Hammersmith had reported seeing a ghost for weeks now, and they were terrified. The specter was verifiably violent. It assaulted men and women, and during its two month campaign of harassment and intimidation, it had successfully evaded capture. Rumors swirled that it could manifest from graves in an instant, and sink back into the mud just as quickly. At the time, the magazine Kirby’s Wonderful and Scientific Museum reported that the ghost was “so clever and nimble in its retreats, that they could never be traced.” When Ann Millwood took the stand, the cross-examiner asked if she was familiar with these reports. “Yes, I heard great talk of it,” Millwood explained, “that sometimes it appeared in a white sheet, and sometimes in a calf-skin dress, with horns on its head, and glass eyes.” That wasn’t all. The ghost also reportedly took the shape of Napoleon Bonaparte; other accounts said that its eyes radiated like glow-worms and that it breathed fire. It must have been incredibly difficult for Millwood to describe the ghost’s appearance, especially in front of a public audience. The ghoul she characterized looked nothing like her late brother Thomas, the young man whom Francis Smith had mistakenly murdered. In 19th century Britain, seeing a ghost—at least, a person dressed up as one—was not uncommon. Ghost impersonating was something of a fad, with churchyards and cobblestoned alleyways regularly plagued by pranksters, louts, and other sheet-wearing hoaxsters who were up to no good. Historian Owen Davies tracks the origin of ghost impersonators in his wide-ranging book, The Haunted: A Social History of Ghosts, tracing the first reports of fake ghosts to the Reformation, when critics of Catholicism accused the Church of impersonating the dead to convert doubters. (According to one account by the reformer Erasmus, a priest once fastened candles to a cast of crabs and released them in a dark graveyard in hopes of imitating the lost, wandering souls of purgatory.) But for most ghost impersonators, candle-strapped crustaceans were unnecessary; all you needed was a white sheet. Up until the 19th century, the bodies of the poor weren’t buried in coffins but simply wrapped in fabric—sometimes the sheet of the deathbed—which would be knotted at the head and feet. Ghost impersonators adopted the white sheet as their de facto wardrobe as early as 1584, when Reginald Scott, a member of parliament and witchcraft aficionado, wrote that “one knave in a white sheet hath cozened [that is, deceived] and abused many thousands that way.” It’s from this practice that the trope of a white-sheeted ghost originated. Seventeenth- and 18th-century Britain are sprinkled with accounts of phony phantoms. Take Thomas Wilmot, a famed crook and highwayman who once disguised himself as a spirit to steal money. (His appearance—chalked-up skin and a sheet-bound head—sent a table of gamblers scrambling for an exit. Wilmot pocketed the cash they left on the table.) And by the 1760s, so many white-sheeted pranksters were prowling in cemeteries that annoyed citizens were paying bounties to get rid of them. According to the Annual Register, one ghost in southern Westminster “struck such terror into the credulous inhabitants thereabouts, that those who could not be brought to believe it a ghost, entered into a subscription, to give five guineas to the person, who would seize him.” These pranks had consequences. In 1792, a ghost impersonator in Essex spooked a farm-worker steering a wagon; the horses jumped, the driver tumbled, and his leg was crushed by one of the wagon’s wheels. He died from his injuries. Twelve years later, soldiers in London’s St. James’s Park spotted the specter of a headless woman, an event that authorities took very seriously, if only because it was distracting—and reportedly harming—its security guards. In the 1830s, a ghost impersonator was tried for manslaughter because he literally frightened an 81-year-old woman to death. It was dangerous for the so-called ghosts, too. In 1844, six men chased a ghost impersonator and beat him so badly that he had to visit the hospital. In 1888, a mob of 50 villagers—all armed with sticks—surrounded a “ghost” and only released him after he agreed to donate money to a local infirmary. (Some ghost-busts startled investigators for other reasons: Davies writes that, in 1834, an investigation of an unoccupied haunted house revealed “nothing more than some boisterous love-makers.”) Like many other pastimes in 19th-century Britain, ghost impersonating was a gendered activity: Women, especially young female servants, were often restricted to mimicking poltergeist activity indoors—rapping on doors, moving furniture, throwing rocks at windows—while the sheet-wearing hijinks were reserved for young men who, far too often, had scuzzy intentions. Most accounts of ghost impersonating, both modern and historical, gloss over the fact that men often used their ghostly cover to intimidate, harass, sexually assault, and even rape women. In his precise and critical account of ghost impersonators, Spirits of an Industrial Age, the historian Jacob Middleton argues that ghost impersonating was not only the domain of juvenile pranksters, but also that of sexual predators. This was made most painfully clear during the 1830s, the height of hauntings by “Spring-Heeled Jack.” Every day, London’s women had to contend not only with the persistent threat of cads and street harassers, but also with men the press dubbed “Monsters,” menaces who stalked, grabbed, groped, slashed, and stabbed women in the breasts and buttocks. These criminals were piquerists, people who took sexual pleasure in piercing the skin of women, and a spate of attacks in the 1780s put all of London at unease. In the early 1800s, these boors started to take cover by dressing as ghosts. Spring-Heeled Jack, called a “monster in human form,” was among them: Hiding in alleyways after sunset, he would seek lone women, knock on their doors, and attempt to tear away their clothes with hooks. Thanks to London’s sensationalist press, tales of Spring-Heeled Jack would bloat into urban legend. But even before Spring-Heeled Jack, on a normal evening, the women of Hammersmith were justified in feeling worried about stepping outside after dark. Organized police forces were a relatively new idea in Great Britain, and solitary neighborhoods such as Hammersmith were protected by little more than a roving constable or watchman. Reports of the Hammersmith ghost intensified that anxiety. (The community’s men weren’t much help. As the Morning Post reported, “[The ghost] was seen on Monday evening last pursuing a woman, who shrieked dreadfully. Although there were four male passengers in the stage coach, which passed at the time, not one durst venture to the rescue of the distressed female.”) It wasn’t until weeks of attacks that bands of locals, their bellies sloshing with ale supplied by the nearest public house, began taking to the streets to stop the menace. It was at the intersection of these two sad facts that the tragedy at Hammersmith unfolded: Francis Smith went out on January 3, 1804 to catch a ghost, while Thomas Millwood went out to ensure that his wife, who was walking home alone in the dark, did not meet one. Thomas Millwood was told he resembled the Hammersmith ghost. A bricklayer, Millwood wore a white jacket, white trousers, and a white apron, an ensemble that scared a carriage-riding couple one dark Saturday night. When the passerby exclaimed to his wife, “There goes the ghost!” Millwood turned and uncorked a few colorful and unprintable words, asking if the man wanted “a punch in the head.” After the incident, a family member named Phoebe Fullbrooke implored Millwood to change his wardrobe at night. “Your clothes look white,” she said. “Pray do put on your great coat, that you may not run any danger.” Millwood mumbled something about how he hoped the town’s vigilantes would catch the ghost, but he neglected to take the advice and continued walking home in his white work clothes. A few nights later, Francis Smith and William Girdler went ghost hunting. Compelled by reports of the ghost’s violence, the men carried firearms. Hammersmith’s spirit had choked a man and the village swirled with rumors that it had even attacked a pregnant woman who later died of shock. According to one report, the apparition caused “so much alarm, that every superstitious person in that neighborhood had been filled with the most powerful apprehensions.” But superstitions mattered little. Ghost or not, there was undoubtedly a public menace in Hammersmith, and people wanted it gone. A bounty of 10 pounds would be awarded to anybody who caught it. That same night, Thomas Millwood stopped at his father’s house and began chatting with his sister Ann. Sometime between 10 and 11 p.m., she suggested he leave and escort his wife, who was still in town, back home. “You had better go,” Ann said. “It is dangerous for your wife to come home by herself.” Millwood agreed and stepped outside, wearing his white bricklayer’s clothes. He didn’t know that he was walking down the same unlit lane as Francis Smith and his shotgun. When Smith spotted the white figure gliding in his direction, he lifted his fowling piece to his shoulder and yelled, “Damn you, who are you? Stand, else I’ll shoot you.” The air stood silent. He yelled a second time and stared down the barrel. Not hearing any response, Smith fired. Millwood’s sister heard the gunshot and screamed for Thomas, but, like Smith, she heard no response. She later found her brother lying face up on the dirt lane, his face stained black with gunpowder, his white clothes stained red. The Caledonian Mercury reported the sad news later that week: “We have to announce to the public an event, in some of its circumstances so ludicrous, but in its result so dreadful, that we fear if the reader should even laugh with one side of his mouth, he must of necessity cry with the other.” The moment the smell of spent gunpowder hit his nose, Smith knew he’d made a mistake. Millwood had been killed instantly; the shot entered his lower left jaw and exited through the back of his neck. Smith barged into the White Hart pub in visible distress, possibly in shock, and waited to be arrested. One week later, he stood trial at London’s Old Bailey courthouse. The jury deliberated for 45 minutes before returning with a conviction of manslaughter. The three judges rejected the sentence. “The Court have no hesitation whatever with regard to the law,” Justice Rooke exclaimed, “and therefore the verdict must be—‘Guilty of Murder’ or ‘a total acquittal from want to evidence.’ ” In other words, the jury could not be wishy-washy. Smith was either guilty of murder, or not guilty of murder—the jury needed to decide. Within minutes, Smith was convicted of murder. He was sentenced to hang the next Monday; his body would be dissected in the name of science. Reports of Smith’s trial were lurid. As the Newgate Calendar tells it, “When the dreadful word ‘Guilty!’ was pronounced [Smith] sank into a state of stupefaction exceeding despair.” His feelings were likely intensified by the admission of John Graham, a Hammersmith shoemaker who days earlier admitted to starting the Hammersmith ghost hoax. (Graham began impersonating the specter to scare his apprentices, who he complained were filling his children’s heads with nonsense about ghosts. Unfortunately, his prank appears to have inspired violent copycats to engage in what the Caledonian Mercury called “weak, perhaps wicked frolic.”) In the end, Smith would be lucky. His sentence was sent to His Majesty King George III, who not only delayed the execution but eventually granted Smith a full pardon. The Hammersmith ghost trial, however, would haunt England’s legal system for almost another two centuries. Smith’s case would remain a philosophical head-scratcher: If somebody commits an act of violence in an effort to stop a crime from occurring—only to realize later that they were mistaken and that no crime was being committed—is that person still justified in using violence? Or are they the criminal? British law would not be make room for this gray area until the 1980s. Meanwhile, the tragedy in Hammersmith failed to deter England’s many ghost impersonators. Pranksters and creeps alike continued wearing bedsheets in dark cemeteries and alleyways for almost another century. In fact, the ghost of 1803 and 1804 would not be the last specter to haunt the village of Hammersmith. Two decades later, a ghost would return. But this time, villagers whispered rumors that this haunting was real, caused by the angry soul of a white-clad bricklayer named Thomas Millwood. Source: Assault, Robbery, and Murder: The Dark History of “Bedsheet Ghosts” 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 18, 2024 Author Report Share Posted October 18, 2024 Fact of the Day - "PEANUTS" SPECIAL Did you know.... It’s pumpkin spice season, and that means it’s time for Linus, Lucy, Snoopy, That Round-Headed Kid, and the whole gang to appear in It’s the Great Pumpkin, Charlie Brown. One of the most beloved of the animated “Peanuts” specials, it’s based on Charles M. Schulz’s long-running comic strip. The newspaper comic debuted in 1950, and the nearly 18,000 strips published before Schulz’s death in 2000 make “Peanuts” perhaps the longest-running story ever told by one person. Whether you’re waiting in the pumpkin patch with Linus or trick-or-treating (not for rocks!) with everyone else, here are five fun facts about some of America’s favorite cartoon specials. First, There Were Fords In 1956, country and gospel singer Tennessee Ernie Ford became the host of the prime-time musical variety program The Ford Show, which was sponsored by the Ford Motor Company (no relation). In 1959, Ford licensed the “Peanuts” comic strip characters to do TV commercials and intros for the show, hiring film director and animator José Cuauhtémoc “Bill” Melendez to bring the figures to life. Melendez, who started his career at Walt Disney Studios, was the only artist whom Schulz would authorize to animate the characters. The multitalented Melendez also provided the “voices” for Snoopy and Woodstock. And Then Came Christmas The animated commercials (and The Ford Show) were a huge hit. On December 9, 1965, the 30-minute A Charlie Brown Christmas made its debut on CBS. Some predicted that the show’s use of child actors, lack of a laugh track, and jazz soundtrack would render it a flop. Instead, A Charlie Brown Christmas won an Emmy and a Peabody and became an annual tradition, airing on broadcast television for 56 years before moving to the Apple TV+ streaming service in 2020. Jazz composer and pianist Vince Guaraldi’s score became a bestselling album, with more than 5 million copies sold. It’s the second-oldest recurring holiday animation, coming after Rudolph the Red-Nosed Reindeer, which made its first appearance in 1964. A “Peanuts” Special Probably Killed (Aluminum) Christmas Trees A Charlie Brown Christmas was a critique of the materialism and commercialism of the Christmas season, and was especially harsh on the mid-’60s mania for shiny aluminum trees. The Mirro Aluminum Company (then known as the Aluminum Specialty Company) of Manitowoc, Wisconsin, began producing Evergleam aluminum trees in 1959, and at its peak in 1964, made around 150,000 of them a year. In the special, Lucy orders Charlie Brown to “get the biggest aluminum tree you can find … maybe paint it pink!” Charlie Brown instead chooses a half-dead, barely needled little fir. Sales of the shiny fake trees plummeted soon after. Halloween and Thanksgiving Came After Christmas The first “Peanuts” special was such a hit that it soon spawned an entire industry of “Peanuts” specials. Many were themed around holidays, including Arbor Day. It’s the Great Pumpkin, Charlie Brown, which aired in 1966, has our poor hero receiving rocks instead of candy while trick-or-treating. The plot of A Charlie Brown Thanksgiving, meanwhile, which aired in 1973, had Peppermint Patty inviting the gang to Charlie’s house for dinner — when he was supposed to eat with his grandmother. Linus, Snoopy, and Woodstock pull together a feast of toast, popcorn, pretzels, and jelly beans … but there’s a happy traditional turkey for everyone at the end. There Are a Ton of “Peanuts” Specials, and They’re Still Popular In addition to the holiday-themed programs (which included shows for New Year’s, Valentine’s Day, and Easter), the “Peanuts” specials empire includes a full-length feature, A Boy Named Charlie Brown. Greenlit in 1969 after the success of other specials, A Boy Named Charlie Brown has its namesake competing in the National Spelling Bee, only to blow his chances by misspelling the word “beagle.” There are also documentaries and television series, including new releases like Welcome Home Franklin, which aired for the first time in 2024. Source: Great Facts About “Peanuts” Specials 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 20, 2024 Author Report Share Posted October 20, 2024 Fact of the Day - GIVE A TOAST Did you know.... Today, cultures around the world have specific rules and phrases for the common toast. In South Korea, one accepts a drink with two hands, and in Italy, locking eyes is absolutely essential. But how exactly does the word “toast,” as in dry bread, figure into all of this? Well, it turns out dunking literal pieces of toast into a drink during celebrations in someone’s honor was commonplace centuries ago. Historians believe the practice came from the idea that the bread soaked up unwanted bitter or acidic sediments found in wine, thus making the drink more enjoyable. By the 18th century, the term “toast” somehow became more entwined with the person receiving the honor than the bread itself, which is also where the phrase “toast of the town” originates. Although dipping crusty bread into your beverage isn’t a common custom today, you don’t have to look hard to find remnants of the practice in literature. In William Shakespeare’s The Merry Wives of Windsor, the hard-drinking Falstaff quips, “Go, fetch me a quart of sack [wine]; put a toast in ’t,” a reference to the bread-dipping ritual. Lodowick Lloyd’s The Pilgrimage of Princes, written in 1573, also contains the passage “Alphonsus … tooke a toaste out of his cuppe, and cast it to the Dogge,” confirming that the alcohol-infused bread didn’t always go to waste after being dunked. Because general toasting in 16th- and 17th-century Europe was often an excuse to drink heavily, many temperance movements, including one in Puritan Massachusetts, banned the practice in the name of health. Of course, these bans didn’t stick, and today toasts — sans actual bread — are central to some of the biggest celebrations in our lives. Libation is an ancient drink-pouring ritual found in many cultures. Today the word “libation” is mostly used as a stand-in for “alcoholic beverage,” but such a definition omits the complex history of the religious and secular ritual known as libation — the act of pouring out a drink to honor the deceased or a deity. Libation is one of the most widespread yet least understood rituals in human history. The act of pouring out liquid (whether on the ground or on an elaborate altar) can be found in cultures throughout the world dating back to the Bronze Age. The Papyrus of Ani, dated 1250 BCE, reads, “Pour libation for your father and mother who rest in the valley of the dead,” and religions with seemingly little connection, such as Greek paganism, Judaism, Christianity, and traditional African religions, all feature some sort of libation ceremony. Even tribes in pre-Columbian South America, separated by an entire ocean from these other examples, performed similar liquid sacrifices. Today, forms of libation rituals still occur in Kwanzaa celebrations, weddings, the hit comedy show Key & Peele, and in bars around the world, where patrons (usually metaphorically) “pour one out” for the dearly departed. Source: The concept of toasting comes from putting a piece of toast in one’s drink. 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 20, 2024 Author Report Share Posted October 20, 2024 (edited) Fact of the Day - HISTORIC CEMETERIES Did you know... Three historic cemeteries, designed as refuges for the dead, are bringing their landscapes back to life for native plants and animals. The weathered gravestones at Mount Olivet Cemetery, the final resting place of more than 180,000 people, mark a city of the dead that is incongruously alive. On a slope overlooking northeast Washington, D.C., near the graves of the Lincoln assassination conspirator Mary Surratt and White House architect James Hoban, oak trees rustle their leaves. Golden dragonflies zip through sunbeams and bumblebees flit between purple coneflowers. A blue jay screeches while a downy woodpecker hammers at a branch. A buckeye butterfly on a black-eyed susan in one of Mount Olivet's new rain gardens. / Matt Kane/TNC All of the activity is concentrated around newly constructed rain gardens—ditches partitioned by weirs, filled with gravel and soil and native plants—that catch stormwater running off the cemetery’s hillsides. Instead of flooding the grounds and overwhelming the sewer system, the water is safely filtered back underground and into the Anacostia River watershed. Though a summer drought left them dry, the gardens bloom with ironweed and black-eyed susans and buzz with insects, the foundations of a vigorous ecosystem among the dead. A completed rain garden, which took over one lane of a formerly two-lane roadway, at Mount Olivet Cemetery. / Matt Kane/TNC Mount Olivet, founded in 1858 as a racially integrated Catholic burial ground, is not the only historical American cemetery being reimagined as a resilient haven for wildlife. Other 19th-century garden cemeteries—a style of burial ground that combined reverence for the dead with Romantic interpretations of nature—are retrofitting their secluded groves and quiet ponds as sanctuaries for flora and fauna that have long been absent from the surrounding urban landscape. Native plants and fungi with appropriately spooky names (snakeroot, weeping widows, witch’s butter) are sprouting among the graves, while modern infrastructure to address flooding also creates places for frogs, butterflies, and birds. Even bats and coyotes are moving in. “To Rob Death of a Portion of Its Terrors” For centuries, Christians buried their dead in churchyards. The graves were protected by a fence as well as the comforting shadow of the church itself. Society’s outcasts were interred beyond the village limits, in spiritual and actual wilderness. “Nature was not welcomed” within the churchyards, James R. Cothran and Erica Danylchak write in Grave Landscapes: The Nineteenth-Century Rural Cemetery Movement. Some members of the clergy even associated certain plants and trees with pagan religions or viewed nature as evil. Puritan immigrants carried this attitude with them to New England in the 17th and 18th centuries. “For the first settlers in the Massachusetts Bay Colony,” historian Blanche M. G. Linden writes, “the woods were a netherworld where the faithful could be bewildered by evil spirits, witches, or even the devil himself. A hostile, menacing nature loomed large around the Puritan town.” Boston’s leaders thus established burial grounds in the town’s center—and quickly ran out of space. The Granary Burying Ground, founded in 1660, became so overstuffed that bodies had to be buried four deep in unmarked, common plots. Boston’s Granary Burying Ground was overstuffed by the early 19th century. By the dawn of the 19th century, Puritanism had declined and so had the care of Boston’s graveyards. The city’s board of health found “the odor being such as to sicken persons in the vicinity. The tombs were exceedingly dilapidated, giving free vent to gases, and in some instances men cutting grass had fallen into them. The soil of both the Granary and King’s Chapel was fairly saturated with buried remains, the two cemeteries containing about 3000 bodies” in a combined area smaller than two football fields. The situation demanded innovative solutions. Dr. Jacob Bigelow, a lecturer in medicine and botany at Harvard, gathered some of the city’s civic leaders at his home and proposed a new style of burying ground, one that would “rob death of a portion of its terrors.” The cemetery would be located outside the city in a plot of woodland already known as Sweet Auburn, “in which the beauties of nature should, as far as possible, relieve from their repulsive features the tenements of the deceased” and soothe the grief of mourners, he wrote. Jacob Bigelow When it opened in 1831, Mount Auburn Cemetery epitomized what historian Philippe Ariès calls the “Age of the Beautiful Death,” wherein the earthly trials of the deceased were left behind for an afterlife of peace with loved ones in eternity. The concept dovetailed with Americans’ changing view of nature, influenced by European Romanticism and the contrast between America’s industrialization and its rural past. Experience in nature—or a suburban approximation of it, improved by human hands—came to be “the essential source of moral, intellectual, poetic, and spiritual energy,” Denise Otis writes in Grounds for Pleasure: Four Centuries of the American Garden. In this way, Mount Auburn and the garden cemeteries that followed were always built more for the living than for the dead. Mount Auburn Cemetery: A Refuge for the Living Then, as now, visitors to Mount Auburn enter through a large gate in the Egyptian Revival style. Walking paths named after plants wind through glens where headstones and monuments nestle in the vegetation. Weeping willows arch over picturesque ponds. Benches and fountains give visitors places to rest and contemplate. Before public parks were common, Mount Auburn served as a lush refuge where families could socialize, have picnics, and breathe fresh, unpolluted air among the monuments. Now, it’s adding to that legacy by welcoming back native flora and fauna as well as citizen scientists. ‘Mount Auburn Cemetery’ The effort began in 2014 with a meeting of teams of biologists, ecologists, herpetologists, ornithologists, hydrologists, habitat restoration specialists, and landscape designers, says Paul Kwiatkowski, director of urban ecology and sustainability at Mount Auburn Cemetery. After walking among and observing the landscape’s features, the teams developed a “mini masterplan” with two initial goals, he tells Mental Floss: removing invasive and undesirable plants and replanting with native species, which provide better habitat for wildlife while improving Mount Auburn’s aesthetic appearance; and conducting terrestrial and aquatic species surveys to determine what was already living in the cemetery’s ponds and meadows. “This effort provided the information required to expand a plan for the reintroduction of native amphibian species that once resided at the cemetery, but were no longer present,” Kwiatkowski says. “We have since successfully reintroduced breeding populations of American toads, spring peepers, and gray treefrogs, and we have a list of amphibians, reptiles, and fish that we hope to introduce in the future.” The landscape of Mount Auburn Cemetery. The initial projects evolved into Mount Auburn’s Wildlife Action Plan, which continues to develop new initiatives to engage the community in its ongoing conservation activities. Beginning in 2016, the cemetery launched its Citizen Science Naturalist Program, which recruits local experts from Boston’s many colleges and universities to help design and implement biodiversity research field projects on the grounds. The program also runs field training and a classroom to create a team of knowledgeable citizen scientists to support the studies. More than 300 people have attended classroom instruction since 2016 and 50 field assistants take part in ongoing field work each year, which includes studies of urban bats and coyotes, surveys of breeding bird populations, insect counts, fungi and lichen monitoring, and the reintroduction of the native red-backed salamander. More than 500 species of animals, plants, and fungi have been spotted in Mount Auburn and logged into the popular iNaturalist app. All of the conservation and horticultural work that takes place at Mount Auburn has to be done without harming the historic monuments and burial plots, which exist in all sizes and dimensions. For example, 1960s-era markers installed at grade with the ground present a challenge for replanting, says Dennis Collins, the cemetery’s horticultural curator. At that time, “nearly all of the grounds were simply covered in grass and maintained with regular mowing. To introduce alternative groundcovers around these monuments, care must be taken to allow visitors to find them, and to read the inscriptions,” Collins tells Mental Floss. An American robin feeds her nestlings in Mount Auburn Cemetery. When installing new wildflower meadows around the flat plaques, the staff has surrounded the gravesites with patches of grass so visitors can find their loved ones among the sprawl of nature. Mount Olivet Cemetery: From Roads to Rain Gardens Founded just before the Civil War, Mount Olivet Cemetery brought Mount Auburn’s winding paths, sheltering groves, and contemplative atmosphere to Washington, D.C. Its exposed location on the crest of a hill creates the illusion of its monuments floating above the rest of the cityscape. Weathered white obelisks, praying angels, and crosses rise up from the lawn against blue sky. A section of Mount Olivet Cemetery in Washington, D.C. This landscape has caused the cemetery a serious problem in recent years, however. Rainfall doesn’t soak harmlessly into the ground, but hits impervious pavement and flows downhill into Hickey Run, a tributary of the Anacostia River. On the way, it collects oil, sediment, and trash and carries it directly into the already-polluted waters. The Catholic Archdiocese of Washington, which owns the cemetery, was paying tens of thousands of dollars in impervious area charges—calculated according to its square footage of surfaces like roads and lawns—to the city each year and needed to reduce its thousands of gallons of runoff. The stormwater issue was not unique to Mount Olivet. To find a sustainable, citywide solution, the District of Columbia established a first-of-its-kind stormwater retention credit (SRC) market in 2014, which allows property owners to retrofit impervious surfaces with water-retaining green infrastructure, like flower beds and shrubbery. That generates credits that the property can sell on the SRC market. Other properties purchase the credits to meet a portion of their legally mandated stormwater retention requirements. Thus, both seller and buyer save money in stormwater fees while enhancing the landscape. Mount Olivet was eager to be the first property to test this model [PDF] and began collaborating with The Nature Conservancy’s Maryland and D.C. chapter in 2017 to bring a pilot project to fruition. “One thing that came up repeatedly in conversations, both with the Archdiocese and also during events that were were holding around this, was [that] they wanted to connect the work happening at Mount Olivet with the pope’s encyclical on the environment, [which] had come out fairly recently at that time, about care for the Earth,” Matthew Kane, the conservancy’s associate director of communications, tells Mental Floss. “They saw that this work had a clear connection to that call, to better care for the world around them—which was a really wonderful message for them to be bringing to their audiences and to the members of the Catholic church in Washington,” he says. The Nature Conservancy set up a company to manage the cemetery’s generation of SRC credits and arranged private funding to develop several qualifying green infrastructure projects at Mount Olivet. The first, a major initiative to install nine rain gardens, was completed in summer 2024. Based on a study of the site’s topography, soil composition, and layout, which included the use of ground-penetrating radar and historical documents to determine the location of graves, engineers replaced underused roadways with linear rain gardens—effectively turning one lane of several two-way roads into long basins planted with native flora. During storms, rain flows downhill and collects in these basins, where it trickles slowly through layers of substrate back into the ground. The gardens don’t just reduce runoff; they also provide habitat for pollinating insects and enrich the cemetery’s aesthetic virtues for visitors. The Archdiocese and The Nature Conservancy also worked with a local nonprofit, Casey Trees, to replace disused sidewalks with new plantings. At the conclusion of all phases of the project in 2024, the numbers are impressive. It generated 217,717 credits for the cemetery to sell on the stormwater market, the proceeds from which can be reinvested in more green infrastructure developments. So far, the rain gardens have replaced more than 44,000 square feet of impervious surfaces and have kept 5,357,233 gallons of stormwater from flooding and polluting Hickey Run. The groups have also planted 420 trees and 182 shrubs, where new populations of birds are starting to move in. “Through this project, I feel like the cemetery has really embraced being a space for people to visit and to enhance their environmental footprint,” Aileen Craig, The Nature Conservancy’s D.C. program director, tells Mental Floss. “They’ve been looking for other initiatives and other ways to continue this work. The inspiration that this project has been to the cemetery itself to do more, and to other cemeteries that are part of the Catholic Archdiocese, is very special.” Green-Wood Cemetery: An Evolving Landscape By the mid-19th century, at least six other U.S. cities had established a rural cemetery based on Mount Auburn’s philosophy and design. Green-Wood in Brooklyn, New York, opened in 1838 and encompasses 478 acres of natural hills, valleys, and ponds carved by the retreat of ice age glaciers. Among the knolls and groves lie 570,000 graves marked with statues, busts, monuments, and elaborate mausoleums in a mix of architectural styles, honoring a similar amalgam of deceased New Yorkers, from Samuel F. B. Morse and Louis Comfort Tiffany to Charles Feltman, inventor of the hot dog. Sylvan Water, the largest pond at Green-Wood Cemetery. In addition to its permanent residents, Green-Wood is home to an varied array of living things. It boasts some of the city’s oldest and largest trees, including a former state champion sassafras. An extensive survey of the cemetery’s wildlife and ecosystems, published in 2018, found 129 species of birds, three reptiles, three amphibians, 12 mammals (including six bat species), and more than 90 moths and butterflies [PDF], among other taxa. In practical terms, “Green-Wood is the largest privately owned piece of land in all of New York City,” says Matt Rea, director of strategic partnerships at The Nature Conservancy. Any issues involving the landscape have an impact on the surrounding community and city infrastructure, he says, but efforts to correct them can also have a large-scale effect. Like Mount Olivet, Green-Wood had its own problems with stormwater flooding one of the ponds and frequently overflowing the neighborhood’s combined sewer system (in which sewage and storm runoff move through the same pipes). In 2021, Green-Wood’s vice president of horticulture, Joseph Charap, heard about The Nature Conservancy’s projects at Mount Olivet. “We have 186-year-old infrastructure that was never built to handle the amount of rainfall that we’re receiving today,” he tells Mental Floss. “I said, ‘you know, they’re doing that there, and we have all these water resources and we are connected to the combined sewer’ … there was a direct line of inspiration.” A hermit thrush, one of the dozens of migratory bird species seen at Green-Wood, rests on one of the botanically named path signs. Green-Wood and The Nature Conservancy collaborated to study the overflow patterns, design a stormwater reduction project, and acquire city and state funding to build it. The final plan focused on Sylvan Water, the cemetery’s largest pond, which routinely flooded its service yard and cut off access to one of its entrance gates. The cemetery and its partners settled on a five-pronged project. At Sylvan Water, engineers installed an algorithm-based, adaptive system that predicts how much rainfall will occur in a given storm, then remotely opens a valve to release that amount of water from the pond into the sewer prior to the event. The system allows rainwater to collect in the pond during storms without overwhelming the surrounding infrastructure. Adjacent to Sylvan Water, the cemetery built a reuse vault that filters water from the pond to be used for irrigating the shrubs, grasses, and trees throughout the property. It also constructed a huge, underground tank—like “a giant swimming pool,” Charap says—that can hold 66,000 gallons of stormwater before gradually releasing it into the sewer. At ground level, four large sections of asphalt were replaced with permeable pavers that absorb water, and a small bioswale (a stormwater-catcher similar to a rain garden) was installed. One part of the plan remains to be built: an emergent habitat of native aquatic plants around the edge of the water body that will offer beauty as well as a nurturing environment for insects, birds, and amphibians—including northern spring peepers and northern gray tree frogs, two native frog species that were not seen during the 2018 survey. Those plantings will complement other areas of the cemetery that have been transformed from turf into native wildflower meadows. A spring peeper, a frog that depends on emergent habitats for shelter With the stormwater system newly finished, it’s too soon to observe how well it’s working. Usually, there’s a three-month learning phase as the system calibrates. But “if we did our jobs as engineers and designers, Green-Wood won’t see anything,” Rea says. “The only thing they will see is that it doesn’t flood anymore like it used to flood.” Like Mount Olivet, Green-Wood aims to be a test of a new approach to reimagining private cemeteries as good environmental stewards, using a variety of funding sources for public benefit. Charap hopes that “Green-Wood can serve as a model for successful, publicly funded projects on private land that have a direct impact on the city at large”—an effect that dovetails with the history of garden cemeteries as the country’s public first green spaces, as places to celebrate life as well as commemorate those who have passed on. “They have this environmental legacy as part of their framework,” he says. “These landscapes have been evolving ever since they were first conceived. Cemeteries are … now reflecting the evolution of views of the natural world, just as much as they do mortality.” Source: From Snakeroot to Salamanders, Life Thrives in Cemeteries Edited October 20, 2024 by DarkRavie 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 21, 2024 Author Report Share Posted October 21, 2024 Fact of the Day - CURVED BANANAS, WHY? Did you know.... Considering most fruits have a spherical or ovate shape, the average banana’s long, curved appearance is something of an anomaly. This unique curvature is due to a scientific concept called negative geotropism, where the stem flexes upward as the plant grows, rather than being pulled straight down by the forces of gravity. While most fruits simply absorb sunlight and grow downward toward or into the earth, bananas begin to curve as they strive to find sufficient sunlight to fuel their growth. This has to do with the unique presence of photosensitive hormones called auxins, which influence how bananas react to light. Some bananas grow in lush rainforests with dense canopies, which can obscure the fruit from getting enough light. In these cases, bananas will grow toward the sky to break through the light-blocking canopy. But negative geotropism still occurs even in other environments where there’s plenty of direct sunlight. The auxins are distributed unevenly along the side of the banana facing the sun, triggering accelerated growth on that side and causing the fruit to curve away from the earth’s gravitational pull. In the very early stages of development, bananas actually grow at a straight downward angle, only developing their signature shape later on. As the fruit matures, it will begin to flex upward in search of additional sustenance. But even as this happens, gravitational forces will continue to pull the banana down toward the ground and away from the sun. This combination is what ultimately gives bananas their distinct curve. A visual artist once sold two bananas for $120,000 each. In 2019, visual artist Maurizio Cattelan unveiled a conceptual piece titled “Comedian” at the Art Basel exhibition in Miami Beach. This unusual artistic work consisted of a banana that had been duct-taped to the wall. For years, Cattelan had dreamed of creating a sculpture in the shape of a banana; he often brought a banana with him on his travels and hung it on the wall for inspiration. But eventually, he gave up on the idea of creating a new sculpture and instead decided to exhibit the banana itself. He brought three editions of “Comedian” with him to Miami, two of which immediately sold for $120,000. Given the high level of interest, Cattelan raised the price of the third one to $150,000, which also promptly found a buyer. A week later, performance artist David Datuna ate one of the pricey fruits right off the wall, criticizing the artwork for embodying wealth inequality and food insecurity. Source: Bananas are curved because they grow toward the sun. 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 22, 2024 Author Report Share Posted October 22, 2024 Fact of the Day - WASH BEFORE USE Did you know.... New cookware may come with contaminants, such as metal shards and “finishing chemicals.” Stocking your kitchen with new purchases can be as fun as cooking in it. It’s tempting to try out your favorite recipes after unboxing your shiny new pots and pans, but there are good reasons to hold off. According to experts, skipping that preliminary clean is unsanitary. Some kitchenware—like tongs, mugs, and dishes—are unwrapped in retail stores. These items have likely accumulated dust from sitting on the shelf—not to mention germs from shoppers and employees handling them. If you order cookware online and it comes in a package, that doesn’t mean you don’t need to clean it before the first use. New, unwashed utensils can pose food safety hazards to those who eat with them. Brian Chau, the food scientist and food systems analyst behind Chau Time, told Southern Living that “finishing chemicals” are added to kitchenware during the manufacturing process. These chemicals may leave behind residue that isn’t food-grade, so they qualify as contaminants with the potential to cause illness. Certain items may also carry small shards of metal leftover from production. Though they may be too tiny to notice, they can cause irritation and pain for those with irritable bowel syndrome even when ingested in small amounts. The best way to ensure your new cookware and dinnerware is food-safe is to clean it before putting it away. This is easy enough for dishwasher-safe goods, but you should always follow the manufacturer’s cleaning instructions to avoid damaging any new additions to your home. Your kitchen isn’t the only spot where you need to worry about tackling germs. If you don’t remember the last time you cleaned your washing machine, it’s likely crawling with bacteria. And, yes—you need to wash new clothing purchases before wearing them the same way you need to clean new plates. Here’s the (very disgusting) reason why. Source: Why You Should Always Wash New Kitchen Items Before Using Them 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 23, 2024 Author Report Share Posted October 23, 2024 Fact of the Day - 9 LIVES Did you know... Cats, like people, only live once. As much as pet owners everywhere would love it to be true, it’s a myth that cats actually have nine lives. You’ve probably heard the saying at some point in your life, but where did something so improbable originate? This belief doesn’t have a clear beginning—but we can turn to several parts of history to get an idea of why humans started throwing the phrase around. Cats in Culture The idea that cats have nine lives has been around for a long time—although it’s hard to pinpoint exactly how long. Most sources cite an old English proverb that states: “A cat has nine lives. For three he plays, for three he strays, and for the last three he stays.” There’s no physical evidence of where the proverb originated or how often it was said; it was likely passed down orally. It’s also believed the idea behind the proverb was probably pointing out the unique personalities that cats often possess, and was not meant to be a factual statement about their lifespans. William Shakespeare mentions the actual “nine lives” part of the phrase in Romeo and Juliet with the line, “Good king of cats, nothing but one of your nine lives.” But England isn’t the only potential source. Cats were revered by ancient Egyptians, who believed they had divine energy. According to their culture, the sun god, Atum-Ra, gave birth to the other eight gods and also took the form of a cat. The nine lives connection may have come from this symbolism—Ra plus eight gods equals nine. In China, both cats and the number nine have positive significance, with cats once being worshipped and the number nine representing longevity. Not every place believes nine is the magic number: According to one Reddit thread, people in Germany and some Spanish-speaking nations say that cats have seven lives. This lucky life-extending number may have to do with the significance of those numbers in those cultures. Do cats always land on their feet? Wherever the phrase began, the idea that these animals have nine lives is undoubtedly prevalent today. This most likely has to do with the behaviors cats have that allow them to survive seemingly impossible situations. Cats are very intelligent and defensive; anyone who owns one knows how quick they are to scare or launch an attack. The little furry beasts also have lightning-fast reflexes that allow them to escape dangerous situations quickly. The average feline’s heightened senses are truly incredible as well. Cats have excellent eyesight in the dark, as well as keen hearing and sense of touch. They essentially have every advantage of sensing danger before it reaches them—which gives them extra time to evade life-threatening situations. The phrase cats always land on their feet may play a role in the nine lives idiom as well. While cats don’t actually always land on their feet, they do have a reflex that helps them orient themselves from virtually any position. There’s still some speculation among experts about how this works and how effective it is, but it does seem to be a trait unique to felines. While these characteristics may not be enough to grant extra lives, they certainly help cats get the most out of their current ones. Source: Why Do We Say Cats Have Nine Lives? 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 24, 2024 Author Report Share Posted October 24, 2024 Fact of the Day - LIQUOR? A SPIRIT? Did you know... The way your behavior can change after a few shots of tequila might make you feel like you’ve been inhabited by the spirit of someone who was clearly the life of the party during their corporeal heyday. According to VinePair, one theory suggests that we call some liquors spirits because of alcohol’s association with one spirit in particular: the Holy Spirit, which, together with God and Jesus, forms the Holy Trinity in most Christian denominations. This is based primarily on certain places in the Bible where the effects of the Holy Spirit are juxtaposed with alcohol’s effects. In the New Testament, for example, when Jesus’s disciples are “filled with the Holy Spirit” during Pentecost and begin to speak in other languages, some bystanders jokingly write off their strange behavior as a symptom of having drunk too much wine. A more likely explanation has to do with the etymology of the word alcohol, which is thought to have come from either of two old Arabic words. As Scientific American reports, the first option is al-ghawl, which literally means “spirit” and is even mentioned in the Qur’an as a spirit or demon that imbues wine with its intoxicating effects. Though that origin story seems logical enough, the second option, al-koh’l, is pretty plausible, too—and more widely accepted. The word al-koh’l described an eyeliner made from stibnite, a black powdery mineral. Since the method of transforming stibnite into makeup was similar to how people distilled liquids, al-koh’l may have gotten co-opted to mean “anything that was distilled.” And spirit—according to this idea—emerged as an alchemist term to represent volatile substances that got separated in the distillation process. When alcohol showed up in English during the 16th century, it was used to describe a powder before it became the spirit or essence distilled from some other substance, as in “alcohol of wine.” All things considered, it’s not surprising that people eventually just started calling them “spirits.” Source: Why Are Some Liquors Called “Spirits”? 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 25, 2024 Author Report Share Posted October 25, 2024 Fact of the Day - PLANTS MAKE SOUNDS? Did you know.... For those of us not blessed with a green thumb, it’d certainly be helpful if our plant friends could tell us when they need attention. Well, it turns out they do — we just can’t hear them. In early 2023, scientists from Tel Aviv University revealed the results of an investigation into whether plants make sounds in ultrasonic frequencies. Previous studies had established that plants can hear sounds, despite not having ears, so it seemed possible that they could create sounds without mouths. After isolating plants in a soundproofed acoustic chamber and a greenhouse and then recording them, the researchers were able to train a machine learning algorithm to differentiate sounds among three disparate plant states: unstressed, cut, or dehydrated. Unstressed plants made little noise and continued along in their usual happy routine of photosynthesizing, but cut and dehydrated plants let out frequent small pops and clicks in a range too high for humans to hear. Stressed plants produced up to 40 of these clicks per hour, while dehydrated plants increased clicks as they got more and more parched. Although tomato and tobacco plants were originally tested, other crops were found to produce similar noises. It’s possible some animals that can hear in frequencies beyond human capabilities could respond to these noises. If a moth were trying to find a suitable plant to lay its eggs, for example, it might skip one that’s popping in distress. But big mysteries remain: For one thing, scientists don’t know how plants are making these sounds in the first place. All we know for sure is that the quiet lives of plants are not nearly as quiet as they seem. Trees can “talk” to one another. Since the mid-19th century, naturalists have often regarded trees as solitary, monolithic figures, but recent research refutes this idea and suggests that trees are remarkably social. That’s because trees in a forest can communicate via a symbiotic relationship known as mycorrhiza. The name, which is Greek for “fungus” and “root,” essentially explains how it works. Fungal threads called mycelium provide nutrients to trees, which in turn deliver sugars generated from photosynthesis. Because mycelium is ubiquitous throughout a forest, it essentially networks trees together — in what some scientists refer to as a “wood-wide web.” Trees can communicate when they are stressed, share information about potential threats, or deliver nutrients to struggling members of the web, especially if they’re in the same family. One study analyzed six different 10,000-square-foot stands of Douglas fir in British Columbia and discovered that nearly all the trees were connected to each other by at most three degrees of separation. They also discovered that one “hub tree,” an older specimen, was connected to at least 47 other trees (and likely many more), including cross-species trees such as the paper birch. Source: Plants make sounds when they’re stressed. 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 26, 2024 Author Report Share Posted October 26, 2024 Fact of the Day - HALLOWEEN COSTUMES Did you know... The origins of these classic costumes are probably not what you would expect. What would a witch costume be without a pointy hat? Why do pirates wear so many accessories that would be impractical on the Seven Seas? And how did it come to be that throwing on a bedsheet was all you needed for a ghost costume? The outfits we wear for Halloween have a story to tell—one often far removed from the historical reality they’re said to represent. And if you find yourself looking for a conversation topic with a person dressed as Batman, maybe they’d like to hear how a Renaissance polymath inspired the look. With that in mind, here are the stories behind five of the most popular adult Halloween costumes as predicted by the National Retail Federation. (We’ve skipped over cats, as that costume is fairly self-explanatory.) Witch It’s often said that the standard witch outfit emerged from medieval women called alewives who brewed and sold beer (or ale, as much as that distinction still exists). The story goes that the women selling beer needed the tall hat to help them stand out in a crowd. That’s almost certainly fiction. In her book Ale, Beer, and Brewsters in England: Women's Work in a Changing World, 1300-1600, Judith M. Bennett writes that alewives were often depicted in a negative light, with at least one late poem (circa 1517) describing a fictional alewife doing all sorts of wicked things, including dealing with a witch. And while it doesn’t explicitly identify the alewife as a witch, the implication is likely there. But by 1517, the alewife was in the process of disappearing (at least in England), with Bennett noting that brewing was largely a man’s game by 1600. That’s problematic for two reasons: The first is that, in England, the peak witchcraft trial period was around 1563-1712; it largely occurred throughout continental Europe around the same time. Secondly, during the peak witchcraft trial period, artistic depictions of witches tended to show them as either naked or looking like everybody else in the community. The classic witch outfit doesn’t emerge until the 18th century at the very earliest, when alewives are mostly out of the picture. While it’s possible individual alewives might have been accused of witchcraft, it’s unlikely they created the archetype for witches in general. As for where the outfit does come from—there is no clear answer. One popular explanation is antisemitism, which traces the witches’ hat to the headpiece Jewish people were forced to wear in several countries. People have also proposed that the hat represents a Quaker hat, a capotain (most famous as the “Pilgrim hat”), or even a reference to the goddess Diana. But it’s very possible there’s no deeper meaning to the outfit and it harkens back to those earlier depictions of witches when they wore everyday clothing. There are many 17th century paintings of women in black robes and tall hats with no suggestion of witchcraft. This leads some authors to suggest that in the 17th and 18th centuries, the modern witch’s outfit was a perfectly standard outfit for people to wear. As the outfit started to become a bit out-of-date, the imagery turned into a parody of rural and folksy elderly women and, from there, witches. Vampire Vampires are suave, handsome, and look great in a tux. Unless that vampire is the original Dracula. In Bram Stoker’s novel, Dracula is described as “a tall old man, clean shaven save for a long white moustache, and clad in black from head to foot, without a single speck of colour about him anywhere” (elsewhere in the story, a de-aged Dracula is described as having a black moustache and pointed beard, but “His face was not a good face”). According to Smithsonian Magazine, the tuxedo element emerged in the 1924 stage production of the story. Because of the requirements of a visual medium, Dracula’s powers of seduction had to be made visibly evident—hence, a good-looking guy wearing a fancy outfit. This production also gave us the now-iconic large collar on the cape (with the cape itself also being credited to the stage production). According to writer David J. Skal, “Originally, the collar had a distinct theatrical function: to hide the actor’s head when he stood with his back to the house, thus allowing him to slip out of the cape and down a wall panel or trapdoor, effectively disappearing before the audience’s eyes. Though the trick collar had no subsequent purpose in film adaptations, it has become a signature feature of vampire costuming for all time.” Batman Batman co-creator Bob Kane has listed many influences for the character over the years. Zorro is apparent, but Kane also said one of his most important influences was The Bat Whispers, a 1930 film that tells the story of a thief who dresses as a giant bat (ish) to rob his victims (because the movie ends with an entreaty to not reveal the twist ending, this is a spoiler-free summary). A final influence was a Leonardo da Vinci drawing called the “Ornithopter” that, Kane felt, would make the person wearing it look like a giant bat. Except beyond a vaguely bat-inspired framework, Kane’s creation had little in common with the modern superhero. Kane’s was flashier, wearing a Robin-esque mask and a red suit with more explicitly bat-like wings à la the ornithopter. The modern Batman design is more readily attributable to the under-appreciated Bill Finger. According to Kane, “One day I called Bill and said, ‘I have a new character called the Bat-Man and I’ve made some crude, elementary sketches I’d like you to look at.’ He came over and I showed him the drawings. At the time, I only had a small domino mask, like the one Robin later wore, on Batman’s face. Bill said, ‘Why not make him look more like a bat and put a hood on him, and take the eyeballs out and just put slits for eyes to make him look more mysterious?’ At this point, the Bat-Man wore a red union suit; the wings, trunks, and mask were black. I thought that red and black would be a good combination. Bill said that the costume was too bright: ‘Color it dark gray to make it look more ominous.’ The cape looked like two stiff bat wings attached to his arms. As Bill and I talked, we realized that these wings would get cumbersome when Bat-Man was in action, and changed them into a cape, scalloped to look like bat wings when he was fighting or swinging down on a rope. Also, he didn’t have any gloves on, and we added them so that he wouldn’t leave fingerprints.” While there were many other influences—it’s now widely acknowledged the first story and some of the early artwork were reworkings of other media—and some changes over the years (the Batman logo on his chest especially has changed dramatically depending on the editors and artists), Finger’s character design helped forge one of the most memorable and popular superheroes of all time. Pirate Compare the following images: This one barely seems like a pirate. Beyond the hat, the many guns, and the smoke, there’s simply not that much that would be out of place in the 21st century. The second picture, however, is much more piratical. There’s a headscarf, a sash, wide pants, even an earring if you zoom in far enough. They’re both artistic depictions of Blackbeard, but the first one is from the 18th century—less than two decades after his death (though it is not necessarily a perfect depiction)—while the one on the right is from the early 20th century. Why the change? It’s generally credited to one man: Howard Pyle. Pyle was an illustrator in the late 19th/early 20th century, a time that saw Golden Age pirates pop up in comic operas and stories like Treasure Island and Peter Pan. This naturally made Pyle want to illustrate pirates as well, but he didn’t go to the archives for his research. Part of his philosophy was that his illustrations had to stand alone, as he famously noted, “Don’t make it necessary to ask questions about your picture. It’s utterly impossible for you to go to all the newsstands and explain your pictures.” To keep with that philosophy, Pyle looked elsewhere for his pirates. According to Anne M. Loechle’s Ye Intruders Beware: Fantastical Pirates in the Golden Age of Illustration, Spain was exotic to 19th-century Americans, and even to much of Europe. The country was a popular destination for artists and travel writers. Those people gave accounts that border on indistinguishable from modern depictions of pirates, with sashes, wide pants, and handkerchiefs around the head. Pyle may have been naturally drawn to the exoticness of Spain while coming up with his designs for pirate outfits. But there might be something more. Pyle was working at a time when tensions between Spain and the United States were increasing, and the pirate can in many ways be contrasted with the era’s stereotypically white Navy man, with Loechle writing “The unexplored, maritime terrain [the pirate] shares with this U.S. sailor highlights their even greater difference: the Navy seaman is a white man; the pirate is racially ambiguous. With his headscarf, wide sash, short pants, and swarthy complexion, he looks nothing like the Anglo-Saxon cowboy or sailor. Instead, American illustrators chose to emulate contemporaneous Spanish gypsies and Spanish genre subjects. The pirate gained popularity despite, or more likely because of, the indeterminate nature of his national and racial identity.” Pyle wasn’t just an illustrator. He also taught other artists, and many of his students went on to create famed pirate images based on his example, forever turning 19th-century Spaniards into the default American image of the pirate. Ghost The origin of the classic bedsheet ghost is traditionally traced to Renaissance-era burial practices. People were buried in a shroud or a winding sheet, often instead of a coffin. This sheet then migrated to the stage. In the early 16th century, beyond some flour to whiten the face, there was little to distinguish ghost characters from non-ghost characters on stage. This began to change by the late 16th century. A visual language emerged, with white sheets coming to represent ghosts (though not necessarily just for dramatic purposes: According to Performing the Unstageable: Success, Imagination, Failure by Karen Quigley, when ghosts show up in Shakespeare’s Richard III, the actors playing the ghosts had other roles and didn’t have time to switch full outfits. A sheet over the other costume likely proved a quick fix). And while modern audiences look at the bedsheet ghost as a source of humor and the epitome of the low-effort Halloween costume, in centuries past its ancestor was serious. Deadly serious. There are many accounts of ghost impersonators from the 16th to 19th centuries where it ends badly for either the hoaxer or the victim, whether that’s the hoaxer being beaten to within an inch of their lives or the hoaxed being robbed. One particularly notable example is from 1704, when thief Arthur Chambers is said to have been staying at a house he was planning to rob. The story goes that he then pretended his brother died and got permission to have the coffin brought to the house on its way to the burial. Chambers then wrapped himself in a winding sheet, dusted his face with flour, and hid himself in the coffin. According to one 18th century account he “[arose] from his mansion of death . . . and going downstairs into the kitchen with his winding sheet about him, set himself down in a chair, opposite to the maid, which frighting her out of her wits, she fell a screaming out, and crying ‘a Spirit, a Spirit, a Spirit.’” Chambers made off with 600 pounds’ worth of goods. So how did such a harrowing visage become a punchline? According to Owen Davies in The Haunted: A Social History of Ghosts, in the 1920s and ‘30s, comedians took note of these hoaxes and incorporated them into their bits. This meant in films like Laurel and Hardy’s Habeas Corpus or Buster Keaton’s Neighbors, people somehow got covered by a sheet and were mistaken for ghosts—and while the characters in the film were terrified, the people in the audience were laughing. Davies writes, “As a consequence the slapstick ghost robbed the white sheet of its power to scare. Many millions today believe that the spirits of the dead walk the earth, but surely few people, if confronted with a white sheet on a dark night, would seriously cry ‘Ghost!’ Laurel and Hardy helped put paid to that.” Source: The Surprising History Behind 5 Popular Halloween Costumes 1 Link to comment Share on other sites More sharing options...
DarkRavie Posted October 27, 2024 Author Report Share Posted October 27, 2024 Fact of the Day - GREENER OCEANS Did you know.... Climate change isn’t just raising the temperature of the world’s oceans — it’s also changing their color. As the oceans absorb the excess heat generated by greenhouse gases, that heat is altering the aquatic life in their waters. New research published in Nature in 2023 shows that the familiar blue hue of the oceans has been steadily transforming over the past 20 years into a greener shade, especially in tropical and subtropical areas of the world. The color of the ocean is dependent on a variety of factors, but one key is the light absorption of H2O. Water usually readily absorbs longer wavelengths of light — red, yellow, and green — and scatters blue. However, a concentration of marine life can cause emerald waters. The newly green hue detected by the 2023 research likely reflects a change in the ocean’s phytoplankton — the algae responsible for 70% of the world’s oxygen, and which also provides the foundation for the marine food web. Scientists monitored ocean color using NASA’s Aqua satellite, and found marked shifts toward green in about 56% of the world’s oceans between 2002 and 2022. Statistical simulations showed that added greenhouse gases are to blame, although it’s not exactly clear how, since areas that warmed the most at the surface weren’t the ones that turned green the most. Some scientists theorize that the change may have to do with reduced mixing in the layers of ocean water, caused by the heat, which limits the nutrients that rise to the surface and consequently affects the types of plankton that can survive. But don’t go color-correcting your photos just yet: While satellites can detect the change in ocean hue, the change is slight enough that most humans probably wouldn’t notice a difference. Human eyes are most sensitive to the color green. Human eyes can only perceive a small fraction of the electromagnetic spectrum. The visible spectrum, which excludes things like radio waves, X-rays, infrared, ultraviolet, and (very dangerous) gamma rays, stretches from around 380 nanometers (nm) to about 740 nm. Glimpse a rainbow — or recite the elementary school acronym ROYGBIV — and you’ll notice that green lies right in the middle of our visual sweet spot. The color occupies around 520 to 565 nm, and the light sensitivity of the human eye in daytime peaks at about 555 nm, which is a green that’s close to yellow. Because of this advantageous middle-of-the-road placement, the human eye can discern more shades of green than any other color. Since seeing green is also less of a strain on our visual system, the color positively affects our mood — essentially, our nervous system gets to relax. Green’s innate ability to “placate and pacify” is one reason the hue can often be found in places of healing, especially hospitals. Source: The oceans are becoming greener. 1 Link to comment Share on other sites More sharing options...
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now