Jump to content

Fact of the Day


DarkRavie

Recommended Posts

Fact of the Day - FINGERNAILS

z28726099H.jpg

Did you know.... Fingernails are an amazing biological invention that play an important and active role in our day-to-day lives, and in the bigger picture of human history. Nails help us grasp and grip objects, which gave earlier Homo sapiens a distinct advantage in manipulating tools and building our modern society. But beyond their primarily utilitarian purpose, nails have also served as status symbols or miniature art canvases in a number of societies throughout history. These six facts explore the biology, history, and artistry of fingernails, and why they’re so intimately tied to the human experience.

 

1. The Only Mammals With Fingernails Are Primates

hqdefault.jpg?sqp=-oaymwEjCNACELwBSFryq4

Among mammals, fingernails are unique to the order of primates; other mammals instead have claws to take down prey or climb trees. Fingernails are essentially flattened claws but are better suited to support broad fingertips, which help some primates swing among tree branches. Homo sapiens developed especially broad fingertips to help grip and manipulate tools some 2.5 million years ago, and nails provide strength and protection for those fingertips. Small blood vessels in the nails maintain blood flow to our fingers even when we’re gripping something very tightly, and the hard covering helps protect against injury. Fingernails also offer protection from viruses and bacteria, aid in fine motor movements (such as scratching or picking), and provide a level of sensation via an intricate network of nerves underneath the nail bed.

 

2. Hair and Nails Are Made of the Same Protein

42477a6fb763adfc7fa9162aa28acb60_ce_988x

Human hair and nails (along with the outer layer of our skin, aka the epidermis) are made of a fibrous protein called keratin, which offers structure and helps protect cells against damage. Your body produces it naturally, but foods such as broccoli, kale, salmon, and sweet potatoes may help boost production. Hair is formed from three cylindrical layers of keratin, while nail plates are made of multiple layers of transparent keratin. Alpha-keratin can also be found in animal fur and claws, and beta-keratin (which differs slightly on a molecular level) is present in reptiles and birds.

 

3. Human Nails Grow 1 Nanometer Per Second

tirnaklarin-fotograflanmas-standartlari-

Fingernails are always growing. In the second it took you to read the previous sentence, your nails grew 1 nanometer (or one-billionth of a meter). But even with 86,400 seconds in a 24-hour period, it’s virtually impossible to notice any day-to-day growth without a microscope. In a month, the average human’s fingernails grow roughly 3.47 millimeters (and toenails grow even more slowly, gaining an average of just 1.62 millimeters). However, there are a few factors that can affect the speed of nail growth. Some research suggests our nails grow faster when we’re younger, and then slow down as we age. There also appears to be a correlation between faster nail growth and a person’s dominant hand. And many people experience rapid nail growth during pregnancy, due to increases in the hormones estrogen and progesterone.

 

4. Manicures Are as Old as History Itself

beautician-polishing_186202-428.jpg?w=36

Manicures can’t be traced to one specific culture, but there’s evidence that they’ve existed in some form for millennia. Archaeologists have discovered Egyptian mummies with gilded nails, and a gold manicure set from Babylonia dating to around 3200 BCE. Some cultures also used henna and kohl to color their fingertips. Around 3000 BCE, the Chinese formulated an early version of nail polish, using gelatin, beeswax, egg whites, and crushed rose petals and orchids to produce different shades of red. The practice eventually fell out of fashion during the Middle Ages, but it made a comeback among wealthy women in Europe during the Renaissance and Victorian eras. Today, the nail care industry is worth billions of dollars, and an estimated 120,000 manicurists and pedicurists work in the U.S. alone.

 

5. Fingernails Can Help Diagnose Certain Diseases

240_F_336087981_uHr0oWS1wbYQUuXXsFtBtn7g

Fingernails can be a great indicator of your overall health. Although a majority of malformed nails can be explained by external injury or poor nutrition or digestion, some abnormalities may be caused by more serious medical conditions. Extreme rounding of the nails, known as clubbing, can be a symptom of oxygen deficiency, for example, possibly related to various lung disorders. Horizontal ridges, known as Beau’s lines, could be a sign of kidney problems or diabetes. The color of your nails can indicate that something is amiss, too. Yellow nails are commonly associated with fungal infections but may also be a sign of thyroid disease, while very white nails may point to liver problems such as hepatitis.

 

6. Fingernails Don’t Really Keep Growing After You Die

images?q=tbn:ANd9GcRQyVN1sJhv6JgihM-_npP

There’s a common myth that our fingernails continue to grow even after death, but it’s just that — a myth. When we’re alive, our nails grow at a rate of around 0.1 mm per day (a little more than 3 mm per month), thanks to something called the germinal matrix at the base of the nail. The germinal matrix uses glucose to create new cells that push the old cells up and out toward the fingertip. However, once the human body stops functioning, it also stops producing glucose, which means the matrix can’t create new cells. The origins of this myth may have something to do with a different biological function, though: While our nails don’t continue to grow after death, the dehydrated skin around them does shrink, which can make nails look like they’ve grown longer.
 

 

Source: Amazing Facts About Fingernails

  • Like 1
Link to comment
Share on other sites

Fact of the Day - U.S. PASSPORTS

product_750613250230600.jpg

Did you know.... American consular officials issued the first passports in the late 1700s. This single sheet of paper was valid for just a few months — a far cry from the current 28-page blue book that’s issued for 10 full years. These days, a passport has become one of the most important documents you’ll ever own, opening up a world of adventures and giving travelers peace of mind on their journeys. But how much do you really know about this critical travel document? Here are six facts you might not know about U.S. passports.

 

1. More Than 143 Million Passports Were in Circulation as of 2020

indian-passport-book-260nw-430143646.jpg

There were 143,116,633 passports in circulation in 2020, according to the U.S. Department of State. A small proportion of those were second passports, but overall that accounts for more than 40% of the U.S. population of 331 million. However, that hasn’t always been the case: In 1989, just 7,261,711 million U.S. passports existed for a population of approximately 245 million, and as recently as 20 years ago, that number still remained stubbornly below 50 million. Despite the significant increase in passport holders, the U.S. figure still falls short of the percentage in many other countries. For example, the Australian government reported that 4,614,941 Australians held a passport in 2019, equivalent to 57% of the population. According to the 2011 census (the most recent data available) just 15% of the 63 million people living in the U.K. did not have a passport. One of them was the Queen — who doesn’t need one because it is she who issues them. Several factors might explain the discrepancies. Thanks to the rapid growth of budget airlines, travel within Europe can be extraordinarily cheap, but a passport is needed to access those international flights. In contrast, visiting many foreign destinations from the U.S. requires a long and relatively expensive flight. The significant differences in climate, topography, and culture make international travel enticing to many foreign nationals, but in the U.S., there’s already a great deal of variety within its borders. Paid holiday entitlements also vary considerably between countries, as does the culture and tradition of gap year travel.

 

2. U.S. Passport Holders Can Travel to 186 Countries Without Having to Arrange a Visa

static-images-pages-75.jpg

The strength of a nation’s passport can be measured in the number of countries its holders are entitled to enter under normal circumstances — either visa-free or by purchasing a visa on arrival (and not having to arrange one in advance). According to the Henley Passport Index, which is based on data from the International Air Transport Association, the U.S. passport currently allows access to 186 countries. That ranks No. 7 among all countries, and is on par with New Zealand, Belgium, Norway, and Switzerland. Japan tops the 2022 chart with 193 countries, closely followed by Singapore and South Korea with 192. As countries alter their entry requirements, the rankings change; the U.S. most recently took first place in 2014.  

 

3. The Cover Color Is the World’s Most Common, But It Hasn’t Always Been Blue

images?q=tbn:ANd9GcQ5m4yE_-zSqmcktt8AwmK

According to the Passport Index from Arton Capital, the world’s passports can be grouped into shades of just four colors: red, blue, green, and black. Blue, the current choice of the U.S. government, is the most popular, preferred by 83 nations. Red is second on the list with 65 countries, followed by green and black with 44 and seven countries, respectively. Passport cover colors don’t always stay the same, either: The U.S. passport has actually been all four at some point in its history. In his book The Passport: The History of Man’s Most Travelled Document, author Martin Lloyd notes that America’s first modern-style passport, issued starting in 1926, was red. Red was replaced by green from 1941 to 1976, when today’s blue option was introduced to match the flag, in honor of the country’s bicentennial celebrations. There are a few exceptions: Diplomats use black passports, and anyone traveling without diplomatic status but who is on government business uses red documents.

 

4. You Can’t Use Just Any Old Photograph

identite-300x200.jpg

Passport photos have to meet a long list of conditions today, but that wasn’t always the case. In the early days of passports, there were few restrictions on the photograph you could use in a passport. People posed with family members and pets, smoked cigarettes, or played musical instruments. Today’s photos must be recent, taken in the last six months, and set against a plain white or off-white background. If you wear glasses, you must remove them unless you can prove you have a medical exemption. No filters or selfies, either. The list doesn’t end there: Don’t even think about wearing a uniform, or camouflage gear. Ditch the hat, unless it’s worn because of your religion, and lose your headphones. Jewelry and piercings are considered acceptable, so long as they don’t obscure your facial features, as are permanent tattoos. If your physical appearance changes significantly, it’s likely you’ll need a new replacement passport.

 

5. The Artwork in the Current U.S. Passport Protects Against Forgery

us-passport-changes.jpg

A 2016 makeover of the U.S. passport moved the machine-readable chip, which contains biometric data about the holder, inside polycarbonate paper to make it more secure. The practice of adding extra pages was banned. Inks, too, got cleverer — depending on the angle you view it, the word “U.S.A.” in the current passport looks green or gold. Even the hot foil stamping on the cover is a feature that aims to make forgery trickier. The passport’s artwork is also used to frustrate potential forgers. The current passport design, dubbed “American Icon,” features a wide range of patriotic images including the Statue of Liberty and Mount Rushmore, bison, bears, bald eagles, and longhorn cattle. The ideals and ethos of a nation are summed up through excerpts from the Declaration of Independence and rousing quotes from Martin Luther King, Jr., Anna Julia Cooper, Henry Ward Beecher, and former presidents. For really mind-blowing passport artwork, however, take a look at some of the passports issued by the Nordic nations. Shining Norway’s passport under a UV light reveals a hidden image of the Northern Lights. Finnish passports act like flicker-books: Flip the passport pages quickly enough and the pictures create a moving image of a moose. Staying one step ahead of forgers is a constant battle, and another, more advanced U.S. passport redesign may soon be in the works.

 

 

Source: Things You Might Not Know About U.S. Passports

  • Like 1
Link to comment
Share on other sites

Fact of the Day - IDES OF MARCH

83839666-torino-la-statua-di-cesare-e-la

Did you know... As mid-March approaches, you’ll no doubt hear the oft-repeated saying “Beware the ides of March.” It’s a strangely archaic phrase that doesn’t make much sense to modern ears without knowing some important historical context, as well as the ins and outs of ancient moon-based calendars  — what are “ides,” anyway? Here are six amazing facts about this famous phrase, and its relation to arguably one of the most important moments in Western history.

 

1. The Phrase Comes From William Shakespeare

In Act 1, Scene 2 of William Shakespeare’s Julius Caesar, Roman politician (and future assassin) Marcus Junius Brutus and the play’s eponymous character are approached through a crowd by a soothsayer who has a warning — “Beware the ides of March.” The two Romans dismiss the fortuneteller as a “dreamer” and go about their business as usual. Of course, the warning proved deadly accurate; for the Romans, the “ides” was the middle of the month, and Julius Caesar was famously assassinated on March 15, 44 BCE. Roman historians say that in reality (not just Shakespeare’s fictionalized version), the soothsayer’s name was Spurinna. He was Etruscan, an ancient people often associated with divination, and served as a haruspex — someone who inspects the entrails of sacrificed animals for clues about the future. However, there’s no record of Spurinna pinpointing the ides of March specifically; instead, he warned Caesar to be wary of the next month generally, a period that would end on March 15. Scholars believe this was likely just a calculated guess, as Roman politicians were already turning against Caesar, who had been named dictator for life, and the famed military leader was leaving the capital for another military campaign on March 18. If Caesar was going to be assassinated, it would likely be in the month of March.

 

2. The “Ides” Were Part of Rome’s Archaic Lunar Calendar

300px-Kalender.jpg

Although the phrase “the ides of March” carries with it a sinister connotation because of the bloody business done on that day two millennia ago, the ides — along with the nones and kalendsare simply ancient markers of the moon’s phases that were part of Rome’s lunar calendar. “Kalends” referred to the new moon (or first of the month), “ides” meant the middle of the month (the 13th in some months and the 15th in others), and “nones” referred to the quarter moon. For a time, the ides of March was actually the beginning of the new year in Rome.

 

3. Caesar Himself Got Rid of Ides Entirely

Although the ides of March is closely related to Julius Caesar, the famous Roman leader was directly responsible for tossing out the old, lunar-based calendar entirely. In 45 BCE, Caesar — after consulting top mathematicians and astronomers — instituted the solar-based Julian calendar, a timekeeping system remarkably similar to the calendar we use today. To implement the new system, Caesar created what has since become known as “the year of confusion,” in which the year 46 BCE lasted for 445 days so the new Julian calendar could begin on January 1. One scholar even argues that this drastic change could’ve been seen by conspiratorial senators as an attack on Roman tradition, and the assassins might’ve purposefully selected the “ides of March” as a symbolic gesture against Caesar and his reforms.

 

4. Every Year Romans Reenact Caesar’s Assassination on March 15

idi-di-marzo-300x200.jpg

Every year (barring worldwide pandemics) Romans reenact the murderous drama that unfolded near the Curia of Pompey two millennia ago. (A curia is a structure where Roman senate members would meet.) However, it wasn’t until 2015 when members of the Roman Historical Group got the chance to recreate Caesar’s final moments on the exact spot where it happened, after finally getting access to the ruins of the curia itself. The reenactment generally unfolds in three parts — first with the senators’ accusations, followed by Caesar’s actual assassination, and then concluding with speeches from both Brutus and Mark Antony justifying their actions. In an interview with NBC News, the Caesar impersonator said this annual bit of theater is about honoring the ancient leader, because “Rome wouldn’t have been as great without him.”

 

5. Caesar Was Deified as a Roman God

Although the Roman pantheon was largely borrowed from ancient Greece, Rome added a few deified originals of its own. One of the most important was the two-headed Janus, the god of doorways and transitions and the namesake of the month of January. But Rome also deified many of its most important leaders, and named months after some of them. After Caesar’s death on the ides of March, a Roman cult known as divus Julius pushed for Caesar’s official divinity. Caesar’s adopted heir, Octavian (known to history as Augustus), later became Rome’s first emperor and similarly received the divinity treatment. The effects of this Roman imperial cult can be seen in today’s calendar, as July and August are named for the two ancient rulers.

 

6. The Location of Caesar’s Murder Is Now a Cat Sanctuary

death-of-caesar-march-15-44-bc-vincenzo-

The Curia of Pompey used to be home to the hustle and bustle of toga-wearing senators going about the business of empire, but it’s now the domain of cats. First excavated during the reign of Benito Mussolini in 1929, the Largo di Torre Argentina houses the remains of the curia where Caesar met his end, as well as the ruins of several temples. However, today the Colonia Felina di Torre Argentina takes care of more than 100 cats that prowl the ancient grounds. Although visitors can glimpse the ruins from street level some 20 feet above ground, only cats are usually allowed to slink among the grounds where the ides of March earned its infamous reputation.
 

 

Source: Intriguing Facts About the Ides of March

  • Like 1
Link to comment
Share on other sites

Fact of the Day - UNSUNG HEROES

download.jpg?format=500w

Did you know.... Even those unfamiliar with the details of the civil rights movement of the 1950s and '60s can point to Martin Luther King Jr. and Rosa Parks as key figures of the era. A deeper dive will reveal names including A. Philip Randolph, James Farmer, Whitney M. Young, and Medgar Evers — leaders who earned their place among the luminaries of the period for spurring legal and social progress. However, not everyone earned due recognition for their contributions, whether because of personality clashes or deep-seated prejudices that went beyond matters of race. Here are five lesser-known civil rights influencers who helped to change the course of history.

 

1. Bayard Rustin

204px-BayardRustinAug1963-LibraryOfCongr

Well before the nation watched the struggle for Black equality unfold on television, Bayard Rustin was at the forefront of a previous generation of activists as a co-founder of the Congress of Racial Equality (CORE). CORE’s main objective was to use “nonviolent direct action” while fighting for civil rights. Rustin later helped Martin Luther King launch the Southern Christian Leadership Conference (SCLC), and is credited as a primary organizing force behind the 1957 Prayer Pilgrimage for Freedom and the 1963 March on Washington. But Rustin was also an openly gay man, and as such, was always in danger of being marginalized despite his obvious brilliance as an adviser and strategist. He was forced out of the SCLC after a congressman threatened to spread rumors about an affair between King and Rustin, and while he returned to pull together the March on Washington, internal opposition forced him to accept a lesser public role in the proceedings. Rustin later served as president and co-chair of the A. Philip Randolph Institute, and continued his push for economic progress even as the wider public movement lost steam. By the time of his death in 1987, Rustin was something of a historical footnote, despite having his fingerprints all over the major civil rights victories of his day.

 

2. Claudette Colvin

156px-Claudette_Colvin.jpg

Nine months before Parks was arrested for refusing to surrender her bus seat to a white passenger in Montgomery, Alabama, the same thing happened to 15-year-old Claudette Colvin. So why was the Parks incident the one that ignited the Montgomery bus boycott and transformed the issue into a national story? As Colvin herself later conceded, the then-42-year-old Parks, a secretary for the NAACP, was considered by some to be a more respectable symbol for the boycott, particularly after it was discovered that the unwed teenager had become pregnant. Nevertheless, Colvin wound up playing a crucial role as events unfolded, as she was named a plaintiff in the 1956 Browder v. Gayle case that challenged the constitutionality of Alabama's segregated buses and provided the legal backbone for the boycott's triumph. Colvin left Alabama soon after and spent most of the following decades living anonymously in New York City, though her contributions have finally earned some long-overdue recognition in recent years.

 

3. Fannie Lou Hamer

159px-Fannie_Lou_Hamer_1964-08-22.png

If King served as the face and eloquent voice of the civil rights struggle, then Fannie Lou Hamer represented its rank-and-file members who were sparked to action because they were "sick and tired of being sick and tired." Born into a Mississippi family of sharecroppers, Hamer was fired after attempting to register to vote in 1962. She used that experience to fuel a tireless dedication to voting rights and launch the Mississippi Freedom Democratic Party (MFDP) in 1964. That summer, Hamer entered the national spotlight with a powerful speech before the Democratic National Committee's credentials panel in which she recalled being subjected to a brutal beating in jail. But her presence also underscored the limitations of her position in the pecking order; President Lyndon B. Johnson dismissed her as an "illiterate woman," and even ostensible ally Roy Wilkins of the NAACP said she was "ignorant." Still, Hamer kept up the fight for equal rights even as she struggled to summon the respect she deserved. She later spearheaded the foundation of the Freedom Farm Cooperative in 1969 and the National Women's Political Caucus in 1971.

 

4. James Meredith

170px-James_Meredith.jpg

Even when compared to other activists who overcame intimidation and violence to participate in demonstrations, James Meredith stands out for his astonishing displays of courage. In the fall of 1962, the 29-year-old Air Force veteran integrated the University of Mississippi. His mere presence at the university caused an uproar and ignited a massive riot that drew 30,000 U.S. troops, federal marshals, and national guardsmen into the fray. Four years later, Meredith embarked on a solo "March Against Fear" out of Memphis, Tennessee, but was shot before he could complete the planned 220-mile walk to Jackson, Mississippi. While he drew praise from King, most notably in the famed "Letter from Birmingham Jail," Meredith was never one to conform to the expectations of others. In 1967, he raised eyebrows by endorsing the reelection campaign of former Mississippi Governor Ross Barnett, who once vehemently opposed Meredith's entry into the state's flagship university. Two decades later, after several failed attempts to run for office, Meredith supported the Louisiana gubernatorial campaign of former KKK grand wizard David Duke. Today, a statue commemorating Meredith's achievement stands on the Ole Miss campus, though the rest of his complicated story is often omitted from history lessons.

 

5. Pauli Murray

300px-Murray-5382.jpg

Pauli Murray was enormously influential as a lawyer, writer, and teacher. She became California's first Black deputy attorney general in 1945, as well as the first African American to earn a Doctor of Juridical Science from Yale Law School two decades later. Additionally, the acclaimed scholar saw her legal arguments used in the groundbreaking cases of Brown v. Board of Education (1954), which struck down segregation in public schools, and Reed v. Reed (1971), which extended the rights under the 14th Amendment's Equal Protection Clause to women. Publicly critical of the sexism rife within the ranks of the civil rights movement, Murray helped launch the National Organization for Women (NOW) in 1966. Eventually, she found herself out-of-step with its leadership and stepped away. On her own once again, Murray resigned from her teaching post and entered New York's General Theological Seminary, en route to one final historic achievement in 1977 as the first African American woman to be vested as an Episcopal priest.

 

 

Source: Unsung Heroes of the Civil Rights Movement

  • Like 1
Link to comment
Share on other sites

Fact of the Day - DINOSAURS

spinosourus-dinosaurs-toy-brown-on-260nw

Did you know... For nearly 200 million years, Earth was the domain of the dinosaurs. Although many people picture giant, green-skinned reptiles roaming the hothouse jungles of the Mesozoic, dinosaurs were incredibly varied creatures — large and small, warm- and cold-blooded — and roamed every continent (yes, including Antarctica). But with some 66 million years or so of separation between humans and dinosaurs, and with many of these wondrous creatures’ secrets hidden away under layers of rock, paleontologists are still trying to understand these amazing beings. Here are six fascinating facts about dinosaurs that debunk long-lasting myths, and explain why paleontology is one of the most exciting scientific fields today.

 

1. An Asteroid Didn’t Kill All the Dinosaurs

According to the prevailing theory among scientists, some 66 million years ago, an asteroid we now call Chicxulub slammed into the coast off the Yucatan Peninsula, triggering Earth’s fifth mass extinction in its more than 4 billion-year-long history. The debris ejected into the atmosphere streaked through the sky, and the resulting friction superheated the atmosphere, causing forest fires around the globe. After a prolonged winter caused by a thick haze of ash blotting out the sun, some 75% of all living species on Earth went extinct. Although many of those species were land-dwelling dinosaurs, one group largely survived the devastation — beaked avian dinosaurs known today as birds. The first avian dinosaur, archaeopteryx, popped up around 150 million years ago. This proto-bird had teeth, though through evolution, a subsect of these flying dinos dropped teeth for beaks instead. Some scientists theorize that these beaks gave birds a post-apocalyptic advantage, because they could more easily dine on the hearty nuts and seeds found throughout the world’s destroyed forests.

 

2. Science Is Still Debating the Existence of the Brontosaurus

320px-Louisae.jpg

Paleontologists have been debating the existence of the giant sauropod named brontosaurus for nearly 150 years. The story starts during the fast-and-loose “Bone Wars” period of paleontology in the late 19th century. During that time, a bitter rivalry developed between American paleontologists Edward Drinker Cope and Othniel Charles Marsh. It was Marsh who discovered the skeleton of a long-necked Apatosaurus in 1877, but the fossil was missing its skull. Marsh incorrectly paired the body with the skull of another dinosaur (likely a Camarasaurus). Two years later, when a more complete Apatosaurus skeleton wound up in his possession, the specimen was unrecognizable compared to Marsh’s Frankenstein dino, so he instead created a whole new species — Brontosaurus, meaning “thunder lizard.” Scientists spotted the mistake in 1903, but the name stuck in the public’s mind. However, a century later, scientists examining more fossils determined that a close cousin of Apatosaurus who had a thinner and less robust neck did exist, and resurrected the name brontosaurus to describe it. However, not all paleontologists accept the revived name for the genus — as beloved as it is.

 

3. Dinosaurs Didn’t Live in Water

Although many aquatic reptiles existed during the Age of the Dinosaurs, they were not dinosaurs. The most famous of these water-dwelling creatures was Ichthyosaurus, which is actually a distinct marine vertebrate — not a dino. The term “dinosaur” instead mostly refers to terrestrial reptiles who walked with their legs under them (not to the side like crocodilians). Other factors such as foot and neck size also help define what is and isn’t a dinosaur. Despite the fact that nearly all dinosaurs were terrestrial, a few lived a semi-aquatic existence. The Spinosaurus, which lived 99 million to 93 million years ago, shows evidence of eating fish, and Ankylosaurus lived near coastlines.
Similarly, species like the flying Pterodactyls (also known as Pterosaurs) — which could be as large as a fighter jet or as small as a paper airplane — are distant cousins of dinosaurs, not dinosaurs themselves, although media coverage frequently refers to them that way.

 

4. Dinosaurs and Mammals Coexisted

images?q=tbn:ANd9GcSbmMnOcaz02YkCNeN5Xuy

Mammals and dinosaurs coexisted during most of the Mesozoic Era (252 million to 66 million years ago). The first known mammal, called Morganucodontids, appeared around 200 million years ago and was about the size of a shrew. During the Age of the Dinosaurs, mammals remained small, never really exceeding the size of a badger, and were a go-to food source for carnivorous dinos (though sometimes the opposite was also true). Things changed when a giant asteroid smacked into Earth at the end of the Cretaceous period. Mammals’ small size meant they could burrow underground and escape scorching surface temperatures. As for food, mammals were perfectly content with eating insects and aquatic plant life (which also survived the asteroid’s impact), while large herbivorous dinosaurs went hungry. Over the next 25 million years, mammals underwent a drastic growth spurt as the Age of Mammals began to take shape.

 

5. The Film "Jurassic Park" Is a Bit of a Misnomer

The entry point for many into the world of dinosaurs is Steven Spielberg’s 1993 film Jurassic Park, which inspired an entire generation of paleontologists. Despite its outsized impact on the field, the film does get a few things wrong about dinosaurs. For one, dinosaurs are now thought to sport feathers, whereas Jurassic Park’s dinos represent the lizard-esque depiction popular in times past. Also, the film’s very name is a misnomer, as the dinosaurs that take up the most screen time — such as the Tyrannosaurus rex, Velociraptor, and Triceratops — all lived during the Cretaceous period (145 million to 66 million years ago). This may seem like a small difference, but the Age of the Dinosaurs is surprisingly long. In fact, the T. rex lived closer to humans, separated by more than 60 million years, than to the Stegosaurus, which lived in the Jurassic period some 80 million years before the “king of the tyrant lizards.”

 

6. We’re Living in a Golden Age of Dinosaur Discovery

a-palaeontologist-at-the-dinopolis-theme

Paleontology is far from a static field. Every year, an estimated 50 new dinosaur species are discovered — that’s basically a new dinosaur every week. Roughly half of those species are being discovered in China, a country that only recently opened up to paleontological pursuits. Technology has also upended the field, with CT scans able to examine the interiors of dino skulls, while other tomographic image techniques can render 3D recreations of bones. Dinosaurs may be a species buried in Earth’s geological past, but uncovering that past has a bright and exciting future.

 

 

Source: Fascinating Facts About the World of Dinosaurs

  • Like 1
Link to comment
Share on other sites

Fact of the Day - COOKIES

LBB_samoa.jpg

Did you know.... The Girl Scouts organization is known for exuding compassion, promoting leadership, and perhaps most famously of all, selling cookies. Since the group was established in 1912 by Juliette Gordon Low, Girl Scouting has blossomed into a global movement — a far cry from its humble origins as a single troop of 18 girls in Savannah, Georgia. In the United States, Girl Scouts raise money for their cause by selling their highly popular and ultra-decadent namesake brand of cookies. In honor of those mouthwatering snacks (which are on sale now!), here are six delectable facts about Girl Scout Cookies to sink your teeth into.

 

1. There Are Three Mandatory Flavors Sold Each Year

girl-scout-cookies.jpg?w=275

Though there have been many changes to the kinds of Girl Scout Cookies sold over the decades, three stalwart flavors are mandated each year: Thin Mints, Do-si-dos (also called Peanut Butter Sandwiches), and Trefoils. None of these varieties existed in their current form in the earliest years of cookie sales, but a version of Thin Mints can be traced back to 1939, when troops started selling a flavor known as “Cooky-Mints.” By the 1950s, shortbread had joined the lineup, alongside the renamed Chocolate Mints and sandwich cookies in vanilla and chocolate varieties. Peanut Butter Sandwiches hit the scene soon after, and by 1966, all three of the aforementioned flavors were among the group’s bestsellers. Other cookies came and went in the decades that followed, but Thin Mints, Do-si-dos, and Trefoils have been staples since the 1970s — and for good reason. Thin Mints are the Girl Scouts’ No. 1 bestselling cookie variety, and the most searched-for Girl Scout Cookies in the majority of U.S. states. Do-si-dos rank fifth in sales (after Samoas/Caramel deLites, Peanut Butter Patties/Tagalongs, and Adventurefuls), and Trefoils feature a version of the Girl Scout logo and were inspired by the original Girl Scout Cookie recipe.

 

2. The “Cookie Queen” Sold 100,000 Boxes

205px-Girl_Scout_cookies_(Girl_Scouts_of

Elizabeth Brinton may not be a household name, but she’s a legend among Girl Scout Cookie sellers. From 1978 to 1990, Brinton sold 100,000 boxes of cookies before ultimately hanging up what she called her “cookie coat.” She began by selling cookies door to door, but in 1985 she pivoted to setting up shop at a local Virginia metro station to sell the treats to passengers during rush hour. Brinton sold 11,200 boxes in that year alone, and was soon dubbed the “Cookie Queen” by the media. She went on to set the record for the most Girl Scout Cookies sold in a single year, with 18,000 boxes, though that number was nearly doubled in 2021 by Girl Scout Lilly Bumpus, who sold a staggering 32,484 boxes. Brinton’s career record of 100,000 boxes has since been surpassed, too, but the Girl Scout who broke it, Katie Francis, actually consulted the Cookie Queen for advice. Brinton told Francis to “think outside of the box” — a maxim that served her well back in the 1980s. In 1985, Brinton wrote to her local congressman, Frank Wolf, to ask for his help in selling cookies to then-President Ronald Reagan, and in 1986, Wolf accompanied her to the White House, where she sold one box of every flavor to President Reagan. She also sold a few boxes to Reagan’s Vice President, George H.W. Bush, and Supreme Court Justices Sandra Day O’Connor, Harry A. Blackmun, and William H. Rehnquist.

 

3. Girl Scouts Sold Calendars Instead of Cookies During World War II

images?q=tbn:ANd9GcRo6vyrh_wu2KjD5WEHC-b

Due to wartime shortages, the Girl Scouts briefly pivoted away from the culinary world during World War II. The U.S. government began rationing sugar in May 1942, and butter in March 1943 — both integral ingredients in the Girl Scout Cookie creation process. Because of this, the Girl Scouts had trouble filling orders, though in certain instances local troops were supplied ingredients by benefactors, or Girl Scouts baked cookies specifically for members of the military. Most troops, however, had to find other ways to raise money, so in 1944, the Girl Scout National Equipment Service began producing calendars to be sold for 25 cents. Fortunately for both the Scouts and their customers, the cookie drought was only temporary. By 1946, ingredients were no longer being rationed, and cookie sales resumed and then grew; by 1950, the line of Girl Scout Cookies had been expanded to add new flavors.

 

4. Girl Scout Cookies Were Originally Homemade

a_proposito_del_gluten_horizontal_192.jp

It may be hard to fathom today, given the sheer breadth of the current cookie operation, but Girl Scout Cookies were originally homemade. A troop in Muskogee, Oklahoma, baked and sold the first cookies in a school cafeteria in 1917, and other troops soon followed suit. A few years later in 1922, a Chicago-based magazine called The American Girl published a recipe to be used by Girl Scouts all over the country. It was just a simple sugar cookie containing butter, sugar, milk, eggs, vanilla, flour, and baking powder, but it was a hit with consumers. Throughout the 1920s, Girl Scout Cookies were baked by troop members with help from their parents and members of the local community. The treats were subsequently packaged in wax paper, sealed with a sticker, and sold for 25 to 35 cents per dozen. It wasn’t until 1934 that the Girl Scouts of Greater Philadelphia Council became the first council to sell commercially baked cookies; within two years, the national organization began licensing the cookie-making process to commercial bakeries.

 

5. Girl Scout Cookies Differ Slightly Depending on Which Bakery Made Them

GettyImages-1232312838-e1643161159997.jp

In the late 1940s, 29 bakers were licensed to make Girl Scout Cookies. Today, Girl Scouts get their goods from just two licensed bakeries: ABC Bakers in Virginia and Little Brownie Bakers in Kentucky. Depending on which bakery produces the cookies your local troop sells, you may find that the snacks have slightly different names. For instance, Tampa residents receive Samoas from Little Brownie Bakers, whereas people who live just a few hours away in Orlando chow down on the virtually identical Caramel deLites from ABC Bakers. And it’s not just the branding that may differ from city to city. Cookies might also look or taste different due to minor discrepancies in each bakery’s recipes. For example, ABC’s Thin Mints are crunchier and mintier than Little Brownie’s richer and chocolatier version, and Caramel deLites are heavier on the coconut flavor than Samoas. A few cookies are also specific to one bakery: Currently, S’mores are made only by Little Brownie Bakers, while Lemonades are exclusive to ABC Bakers. (Little Brownie has a completely different lemon cookie called Lemon-Ups.) No matter which bakery provides the cookies, though, you’re in for an indulgent treat.

 

6. Over 50 Flavors Have Been Discontinued

1583355968348.jpeg

Some Girl Scout Cookie flavors are likely never to go away, due to their enduring popularity, but not all cookies are so lucky. Some 51 former varieties have come and gone in the decades since the snacks were first introduced. That’s not to say these bygone flavors didn’t have their fans, of course; many people look back fondly upon these scrumptious but discontinued treats, which include Kookaburras, a combination of Rice Krispies and chocolate, and Golden Yangles, a savory cheddar cheese cracker. There’s always the possibility of a comeback, though, as Lemon Chalet Cremes made a brief return in 2007 after having been phased out in the 1990s. It was a short-lived run, but you can still hold out hope that your favorite former flavor may return someday.

 

 

Source: Delectable Facts About Girl Scout Cookies

  • Like 1
Link to comment
Share on other sites

Fact of the Day - ELEVATORS

240_F_306797232_CRtb7HNgSgOl8FA4C2BhYfXA

Did you know.... Today, riding an elevator is a mundane activity, but little more than two centuries ago, these mechanical contraptions were steam-powered, death-defying wonders. In the years since, these mostly unseen pieces of urban infrastructure have become a key part of what makes modern cities possible. Without them, a city’s upward trajectory would be impossible, and the design of our world would be unimaginably different. Here are six amazing facts about the humble elevator, from its surprisingly ancient origins to the many places it may take us in the future.

 

1. Greek Mathematician Archimedes Invented an Elevator in 236 BCE

The elevator is a surprisingly old invention. According to writings from the ancient Roman engineer Vitruvius (the same Vitruvius who inspired Leonardo da Vinci’sVitruvian Man”), the Greek mathematician Archimedes invented a primitive elevator back in 236 BCE. Archimedes’ contraption bore little resemblance to today’s people-movers: It worked via manpower, with ropes drawn around a drum that was then turned by a capstan, a large revolving cylinder often used to wind ropes on ships. Although the attribution was written after Archimedes’ death, the invention makes sense for the great Greek thinker, who was famous for his exploration of compound pulley systems. Elevators join the list of other surprising ancient inventions, including such wonders as the world’s first steam engine and the world’s first computer.

 

2. Before the Modern Elevator, Top Floors Were Undesirable

staircase-in-an-old-apartment-building-i

Today the most luxurious high rises are crowned with multimillion-dollar penthouses, but before the rise of elevators (pun intended), the most desirable floors were those closest to the ground. The first building to include elevators at the design stage was the 130-foot Equitable Life Building in downtown Manhattan, which was built in 1870. Society was slow to adjust to the elevator, and the building was designed to look like it had fewer floors than it did. Also, the insurance company that worked out of the building still occupied the “valuable” lower floors, while the custodian enjoyed the upper floors. The era of the penthouse didn’t arrive in full swing until the 1920s, when the decade’s economic boom brought a flurry of construction projects to New York City and other cities around the world.

 

3. An American Inventor Created the First Modern Passenger Elevator

A key part of the very first passenger elevator was invented by Elisha Graves Otis, who founded the Otis Elevator Company, a manufacturer still in business today. Otis invented a safety device that would prevent an elevator car from falling if the cable broke. Before Otis’ invention, elevators were dangerous contraptions primarily reserved for moving cargo in factories, warehouses, and mines. In 1854, Otis introduced his “safety elevator” at New York City’s Crystal Palace, also known as “Exhibition of the Industry of All Nations,” where he asked someone to cut the rope that was holding him up. Once cut, the platform dropped only a few inches before catching him. This enhanced safety feature helped sway public opinion by demonstrating that elevators could be a safe means of vertical transportation. Today, elevators are considered statistically safer than stairs.

 

4. People Once Trained for Years To Be Elevator Operators

elevatorOperator.png

Although Elisha Otis invented a safer elevator, that didn’t mean the device was foolproof. For decades, operating an elevator was considered a highly skilled job that required years of study in some parts of the world, such as Germany. In the late 19th century, elevators were operated using “shipper ropes,” and operators were trained on the precise timing of pulling these ropes to arrive at the right floor. A well-trained operator was highly desirable, since they made the difference between a smooth ride or a death-defying jumble of starts and stops. Over the decades, the job of the elevator operator became increasingly automated. In 1887, American inventor Alexander Miles designed the first automatic elevator doors, after reading about several accidents involving people falling down elevator shafts. But it wasn’t until the 1960s — a little over a century after Elisha Otis introduced the first safety elevator — that automated elevator cars began to replace human operators entirely.

 

5. The Fastest Elevator in the World Travels Up to 67 Feet Per Second

In the early days, elevators could only travel at about 40 feet per minute. After some 150 years of innovation, the world’s fastest elevator can now travel 67 feet in a second (or around 46 miles per hour). This elevator is located in Shanghai Tower in China, which also includes the longest continuous elevator run, at 1,898 feet. Originally installed by the Japanese company Mitsubishi Electric in 2015, the elevator got an upgrade in 2016, allowing it to traverse a path from the second-level basement to the tower’s 119th floor in just 53 seconds. The elevator in the CTF Finance Center, also located in China, comes in a very close second, traveling at 65 feet per second.

 

6. German Engineers Designed a Sideways Elevator in 2017

 

Since their invention two millennia ago, elevators have done just two things — go up and go down. However, in 2017 a German elevator company began testing an elevator that can travel in any direction. Nicknamed the “Wonkavator” after the multidirectional elevator seen in 2005’s Willy Wonka & the Chocolate Factory, the machine was hailed as “the biggest development in the elevator industry” since the device’s invention. However, a sideways elevator is only the beginning of what’s in store for the technology’s future. Scientists (and sci-fi writers) have also hypothesized about the feasibility of a space elevator that can ferry future astronauts from the Earth’s surface to outer space — completely forgoing the need for expensive, pollution-belching rockets.

 

 

Source: Amazing Facts About the Ups and Downs of Elevators

  • Like 1
Link to comment
Share on other sites

Fact of the Day - GRACE KELLY

images6.jpg?w=584

Did you know.... Although she appeared in just 11 feature films, Grace Kelly endures as a larger-than-life figure due to her magnetic screen presence, her impeccable fashion sense, and a fairy-tale marriage that whisked her from Tinseltown to the royal palace of a glamorous European city-state at the height of her career. Here are seven facts about a leading lady who lived a life seemingly scripted by the Hollywood machine she left behind.

 

1. Grace Kelly Hailed From an Accomplished Family

The Philadelphia-based Kelly clan was a group of high achievers: Grace's father, Jack Sr., won three Olympic gold medals for rowing, earned a fortune from his construction business, and had significant political connections; her mother, Margaret, was a model and the first woman to teach physical education at the University of Pennsylvania. Two of Grace's uncles also enjoyed success in the entertainment industry: Walter Kelly was a vaudeville star whose career stretched to the advent of talking pictures, and George Kelly, who served as a valuable mentor to his niece, was a Pulitzer Prize-winning playwright.

 

2. Pre-Fame, Grace Kelly Was a Highly Paid Model

20230104-00010000-ellegirl-000-1-view.jp

Despite her parents' finances (and because they disapproved of her acting ambitions), a teenage Kelly insisted on paying her own tuition to attend New York City's American Academy of Dramatic Arts in the late 1940s. Fortunately for her, the beauty and poise that soon became familiar to theater audiences was already apparent, and Kelly quickly found work with the John Robert Powers modeling agency. According to Donald Spoto's High Society: The Life of Grace Kelly, the budding actress appeared in a series of print ads and commercials for shampoo, soap, toothpaste, beer, and cigarettes, with earnings of more than $400 per week making her one of the city’s highest-paid models at the time.

 

3. A Failed Screen Test Fueled Her Later Success

Sometime between 1950 and 1952 (sources differ on the year), Kelly auditioned for the part of a desperate Irish woman in a New York City-based drama called Taxi (1953). She was passed over for the role, but her screen test eventually found its way to celebrated director John Ford, who lobbied for the little-known actress to be included in his high-profile adventure film Mogambo (1953). Separately, Alfred Hitchcock also saw something intriguing in the same Taxi screen test, leading to Kelly’s first true starring role, in Dial M for Murder (1954).

 

4. She Enjoyed a Running Gag With Alec Guinness

le-cygne-the-swan-by-charlesvidor-with-g

As told in Spoto's High Society, Kelly and Alec Guinness engaged in a running gag that lasted more than two decades after their time together on the prank-filled set of The Swan (1956). After Kelly relentlessly teased her co-star about an overzealous fan, Guinness retaliated by having a concierge slip a tomahawk into her hotel bed. A few years later, Guinness was surprised to return to his London home and discover the same tomahawk nestled between his bedsheets. He later enlisted English actor John Westbrook to redeliver the item while Kelly and Westbrook toured the U.S. for a poetry reading during the 1970s, but her highness got the last laugh when Guinness again found the tomahawk in his Beverly Hills hotel bed in 1979.

 

5. Her Romance With Prince Rainier Got Off to a Rocky Start

Per High Society, Kelly was in France to attend the 1955 Cannes Film Festival when she agreed to travel to Monaco to meet Prince Rainier III (part of a scheme put together by the magazine Paris-Match for a photo story). However, the prince was delayed by a commitment elsewhere, and by the time he rushed back to his palace an hour late, his fed-up guest was ready to leave. When Rainier asked if she wanted to tour the palace, Kelly coolly replied that she'd already done so while waiting. They subsequently relaxed while walking through the palace garden, their brief meeting giving rise to an epistolary friendship that turned romantic, and eventually led to their "wedding of the century" in April 1956.

 

6. As Princess Grace of Monaco, She Devoted Herself to Charity

a103f527c63f15df67be42b80a779872--patric

Along with giving birth to Prince Albert and Princesses Caroline and Stephanie, Kelly transitioned to life as Princess Grace by immersing herself in charitable initiatives in her adopted country. After taking over the presidency of the Monaco Red Cross in 1958, the erstwhile actress launched the World Association of Children’s Friends (AMADE) in 1963 and the Princess Grace Foundation the following year. Additionally, the princess opened the city-state's first day care in 1966, and channeled her longtime love of flowers into the formation of the Monaco Garden Club two years later.

 

7. Princess Grace Starred in a Little-Seen Comedy Just Before Her Death

A glance at a standard Kelly bio gives the impression that her screen career ended with her marriage, save for the occasional documentary appearance. However, the princess did deliver one final acting performance — albeit as a fictionalized version of herself — in the early '80s mistaken-identity comedy Rearranged. Initially intended as a promotion for the Monaco Garden Club’s annual competition, the half-hour-long short was a hit, sparking plans to expand the piece into an hour-long American TV special. However, its star's untimely death following a September 1982 car accident torpedoed those plans, and the original short remains within Monaco’s royal archives, largely unavailable to viewers.

 

 

Source: Facts About Hollywood Star Turned Real-Life Princess Grace Kelly

  • Like 1
Link to comment
Share on other sites

Fact of the Day - ALCATRAZ

hqdefault.jpg?sqp=-oaymwEjCNACELwBSFryq4

Did you know....Alcatraz Island, known colloquially as “The Rock,” was once the most notorious prison in the United States. Located 1.25 miles offshore from San Francisco, the island saw Civil War prisoners in the 1860s, mob bosses in the 1930s, and much more. Today, it’s one of the Bay Area’s most popular tourist attractions, and an on-island museum tells the story of the prison’s past. These seven facts span the many ages of Alcatraz and reveal how it became one of the most infamous sites in American history.

 

1. The Word “Alcatraz” Means “Pelican” in Archaic Spanish

In 1775, Spanish explorer Juan Manuel de Ayala became the first European to sail into San Francisco Bay. He named the bay and its islands, including one he called “Alcatraces.” Although the island’s name was anglicized over the decades, its origin is widely believed to meanpelican” or “strange bird.” The island was once a particular hot spot for California brown pelicans (Pelecanus occidentalis californicus), which were so plentiful in the 19th century that one French observer noted that when a group of pelicans took off in flight, it created winds like a hurricane. Although the birds’ numbers dwindled sharply due to hunting and the use of DDT over the decades, the pelican rebounded in the latter part of the 20th century, and was removed from the Endangered Species List in 2009.

 

2. Before Becoming a Prison, Alcatraz Was a Military Outpost

220px-AlcatrazIsland-1895.jpg

Although Alcatraz is known as one of America’s most infamous prisons, its first official U.S. role was as a military outpost. With California joining the U.S. in 1850 after being ceded from Mexico two years prior, and with hundreds of thousands of people flooding the state as part of the California Gold Rush, the U.S. military needed to protect San Francisco Bay. Alcatraz, along with Fort Point and Lime Point, formed a “triangle of defense” that guarded the bay’s entrance. At one point, the U.S. even installed 100 cannons on the 22-acre island, making it the most heavily armed military outpost in the Western U.S. But by the decade’s end, the first prisoners had been brought to the island, and Alcatraz played host to both Confederate prisoners and Union deserters during the Civil War.

 

3. Alcatraz Was Home to the First Lighthouse on the U.S. West Coast

During the island’s days as a military outpost, the U.S. constructed a lighthouse to serve vessels crisscrossing the busy shipping lanes of San Francisco Bay. Although the lighthouse tower was built by 1852, the Fresnel lens — a compact lens designed to make lighthouses brighter — didn’t arrive until 1854. Luckily, the delay didn’t cost the lighthouse the impressive accolade of being the first lighthouse constructed on the West Coast of the United States. Sadly, the structure was damaged beyond repair following the catastrophic 1906 San Francisco earthquake. It was rebuilt, however, and still operates to this day.

 

4. Prison Life at Alcatraz Wasn’t Always Bad

images?q=tbn:ANd9GcQSHtsCD2wEd-ouvY0rl9W

Alcatraz became a federal prison in 1934, after being transferred to the U.S. Department of Justice and the Federal Bureau of Prisons. It was designed as a maximum security penitentiary meant for the most difficult inmates in the federal system, and was partly an attempt to show the public that the government was being tough on the widespread crime of the 1920s and ’30s. Although Alcatraz cut an intimidating figure, some prisoners reported that the experience wasn’t so bad. The first warden of Alcatraz made sure the food was good to dissuade rioting, and a menu in the 1940s even includedbacon jambalaya, pork roast with all the trimmings, or beef pot pie Anglaise.” Prisoners lived one man to a cell, which wasn’t a certainty in other federal prisons, and had basic rights to food, shelter, clothing, and medical care. Through good behavior, prisoners could earn privileges that included work on the island and even playing music. In fact, Alcatraz’s reputation far surpassed those of some other federal prisons, and occasionally inmates around the country even requested transfers to “The Rock.”

 

5. Al Capone Wrote Love Songs While an Inmate at Alcatraz

Arguably the prison’s most famous inmate was Al Capone, who was known at Alcatraz as Prisoner 85. Although a ruthless mob leader who ran the Italian American organized crime syndicate known as the Chicago Outfit, Scarface was finally put behind bars for tax evasion in 1931. In a few instances, he resorted to violence when provoked, but he mostly spent time playing banjo in the prison band the Rock Islanders, and writing love songs. In 2017, Capone’s handwritten lyrics to one song, titled “Humoresque,” sold at auction for $18,750. The lyrics included such memorable lines as “You thrill and fill this heart of mine, with gladness like a soothing symphony, over the air, you gently float, and in my soul, you strike a note.” Capone was eventually released from prison in November 1939, after more than seven years behind bars, by which time he was in ill health due to an untreated case of syphilis.

 

6. No One Has Ever Escaped From Alcatraz (Probably)

mqdefault.jpg

Of the 14 escape attempts at Alcatraz, all failed — except one daring attempt (forever immortalized in the 1979 film Escape From Alcatraz). On June 12, 1962, an early morning bed check at the prison revealed that three inmates were missing from their beds — and in a made-for-Hollywood twist, they’d been replaced by papier-mâché heads constructed in secret to fool the night guards. While hacking together homemade life vests (an idea they got from the DIY magazine Popular Mechanics), the escapees tried their luck across the bay toward San Francisco. The FBI discovered the vests on Cronkhite Beach and found other bits of evidence (including letters sealed in rubber) scattered throughout the bay — but the authorities never found any evidence of the men living in the U.S. or abroad, and believed they actually drowned in the bay’s frigid waters. The FBI closed the case on December 31, 1979, but the U.S. Marshals Service has continued to investigate.

 

7. Native Americans Occupied Alcatraz

One problem with running a prison on an island is that it can be pretty expensive to maintain, and so in March 1963, the century-old military outpost-turned-penitentiary closed its doors — but that wasn’t the end of its story. In November 1969, a group of Native Americans led by activist Richard Oakes traveled to Alcatraz and began an occupation of the island that lasted 19 months. The group referenced the 1868 Treaty of Fort Laramie, which allowed Native people to repossess retired or abandoned federal land, as the basis for their seizure. They issued a proclamation that included a letter to the “Great White Father and All His People,” which highlighted the hypocrisy of the U.S. government’s treatment of Native Americans both past and present. Over the following months, the occupation grew in size to as many as 600 people, before numbers began to dwindle in January 1970. The government cut off electrical and water supplies to the island, food became scarce, and in June 1971 U.S. marshals forcibly removed the final 15 occupiers from the island. A highly publicized moment of Indigenous activism, the protest brought considerable attention to the plight of America’s Native peoples. In 1970, President Richard Nixon even ended the U.S.’s decades-long termination policy — an effort to forcibly eliminate tribes and assimilate Native Indians into American society. The occupation of Alcatraz was the first intertribal protest, and part of a rich history of modern Native American activism.

 

 

Source: Amazing Facts About Alcatraz

 

 

  • Like 1
Link to comment
Share on other sites

Fact of the Day - EMPIRE STATE BUILDING 

images?q=tbn:ANd9GcRwXgGNXJmCMW9hNFbalbh

Did you know.... In a metropolis filled with architectural marvels both new and old, the Empire State Building still carries major clout as a defining landmark of New York City. Whether it’s because of the classy art deco design, the attention-grabbing light displays, or the far-reaching views offered from its observation decks, the Great Depression-era skyscraper remains a top tourist attraction and one of the most photographed buildings in the world. Here are six facts you might not know about the longtime stalwart of 34th Street and Fifth Avenue in Manhattan.

 

1. The Empire State Building Was Built in 410 Days

ppkr.jpg

The brainchild of financier John J. Raskob, the Empire State Building was conceived at a time when multiple developers were racing to leave their imprint on the New York City skyline — and it became a reality with mind-boggling speed. Fueled by the labor of as many as 3,400 daily workers, the structure climbed off the ground at a peak rate of 4.5 stories per week following its formal groundbreaking on March 17, 1930. Remarkably, the massive building — comprising 60,000 tons of steel, 200,000 cubic feet of Indiana limestone and granite, 730 tons of aluminum and stainless steel, and 10 million bricks — was completed ahead of schedule (and below budget) after just 410 days. President Herbert Hoover officially dedicated the new skyscraper on May 1, 1931.

 

2. The Empire State Building Was the Tallest Building in the World for Four Decades

empire-state-building-mint-EMPIRE0921-56

Although it's since been dwarfed by giants such as the United Arab Emirates' 2,720-foot Burj Khalifa, the Empire State Building once set the standard for human ambition to reach for the skies. At 102 stories and 1,250 feet tall (not counting the later addition of an antenna, which added 204 feet), it was the first building to pass the 100-story mark, and its height easily surpassed the 1,046-foot record previously established by the Chrysler Tower in 1930. The Empire State Building remained the world's tallest building until the 110-story Twin Towers of Lower Manhattan's World Trade Center both pushed past 1,360 feet in the early 1970s.

 

3. The Empire State Building Has Its Own Zip Code

52367_main__69557.1584836361.jpg?c=2

Since May 1980, with the designation of the skyscraper’s very own 10118 ZIP code, the Empire State Building’s tenants have enjoyed the postal privileges of a small city. This was the result of an effort to speed up mail delivery in Manhattan by giving higher-volume areas their own digits. Of the 63 new ZIP codes introduced in the borough that year, 39 were buildings that received at least 5,000 pieces of mail per day. The Empire State Building easily surpassed that cutoff with a daily intake of 35,000 pieces of mail.

 

4. The Building’s Colorful Light Displays Began in 1976

628397_poster_l.jpg

Among the Empire State Building's famed features are the crowning lights that frequently change colors to honor cultural events, organizations, and local sports champions. The building first shone a beacon following Franklin D. Roosevelt's presidential election in November 1932, but the multicolored displays that New Yorkers have come to know and love date back to the red, white, and blue bicentennial celebration of July 1976. The lights have since flashed in a range of colors, such as pink to commemorate Breast Cancer Awareness Month, blue for Frank Sinatra's 1998 death, and even neon green in 2009 for the 25th anniversary of the first Teenage Mutant Ninja Turtles comic book. The building switched to LED lights in 2012, giving operators the ability to choose from 16 million colors and add special effects like ripples, sparkles, and strobes.

 

5. Competitors Race to the Top in the Annual Empire State Building Run-Up

ny-empire-state-building-run-01.jpg

For those with energy to burn (and maybe a masochistic bent), the Empire State Building Run-Up offers runners from around the world a chance to scale the majority of the skyscraper by foot. An annual tradition since 1978, the Run-Up covers 1,576 steps over 1,050 vertical feet, from the lobby to the 86th-floor observatory. The fastest record for what is billed as "the world’s first and most famous tower race" was set by Australian Paul Crake, who completed the grueling climb in nine minutes and 33 seconds in 2003. And while that's obviously slower and more strenuous than the sub-minute it would take to ride an elevator, it does hold some appeal, given the lines to visit the observatory stretch the average elevator wait time to upwards of 45 minutes.

 

6. It's Been Featured in More Than 250 Movies

images?q=tbn:ANd9GcQpfO1oSeO_YFblBPvkXAB

As one of the world’s most famous structures, the Empire State Building has made numerous appearances on the big screen. Just how many is impossible to determine, considering the number of low-budget films that fly under the radar, but the Empire State Building's website once cited an estimate of "more than 250 movies." The most famous ones include King Kong (1933), which features the titular ape swatting at planes from the newly completed skyscraper; Independence Day (1996), which sees the Empire State Building destroyed by a giant alien spaceship; Sleepless in Seattle (1993), which features an unforgettable meeting between the main characters in the film’s finale; and Andy Warhol's Empire (1965), which focuses solely on the iconic building over the course of its eight-hour run time.
 

 

Source: Towering Facts About the Empire State Building

  • Like 1
Link to comment
Share on other sites

Fact of the Day - TABLE ETIQUETTE

06cedf5addcfd6431774f55821e51087.jpg

Did you know... When we’re eating casually at home, most of us don’t have a large formally set dining table complete with multiple pieces of silverware and glassware. We can stick to a few basic rules that we learned as children, like not speaking with our mouths full of food. But at a fancy event, or when we’re trying to impress someone important, the rules may seem a little more complex and overwhelming. Here are six table etiquette guidelines that you might not know.

 

1. When It Comes to Silverware, Work From the Outside In

2_formal_table_settings.jpg

A formal dinner setting might have three or more forks, and just as many knives and spoons. It can all get a bit confusing. You may be confronted with a shellfish fork, a soup spoon, or a fish knife and fork, all in addition to the main dinner knife and fork. For some multiple-course meals, utensils may be brought in with each course. This is especially true for salad and dessert courses, and it makes it easier to know what to use. When in doubt, the basic rule to remember is that you should always start at the outside and work your way inward so that the largest tools are used for the main course. Another helpful tip is to wait for the host or hostess to begin eating. Not only is it good manners to do so, but it also allows you to see which implement they are using.

 

2. Put Your Napkin on Your Lap

images?q=tbn:ANd9GcRRdYugThz02WXaPV7R8h2

The first paper napkins are believed to have appeared in ancient China, where they were used in little baskets that carried tea cups. Before that, many cultures (including the Romans) used finger bowls for wiping food remnants from their hands. During the Middle Ages, most people used whatever was available, usually a sleeve, for wiping their mouths. That slowly changed, with nobles using a separate cloth or nappe. This may have started as a giant tablecloth, but eventually became what we recognize today as a cloth napkin. Of course, napkins eventually developed their own rules of etiquette. When sitting down to eat, it is polite to take the napkin and spread it on your lap. Do not tuck it into the neck of your shirt. Use it to gently dab at your mouth during the meal and, when finished, leave the napkin loosely folded on the table.

 

3. Wait on the Bread

beurre.jpg

It’s an all-too-common scenario. By the time the entrée arrives at a restaurant, everyone has eaten their fill of bread. But at a formal dinner, the bread is to be eaten with the courses, rather than by itself. So as tempting as the smell of freshly baked bread may be, wait. There are also rules about how to eat the bread. Do not spread the entire slice with butter. Likewise, don’t cut a bread roll in half and butter both halves. The reasoning is that this may leave you with butter smeared across your face. The correct way to eat it is to break off a small piece and butter just that piece. Continue to butter one bite at a time. And to avoid confusion, the bread plate is to your left.

 

4. Consider Adopting the Continental Style

4204dfe6f9d30c4d24466b317b962d85

The American method of using eating utensils is often very different from the Continental, or European, method, which can lead to some confused looks on either side of the Atlantic. Each style is correct, but one may be more appropriate depending on the setting. The Continental style is to hold the fork in your left hand with the tines facing down. The knife is held in the right hand. The index finger of each hand is extended along the utensil. Meanwhile, the American method often sees the fork being transferred from one hand while cutting food to the other while eating. Etiquette experts advise that the Continental style may be “the most diplomatic.” Again, if in doubt, it is always wise to default to copying your host.

 

5. No Elbows on the Table (But Only While Eating)

80a6e3a57676a2921a6a1361e4117f8b--etique

Where and why did the rule about no elbows on the table originate? No one seems to know for sure, but the rule is common to many cultures. There is even a reference to it in the Old Testament of the Bible. In the 16th century, the Dutch philosopher Erasmus warned that only those weakened by old age or infirmity should rest their elbows on the table. More recently, Emily Post continues to caution against it, unless engaging in conversation between courses. Some believe that the use of elbows could once have been seen as a sign of intimidation or potential violence. Martha Stewart claims that resting one’s elbows increases the likelihood of slouching, which was once considered, in itself, rude. Whatever the reasoning, most people agree that elbows on the table while eating can be seen as impolite and can intrude upon your neighbor’s space.

 

6. Pay Attention to Local Customs

makan-banyakjpg-20210417104420.jpg

Table etiquette varies from one country to another. To avoid insulting a host when dining overseas, it can be useful to brush up on local manners. If eating with your hands in India and parts of the Middle East, remember to always use the right hand, as the left is considered unclean. Slurping one’s noodles may be a definite faux pas in the U.S., but in Japan and China, it is a sign of appreciation. In France, any bread on the table is to be eaten during the meal, not before. Furthermore, to avoid offending your French dinner host, both hands should rest on the table and not in your lap when you’re not eating. Meanwhile, never use your fork as a scoop for your peas in the United Kingdom. Although it may seem very impractical, the “proper” way is to use the tines of your fork to lightly squash a small amount at a time, or stick them to some mashed potatoes.

 

 

Source: Table Etiquette Tips You Might Not Know

  • Like 1
Link to comment
Share on other sites

Fact of the Day - KATHARINE HEPBURN

f7f404d731b1dcc17ae1b8b6318f34ef.jpg

Did you know... She was a true Hollywood luminary, the headliner of such classic films as The Philadelphia Story (1940), The African Queen (1951), and Guess Who’s Coming to Dinner (1967). Yet Katharine Hepburn was far more than a screen persona propped up by a camera and lights: She clashed with studio executives over her refusal to dress like a typical starlet, navigated her own way out of professional slumps, and largely lived and loved as she saw fit over a career that spanned more than six decades. Here are six facts about this one-of-a-kind leading lady.

 

1. Hepburn’s Stage Career Got Off to a Rough Start

Fresh out of Bryn Mawr College in the late 1920s, the ambitious but unrefined actress struggled to hold on to several of the stage roles she relentlessly pursued. She was fired from productions of The Big Pond, Death Takes a Holiday, and The Animal Kingdom, and was briefly replaced before delivering a breakout performance in 1932's The Warrior's Husband. Even after making a successful leap to Hollywood with celebrated turns in 1933’s Morning Glory and Little Women, Hepburn was humbled by a widely panned return to Broadway that year in The Lake, and bought out her contract to avoid the embarrassment of continuing with the production on tour.

 

2. Hepburn Endured a Close Call With the Leopard of “Bringing Up Baby”

GettyImages-607388836-700x529MobileImage

Hepburn spent several scenes with a dangerous co-star in 1938's Bringing Up Baby, and it wasn't Cary Grant. She initially got along pretty well with Nissa the leopard — the titular "Baby" of the screwball comedy — who enjoyed nuzzling his head into Hepburn’s perfume-laden negligee. However, a leopard never changes its spots, and something in its primal brain was triggered when the leading lady changed to a dress weighted with metal pieces to enhance its swirling capabilities. As she recalled in her memoir Me: Stories of My Life: "[O]ne quick swirl and that leopard made a spring for my back, and [the trainer] brought that whip down right on his head. That was the end of my freedom with the leopard."

 

3. The A-Lister Was an Excellent Athlete

Raised by parents who encouraged the athletic development of their children and provided the financial means for doing so, Hepburn and her siblings engaged in a wide array of sports while growing up. She was particularly adept at golf, thanks to the private lessons she received as a teenager, and more than held her own in high-level competitions before pursuing her acting career. Hepburn also was known for her daily workouts on the tennis court by the time she was an A-list star, and continued to play regularly into her 80s. Fans can watch the screen great show off her natural skills in both sports in the 1952 comedy Pat and Mike.

 

4. Hepburn Enjoyed Instant Chemistry With Longtime Co-Star and Lover Spencer Trac

51039501-10-1024.jpg

Upon meeting 5-foot-9 Spencer Tracy shortly before they were to begin shooting 1942's Woman of the Year together, the 5-foot-7 Hepburn remarked that she would refrain from wearing high heels in his presence. Tracy soon had his revenge: After Hepburn knocked over a glass of water during an early take, Tracy continued with his lines while handing her a handkerchief, essentially forcing her to wipe up the mess while in character. It was that sort of spirited interaction that fueled their unparalleled screen chemistry over nine films, as well as their open-secret, real-life romance, which endured from their first production until his death in 1967.

 

5. She Performed Her Own Stunts

It wasn't quite Jackie Chan territory, but Hepburn insisted on doing her own stunts to preserve the authenticity of her shoots. Yes, that's her dangling from Grant's grasp off the scaffold at the end of Bringing Up Baby, and that's her tumbling into an unsanitary Venetian canal in Summertime (1955). Furthermore, advancing years did little to dampen her enthusiasm for such exertion: She endured horseback rides across treacherous terrain for Rooster Cogburn (1975), less than a year after undergoing hip surgery, and insisted on doing her own dives into frigid waters for On Golden Pond (1981), a few weeks after having an operation for a separated shoulder.

 

6. Hepburn Won a Slew of Awards Later in Her Career

201810242317557492.jpg

For all her early successes in films like Morning Glory and The Philadelphia Story, Hepburn didn't fully hit her stride until reaching an age when many actresses struggle to land quality roles. She received the bulk of her 12 Academy Award nominations after age 40, and three of her record four Oscar wins after turning 60. Additionally, Hepburn picked up the first of her two Tony nominations just before turning 63, and claimed her lone Emmy five years later. It was partly due to that record of longevity, and her embrace of both the joys and vulnerabilities of aging in her performances, that inspired the American Film Institute to name her the top female screen star of all time.

 

 

Source: About Iconoclastic Screen Star Katharine Hepburn

 

 

  • Like 1
Link to comment
Share on other sites

Fact of the Day - HIBERNATION

p06h3pw8.jpg

Did you know... When the weather warms in spring, animals begin stirring from hibernation. But they haven’t spent the previous few months just catching up on beauty sleep. Hibernation is a complex physiological state that helps animals survive seasons when resources are low. Here are a few facts on this unusual adaptation, and the critters that have mastered it.

 

1. Hibernation’s Purpose Is To Conserve Energy

s324_2_002i.jpg

Hibernation is a form of torpor — a dormant state in which an animal’s body temperature cools and its heart rate and metabolism slow to conserve energy. Torpor can last as little as a few hours: Hummingbirds in the Andes, for example, cool their internal temperature up to 33 degrees Celsius and enter torpor overnight, saving energy until the following morning. Hibernation is basically torpor that lasts several weeks to several months. The degree of hibernation among animals can vary from the practically dead (the Arctic ground squirrel can lower its body temperature below the freezing point of water) to relatively active (bears don’t lower their body temperatures as much and periodically wake up during hibernation).

 

2. A Lack of Food Usually Sets Hibernation in Motion

767de860-b9c6-4e17-b775-66c0bc5b40b4.jpg

Biologists used to think that cold weather was the signal for animals to start hibernating, since those in the temperate Northern Hemisphere disappeared into their dens when summer turned into fall and emerged when winter turned to spring. Then scientists discovered that numerous species in the tropics also hibernate, which suggested that hibernation was triggered by a seasonal lack of food instead of a change in temperature. These tropical species enter a hibernation-like state called estivation during hot and dry periods when water or food is scarce.

 

3. Hibernation Is Different From Sleep

shutterstock_626917214-400x250.jpg

When animals hibernate, they’re not just sleeping for weeks on end. “Light” hibernators like bears cycle between periods of rest, when their body temperature and functions are dormant, and brief periods of wakefulness when they change position, urinate, or even get some actual sleep. Female bears and other mammals may give birth and raise their young during this time. Deep hibernators, like some species of groundhog, mice, and bats, may remain practically motionless for months.

 

4. Hibernators Wake Up Hungry

speckled-ground-squirrel-spotted-souslik

Hibernating animals eventually wake up, signaled by the changing temperature of their environment or possibly by an internal “alarm clock.” The months spent in their cozy dens are actually not that relaxing: When animals emerge from torpor, they’re often underweight, tired, hungry, and thirsty. Their first post-hibernation acts are to drink water, hunt or forage for food, and size up potential mates.

 

5. A Huge Variety of Animals Hibernate

antarctic-cod-antarctica-toothfish-isola

Numerous species of warm-blooded animals experience some degree of torpor, but only a small percentage of them are considered true hibernators. A single species of bird, the common poorwill, and a single fish, an Antarctic cod (which isn’t warm-blooded, but does produce antifreeze-like proteins in its body), are known to hibernate. The practice is much more common among mammals; hibernating mammals include echidnas, insect-eating bats, at least one species of armadillo, the fat-tailed dwarf lemur, badgers, ground squirrels, marmots, jumping mice, dormice, and black and brown bears.

 

6. There Have Been a Few Cases of Human “Hibernation”

?t=resize:fill:408:255,enlarge:1

You may have noticed one mammal that doesn’t hibernate — us. But there are a handful of cases in which humans have endured a lethally low body temperature and lived, with no lasting effects. The most famous is the ordeal of Mitsutaka Uchikoshi, a 35-year-old Japanese civil servant, who slipped on a mountain trail and broke his hip in October 2006. He was rescued after 24 days suffering from extreme hypothermia “similar to hibernation,” his doctors said. After nearly two months in the hospital, he emerged with no residual injury. In 2012, a Swedish man was stranded in his snowed-in car for two months but survived, despite having severe hypothermia and no food, possibly due to having entered a torpor-like state.

 

7. Hibernation Might Help Humans Get to Mars

7cVYhPGzBCmaDTkw1DvG4T.jpeg

Research into animal hibernation has the potential to help humans. Understanding why hibernators can withstand extremely low body temperatures and slowed metabolism without injury might give us clues for recovering from heart attacks, preserving human organs for transplant, or conducting complex surgeries. Scientists are even experimenting with “induced hibernation” as a way to conserve astronauts’ energy on long journeys through space, and to reduce the amount of resources needed on future missions to Mars.

 

 

Source: Facts About Hibernation

  • Like 1
Link to comment
Share on other sites

Fact of the Day - MYSTERIES

th48awehfe.jpg?w=584

Jack the Ripper

Did you know... Although humans often prefer stories with a simple beginning, middle, and end, history doesn’t always line up so nicely. These six moments from the past represent some of the most head-scratching conundrums that still stump scientists, FBI investigators, and even amateur sleuths. Some of them might never be solved, but that doesn’t mean it’s not fun to try.

 

1. What Happened to the “Lost Colony” of Roanoke?

na_63ef6d03d6599.jpg

The legend of the Roanoke colony is so enduring because it lies at the heart of the founding of America. Starting in 1584, 23 years before the establishment of the Jamestown colony in nearby Virginia, three English expeditions landed at Roanoke Island, nestled between the Outer Banks and mainland North Carolina, although these initial forays failed to establish a permanent settlement. In 1587, John White, along with roughly 115 colonists, traveled from England and established a colony on Roanoke Island. White sailed back to England later the same year to get supplies, but upon his return three years afterward (having been delayed by the Spanish Armada), he found Roanoke completely abandoned. There was no sign of foul play. Houses were replaced with a fortress, and the word “Croatoan” had been carved into a post — a reference to the nearby island of Croatoan, now called Hatteras Island, as well as the tribe that lived there. White tried to travel to the island but storms prevented him from doing so, and he sailed back to England. He died in 1593 unable to return to Roanoke, and no one truly knows what happened to the colonists — no bodies have ever been found. Theories range from the practical (confrontation or assimilation with Native Americans) to the supernatural or extraterrestrial, but it’s unlikely historians will ever know for sure.

 

2. What Happened Aboard the Mary Celeste?

images?q=tbn:ANd9GcS-St3iPYmJZFzXDkK5IVt

The world’s oceans have swallowed many ships since the dawn of the Age of Sail in the 16th century, but no story is quite like the curious case of the Mary Celeste. On November 7, 1872, the Mary Celeste set sail for Genoa, Italy, loaded with 1,700 barrels of alcohol as cargo. Fast-forward nearly a month later, and a British merchant vessel named Dei Gratia spotted the ship some 400 miles east of the Azores in the mid-Atlantic. But something was wrong — no one on board the Mary Celeste was responding to the Dei Gratia’s signals. After boarding, sailors found the ship mostly undamaged, but abandoned. There was little to no sign of struggle, and six months of food onboard. Only the lifeboat and navigational tools were missing. The ship’s captain, his family, and his crew have never been found. The theories put forward to explain the ship’s abandonment include pirates, an earthquake, or a mutiny. However, the most colorful theory includes a giant squid attack.

 

3. Who Was D.B. Cooper?

300px-D.B._Cooper_Composite_Sketch_B.jpg

On November 24, 1971, a man calling himself Dan Cooper (later erroneously reported as D.B. Cooper) boarded Northwest Orient Flight 305 traveling from Portland, Oregon, to Seattle, Washington. Described as a mid-40s white man dressed in a business suit, Cooper ordered a bourbon and soda before alerting the stewardess that he had a bomb in his briefcase. Cooper then handed the stewardess a list of demands, saying that he wanted parachutes, a refueling truck, and $200,000 in cash waiting for him when the plane landed in Seattle. He added the phrase, “no funny stuff.” After an exchange of the flight’s passengers for the money and other goods, the plane took off for Cooper’s requested destination in Mexico City — but he didn’t get far. While flying over southern Washington, Cooper strapped on one of the parachutes he had demanded and jumped out of the plane. Nine years later, a boy found $5,800 in southern Washington with serial numbers that matched the money stolen by Cooper. The FBI has described the case as “one of the longest and most exhaustive investigations in our history,” although it is no longer currently investigating it. Over 100 suspects have been evaluated, but the mysterious criminal has yet to be identified.

 

4. What Is the Purpose of the Nazca Lines?

63571_shutterstock-211638601-nazca.jpg

The Nazca Lines are massive geoglyphs — sometimes more than a thousand feet long —  carved into the ground some 250 miles south of Lima, Peru. At first glance, these lines might look similar to crop circles, and can only be viewed from the cockpit of a helicopter or airplane. Depicting animals, plants, and various shapes, the Nazca Lines were created by the Nazca people some 2,000 years ago. Archaeologists have studied the lines for 80 years (and are still discovering new geoglyphs), but still don’t know for sure why ancient people created such massive monuments they couldn’t even see. Early theories suggest the lines had some sort of astronomical or calendrical purpose — not unlike Stonehenge — although more recent theories suggest the structures could’ve been tied to irrigation or elaborate religious ceremonies. Whatever the reason, the Nazca Lines remain a mystery etched into the very face of the planet.

 

5. Where Are the Gardner Museum Paintings?

e75bd40d-6164-4e69-b3e6-2cdd321c4a9f_338

Museum heists are common throughout history (and Hollywood), but the ne’er-do-wells are usually captured in the following months, or sometimes years. Unfortunately, the Isabella Stewart Gardner Museum in Boston, Massachusetts, wasn’t so lucky. In the early morning of March 18, 1990, two burglars dressed as police officers subdued the museum’s two security guards and purloined 13 paintings worth over $500 million, including works by Johannes Vermeer, Rembrandt van Rijn, Edgar Degas, Govaert Flinck, and Édouard Manet. By 8:30 a.m., several hours after the heist, the police (the actual police) found the guards handcuffed in the basement. Four years later, a mysterious letter sent to the museum offered to return the paintings for $2.6 million. Although the museum agreed, a second letter revealed the mysterious author was clearly spooked by FBI involvement, and the deal fell through. A Netflix documentary and a popular podcast have explored the heist, and the FBI even offered a $10 million reward leading to the paintings’ whereabouts, but despite it all, the 13 masterpieces — as well as the two burglars — have yet to be found.

 

6. What Happened to Amelia Earhart?

Amelia%202.jpeg

In the 1930s, Amelia Earhart wasn’t just one of the most famous pilots in the world — she was arguably the most famous woman in the world.  In 1928, she had become the first woman to fly across the Atlantic; in 1932, she became the first woman to make a solo nonstop transcontinental flight, from L.A. to Newark. So it’s no wonder her disappearance on July 2, 1937, while trying to circumnavigate the globe, sent a shockwave through society whose ripples can still be felt. On that fateful summer day in 1937, Earhart and her navigator, Fred Noonan, set out from Lae, New Guinea, flying a Lockheed Model 10 Electra and headed for Howland Island, a Pacific island that measures only 1 square mile. Although Earhart was in contact with the U.S. Coast Guard ship near the island, the famous pilot never arrived. In her last transmission, she noted her position and that she was running low on fuel. Neither Earhart, her navigator, nor her plane was ever seen again. The leading theory is that Earhart simply crashed into the ocean, but an extensive search of the surrounding area has turned up nothing. Other theories suggest Earhart possibly landed on a nearby island in line with her last coordinates. In 2017, another theory suggested that Earhart survived as a Japanese prisoner, and some argued that she can be seen in a grainy photo taken on the then-Japanese Marshall Islands shortly after the crash (though some experts have poured cold water on the idea). It’s unlikely we’ll ever know what happened to one of history’s most famous aviators, but that won’t keep people from looking for answers.

 

 

Source: Most Perplexing Mysteries in History

  • Like 1
Link to comment
Share on other sites

Fact of the Day - DOGS

_medium.jpeg

Did you know.... There are few creatures on the planet as cuddly, loyal, and beloved as dogs. Many people pamper their pooches and treat them as members of the family, and in return, dogs provide unconditional love and companionship. In some cases, they can even be trained to sniff out diseases, detect explosives, or assist people with disabilities. But no matter their breed or purpose, they’re incredible animals — with a fascinating history to boot. Here are six faithful facts about dogs.

 

1. The World’s Oldest Dog Is Over 30

 

Dog lovers are living in a historic era: As of this writing, the longest-surviving dog on record is alive and well. Bobi is a purebred Rafeiro do Alentejo, which is a Portuguese breed known to be good at protecting livestock and guarding property. On average, dogs of this type live between 12 and 14 years, but Bobi was born on May 11, 1992, and recently celebrated his 30th birthday… in human years! According to the Guinness Book of World Records, that milestone makes Bobi not only the oldest dog currently living in the world, but the oldest dog to ever live. The previous record holder for history’s oldest dog was an Australian cattle dog named Bluey, who was born in 1910 and lived for 29 years and five months. Bobi is the first dog on record to surpass three decades in age, which is especially miraculous considering that he nearly didn’t survive infancy. Bobi’s owner, Leonel Costa, worked hard to save and take care of Bobi when the pooch was born, despite being just an 8-year-old kid himself at the time. His efforts obviously paid off, as Bobi has lived a long and full life in Leiria, Portugal, consuming a delicious diet of unseasoned human food soaked in water.

 

2. President Harding’s Dog Had His Own Seat at Cabinet Meetings

laddie-boy31.jpg

These days, presidential pets are almost as famous as their commanders in chief — but that wasn’t always the case. The first White House animal to really achieve celebrity status was President Warren G. Harding’s pup, an Airedale terrier named Laddie Boy, who lived in Washington, D.C., during the Harding administration from 1921 to 1923. Laddie Boy was a fixture at the President’s side from the very beginning. In fact, on March 5, 1921, one day after taking office, Harding interrupted his first official Cabinet meeting to introduce the dog, who had just arrived from Ohio. After that, Laddie Boy became a regular at Cabinet meetings, and even had his own chair at the table. As part of Harding’s attempt to appeal to the average person and capitalize on his campaign slogan, which promised a “Return to Normalcy,” Laddie Boy also accompanied the President on the golf course, helped welcome foreign delegates, and once participated in the annual White House Easter Egg Roll. The pooch was beloved not only by Harding but by the press as well; newspapers would publish pretend interviews with fictitious quotes from Laddie Boy, much to the delight of the public. After Harding’s death, more than 19,000 newsboys donated pennies to be turned into a copper statue of Laddie Boy, which now belongs to Washington’s Smithsonian Institution.

 

3. The Dog Who Played Toto Earned $125 per Week

619389b6452498602df7c9e1d6c263cb--wizard

Few dogs are more famous than the cairn terrier who played Toto in 1939’s The Wizard of Oz. Terry, as she was known off-screen, was born in 1933 and rescued by Carl Spitz after being abandoned by her birth parents. Despite a tumultuous start to her life, she went on to have a prolific career in Hollywood that even many human actors would envy. Spitz ran his own Hollywood Dog Training School, and although he initially took in Terry just as a pet, she quickly became his biggest star. His technique used silent hand signals, which gave him (and his dogs) an edge over other trainers who had to vocally call out their commands. With his help, Terry landed an uncredited role in 1934’s Ready for Love, and later that year starred alongside Shirley Temple in Bright Eyes. After appearing in several other films, Terry ascended to superstardom when she booked the role of Toto in The Wizard of Oz, earning $125 a week for portraying Dorothy’s trusted companion — more than some of her human castmates made.

 

4. The Beatles’ “A Day in the Life” Features a Whistle Only Dogs Can Hear

Fr1ivUKWIAAm6rY.jpg

Back in 2013, the BeatlesPaul McCartney revealed a little canine-related trick the band had included on their seminal 1967 album, Sgt. Pepper’s Lonely Hearts Club Band. At the end of the album’s final track, “A Day in the Life,” the Fab Four added a tone pitched at 15 kilohertz, making the sound audible to dogs but difficult to hear for most humans. The tone was reportedly added at the request of John Lennon, who asked producer George Martin to dub in the high-pitched frequency. “We’d talk for hours about these frequencies below the sub that you couldn’t really hear and the high frequencies that only dogs could hear,” McCartney explained. “If you ever play Sgt. Pepper, watch your dog.”

 

5. Spiked Dog Collars Originated in Ancient Greece

320px-Roccale_1.jpg

Though they serve a more decorative purpose in modern times, spiked dog collars had an important use in ancient Greece, where they were originally conceived of as protection for pooches patrolling farms, as those dogs were susceptible to harm from random wolf attacks. Inspired by standard dog collars that had been developed in the ancient Egyptian city of Naucratis — with whom the Greeks frequently traded — Greek dog owners designed a couple of spiked options to defend against predators. One type of collar was made of metal, thus forming a kind of chain-link guard with spikes, whereas the other was made of leather with metal spikes poking through the material and secured by rivets. In both cases, the spikes were meant to protect the dog’s throat and potentially injure the attacking wolf. Artistic depictions suggest the collars could also be symbols of status, perhaps ornamented with engravings. Dogs were important creatures within Greek society, and had a profound cultural impact as well. In his epic poem the Odyssey, for example, Homer wrote of Argos, the faithful dog who waits 20 years for the return of the hero Odysseus.

 

6. Some Domesticated Dog Breeds Date Back Millennia

1daeb2b5422c2e4311027d9b98f1556c.jpg

There’s some debate around the oldest breed of domesticated dog, but if you go by the Guinness Book of World Records, that title belongs to the saluki, which is believed to have originated circa 329 BCE (though some experts posit it may date back even further to around 7000 BCE). Sometimes called the royal dog of Egypt, the saluki was heralded in ancient Egyptian society; the dogs were even honored with mummification after death, much like pharaohs at the time. In many Arab cultures throughout the Middle East, hunters used salukis — which boast incredible speed — to track and take down gazelle. The breed eventually made its way to England by the mid-1800s, and was finally recognized by the American Kennel Club in 1927.

 

 

Source: Faithful Facts About Dogs

  • Like 1
Link to comment
Share on other sites

Fact of the Day - FOUNDING FATHERS

Was-The-American-Constitution-Really-Bas

Did you know.... Few figures loom as large in American history as the Founding Fathers. Although wrapped in myth and shrouded in legend, these leaders lived fascinating lives molding a fractious colony into a new nation. Although their stories have been meticulously detailed — through their own writings as well as centuries of biographies and classroom textbooks — not everything about them is well known. Which famous general lost more battles than he won? Which two Founding Fathers died on the same day? Which one invented a strange musical instrument? Here are seven little-known facts about the men who created a nation.

 

1. John Adams and Thomas Jefferson Died on the Same Day

John Adams and Thomas Jefferson, bitter political rivals and, at times, close friends, died on the very same day — July 4, 1826, 50 years after signing the Declaration of Independence. The two were the last surviving of the original revolutionaries who helped forge a new nation after breaking with the British Empire. During their presidencies, the two diverged on policy and became leaders of opposing political parties, but at the urging of another founding father, Benjamin Rush, around 1812, Adams and Jefferson began a correspondence that lasted the rest of their lives. On his deathbed at the age of 90, Adams’ last words were reportedly “Jefferson still lives,” but he was mistaken — Jefferson had died five hours earlier in Monticello, Virginia.

 

2. James Madison Was the Shortest President in U.S. History

160px-James_Madison_by_Gilbert_Stuart_(c

Although James Madison’s signature doesn’t adorn the Declaration of Independence, as the nation’s fourth President and chief architect of the Bill of Rights, he’s widely regarded as one of the most influential Founding Fathers. Madison had a large impact on early U.S. history even though he is also the country’s shortest President thus far, standing just 5 feet and 4 inches tall. That makes Madison a full foot shorter than America’s tallest President, Abraham Lincoln (and no, that height doesn’t include Lincoln’s signature stovepipe hat).

 

3. John Hancock Was Accused of Smuggling

On May 24, 1775, John Hancock became the presiding officer over the Second Continental Congress. A little more than a year later, his signature became famous when he wrote his name in grandiose letters, taking up some 6 square inches, on the Declaration of Independence. (Legend says Hancock wanted the king to be able to see it without spectacles.) However, Hancock was also known as an importer, and — at least when it came to British tea — was accused of being a smuggler. The British seized his sloop Liberty in 1768 because of suspected smuggling, which instigated a riot. Luckily, fellow founding father and lawyer John Adams cleared Hancock of all charges, and there was only flimsy evidence for the charges in the first place.

 

4. Sam Adams Might Never Have Brewed Beer

image.php?type=thumbnail_580x000&url=02a

Sam Adams was the most influential member of the Sons of Liberty, a loosely organized political organization that formed in opposition to the Stamp Act in 1765. But to many Americans, he’s also the name behind one of the most successful beer brands in the U.S. The company says it picked the name because its founder, Jim Koch, “shared a similar spirit in leading the fight for independence and the opportunity for all Americans to pursue happiness and follow their dreams.” That’s good, because it’s not clear whether Sam Adams actually ever brewed beer. After his father’s death in 1748, Adams inherited his malt house, which is where grains are converted into malt that’s then sold to brewers. But within only a few years, the business was bankrupt and the malt house itself was crumbling; the whole family estate was then put up for auction. Adams proved more effective as a political firebrand than as a “maltster.”

 

5. George Washington Lost More Battles Than He Won

General George Washington embodies the phrase “losing the battle but winning the war,” because during the American Revolution, he lost more battles than he won. Despite some experience in the British army, Washington had little experience fielding a large fighting force, and the Continental Army was filled with soldiers who were far from professional fighters. However, Washington’s resilience, determination, and long-term strategy eventually won the day. According to Washington’s aide Alexander Hamilton, the plan was simple: “Our hopes are not placed in any particular city, or spot of ground, but in preserving a good army … to take advantage of favorable opportunities, and waste and defeat the enemy by piecemeal.” Washington, also aided by competent generals such as Nathanael Greene and assisted by the French Navy, decisively ended British ambitions in the colonies at the Battle of Yorktown in 1781.

 

6. Benjamin Franklin Invented a Musical Instrument Used by Mozart and Beethove

Unbekannt_-_Czech_glass_harmonica_from_t

In the mid-1700s, while serving as a delegate for the American colonies in Europe, Benjamin Franklin experienced a popular musical performance — singing glasses. Intrigued by the beautiful sound of a wet finger on glass, Franklin developed an instrument known as a “glass armonica” in 1761. Working with a glassblower in London, Franklin altered the thickness of glass bowls, interlocked along a rod, in order to produce a range of pitches. Far from being one of Franklin’s odder ideas (like his failed phonetic alphabet), the glass armonica was an 18th-century sensation. Some of the era’s greatest composers, including Wolfgang Amadeus Mozart and Ludwig van Beethoven, wrote music for the instrument. However, it was largely forgotten by the 1820s — many musicians complained of dizziness and other symptoms after playing it, with some blaming lead poisoning or the instrument’s vibrations as the cause. Today, a few musicians still practice the subtle, ethereal art of the glass armonica.

 

7. Alexander Hamilton Was Captain of One of the Oldest U.S. Army Regiments in Existence

Alexander Hamilton is known for many things — he was the prolific writer behind the Federalist Papers, the first secretary of the treasury, the creator of the U.S. Coast Guard, and the inspiration for one of Broadway’s biggest musicals. What’s less celebrated about Hamilton is his military career, though when fighting broke out, the eager immigrant from Nevis island in the Caribbean joined the cause. On March 14, 1776, Hamilton was named captain of the New York Provincial Company of Artillery, and soon fought in the battles at Kip’s Bay and White Plains, among others. Hamilton slowly climbed up the military ladder, first serving as General George Washington’s aide and then as commander of a light infantry battalion at the decisive Battle of Yorktown. However, it’s his original artillery company that holds a singular distinction. Known today as 1st Battalion, 5th Field Artillery Regiment, Hamilton’s former artillery unit is one of the oldest active regiments still serving in the U.S. Army.

 

 

Source: Amazing Facts About America’s Famous Founding Fathers

  • Like 1
Link to comment
Share on other sites

Fact of the Day - BLACK INVENTORS

statue-to-george-washington-carver-at-hi

Did you know... The world would be unrecognizable without the groundbreaking contributions of Black inventors. Whether it’s the country’s most popular toy or a well-known piece of lifesaving battlefield gear, the extraordinary men and women who dreamed up these ideas did so while facing virulent racism and systemic injustice, yet persevered to make the world a better — or at least more interesting — place.

 

1. While Working at NASA, Lonnie Johnson Invented the Super Soaker

lonnie-johnson-smiling-and-shooting-feat

Inventor Lonnie Johnson has quite the résumé. A nuclear engineer by profession, Johnson worked at Oak Ridge National Laboratory, joined the Air Force, then jumped ship to NASA’s Jet Propulsion Laboratory in 1979, where he worked on Galileo — a robotic orbiter studying Jupiter and its moons. While at NASA, Johnson worked on a heat pump that used water instead of Freon. “I was experimenting with some nozzles that I machined, and I shot a stream of water across the bathroom,” Johnson told CNN in 2020. “I thought, ‘Geez, maybe I should put this hard science stuff aside and work on something fun like a water gun.’” In 1989, Johnson licensed his famous invention, and in two years, the Super Soaker became the No. 1 toy in America, making more than $200 million in sales.

 

2. Alexander Miles Made Elevators Less Harrowing

Alexander Miles first found success as a barber, and then as an elevator innovator. The first passenger elevator debuted in 1853, but riding one was less than ideal. Because elevator doors had to be manually operated, elevator-related deaths were far too common. An owner of many buildings himself, Miles saw firsthand the dangers of elevators and decided to do something about it. Using a flexible belt attached to the elevator cage, drums above and below the doors on each floor, and other equipment, Miles’ invention automated the process of opening and closing elevator doors. Miles was granted a patent for his invention in 1887. At the time of his death in 1918 in Seattle, the barber-turned-inventor was the wealthiest Black man living in the Pacific Northwest.

 

3. The Inventor of the Home Security System Was a Nurse

GettyImages-854103122-HERO.jpg?itok=6sU2

Necessity is the mother of invention, and that can certainly be said of Marie Van Brittan Brown and her home security system. In the mid-1960s, Brown lived in a rough neighborhood in Queens, New York, while working as a nurse. She was often alone at night, so she decided to design her own peace of mind. Her invention featured four peepholes on the front door and a motorized camera that could look through the holes at varying heights. The camera was connected to a television inside the home, and a microphone both inside and outside the door allowed her to interrogate uninvited visitors. For added security, Brown also devised a way to alert police via radio. This ingenious use of cameras and closed-circuit television helped Brown score a patent for her security system in 1969. Today, Brown’s invention is widely regarded as the cornerstone of modern home security systems.

 

4. George Washington Carver Was the First Black American Honored With a National Monument

George Washington Carver is one of the greatest minds in American history. Primarily an agricultural scientist, he invented hundreds of products using sweet potatoes, soybeans, and peanuts (but not peanut butter, as a persistent myth suggests). Carver was born enslaved around 1864, but 30 years later, and after many trials, he earned a bachelor's degree in science. And he put that degree to work. Carver developed crop rotation methods, invented the Jesup wagon (a sort of mobile classroom for Carver to teach farmers about agricultural science), and lots of peanut-based products, including milk, Worcestershire sauce, cooking oils, paper, cosmetics, and wood stains. Carver died in early 1943 having dedicated his life to science, and a grateful nation honored him for his efforts later that same year, when President Franklin Delano Roosevelt founded the George Washington Carver National Monument. It was the first national monument dedicated to a Black American — or to any non-President.

 

5. Traffic Signal Inventor Garrett Morgan Was Also a Hero

garretmorgan_nventor_BPTN-300x300.png

Garrett Morgan’s life as an inventor began at the turn of the 20th century, when he started working at a sewing machine factory. After learning the inner workings of his machines, Morgan patented an improvement that earned him some much-needed income. He later developed a hair-straightening cream that made him financially independent and able to pursue his own interests. In 1914, Morgan developed “safety hoods” for firefighters to wear when battling blazes, and the underlying design eventually found its way into the trenches of World War I. Then, in 1916, Morgan became a local hero when a tunnel explosion under Lake Erie trapped workers in close quarters with noxious fumes. Upon hearing of the accident, Morgan and his brother donned their breathing devices and saved two people’s lives. However, Morgan’s greatest invention came in 1923, when he developed the first automatic traffic signal to control stop-and-go traffic at intersections. He acquired patents for the device in the United States, Britain, and Canada, and it saved thousands of lives over the years.

 

6. George Crum Accidentally Invented the Potato Chip

In the 1850s, George Crum (born George Speck) worked at Moon’s Lake House, a high-end restaurant in upstate New York. The legend goes that one day a surly customer didn’t like the way Crum prepared his french fries and complained they were too thick. With a not-so-subtle amount of spite, Crum cut some fresh potatoes incredibly thin and then fried them up for his needy patron. To Crum’s surprise, the thinly sliced fry — or potato chip, as we call it today — became a big hit, and soon the restaurant became known for its “Saratoga chips.” Although the owner of the restaurant tried to take credit for the invention, as did others, Crum soon opened his own establishment and provided a basket of chips on every table. The potato chip remained a local delicacy in upstate New York until Herman Lay began building his snack food empire in the 1920s.

 

 

Source: Amazing Facts About Black Inventors Who Changed the World

  • Like 1
Link to comment
Share on other sites

Fact of the Day - COLOR GREEN

d2e72095f3a148550978352978010e53.jpg

Did you know.... The color green is intimately tied to the human experience. The hue fills our world as the color of nature, and its particular wavelength has a fascinating relationship with our visual sense. The color can represent positive notions (peace and fertility) as well as negative ones (greed or envy). Although it’s considered a secondary color, because it’s a mix of both yellow and blue primary colors, green is maybe the most important hue in the visual spectrum — and these five mind-blowing facts explain why.

 

1. Human Eyes Are Most Sensitive to the Green Wavelength of Light

images?q=tbn:ANd9GcQLPhrPN_3aNQfIqwJnEXr

Electromagnetic radiation comes in a variety of types, including radio waves, gamma rays, and visible light. The human eye can perceive wavelengths around 380 to 740 nanometers (nm), also known as the visual light range. The size of the wavelength determines the color we see: For example, at 400 nm our eyes perceive the color violet (hence the name “ultraviolet” for wavelengths directly under 400 nm), whereas at 700 nm our eyes glimpse red (but can’t see the “infrared” wavelengths just beyond it). In the middle of this spectrum of visible light is the color green, which occupies the range between 520 to 565 nm and peaks at 555 nm. Because this is right in the middle of our visual range, our eyes are particularly sensitive to the color under normal lighting conditions, which means we can more readily differentiate among different shades of green. Scientists have also found that the color green positively affects our mood in part because our visual system doesn’t strain to perceive the color — which allows our nervous system to relax.

 

2. The Color Green Has Meant Many Things Throughout History

Osiris+on+his+throne+underworld.jpg

Today, the color green is associated with a variety of feelings and social movements. Turning a shade of green can indicate nausea, but you can also become “green” with envy. Green is closely associated with money and capitalism, while also embodying aspects of nature and the environmentalist, or “Green,” movement. However, these cultural definitions have changed over millennia, and have different associations in different parts of the world. For example, in ancient Egypt, green was often linked with both vegetation and death, and Osiris (god of fertility and death) was often depicted as having green skin. These days, green is prevalent throughout the Muslim world — adorning the flags of Muslim-majority nations such as Iran and Saudi Arabia — because it was supposedly the prophet Mohammad’s favorite color. Many African nations also include the color green in their flags to represent the natural wealth of their continent, and Confucius believed green (more specifically jade) represented 11 separate virtues, including benevolence, music, and intelligence.

 

3. Hollywood Uses Green Screens Because of Human Skin Tones

in-the-big-film-studio-professional-crew

If you’ve seen any big-budget Hollywood film, it probably used some variety of green screen-enabled special effects. In fact, some version of green screen technology, also known as “chroma keying,” has been around since the early days of film. The reason why screens are green is actually pretty simple — human skin is not green. When a camera analyzes chrominance, or color information, it can easily separate green (or blue) from the rest of the shot so that a video backdrop can be inserted. However, the technology isn’t foolproof, as green clothes can blend in with backgrounds. (That’s why meteorologists don’t wear green on St. Patrick’s Day.) Because of this deficiency, among other reasons, some productions are shifting to high-tech LED panels to recreate otherworldly locations.

 

4. The Color Green May Have Killed Napoleon Bonaparte

2a41xmb.jpg

In 1775, German Swedish chemist Carl Wilhelm Scheele made a green-hued pigment that eventually bore his name — Scheele’s green. Unfortunately, the pigment was extremely dangerous, since it was made with arsenic. However, its rich hue ignited a craze for green, and the pigment was used in wallpaper, clothing, and even children’s toys. In fact, some historians believe that Napoleon Bonaparte died from the Scheele’s green pigment embedded in the wallpaper of his bedroom on the island of St. Helena. However, that wasn’t the end of green’s deadly reputation. Decades later, impressionist painters — such as Paul Cézanne and Claude Monet — used a green pigment called Paris green that was highly toxic, if less dangerous than Scheele’s green. Experts suggest that the chemical could have contributed to Cézanne’s diabetes and Monet’s blindness.

 

5. No One Is Sure Why the Backstage Room Is Called a “Green Room

green-and-orange-lounge-green-and-orange

One early reference to a “green room” in the sense of a waiting room appears in The Diary of Samuel Pepys, the famed journal kept by a civil servant in 1660s London. Pepys mentions a “green room” when going to meet the royal family — likely a reference to the color of the walls. A “green room” was then tied to the theater in English playwright Thomas Shadwell’s 1678 comedy A True Widow, which includes the line: “Selfish, this Evening, in a green Room, behind the Scenes.” However, Shadwell doesn’t mention why it was called a green room. One notable London theater did have a dressing room covered in green fabric, but other theories behind the term reference actors going “green” because of nervousness, amateur or young (aka “green”) actors, or a place where early actors literally waited “on the green” lawns of outdoor theaters — among many other ideas. It’s possible we’ll never know the origin of the phrase for sure.
 

 

Source: Fascinating Facts About the Color Green

  • Like 1
Link to comment
Share on other sites

Fact of the Day - CELEBRITY INVENTORS

150px-Jamie_Lee_Curtis_(41851191720)_(cr

Did you know... Not all inventors fit the image of the white-haired, bespectacled eccentric scribbling out notes while surrounded by beeping machines and steaming beakers. Some are gorgeous actors or gifted musicians who achieve fame and fortune in their chosen fields, yet still are motivated to fulfill a need or solve a problem afflicting the public. Here are six such celebrities who found the time between photo shoots, interviews, and the demands of their day jobs to follow their personal passions to the patent office.

 

1. Marlon Brando

t2874cd4a8d1223e632a8a10d4d40d99e.jpg

He may not have originated the "method acting" technique, but Marlon Brando was an innovator when it came to his enthusiasm for drumming. Late in life, the Oscar winner devoted his energy to developing a conga drum that could be tuned by way of a single lever at the bottom, as opposed to the usual five or six bolts along the top. Although he received four patents prior to his death in 2004, Brando likely needed to put in more work to make his creation a reality; one drum manufacturer interviewed for a 2011 NPR article indicated that the actor's design was practical, but not cost-effective enough for production.

 

2. Hedy Lamarr

191px-Hedy_Lamarr_Publicity_Photo_for_Th

During her Hollywood heyday, Hedy Lamarr was known as "the most beautiful woman in the world," a designation that ignored the impressive brain power behind those green eyes. Determined to aid the Allied cause during World War II, Lamarr teamed with composer George Antheil to devise a radio transmission technique that defied enemy disruption efforts by randomly jumping to different frequencies. Although it was initially dismissed by the U.S. Navy, the secret communication system is now recognized as a precursor to the wireless technology that fills our everyday lives. Lamarr also dabbled in more mundane creations, like an improved stoplight and dog collar, and was inducted into the National Inventors Hall of Fame in 2014.

 

3. Eddie Van Halen

162px-Eddie_Van_Halen_at_the_New_Haven_C

While he is rightly celebrated for dazzling solos for his namesake band, Eddie Van Halen was also a craftsman who constantly sought out ways to improve the guitar-playing experience for himself and others. In 1987, the rocker patented his musical instrument support, a plate that props up a guitar against the player's body and frees the hands to "explore the musical instrument as never before." Van Halen also acquired patents for a tension adjustment mechanism for stringed instruments, the design and implementation of a noise-canceling humbucking pickup, and a guitar peghead.

 

4. Zeppo Marx

189px-Zeppo_Marx.jpg

The youngest of the Marx Brothers, Herbert "Zeppo" Marx was largely overshadowed as the straight man of the comedic quartet, but he later came into his own as an agent, businessman, and health-minded inventor. His first patent was for a vapor delivery pad for distributing moist heat, intended to replace the inefficient method of dipping towels in hot water to apply to achy body parts. The erstwhile entertainer later received multiple patents related to cardiac monitoring applications, one of which made headlines as the pulse-tracking "heart wristwatch."

 

5. Jamie Lee Curtis

z25383681H,Jamie-Lee-Curtis-przerwa-milc

While she'd already achieved stardom by way of roles in films such as Halloween (1978) and Trading Places (1983), Jamie Lee Curtis showed she was just as burdened as the next parent when she patented a new and improved diaper in the late 1980s. The solution was a simple one, as her infant garment came with a front pocket for wipes to eliminate the need to hunt down both items during stressful moments. Although she let the patent expire because of concerns over the product's biodegradability, Curtis continued her pursuit of the perfect diaper with another patent in 2017, this time including a plastic bag to make disposal even more tidy.

 

6. Bill Nye

160px-Bill_Nye_2017.jpg

Best known as the "Science Guy" from his popular 1990s PBS show, Bill Nye has engaged in a wide-ranging career that includes stints as a mechanical engineer, a stand-up comic and yes, an inventor. As befitting his brainy reputation, Nye designed a noise-and-vibration-reducing device called a hydraulic pressure resonance suppressor for use on the Boeing 747 jumbo jet, and he later received patents for his educational lens and digital abacus. More surprising are his patents for a throwing technique trainer, to help budding baseball players, and his toe shoe, to provide additional support for the grueling regimen of a ballerina.

 

 

Source: Celebrities Who Doubled as Inventors

  • Like 1
Link to comment
Share on other sites

Fact of the Day- NAMED AFTER INTERNATION CITIES

536fa40897715405b8ad90223e2d1726

Did you know.... Place names in the United States have a variety of influences. Many spots retain their Indigenous names, while some are derived from a European language; others are twists on cities that already existed. (Here’s looking at you, New York). If you’ve ever driven along the interstate and seen a sign directing you to a European capital or ancient civilization, that’s because a handful of American cities enthusiastically adopted the name of another place. Here are the stories behind eight of the most interesting cities in the U.S. that are named after other cities abroad.

 

1. Melbourne, Florida

l.jpg

With palm trees and blue seas all around, you might be forgiven for mistaking this Florida city for its Australian namesake. But it’s not the geographical similarities that led to this city being named after the Melbourne Down Under. The area around the Indian River Lagoon began rapidly developing in the late 19th century, and when a post office became necessary to serve the community, the settlement needed a name. The area’s inhabitants found inspiration in the town’s first postmaster, Cornthwaite John Hector, who had spent many of his formative years in Melbourne, Australia. (However, it wasn’t Hector who proposed calling the settlement Melbourne; a local woman suggested it.) On December 23, 1888, straws were drawn to select a new name — and Melbourne won. Today, this harbor city retains many historic Victorian wooden houses that wouldn’t look out of place in Melbourne, Australia.

 

2. Athens, Georgia

Like its European counterpart, Athens, Georgia, is a center of academia, culture, and the arts. The University of Georgia — the first public land-grant university in the U.S. — was founded here in 1785. After classes began in 1801, a burgeoning city sprang up around campus. In 1806, the city was incorporated, and the Georgian governor at the time, John Milledge, suggested the name Athens, as the Greek capital was home to Europe’s earliest intellectuals, including revered philosophers Plato and Aristotle. Today, the 19th-century Greek Revival buildings in the city center, including the Taylor Grady House, remind visitors of the Parthenon everywhere they look. And, like its Greek counterpart, Athens, Georgia, enjoys a thriving student scene — you can’t miss the music of up-and-coming indie bands emanating from the campus town’s trendy bars and restaurants.

 

3. Paris, Texas

21461968-mini-eiffelturm-in-paris-texas-

A railroad boom town on the northern edge of the Lone Star State, Paris, Texas, is arguably the most famous of all America’s towns named after the French capital — indeed, locals call it “the second-largest Paris in the world.” But no one is exactly sure why the two cities share a name. Popular belief is that an employee of the town’s founder, George Washington Wright, came up with the idea when the town was incorporated in 1844. The employee, Thomas Poteet, lobbied to call the new town Paris in honor of his French ancestors. But other theories abound, from a local girl winning naming rights in a beauty pageant and choosing Paris, to a group of bored men simply plucking the name out of thin air. Whatever the origins of its name, Paris, Texas, gained fame thanks to the 1984 road movie of the same name. Visit today, and you’ll find old-school trolleys taking tourists around town, particularly to see the 65-foot high replica Eiffel Tower topped with a red Texan hat. (Just in case you weren’t sure which Paris you were in.)

 

4. Memphis, Tennessee

With its neon lights and blues music on every corner, Memphis, Tennessee, feels a world away from its ancient Egyptian namesake. The Tennessee city was built thousands of years after Memphis, Egypt, was abandoned, but both cities have something in common: They were constructed alongside great rivers. It’s not certain why the three men (including future President Andrew Jackson) who founded the Tennessee city named it after the one in Egypt, but perhaps they felt the Mississippi River evoked the spirit of the Nile and the prosperous trading and temple city built on its banks. Modern Memphis is known throughout the world for its music scene. Elvis Presley built Graceland on the city’s outskirts, and Sun Studios, the birthplace of rock ‘n’ roll, is steps away from Beale Street. While there aren’t many obvious similarities with the ruined city on the banks of the Nile, there is one big homage to its Egyptian connection: the Memphis Pyramid.

 

5. Boston, Massachusetts

320px-Boston_skyline_from_Longfellow_Bri

As one of the first cities built by English settlers in the U.S., it’s no surprise that Boston’s name comes from a town in England. Established in 1630, just a decade after the Plymouth colony (a name itself taken from the Cornish port the Puritan settlers had departed) was founded, this Massachusetts city was named for Boston, Lincolnshire. Many of the colony’s most prominent early citizens hailed from the English cathedral town — which was a hotbed for religious nonconformism at the time — including the governor and his deputy. It’s estimated that about 250 people left Boston, England, for the shores of the New World in the 1630s (a significant portion of its population). The Puritans of early Boston named the city after their English home, as they hoped to set a shining example of how life could be for their former homeland. It quickly flourished, and trade with Europe soon brought the Massachusetts city wealth and a growing population that now far eclipses that of its namesake.

 

6. Portland, Maine

Maine’s largest city wasn’t always named Portland. It was originally called Casco, a name either derived from the Abenaki Indigenous word aucocisco, meaning “a place of herons,” or the Spanish word for “helmet.” Later, English settlers called their city on the peninsula Falmouth, after a port town in England. However, Falmouth was destroyed during the Revolutionary War, and its survivors built a new city in its place in 1786, naming it Portland after a peninsula on the Jurassic Coast of England. The resemblance between the American town and its English namesake is striking. Both places sit on rocky peninsulas, and both have iconic lighthouses looking out over the Atlantic. Portland, Maine, is also the closest transatlantic port in the U.S. to Europe, so its ties with the continent are strong.

 

7. Toronto, Ohio

m_Schottland_Lowlands_Loch%20Fyne_is_tou

Some American cities didn’t need to look too far abroad to find a suitable name. One such example is the city of Toronto, located on the shores of the Ohio River. A respected businessman from Toronto, Ontario, W. F. Dunsbaugh was working in this part of Ohio in the 1880s when the city was named in his honor. Whether it was his own suggestion, or the citizens came up with the idea, is unclear, but it certainly seemed that Toronto, Canada, was, “a place worth emulating,” according to locals at the time. Toronto, Ohio, is perhaps better known by the name on its welcome sign: Gem City. That’s not because of an abundance of precious stones, but rather the many riverboat captains who stopped here to pick up supplies and were so impressed with the variety of wares available that they called it “a gem of a place.”

 

8. Berlin, Connecticut

This charming town just outside of Hartford was originally known as Pagonchawnischage (“the great white oak place”) by the area’s Mattabasset Indigenous peoples and later, bizarrely enough, as the Great Swamp Society by the first ecclesiastical group in the area. When it was incorporated in 1785, the area was renamed Berlin, after the capital of Prussia (now capital of Germany). Although the town has a German name, it takes its Connecticut home to heart. (Quite literally, as it’s located at the geographic center of the state.) Berlin also proudly proclaims to be the “home of the Yankee peddler,” the traveling salesmen who sold mid-19th century Americans everything from nutmeg to hardware. When you’re done admiring the area’s history, explore the verdant woodlands surrounding town or check out neighboring New Britain — another place that searched for identity abroad.

 

 

Source: The Stories Behind 8 U.S. Places Named After International Cities

Edited by DarkRavie
  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...
Please Sign In or Sign Up