Quantcast
Channel: History
Viewing all 1838 articles
Browse latest View live

Nike's incredible road to becoming the world's dominant sneaker retailer


Why so many Americans think they're part Cherokee

$
0
0

Cherokee boy and girl north carolina

"I cannot say when I first heard of my Indian blood, but as a boy I heard it spoken of in a general way," Charles Phelps, a resident of Winston-Salem in North Carolina, told a federal census taker near the beginning of the 20th century.

Like many Americans at the time, Phelps had a vague understanding of his Native American ancestry. On one point, however, his memory seemed curiously specific: His Indian identity was a product of his "Cherokee blood."

The tradition of claiming a Cherokee ancestor continues into the present.

Today more Americans claim descent from at least one Cherokee ancestor than any other Native American group. Across the United States, Americans tell and retell stories of long-lost Cherokee ancestors.

These tales of family genealogies become murkier with each passing generation, but as with Phelps, contemporary Americans profess their belief despite not being able to point directly to a Cherokee in their family tree.

Recent demographic data reveals the extent to which Americans believe they're part Cherokee. In 2000, the federal census reported that 729,533 Americans self-identified as Cherokee.

By 2010, that number increased, with the Census Bureau reporting that 819,105 Americans claimed at least one Cherokee ancestor. Census data also indicates that the vast majority of people self-identifying as Cherokee — almost 70% of respondents — claim they are mixed-race Cherokees.

Why do so many Americans claim to possess "Cherokee blood"? The answer requires us to peel back the layers of Cherokee history and tradition.

cherokee map

Most scholars agree that the Cherokees, an Iroquoian-speaking people, have lived in what is today the Southeastern United States — Virginia, West Virginia, Kentucky, North and South Carolina, Georgia, and Alabama — since at least A.D. 1000. When Europeans first encountered the Cherokees in the mid-16th century, Cherokee people had well-established social and cultural traditions.

Cherokee people lived in small towns and belonged to one of seven matrilineal clans. Cherokee women enjoyed great political and social power in the Cherokee society. Not only did a child inherit the clan identity of his or her mother, women oversaw the adoption of captives and other outsiders into the responsibilities of clan membership.

As European colonialism engulfed Cherokee Country during the 17th and 18th centuries, however, Cherokees began altering their social and cultural traditions to better meet the challenges of their times. One important tradition that adapted to new realities was marriage.

The Cherokee tradition of exogamous marriage, or marrying outside one's clan, evolved during the 17th and 18th centuries as Cherokees encountered Europeans on a more frequent basis. Some sought to solidify alliances with Europeans through intermarriage.

It is impossible to know the exact number of Cherokees who married Europeans during this period. But we know that Cherokees viewed intermarriage as both a diplomatic tool and as a means of incorporating Europeans into the reciprocal bonds of kinship.

Three_Cherokee

Eighteenth-century British traders often sought out Cherokee wives. For the trader, the marriage opened up new markets, with his Cherokee wife providing both companionship and entry access to items such as the deerskins coveted by Europeans. For Cherokees, intermarriage made it possible to secure reliable flows of European goods, such as metal and iron tools, guns, and clothing.

The frequency with which the British reported interracial marriages among the Cherokees testifies to the sexual autonomy and political influence that Cherokee women enjoyed. It also gave rise to a mixed-race Cherokee population that appears to have been far larger than the racially mixed populations of neighboring tribes.

Europeans were not the only group of outsiders with which 18th-century Cherokees intermingled. By the early 19th century, a small group of wealthy Cherokees adopted racial slavery, acquiring black slaves from American slave markets. A bit more than 7% of Cherokee families owned slaves by the mid-1830s; a small number, but enough to give rise to a now pervasive idea in black culture: descent from a Cherokee ancestor.

In the early-20th century, the descendants of Cherokee slaves related stories of how their black forebears accompanied Cherokees on the forced removals of the 1830s.

They also recalled tales of how African and Cherokee people created interracial families. These stories have persisted into the 21st century. The former NFL running back Emmitt Smith believed that he had "Cherokee blood." After submitting a DNA test as part of his 2010 appearance on NBC's "Who Do You Think You Are," he learned he was mistaken.

cherokee dolls

Among black Americans, as among Americans as a whole, the belief in Cherokee ancestry is more common than actual blood ties.

Slaves owned by Cherokees did join their owners when the federal government forced some 17,000 Cherokees from their Southeastern homeland at the end of the 1830s. Cherokee people and their slaves endured that forced journey into the West by riverboats and overland paths, joining tens of thousands of previously displaced Native peoples from the eastern United States in Indian Territory (modern-day eastern Oklahoma). We now refer to this inglorious event as the Trail of Tears.

But the Cherokee people did not remain confined to the lands that the federal government assigned to them in Indian Territory. During the late 19th and early 20th centuries, Cherokees traveled between Indian Territory and North Carolina to visit family and friends, and Cherokee people migrated and resettled throughout North America in search of social and economic opportunities.

While many Native American groups traveled throughout the United States during this period in search of employment, the Cherokee people's advanced levels of education and literacy — a product of the Cherokee Nation's public-education system in Indian Territory and the willingness of diaspora Cherokees to enroll their children in formal educational institutions — meant they traveled on a scale far larger than any other indigenous group.

In these travels it's possible to glimpse Cherokees coming into contact with, living next door to, or intermarrying with white and black Americans from all walks of life.

Cherokee Confederates Reunion

At the same time that the Cherokee diaspora was expanding across the country, the federal government began adopting a system of "blood quantum" to determine Native American identity. Native Americans were required to prove their Cherokee, or Navajo, or Sioux "blood" to be recognized. (The racially based system of identification also excluded individuals with "one drop" of "Negro blood.")

The federal government's blood-quantum standards varied over time, helping to explain why recorded Cherokee blood quantum ranged from full-blood to one-2,048th. The system's larger aim was to determine who was eligible for land allotments following the government's decision to terminate Native American self-government at the end of the 19th century.

By 1934, the year that Franklin Roosevelt's administration adopted the Indian Reorganization Act, blood quantum became the official measure by which the federal government determined Native American identity.

In the ensuing decades, Cherokees, like other Native American groups, sought to define "blood" on their own terms. By the mid-20th century, Cherokee and other American Indian activists began joining together to articulate their definitions of American Indian identity and to confront those tens of thousands of Americans who laid claim to being descendants of Native Americans.

Groups such as the National Congress of American Indians worked toward the self-determination of American Indian nations and also tackled the problem of false claims to membership. According to the work of Vine Deloria, one of NCAI's leading intellectuals, "Cherokee was the most popular tribe" in America. "From Maine to Washington State," Deloria recalled, white Americans insisted they were descended from Cherokee ancestors.

cherokee woman weaving

More often than not, that ancestor was an "Indian princess," despite the fact that the tribe never had a social system with anything resembling an inherited title like princess.

So why have so many Americans laid claim to a clearly fictional identity? Part of the answer is embedded in the tribe's history: its willingness to incorporate outsiders into kinship systems and its wide-ranging migrations throughout North America. But there's another explanation, too.

The Cherokees resisted state and federal efforts to remove them from their Southeastern homelands during the 1820s and 1830s. During that time, most whites saw them as an inconvenient nuisance, an obstacle to colonial expansion. But after their removal, the tribe came to be viewed more romantically, especially in the antebellum South, where their determination to maintain their rights of self-government against the federal government took on new meaning.

Throughout the South in the 1840s and 1850s, large numbers of whites began claiming they were descended from a Cherokee great-grandmother. That great-grandmother was often a "princess," a not-inconsequential detail in a region obsessed with social status and suspicious of outsiders. By claiming a royal Cherokee ancestor, white Southerners were legitimating the antiquity of their native-born status as sons or daughters of the South, as well as establishing their determination to defend their rights against an aggressive federal government, as they imagined the Cherokees had done.

These may have been self-serving historical delusions, but they have proven to be enduring.

Coachella Indian headdress native american

The continuing popularity of claiming "Cherokee blood" and the ease with which millions of Americans inhabit a Cherokee identity speaks volumes about the enduring legacy of American colonialism. Shifting one's identity to claim ownership of an imagined Cherokee past is at once a way to authenticate your American-ness and absolve yourself of complicity in the crimes Americans committed against the tribe across history.

That said, the visibility of Cherokee identity also owes much to the success of the three federally recognized Cherokee tribes. Today, the Cherokee Nation, the United Keetoowah Band of Cherokee Indians, and the Eastern Band of Cherokees make up a combined population of 344,700.

Cherokee tribal governments provide community members with health services, education, and housing assistance; they have even teamed up with companies such as Google and Apple to produce Cherokee-language apps. Most Cherokees live in close-knit communities in eastern Oklahoma or the Great Smoky Mountains in North Carolina, but a considerable number live throughout North America and in cities such as New York, Chicago, San Francisco, and Toronto.

Cherokee people are doctors and lawyers, schoolteachers and academics, tradespeople and minimum-wage workers. The cultural richness, political visibility, and socioeconomic diversity of the Cherokee people have played a considerable role in keeping the tribe's identity in the historical consciousness of generation after generation of Americans, whether or not they have Cherokee blood.

SEE ALSO: Hurricane Joaquin is moving slowly — here's what that means for the East Coast

Join the conversation about this story »

NOW WATCH: Animated map shows the history of immigration to the US

6 of the craziest bayonet charges in military history

$
0
0

Bayonet soldier ww1 world war 1 training uniform army usa

Bayonet fighting is a lost art to many, but it has served as a tried and true tactic since the first riflemen realized they could use a blade if they found themselves wanting to kill something when their ammunition went empty.

Here are 6 times America and its allies decided to press cold steel into their enemies chests, including two charges from the Global War on Terror.

1. Two National Guard battalions shove an entire Chinese division off a hill with their bayonets.

bayonet charge

While attempting to take two hilltops to the south of Seoul, South Korea in early 1951, the 65th Infantry Regiment of the 3rd Infantry Division fought for two days up a Chinese-held hill. On the morning of the third day, the crest of the hill was in sight and the Puerto Rican fighters decided that it was time they were atop it.

So, two battalions fixed bayonets and charged against a Chinese division. In the resulting clash, the unit was credited with killing 5,905 enemy soldiers and capturing 2,086.

2. Gettysburg-Little Round Top

In one of the most famous counterattacks in American history, the 20th Maine under Union Col. Joshua L. Chamberlain found itself running out of ammunition on Little Round Top, an important hill at the Battle of Gettysburg.

Chamberlain and his 386 men, including 120 mutineers added to the regiment just before the battle, charged down the hill and defeated two Confederate regiments. Chamberlain himself was nearly killed multiple times during the charge.

3. Marines take Peleliu Airfield with a daring bayonet charge across open ground.

The 1st Marine Division was attempting to take the Japanese-held Peleliu Airfield on Sep. 16, 1944. When they realized they weren’t making enough progress through rifle-fire, they lined up four battalions and charged against the open ground with fixed bayonets. While they took heavy losses, they reached the enemy, engaged at close quarters, and took the airfield.

4. Revolutionary War Gen. “Mad” Anthony Wayne orders a daring charge and threatens to kill any soldiers who fire.

Mad Anthony Wayne bayonet charge

To retake a position at Stony Point, New York, Gen. “Mad” Anthony Wayne ordered his outnumbered and outgunned men to not fire under punishment of death.

The Americans crept up to the British defenders at night and charged through the lines with fixed bayonets and sabers. When it was all over, the Americans had retaken Stony Point with 15 men killed and 85 wounded while the British suffered 63 dead, 70 wounded, and 442 captured.

5. The British dismount their heavily-armed vehicles in Iraq to attack insurgents with their bayonets.

A group of British soldiers from the Prince of Wales’ Royal Regiment were ambushed by fighters from Mugtada Al-Sadr’s forces May 14, 2004.

The enemy was firing from an actual trench, so Company Sgt. Maj. David Falconer ordered his men to fix bayonets and enter the trenches. The British charged across open ground and dropped into the trenches. With bayonets and rifles, the men fought for the next four hours, killing about 30 enemy soldiers with no major casualties before a British tank arrived and ended the battle. Falconer and another soldier were awarded the British Military Cross.

6. Capt. Lewis Millett orders two bayonet charges in 4 days during the Korean War.

col lewis millet

On Feb. 4th, 1951, then-Capt. Lewis Millett led a bayonet charge an occupied hill in Korea and one of his platoon leaders went down. Millett organized a rescue effort with bayonets while under fire and finished taking the hill.

Then, only three days later, he was leading an attack up Hill 180 when one of his platoons was pinned down by enemy fire. Millett took another platoon up to rescue them, ordered both platoons to fix bayonets, and led a charge up the hill and captured it. He’s personally credited with bayonetting at least two men in the assault while clubbing others and throwing grenades.

SEE ALSO: The US military took these incredible photos this week

Join the conversation about this story »

NOW WATCH: China has been upgrading its military and is now stronger than ever

The true story behind Google's hilarious first name: BackRub (GOOG, GOOGL)

$
0
0

young larry page sergey brinGoogle cofounders Larry Page and Sergey Brin are definitely fans of wordplay, and they seem to have a thing for company names that are both goofy and yet significant at the same time.

A perfect example of this is when Google rolled out its new operating structure, Alphabet, earlier this summer. Page explained the name in an exclamation-laden blog post:

"We liked the name Alphabet because it means a collection of letters that represent language, one of humanity's most important innovations, and is the core of how we index with Google search! We also like that it means alpha‑bet (Alpha is investment return above benchmark), which we strive for!"

But this certainly wasn’t the first time the duo had experimented with language. Back in 1996, before Google even existed as an entity, Page and Brin were already making up nerdy names for search engines.

According to Stanford’s David Koller, and Google’s own website, Page and Brin’s 1996 foray into the world of search engines was initially called “BackRub.”

Yes, BackRub.

They called it this because the program analyzed the web’s “back links” to understand how important a website was, and what other sites it related to. BackRub operated on Stanford’s servers until it eventually took up too much bandwidth.

But by 1997, Page seems to have decided that the BackRub name just wasn’t good enough. According to Koller, Page and his officemates at Stanford began to workshop different names for the search engine technology, names that would evoke just how much data they were indexing.

The name “Google” actually came from a graduate student at Stanford named Sean Anderson, Koller writes. Anderson suggested the word “googolplex” during a brainstorming session, and Page countered with the shorter “googol.” Googol is the digit 1 followed by 100 zeroes, while googolplex is 1 followed by a googol zeros.   

Anderson checked to see if that domain name was taken, but accidentally searched for “google.com” instead of “googol.com.” Page liked that name even better, and registered the domain name for Brin and himself on September 15, 1997.

From BackRub to Google to Alphabet — makes you wonder what's next.

Join the conversation about this story »

NOW WATCH: Google's self-driving car has a huge problem

The secrets of Wall Street: slavery, terrorism, and hidden vaults

Al Capone: the life and death of the infamous gangster known as 'Scarface'

A relic of medieval history explains why glasses make people look smart

$
0
0

medieval glasses

Today, more than half the US population wears glasses. 

But just about everyone, including psychologists, agrees that four-eyed dweebs look smarter and more qualified for jobs than people who don't wear glasses. 

There's a historical explanation for the stereotype: Glasses signaled that you needed your eyesight more than other people.

For several hundred years after they were invented in 1296, glasses were reviled because they revealed a key weakness in the wearer's biology, says Neil Handley,curator of the British Optical Association Museum at the College of Optometrists.

Back in the day, if you had glasses it was because you were counted among a select few whose jobs required the ability to see fine details.

If you worked in a field or a factory, glasses did nothing for you. But if you were a doctor, banker, teacher, or worker in one of those new-fangled offices that came on the scene in the 1700s, those fine details likely made up the bulk of your work.

You can see glasses being equated with smarts all the way back in the 17th century.

One piece in particular — a portrait of a Venetian man completed sometime around 1610-1620 — is believed to be one of the earliest commissioned portraits to feature spectacles, Handley explained in a 2012 lecture.

We don't know who he is, but he "would have been known at the time," and thus could have decided not to pose with his glasses.

venetian artist"Gone is the fear of what the eyewear might negatively imply," Handley says. "His only fear seems to be that the glasses might fall off and his hands are outstretched as if to catch them. To this man the spectacles might perhaps signify intelligence, literacy, and social standing." 

That significance stuck around for the next couple hundred years.

But then things changed.

As Kerry Segrave explains in "Vision Aids in America: A Social History of Eyewear and Sight Correction Since 1900," eyesight became formally important in 1908, when the first legislation emerged requiring states to offer optometry services. By the mid-1950s, eyeglasses advertisements were no longer selling to "customers," but "patients." 

Glasses had become a way for the masses to correct their eyesight, rather than a tool to make high-minded work easier.

Not that social norms, moving at the glacial pace they do, ever caught on. People still think glasses make you look smarter because an old truth gradually shed its accuracy and left only the husk — in the form of a favorable stereotype — behind. 

Since then, research has found even the kind of glasses matters: Thick, blocky frames make you look smarter than thin ones

So go hipster, and look smart.

bill gates

Join the conversation about this story »

NOW WATCH: Why airplanes still have ashtrays in the bathroom

Meet the world's deadliest female sniper who terrorized Hitler's Nazi army

$
0
0

lyudmila pavlichenko

In early 1941, Lyudmila Pavlichenko was studying history at Kiev University, but within a year, she had become one of the best snipers of all time, credited with 309 confirmed kills, 36 of which were German snipers.

Pavlichenko was born in 1916 in a small town in Ukraine.

She was described as an independent, opinionated tomboy who was "unruly in the classroom," as the Smithsonian notes.

At the age of 14, Pavlichenko's family had relocated to Kiev, where she worked as a metal grinder in a munitions factory.

Like many young people in the Soviet Union at that time, Pavlichenko participated in OSOAVIAKhIM, a paramilitary sporting organization which taught youths weapons skills and etiquette. 

“When a neighbor’s boy boasted of his exploits at a shooting range,” said Pavlichenko according to the Smithsonian.

“I set out to show that a girl could do as well. So I practiced a lot.”

On June 22, 1941, Hitler broke ties with Joseph Stalin and German troops poured into the Soviet Union. Pavlichenko rushed to join the Soviet army and defend her homeland, but she was initially denied entry into the army due to gender.

"She looked like a model, with well-manicured nails, fashionable clothes, and hairstyle. Pavlichenko told the recruiter that she wanted to carry a rifle and fight. The man just laughed and asked her if she knew anything about rifles,"Soviet-Awards.com wrote of Pavlichenko's effort to join the military.

lyudmila pavlichenkoEven after Pavlichenko presented her  marksman certificate and a sharpshooter badge from OSOAVIAKhIM, officials still urged her to work as a nurse. 

“They wouldn’t take girls in the army, so I had to resort to all kinds of tricks to get in,” explained Pavlichenko.

Eventually, the Red Army gave her an "audition" by giving her a rifle and showed her two Romanians downrange who were working with the Germans. She shot down the two soldiers with ease, and was then accepted into the Red Army’s 25th Chapayev Rifle Division.

Lyudmila PavlichenkoPavlichenko then shipped out to the battle lines in Greece and Moldova. In very little time she distinguished herself as a fearsome sniper, killing 187 Germans in her first 75 days at war.

Snipers in these battles fought between the enemy lines, often far from their companies. It was extremely dangerous and careful work, as she had to sit perfectly still for hours on end to avoid detection from enemy snipers. After making a name for herself in Odessa and Moldova, Pavlichenko was moved to Crimea to fight in the battle of Sevastopol.

Her reputation earned her more dangerous assignments, eventually facing off one on one with enemy snipers. The Smithsonian reports that she dueled and killed 36 enemy snipers, some of whom were highly decorated themselves.

“That was one of the tensest experiences of my life,” Pavlichenko reportedly said.

Mosin Nagant M1891She spent eight months fighting in Stevastopol, where she earned a praise from the Red Army and was promoted. On several occasions she was wounded, but she was only removed from battle after taking shrapnel to the face when her position was bombed by Germans who were desperate to stem the tide of her mounting kill count.

She had become a well known figure in the war, as a protagonist in the Red Army's domestic propaganda, and the scourge of German soldiers all over the Eastern front. The Germans even went so far as to address her over loud speakers, offering her comfort and candy should she defect and join their ranks.

Pavlichenko became a sniper instructor and was soon invited to the White House.

lyudmila pavlichenko elanor rooseveltShe became the first Soviet soldier to visit the White House, where she met with President Franklin Roosevelt and first lady, Eleanor Roosevelt.

Pavlichenko became angry at the US media for the blatantly sexist way they questioned her about the war. Her look and dress was criticized. When she was asked if she wore make up to battle she responded, “There is no rule against it, but who has time to think of her shiny nose when a battle is going on?”

lyudmila pavlichenko“I wear my uniform with honor. It has the Order of Lenin on it. It has been covered with blood in battle. It is plain to see that with American women what is important is whether they wear silk underwear under their uniforms. What the uniform stands for, they have yet to learn,” she told Time Magazine in 1942.

Pavlichenko was one of 2,000 female snipers who fought for the Red Army in World War II, and one of the 500 who survived.

Her score of 309 kills likely places her within the top five snipers of all time, but her kills are likely much more numerous, as a confirmed kill has to be witnessed by a third party.

After the war, Pavlichenko went back to finish her Master's Degree at Kiev University. 

In April of this year, Pavlichenko's story was immortalized in a film called "Battle for Sevastopol" in Russia and "Indestructible" in the Ukraine. 

The film was shot during the 2013 EuroMaidan protests in Ukraine, and financed by both Russian and Ukrainian backers at the start of a conflict that would become bloody and divisive, however the film is a testament to the outstanding career of Pavlichenko, a common hero among both parties.

battle of sevastopol

SEE ALSO: The incredible story of the man who volunteered to enter Auschwitz and exposed the horrors of the Holocaust

Join the conversation about this story »


Many of the world's first cars ran on electricity

$
0
0

Detroit electric car ad

Today, cars that run on gas are the norm. But when the automobile was an emerging technology in the early 1900s, Americans bought more electric vehicles than gas cars.

According to Elon Musk's interview with Wait But Why's Tim Urban, 38% of the cars available at the turn of the century were electric. Steam powered another 40% of autos, while cars that ran on gas made up just 22% of the total.

Steam-powered combustion was the oldest and most well-understood technology at the time, which is why it made up the majority of the available cars.

Electricity was the newest and hottest technology.

In the years leading up to the 1900s, electricity made the unthinkable possible, including telephones, light bulbs, movies, and the radio. Some of the inventors who made these possible, including Edison and Tesla, were also pushing to invent viable electric cars.

"If someone in the year 1900 had to bet on the outcome of the battle between external steam combustion, internal gasoline combustion, and electricity as the future standard for powering cars, they'd have probably put their money on electricity," Urban writes.

The New York Times also referred to the electric car as the ideal out of the three — it was quieter, cleaner, and cheaper.

But in 1908, Henry Ford's Model T came along. Quickly and cheaply made, the Model T transformed how the world saw the automobile. According to Urban, electric cars had been considered ideal by many, but Ford figured out to make gas cars faster, sturdier, and most importantly, cheaper. In other words, Ford figured out to improve gas cars faster than anyone developing electric cars at the time and those gas-powered autos started to devour the market.

By 1914, 99% of American cars ran on gas. Electric car manufacturers couldn't catch up to Ford's profitable business model, and by 1920, electric cars "dropped entirely out of commercial production."

The rest is history, though a period in history that Musk thinks must end soon.

Read the rest of the Elon Musk interview with Wait But Why.

Join the conversation about this story »

NOW WATCH: The biggest science mistakes in 'The Martian'

This declassified video shows the US military testing biological warfare — on the US

$
0
0

bioweapons

From 1949 to 1969, the US government carried out at least 239 tests on unsuspecting US civilians that were meant to simulate biological weapon attacks.

Officials back then used what they believed were harmless "simulants" of actual bioweapons. But Leonard Cole, the author of the investigative book "Clouds of Secrecy: The Army's Germ Warfare Tests Over Populated Areas," which documented the tests, tells Tech Insider that these supposedly harmless germs are "all considered pathogens now."

A newly declassified 1952 Department of Defense film, released on Sept. 30, 2015 in response to a FOIA request by the site Government Attic, shows the enthusiasm with which the DoD viewed those tests at the time.

Both in tone and content, it's hard to believe, especially from the non-Cold War perspective of today.

Government Attic posted a copy of the video on October 12, and Tech Insider verified its authenticity by comparing it to a version provided by an official at the US National Archives.

We've broken the film down into GIFs for easy viewing (pardon the quality), but the full version, which we've uploaded to YouTube, is at the end of the post.

The film details the US capabilities for using biological weapons at the time, and the ways that testing in inhabited areas helped them develop these strategies.

RAW Embed

 



Most of the film is dedicated to the "offensive" capabilities gained by these experiments.

RAW Embed

 



Such attacks are meant to devastate food supplies and incapacitate both armed forces and "the human population that directly supports them."

RAW Embed

 



See the rest of the story at Business Insider

NOW WATCH: 4 ways to stay awake without caffeine

The discovery of a trove of 47 teeth from a Chinese cave is rewriting human history

$
0
0

mg22830434.400 1_800

WASHINGTON (Reuters) — A trove of 47 fossil human teeth from a cave in southern China is rewriting the history of the early migration of our species out of Africa, indicating Homo sapiens trekked into Asia far earlier than previously known and much earlier than into Europe.

Scientists on Wednesday announced the discovery of teeth between 80,000 and 120,000 years old that they say provide the earliest evidence of fully modern humans outside Africa.

The teeth from the Fuyan Cave site in Hunan Province's Daoxian County place our species in southern China 30,000 to 70,000 years earlier than in the eastern Mediterranean or Europe.

"Until now, the majority of the scientific community thought that Homo sapiens was not present in Asia before 50,000 years ago," said paleoanthropologist Wu Liu of the Chinese Academy of Sciences' Institute of Vertebrate Paleontology and Paleoanthropology.

Our species first appeared in East Africa about 200,000 years ago, then spread to other parts of the world, but the timing and location of these migrations has been unclear.

University College London paleoanthropologist María Martinón-Torres said our species made it to southern China tens of thousands of years before colonizing Europe perhaps because of the entrenched presence of our hardy cousins, the Neanderthals, in Europe and the harsh, cold European climate.

"This finding suggests that Homo sapiens is present in Asia much earlier than the classic, recent 'Out of Africa' hypothesis was suggesting: 50,000 years ago," Martinón-Torres said.

Liu said the teeth are about twice as old as the earliest evidence for modern humans in Europe.

"We hope our Daoxian human fossil discovery will make people understand that East Asia is one of the key areas for the study of the origin and evolution of modern humans," Liu said.

Martinón-Torres said some migrations out of Africa have been labeled "failed dispersals." Fossils from Israeli caves indicate modern humans about 90,000 years ago reached "the gates of Europe," Martinón-Torres said, but "never managed to enter."

It may have been hard to take over land Neanderthals had occupied for hundreds of thousands of years, Martinón-Torres said.

"In addition, it is logical to think that dispersals toward the east were likely environmentally easier than moving toward the north, given the cold winters of Europe," Martinón-Torres said.

Paleoanthropologist Xiujie Wu of the Institute of Vertebrate Paleontology and Paleoanthropology said the 47 teeth came from at least 13 individuals.

The research appears in the journal Nature.

(Reporting by Will Dunham; Editing by Sandra Maler)

UP NEXT: Scientists discovered a new, extinct human relative that may have 'buried' its dead

SEE ALSO: 50 groundbreaking scientists who are changing the way we see the world

Join the conversation about this story »

NOW WATCH: This is how scientists discovered an ancient species related to humans

Here's the critical difference between marriage today and 30 years ago

$
0
0

the notebook ryan gosling rachel mcadams

Marriage has always been a gamble, but the modern game is harder — with higher stakes than ever before.

Struggling marriages make people more unhappy today than in the past, while healthy marriages have some of the happiest couples in history, according to a comprehensive analysis published in 2007 regarding marital quality and personal well-being.

When Eli Finkel sought to understand why marriage is more extreme at both ends today than in the past, he discovered something intriguing yet discouraging: Marriages in the US are more challenging today than at any other time in our country's history.

Finkel is a professor of social psychology at Northwestern University and is known for developing a surprisingly simple marriage-saving procedure, which takes 21 minutes a year. (The procedure involves three seven-minute online writing sessions, where couples describe their most recent disagreement from the perspective of a hypothetical neutral bystander — something they are also encouraged to try out in future arguments.)

Finkel, together with his colleagues of the Relationships and Motivation LAB at Northwestern, have gone on to publish several papers on what they call "the suffocation model of marriage in America."

In their latest paper on this front, they explain why — compared to previous generations — some of the defining qualities of today's marriages make it harder for couples to cultivate a flourishing relationship. The simple answer is that people today expect more out of their marriage. If these higher expectations are not met, it can suffocate a marriage to the point of destroying it.

couples

Finkel, in an Opinion article in The New York Times summarizing their latest paper on this model, discusses the three distinct models of marriage that relationship psychologists refer to:

  • institutional marriage (from the nation's founding until 1850)
  • companionate marriage (from 1851 to 1965)
  • self-expressive marriage (from 1965 onward)

Before 1850, people were hardly walking down the aisle for love. In fact, American couples at this time, who wed for food production, shelter, and protection from violence, were satisfied if they felt an emotional connection with their spouse, Finkel wrote. (Of course, old-fashioned, peaceful-seeming marriages may have been especially problematic for women, and there were an "array of cruelties that this kind of marriage could entail,"Rebecca Onion wrote recently in Aeon.)

Those norms changed quickly when an increasing number of people left the farm to live and work in the city for higher pay and fewer hours. With the luxury of more free time, Americans focused on what they wanted in a lifelong partner, namely companionship and love. But the counter-cultural attitude of the 1960s led Americans to think of marriage as an option instead of an essential step in life.

This leads us to today's model, self-expressive marriage, wherein the average modern, married American is looking not only for love from their spouse but for a sense of personal fulfillment. Finkel writes that this era's marriage ideal can be expressed in the simple quote "You make me want to be a better man," from James L. Brooks' 1997 film "As Good as It Gets."

as good as it gets jack nicholson with puppy

These changes to marital expectations have been a mixed bag, Finkel argues.

"As Americans have increasingly looked to their marriage to help them meet idiosyncratic, self-expressive needs, the proportion of marriages that fall short of their expectations has grown, which has increased rates of marital dissatisfaction,"Finkel's team writes, in their latest paper. On the other hand, "those marriages that succeed in meeting these needs are particularly fulfilling, more so than the best marriages in earlier eras."

The key to a successful, flourishing marriage? Finkel and his colleagues describe three general options:

  • Don't look to your marriage alone for personal fulfillment. In addition to your spouse, use all resources available to you including friends, hobbies, and work.
  • If you want a lot from your marriage, then you have to give a lot, meaning that in order to meet their high expectations, couples must invest more time and psychological resources into their marriage.
  • And if neither of those options sound good, perhaps it's time to ask less of the marriage and adjust high expectations for personal fulfillment and self discovery.

wedding couple first dance bride groomOther researchers, like sociologist Jeffrey Dew, support the notion that time is a crucial factor in sustaining a successful marriage.

Dew, who is a professor at the University of Virginia, found that Americans in 1975 spent, on average, 35 hours a week alone with their spouse while couples in 2003 spent 26 hours together. Child-rearing couples in 1975 spent 13 hours a week together, alone, compared to couples in 2003 who spent 9 hours a week together. The divorce rate in America was 32.8% in 1970 and rose to 49.1% by 2000.

While that doesn't necessarily mean less time together led to divorce or that the people who stayed together were happy, Finkel's research suggests that higher expectations and less investment in the relationship may be a toxic brew.

Marriage has become as tricky but also as potentially rewarding as climbing Mt. Everest: Obtaining a sense of personal fulfillment from your partner is as hard as achieving the summit. This is both good and bad because it means that you are reaching for the pinnacle of what marriage has to offer — which explains why couples in healthy marriages are happier now than in the past — but it also means that meeting those expectations and feeling satisfied in marriage is harder than ever.

"The good news is that our marriages can flourish today like never before," Finkel writes for The New York Times. "They just can't do it on their own."

SEE ALSO: Scientists Have Discovered How Common Different Sexual Fantasies Are

CHECK OUT: 5 Ways To Tell If Someone Is Cheating On You

Join the conversation about this story »

NOW WATCH: This one ingredient is making a lot of Americans fat

21 pictures of New York City in the early 1900s

$
0
0

old times square

New York City, like most older American cities, has changed drastically over the centuries.

But one thing that hasn't changed is its residents' desire to photograph it.

A vast trove of photos in The Library of Congress gives us the opportunity to look back at New York just before it was entering the 20th century.

These images give us an idea of what life was like in the early 1900s — how landmarks have changed or, remarkably, stayed the same.

Eric Goldschein wrote an earlier version of this story.

SEE ALSO: The 35 best Reuters photos of the year so far

DON'T FORGET: Follow Business Insider's lifestyle page on Facebook!

Federal Hall, which now stands as a museum and memorial, was originally home to the first Congress, US Supreme Court, and federal executive-branch offices.



Manhattan's City Hall is the oldest such building in the US.



Times Square wasn't yet bombarded with advertisements at the turn of the 20th century.



See the rest of the story at Business Insider

The 25 weirdest things people brewed 'coffee' from during the Civil War

$
0
0

Civil War Union Soldiers

Times were tough back in ye old times of the Civil War. Food was being rationed, hundreds of thousands of soldiers were dying, and coffee was a hot commodity that the Union army was guzzling faster than the Confederates and civilians could keep up with.

"Nobody can 'soldier' without coffee," the New York Times wrote in an article about coffee's importance in the Civil War. "Union troops made their coffee everywhere, and with everything: with water from canteens and puddles, brackish bays and Mississippi mud, liquid their horses would not drink."

But soon the beans ran out, and people needed to get creative.

Take this excerpt, for instance, from the Weekly Arkansas Gazette in Little Rock from June 15, 1861, the year the Civil War began:

A very good coffee can be made, costing only 12½ cents, by mixing one spoonful of coffee with one spoonful of toasted corn meal, boil well and clear in the usual way. I have used it for two weeks, and several friends visiting my house say they could not discover any thing peculiar in the taste of my coffee, but pronounced it very good. Try it and see if we cannot get along comfortably, even while our ports are blockaded by the would be kind I can assure you it is very pleasant, though not strong enough to make us drunk.

Here are 25 of the most bizarre things that people made what came to called "Confederate Coffee," which, in most cases, didn't contain any caffeine and in fact, was more of a tea.

These ingredients were either dried, browned, roasted, or ground before steeping or dissolving into hot water to make "coffee."

1. Almond

2. Acorn

3. Asparagus

4. Malted barley

5. Beans

6. Beechnut

7. Beets

8. Carrot

9. Chicory root

10. Corn

11. Corn Meal

12. Cottonseed

13. Dandelion root

14. Fig

15. Boiled-down molasses

16. Okra seed

17. Pea

18. Peanuts

19. Persimmon seed

20. Potato peel

21. Sassafras pits

22. Sugar cane seeds

23. Sweet potato

24. Wheat berries

25. Wheat bran

Yum?

Join the conversation about this story »

NOW WATCH: 4 ways to stay awake without caffeine

These trading cards from the early 1900s show a bizarrely accurate vision of women of the future

$
0
0

journalist

Women have always worked hard for the money.

Around 1902, a French artist imagined professions that women would hold in the future.

He designed a set of trading cards that depicted these women. On the surface, it may seem that most of the predictions are both progressive and accurate.

But the cards are not what they seem.

 

The antique cards by Albert Bergeret show women in historically male professions, like this soldier.



Around the turn of the 20th century, there were female lawyers in Europe and the U.S., but there were restrictions where they could practice.



"These are not just women of the future," Alice Kessler-Harris, history professor at Columbia University, tells Tech Insider. "They are women of the moment who were breaching the barriers of traditional gender roles."



See the rest of the story at Business Insider

NOW WATCH: 4 ways to stay awake without caffeine


FDR had a top-secret bulletproof train car beneath Grand Central

$
0
0

It is no secret now that Franklin D. Roosevelt had the unique challenge of hiding his crippling disease of what was believed to be polio throughout his terms as president. He went through great lengths to hide this disease from the press and the public. One of his many tricks of disguise can still be seen on a secret track, hidden below Grand Central Terminal. 

Produced by Justin Gmoser. Additional camera by Sam Rega

Follow BI Video: On Facebook

 

Join the conversation about this story »

Israel's Prime Minister is getting slammed for statements linking a 1940s Palestinian leader to the Holocaust

$
0
0

Israel's Prime Minister Benjamin Netanyahu attends the weekly cabinet meeting at his office in Jerusalem August 16, 2015. REUTERS/Abir Sultan/Pool

Prime Minister Benjamin Netanyahu was pounded Wednesday with a barrage of condemnations after he claimed that Nazi leader Adolf Hitler only decided on the mass extermination of Europe’s Jews after receiving input on the matter from Jerusalem’s then-grand mufti, Haj Amin al-Husseini, a Palestinian nationalist widely acknowledged as a fervent Jew-hater.

Critics accused Netanyahu of “absolving” Hitler of responsibility for the Holocaust, a charge the prime minister later brushed off, saying he had merely intended to drive home the enormity of the mufti’s role as the originator of contemporary Palestinian “incitement” against Jews.

During an address Tuesday to delegates at the World Zionist Congress in Jerusalem, Netanyahu posited that the Nazi fuehrer did not initially intend to annihilate the Jews, but rather sought to expel them from Europe.

According to the prime minister’s version of the events, Hitler changed his mind after meeting with Husseini — who was grand mufti of Jerusalem from 1921 to 1948, and president of the Supreme Muslim Council from 1922 to 1937 — in Berlin near the end of 1941.

“Hitler didn’t want to exterminate the Jews at the time [of the meeting between the mufti and the Nazi leader]. He wanted to expel the Jews,” Netanyahu said.

“And Haj Amin al-Husseini went to Hitler and said, ‘If you expel them, they’ll all come here [to mandatory Palestine],'” continued the prime minister.

“‘So what should I do with them?’ He [Hitler] asked,” according to Netanyahu. “He [Husseini] said, ‘Burn them.'”

hitler husseni mufti palestineNetanyahu was speaking in the context of enduring Palestinian accusations to the effect that Israel is seeking to take control of the Temple Mount in Jerusalem; the mufti was one of the first to peddle such allegations against Jews in Mandatory Palestine.

The charges have been fueling a recent wave of attacks against Israelis in and around Jerusalem. Israel has repeatedly denied allegations that it wishes to change the status quo on the Mount, which houses the Al-Aqsa Mosque and is holy to both Jews and Muslims. As per the status quo, Jews may visit the Temple Mount but not pray there.

An overwhelming majority of Holocaust historians reject the notion that Husseini planted the idea of a “Final Solution” for Europe’s Jews in Hitler’s mind.

Tom Segev, a leading Israeli historian who has conducted extensive research on the Holocaust, told The Times of Israel Wednesday that the notion that Hitler needed to be convinced to exterminate the Jews was “entirely absurd.” He stressed that “one can surely say that [Husseini] was a war criminal, but one cannot say Hitler needed his advice.”

Israel's Prime Minister Benjamin Netanyahu attends the weekly cabinet meeting at his office in Jerusalem May 10, 2015.   REUTERS/Sebastian ScheinerSegev, born in Jerusalem to parents who escaped Nazi Germany in 1933, further stressed that by the time Husseini and Hitler met in 1941, the annihilation of the Jews had already begun. In fact, hundreds of thousands of Jews had been killed by the Nazis and their collaborators by the time of the meeting.

“So the mufti told Hitler, ‘Burn them,’ and Hitler goes, ‘Oh, what a great idea,’” Segev added ironically.

Other commentators pointed out that Hitler had discussed the possible extermination of European Jewry as early as 1939, even before World War II began and certainly before he met with Husseini. The order to carry out a Final Solution against Jews was given in July 1941 — months ahead of the mufti and Hitler’s meeting — after which the infamous Wannsee Conference was called in order to finalize the logistics and details of the mass-murder operation.

The Wannsee Conference, held in on January 20, 1942, came after the meeting between Hitler and Husseini.

The theory that Husseini played a role in the origin of the plan to commit genocide against the Jews has been raised by a number of historians including David Dalin and John Rothmann, but the notion has been rejected by a vast majority of Holocaust scholars.

Netanyahu’s speech on Tuesday was not the first time the Israeli leader offered his alternate version concerning the mufti’s role in the perpetration of the Holocaust.

“Haj Amin al-Husseini was one of the leading architects of the Final Solution,” he said in 2012 during a speech at the Knesset. “He, more than anybody else, convinced [Hitler] to execute the Final Solution, and not let the Jews leave [Europe]. Because, God forbid, they would come here. Rather that they would be annihilated, burned, there.”

The prime minister was criticized across the political board Wednesday for his comments, which were described as inaccurate at best and, at worst, as a tailwind to Holocaust denial. Implying the mufti planted the idea for the Final Solution in Hitler’s mind was tantamount to some, to absolving Hitler and the Nazis, at least partially, for orchestrating the unprecedented, systematic genocide of the Jews.

“This is a dangerous distortion of history and I demand that Netanyahu correct it immediately because it trivializes the Holocaust, the Nazis, and the terrible dictator Adolf Hitler’s share in the terrible tragedy of our people in the Holocaust,” Israeli opposition leader Isaac Herzog said in a statement. “It falls like ripe fruit straight into the hands of Holocaust deniers, and involves them in the Palestinian conflict.

“Netanyahu has forgotten that he is not only the Israeli prime minister, he is also the prime minister of the Jewish people. No one will teach me what a hater of Israel the mufti was. He gave the order to kill my grandfather, Rabbi [Yitzhak HaLevi] Herzog, and actively supported Hitler,” Herzog added.

Zehava Galon, leader of the liberal Meretz party, was even more vituperative, asserting that she felt “ashamed” for Netanyahu.

“This is not a speech by [extreme right-wing Austrian politician] Jorg Haider. This is not part of the doctorate of [Palestinian Authority President Mahmoud] Abbas [which accused the Zionist movement of collaborating with Nazism and played down the extent of the Holocaust]. It is an absolutely accurate quote by Israeli Prime Minister Benjamin Netanyahu… It’s unbelievable,” Galon said in a statement.

Yad Vashem Holocaust Memorial“Perhaps we should exhume the 33,771 Jews killed at Babi Yar in September 1941, two months before the mufti and Hitler ever met, and let them know that the Nazis didn’t intend to destroy them. Perhaps Netanyahu will tell that to my relatives in Lithuania murdered by the Nazis along with nearly 200,000 members of the Jewish community there, well before the mufti and Hitler met,” she continued.

“I am ashamed for you, Mr. Prime Minister,” Galon added.

Joint (Arab) List party leader Ayman Odeh accused Netanyahu of distorting history in order to incite against the Palestinian people.

“The victims of the Nazi monstrosity, among them millions of Jews, are converted into cheap political propaganda to assist the refusal of peace,” Odeh said. “Netanyahu proves every day how dangerous he is to the two nations, and how far he is willing to go to consolidate his power and justify his catastrophic policies.”

Echoing Odeh’s words, the Palestinian Authority’s former chief negotiator, Saeb Erekat, asserted that “Netanyahu hates Palestinians so much that he is willing to absolve Hitler for the murder of 6 million Jews.” He added that “on behalf of the thousands of Palestinians that fought alongside the Allied troops in defense of international justice, the State of Palestine denounces [Netanyahu’s] morally indefensible and inflammatory statements.”

PA President Mahmoud Abbas accused Netanyahu of placing “responsibility on Haj Amin al-Husseini for the killing of Jews during the Holocaust.” He said that by implying that “Hitler was not responsible [for the Holocaust], Netanyahu wants to change history. He is changing the history of the Jews.” In a 1984 book based on his PhD dissertation, Abbas claimed that the Nazis had collaborated with the Zionists, who had also exaggerated the number of Jews killed in the Holocaust.

Abbas Netanyahu ClintonIn a statement on Wednesday afternoon, Netanyahu asserted that his comments had been misconstrued. Hitler, he said, “was responsible for the extermination of six million European Jews — no one doubts that.” But, he added, “we must not ignore that the mufti, Haj Amin al-Husseini, was among those who encouraged him to adopt the Final Solution. There is much testimony to that effect, including the testimony of Eichmann’s deputy at the Nuremberg trials.”

Addressing reporters on the tarmac at Ben-Gurion Airport as he prepared to head to Berlin, Netanyahu said Wednesday he had no intention of “absolving Hitler of his responsibility,” but had rather meant to show that “the forefather of Palestinian nationalism, which was without a state and without what is referred to as an ‘occupation,’ without the territories and without settlements, was already aspiring, through systematic incitement, to annihilate the Jews.

“To my chagrin, Haj Amin al-Husseini is to this day a revered figure in Palestinian society — he appears in textbooks and celebrated as the father of the nation — and the incitement that began with him, the incitement to kill Jews, yet persists,” he said. “The incitement must stop if we are to end the murders. The paramount thing is to acknowledge the historical facts and not ignore them.”

SEE ALSO: All of the ways US intelligence thought Hitler may try to disguise himself

Join the conversation about this story »

NOW WATCH: Netanyahu berates UN members for their 'utter silence' on Iran deal

From 1860-1916 the British Army required every soldier to have a mustache

$
0
0

Kitchener Britons we want you military propaganda recruitmentToday I found out that uniform regulation in the British Army between the years 1860 and 1916 stipulated that every soldier should have a moustache.

Command No. 1,695 of the King’s Regulations read:

The hair of the head will be kept short. The chin and the under lip will be shaved, but not the upper lip…

Although the act of shaving one’s upper lip was trivial in itself, it was considered a breach of discipline.

If a soldier were to do this, he faced disciplinary action by his commanding officer which could include imprisonment, an especially unsavory prospect in the Victorian era.

Interestingly, it is during the imperial history of Britain that this seemingly odd uniform requirement emerged. 

Initially adopted at the tail end of the 1700s from the French, who also required their soldiers to have facial hair which varied depending on the type of soldier (sappers, infantry, etc.),  this follicular fashion statement was all about virility and aggression.

Beard and moustache growth was rampant, especially in India where bare faces were scorned as being juvenile and un-manly, as well as in Arab countries where moustaches and beards were likewise associated with power.

It wasn’t all plain sailing for the moustache though; back home British citizens were looking on it as a sign of their boys ‘going native’ and it was nearly stamped out completely.

However, in 1854, after significant campaigning, moustaches became compulsory for the troops of the East India Company’s Bombay Army.  While not in the rules for everyone else yet, they were still widely taken up across the Armed Forces and during the Crimean War there were a wide variety of permissible (and over the top) styles.

By the 1860s, moustaches were finally compulsory for all the Armed Forces and they became as much an emblem for the Armed Forces as the Army uniform.

In 1916, the regulation was dropped and troops were allowed to be clean-shaven again. This was largely because such a superficial requirement was getting ignored in the trenches of WWI, especially as they could sometimes get in the way of a good gas mask seal. 

Vickers machine gun crew with gas masks english british ww1 world war one wwi

The order to abolish the moustache requirement was signed on October 6, 1916 by General Sir Nevil Macready, who himself hated moustaches and was glad to finally get to shave his off. While no longer in force today, there are still regulations governing moustaches and, if worn, they can grow no further than the upper lip.  It is also still extremely common for British soldiers in Afghanistan to wear beards, as facial hair is still associated with power and authority in many Islamic regions.

Bonus Moustache Facts:

  • As alluded to, during the Napoleonic era, French soldiers were required to wear facial hair of various sorts.  Sappers were required to have full beards.  Grenadiers and other elite level troops had to maintain large busy moustaches.  Infantry Chasseurs were required to wear goatees with their moustache.  This requirement has long since died out excepting the case of sappers in the Foreign Legion, who still are strongly encouraged to maintain a full, robust beard.
  • Russian non-officer soldiers were required to wear moustaches under Peter the Great’s reign.  On the flipside, while previously it was extremely common for Russian soldiers to wear beards, Peter the Great didn’t find beards so great and not only banned them from the military, but also for civilians, with the lone exception being that members of the clergy could wear them.
  •  Moustache, mustache, and mustachio are all technically correct spellings to describe hair on the upper lip.  Mustachio has relatively recently fallen out of favor for generically describing all moustaches, now more typically referring to particularly elaborate moustaches.  Moustache is the most common spelling today in the English speaking world, though North Americans usually prefer mustache.
  • The English word “moustache” comes from the French word of the same spelling, “moustache”, and popped up in English around the 16th century.  The French word in turn comes from the Italian word “mostaccio”, from the Medieval Latin “mustacium” and in turn the Medieval Greek “moustakion”.  We now finally get to the earliest known origin which was from the Hellenistic Greek “mustax”, meaning “upper lip”, which may or may not have come from the Hellenistic Greek “mullon”, meaning “lip”.  It is theorized that this in turn came from the Proto-Indo-European root “*mendh-“, meaning “to chew” (which is also where we get the word “mandible”).
  • Western Women tend to wax or shave their moustaches, those that can grow them anyways, but Mexican artist Frida Kahlo actually celebrated not only her ‘stache, but also her unibrow, including putting them in her very famous self portrait seen to your right.
  • The oldest known depiction of a man with a moustache goes all the way back to 300 BC.  The depiction was of an Ancient Iranian horseman.
  • “De befborstel” is the Dutch slang for a moustache grown for the specific purpose of stimulating a woman’s clitoris.
  • The longest moustache ever recorded was in Italy on March 4, 2010, and measured in at 14 ft. long (4.29 m).  The proud owner of that magnificent ‘stache was Indian Ram Singh Chauhan.

lord kritchener world war 1 wwi world war i mustache

  • Names of the Various Styles of Moustache:
    • Hungarian: Extremely bushy, with the hairs pulled to the side and with the hairs extending past the upper lip by as much as 1.5 cm.
    • Dali: Named after artist Salvador Dali (who incidentally once published a book, with Philippe Halsman, dedicated to Dali’s moustache, titled: Dali’s Mustache), styled such that the hair past the corner of the mouth is shaved, but the non-shaved hair is allowed to grow such that it can be shaped to point upward dramatically.
    • English Moustache: Thin moustache with the hair on a line in the middle of the upper lip sideways, with the hair at the corner of the mouth slightly shaped upwards.
    • Imperial: Includes not only hair from above the upper lip, but also extends beyond into cheek hair, all of which is curled upward.
    • Fu Manchu: moustache where the ends are styled downwards, sometimes even beyond the bottom of the chin.
    • Handlebar Moustache: a somewhat bushy version of the Dali, but without the strict regulation of having the hair shaved past the side of the lips.
    • Horseshoe: Similar to the Handlebar, but with vertical extensions coming off the sides that extend downwards sharply to the jaw, looking something like an upside down horseshoe (think Hulk Hogan)
    • Chevron: thick moustache covering the whole of the upper lip (think Jeff Foxworthy)
    • Toothbrush: The moustache made popular by Charlie Chaplin, but  whose popularity hit a sharp decline thanks to one Adolph Hitler.
    • Walrus: very similar to the Hungarian, except without the strict length limit on the hair overhanging the upper lip. 

Charlie Chaplin

Contrary to a myth you may hear sometimes, there is no evidence whatsoever that Adolph Hitler decided to grow a toothbrush moustache to mimic Charlie Chaplin. 

Chaplin did parody Hitler in The Great Dictator and sported the now infamous moustache in that film.  The toothbrush moustache was popularized in Germany by Americans and began to become extremely popular by the end of WWI. 

Hitler originally went with the previous most popular ‘stache in Germany, the Kaiser Moustache, which was turned up at the ends, often with scented oil. 

He continued to wear this ‘stache at least up to and during WWI. 

A soldier who served with Hitler during WWI, Alexander Moritz Frey, stated that Hitler was ordered to trim his moustache during WWI while in the trenches to facilitate wearing a gas mask; so shaved the sides off and went with the toothbrush moustache instead.

Chaplin stated that he used the toothbrush moustache as it looked funny and also allowed him to show his expressions more fully than an alternatively comical moustache that covered more of his face would have.

SEE ALSO: Here are 2 problems that could sink the Russian military in a war

Join the conversation about this story »

NOW WATCH: Startling facts about World War II

Here's what all 50 state names really mean

$
0
0

If you want to understand a state's history, start by looking at its name.

The map below shows the breakdown of all the states' etymologies. The most names, eight in both cases, stem from Algonquin and Latin. 

 

US state name etymologies

But the etymologies of some names have become muddled over the years. Alternate theories exist for some, while an author appears to have made one up entirely.

Scroll through the list to find your home state's meaning and how the name originated:

Alabama: From the Choctaw word albah amo meaning "thicket-clearers" or "plant-cutters."

Alaska: From the Aleut word alaxsxaq, from Russian Аляска, meaning "the object toward which the action of the sea is directed."

Arizona: From the O'odham (a Uto-Aztecan language) word ali sona-g via Spanish Arizonac meaning "good oaks."

Arkansas: From a French pronunciation of an Algonquin name for the Quapaw people: akansa. This word, meaning either "downriver people" or “people of the south wind," comes from the Algonquin prefix -a plus the Siouan word kká:ze for a group of tribes including the Quapaw.

California: In his popular novel "Las sergas de Esplandián" published in 1510, writer Garci Ordóñez de Montalvo named an imaginary realm California. Spanish explorers of the New World could have mistaken Baja California as the mythical place. Where Montalvo learned the name and its meaning remain a mystery. 

Colorado River mineral bottomColorado: Named for the Rio Colorado (Colorado River), which in Spanish means "ruddy" or "reddish." 

Connecticut: Named for the Connecticut River, which stems from Eastern Algonquian, possibly Mohican, quinnitukqut, meaning "at the long tidal river." 

Delaware: Named for the Delaware Bay, named after Baron De la Warr (Thomas West, 1577 – 1618), the first English governor of Virginia. His surname ultimately comes from de la werre, meaning "of the war" in Old French.

Florida: From Spanish Pascua florida meaning "flowering Easter." Spanish explorers discovered the area on Palm Sunday in 1513. The state name also relates to the English word florid, an adjective meaning "strikingly beautiful," from Latin floridus.

Georgia: Named for King George II of Great Britain. His name originates with Latin Georgius, from Greek Georgos, meaning farmer, from ge (earth) + ergon (work). 

Hawaii: From Hawaiian Hawai'i, from Proto-Polynesian hawaiki, thought to mean "place of the Gods." Originally named the Sandwich Islands by James Cook in the late 1700s.

Idaho: Originally applied to the territory now part of eastern Colorado, from the Kiowa-Apache (Athabaskan) word idaahe, meaning "enemy," a name given by the Comanches. 

Illinois: From the French spelling ilinwe of the Algonquian's name for themselves Inoca, also written Ilinouek, from Old Ottawa for "ordinary speaker." 

Indiana: From the English word Indian + -ana, a Latin suffix, roughly meaning "land of the Indians." Thinking they had reached the South Indes, explorers mistakenly called native inhabitants of the Americas Indians. And India comes from the same Latin word, from the same Greek word, meaning "region of the Indus River." 

Sleeping baby

Iowa: Named for the natives of the Chiwere branch of the Aiouan family, from Dakota ayuxba, meaning "sleepy ones."

Kansas: Named for the Kansa tribe, natively called kká:ze, meaning "people of the south wind." Despite having the same etymological root as Arkansas, Kansas has a different pronunciation.

Kentucky: Named for the Kentucky River, from Shawnee or Wyandot language, meaning "on the meadow" (also "at the field" in Seneca). 

Louisiana: Named after Louis XIV of France. When René-Robert Cavelier, Sieur de La Salle claimed the territory for France in 1682, he named it La Louisiane, meaning "Land of Louis." Louis stems from Old French Loois, from Medieval Latin Ludovicus, a changed version of Old High Germany Hluodwig, meaning "famous in war."

Maine: Uncertain origins, potentially named for the French province of Maine, named for the river of Gaulish, an extinct Celtic language, origin.

Maryland: Named for Henrietta Maria, wife of English King Charles I. Mary originally comes from Hebrew Miryam, the sister of Moses.

Massachusetts: From Algonquian Massachusetts, a name for the native people who lived around the bay, meaning "at the large hill," in reference to Great Blue Hill, southwest of Boston.

Michigan: Named for Lake Michigan, which stems from a French spelling of Old Ojibwa (Algonquian) meshi-gami, meaning "big lake."

Minnesota: Named for the river, from Dakota (Siouan) mnisota, meaning "cloudy water, milky water,"

Mississippi: Named for the river, from French variation of Algonquian Ojibwa meshi-ziibi, meaning "big river."

fayetteville arkansas canoe lake river fall

Missouri: Named for a group of native peoples among Chiwere (Siouan) tribes, from an Algonquian word, likely wimihsoorita, meaning "people of the big (or wood) canoes."

Montana: From the Spanish word montaña, meaning "mountain, which stems from Latin mons, montis. U.S. Rep. James H. Ashley of Ohio proposed the name in 1864. 

Nebraska: From a native Siouan name for the Platte River, either Omaha ni braska or Oto ni brathge, both meaning "water flat."

Nevada: Named for the western boundary of the Sierra Nevada mountain range, meaning "snowy mountains" in Spanish.

New Hampshire: Named for the county of Hampshire in England, which was named for city of Southampton. Southampton was known in Old English as Hamtun, meaning "village-town." The surrounding area (or scīr) became known as Hamtunscīr.

New Jersey: Named by one of the state's proprietors, Sir George Carteret, for his home, the Channel island of Jersey, a bastardization of the Latin Caesarea, the Roman name for the island.

New Mexico: From Spanish Nuevo Mexico, from Nahuatl (Aztecan) mexihco, the name of the ancient Aztec capital.

New York: Named in honor of the Duke of York and Albany, the future James II. York comes from Old English Eoforwic, earlier Eborakon, an ancient Celtic name probably meaning "Yew-Tree Estate."

King Charles the II

North Carolina: Both Carolinas were named for King Charles II. The proper form of Charles in Latin is Carolus, and the division into north and south originated in 1710. In latin, Carolus is a strong form of the pronoun "he" and translates in many related languages as a "free or strong" man.

North Dakota: Both Dakotas stem from the name of a group of native peoples from the Plains states, from Dakota dakhota, meaning "friendly" (often translated as "allies").

Ohio: Named for the Ohio River, from Seneca (Iroquoian) ohi:yo', meaning "good river." 

Oklahoma: From a Choctaw word, meaning "red people," which breaks down as okla "nation, people" + homma "red." Choctaw scholar Allen Wright, later principal chief of the Choctaw Nation, coined the word. 

Oregon: Uncertain origins, potentially from Algonquin.

Pennsylvania: Named, not for William Penn, the state's proprietor, but for his late father, Admiral William Penn (1621-1670) after suggestion from Charles II. The name  literally means "Penn's Woods," a hybrid formed from the surname Penn and Latin sylvania.

Rhode Island: It is thought that Dutch explorer Adrian Block named modern Block Island (a part of Rhode Island) Roodt Eylandt, meaning "red island" for the cliffs. English settlers later extended the name to the mainland, and the island became Block Island for differentiation. An alternate theory is that Italian explorer Giovanni da Verrazzano gave it the name in 1524 based on an apparent similarity to the island of Rhodes.

Block Island

South Carolina: See North Carolina.

South Dakota: See North Dakota.

Tennessee: From Cherokee (Iroquoian) village name ta'nasi' of unknown origin.

Texas: From Spanish Tejas, earlier pronounced "ta-shas;" originally an ethnic name, from Caddo (the language of an eastern Texas Indian tribe) taysha meaning "friends, allies."

Utah: From Spanish yuta, name of the indigenous Uto-Aztecan people of the Great Basin; perhaps from Western Apache (Athabaskan) yudah, meaning "high" (in reference to living in the mountains).

Vermont: Based on French words for "Green Mountain,"mont vert.

Virginia: A Latinized name for Elizabeth I, the Virgin Queen.

Washington: Named for President George Washington (1732-1799). The surname Washington means "estate of a man named Wassa" in Old English.

West Virginia: See Virginia. West Virginia split from confederate Virginia and officially joined the Union as a separate state in 1863.

Wisconsin: Uncertain origins but likely from a Miami word Meskonsing, meaning "it lies red"; misspelled Mescousing by the French, and later corrupted to Ouisconsin. Quarries in Wisconsin often contain red flint.

Wyoming: From Munsee Delaware (Algonquian) chwewamink, meaning "at the big river flat."

SEE ALSO: Why we pronounce Arkansas and Kansas differently

Join the conversation about this story »

Here's how the 40-hour workweek became the standard in America

$
0
0

walking to work

In 1890, the US government began tracking workers' hours. The average workweek for full-time manufacturing employees was a whopping 100 hours.

Seventy-five years ago, on October 24, 1940, the eight-hour day and 40-hour workweek became standard practice in a range of industries. It was a long, drawn-out battle between workers and government officials.

We take a look back at the history of the 40-hour workweek, as well as how it's evolved in the last few years. 

The history of the 40-hour workweek 

August 20, 1866: A new organization named the National Labor Union asked Congress to pass a law mandating the eight-hour workday. Their efforts technically failed, but they inspired Americans across the country to support labor reform over the next few decades. 

May 1, 1867: The Illinois Legislature passed a law mandating an eight-hour workday. Many employers refused to cooperate, and a massive strike erupted in Chicago. That day became known as "May Day." 

May 19, 1869: President Ulysses S. Grant issued a proclamation that guaranteed a stable wage and an eight-hour workday — but only for government workers. Grant's decision encouraged private-sector workers to push for the same rights. 

1870s and 1880s: While the National Labor Union had dissolved, other organizations including the Knights of Labor and the Federation of Organized Trades and Labor Unions continued to demand an eight-hour workday. Every year on May Day, strikes and demonstrations were organized to bring awareness to the issue.

May 1, 1886: Labor organizations called for a national strike in support of a shorter workday. More than 300,000 workers turned out across the country. In Chicago, demonstrators fought with police over the next few days. Many on both sides were wounded or killed in an event that's now known as the "Haymarket Affair." 

1906: The eight-hour workday was instituted at two major firms in the printing industry.

September 3, 1916: Congress passed the Adamson Act, a federal law that established an eight-hour workday for interstate railroad workers. The Supreme Court constitutionalized the act in 1917.

September 25, 1926: Ford Motor Companies adopted a five-day, 40-hour workweek.

June 25, 1938: Congress passed the Fair Labor Standards Act, which limited the workweek to 44 hours.

June 26, 1940: Congress amended the Fair Labor Standards Act, limiting the workweek to 40 hours. The act went into effect on October 24, 1940.

man frustrated with phone

How the 40-hour workweek has evolved

Recent research suggests that the 40-hour workweek may be on its way out— at least among professionals and executives.

In a survey by tax and professional services firm EY, half of managers around the world reported logging more than 40 hours a week. In the US, a whopping 58% of managers said they worked over 40 hours a week. Presumably, some of that time is spent at home answering emails, instead of at the office.

Meanwhile, there's evidence that some Americans see working around the clock as a kind of status symbol. While many people claim to be working 60- or 80-hour workweeks, much of that time isn't very productive. In fields like finance and consulting, some workers may only be pretending to work 80-hour weeks, a recent study suggests. 

Yet for lower-income Americans, who may not view overwork the same way, there are some signs of progress.

In June 2015, Congress proposed a rule change that would expand the number of Americans who qualify for overtime pay. Workers who earn up to $50,440 a year would be eligible for time-and-a-half overtime wages when they work more than 40 hours per week. Currently, the threshold below which workers can earn overtime wages is just $23,660. 

No matter your profession, the truth is that working longer hours can be counterproductive because you start putting out lower-quality work as time goes on. 

In general, research suggests that we can handle working 60-hour weeks for three weeks — after that, we become less productive.

SEE ALSO: The 40-hour workweek is on its way out

Join the conversation about this story »

NOW WATCH: What 'Shark Tank' costar Daymond John learned from losing $6 million

Viewing all 1838 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>