Here's how Marcus Aurelius got himself out of bed every morning
Watch Leonard Nimoy in a Marine Corps instructional video from 1954
Long before he played the greatest Starfleet officer of all time and directed the immortal 'The Voyage Home' Leonard Nimoy spent 18 months in the Army reserve. According to Military.com, Nimoy achieved the rank of sergeant and spent much of his army service "putting on shows for the Army Special Services branch which he wrote, narrated, and emceed."
Nimoy acted in the following instructional film along with future "Davy Crockett" star Fess Parker. It addressed what was then called combat fatigue, or the emotional and psychological toll of warfare. The film shows how Marine Corps psychologists were supposed to treat combat fatigue sufferers, giving a glimpse into how the wartime military of the 1950s dealt into the still-vital question of how to address the mental health needs of its troops. Nimoy appears as the first of the two Marines in the clip to undergo treatment.
This clip was made in 1954, at the height of Korean War and 12 years before Star Trek premiered on NBC.
SEE ALSO: Actor Leonard Nimoy dies at 83
Join the conversation about this story »
NOW WATCH: 14 things you didn't know your iPhone headphones could do
21 beautiful, vintage photographs of NASA's glory days
Space exploration's golden age was arguably at its very start, when ambition was boundless and progress came in great strides.
A massive collection of vintage photos from this era went up for auction on February 26 at London's Bloomsbury Auction.
The nearly 700 photographs — original prints, not reproductions — come from the collection of a single European collector.
The auction lasted nearly ten hours and brought in a total of £489,440, (or more than $755,500) from more than 300 bidders.
Here are 21 of them, in chronological order, starting in 1946 with the first image of Earth from space.
On October 24, 1946, mankind got its first photograph taken from outer space, at an altitude of 65 miles. A camera attached to a V-2 Rocket, a product of German engineering during World War II, was set up to snap a photo every second and a half. The rocket crashed back to Earth, its film roll kept safe by a steel casing.
Ed White was the first American astronaut to take a spacewalk, on June 3 1965. A cosmonaut (as Soviet space explorers are called) by the name of Alexei Leonov beat him to it by almost three months — though Leonov had a brush with death to do so, as he was forced to let oxygen out of his suit before reentering his spacecraft. Spacewalks are an important part of an astronaut's toolkit, who exit their vessels in order to make repairs on the outside.
Another shot of Ed White's historic spacewalk. "You looked like you were in your mother’s womb," White's copilot James McDivitt later told him.
Source: Bloomsbury Auctions
See the rest of the story at Business Insider
The credit card was invented by a man who forgot his wallet at dinner
Recently I was reading Jason R. Hastie's "The Dollar Code: Get Out of Debt With One Number," where I came across the story of Frank McNamara, the creator of the credit card.
The story caught my attention and I decided to do a little research of my own.
Here's what I found.
According to the Diner's Club, the idea of the credit card came to Frank McNamara in 1949 while he was having dinner at a restaurant in New York City.
When it was time to pay the bill, McNamara realized he had forgotten his wallet.
What happened next depends on whom you ask: According to "The Dollar Code," McNamara negotiated his way out of washing dishes to pay for his dinner by signing for it instead and promising to pay the restaurant back. According to NerdWallet, he had to call his wife and ask her to deliver some cash.
According to Jonathan Levine's 2008 "Credit Where It Is Due: A Social History of Consumer Credit in America," this dinner — apparently known in the industry as "the first supper"— is a parable that didn't actually happen.
However, it is undisputed that McNamara went on to create the first American consumer-facing credit card company, the Diner's Club, which he founded in 1950 with Ralph Sneider.
According to the Pittsburgh Post-Gazette, the idea behind the Diner's Club was sign now, pay later. Members of the club would be able to sign for their dinner, and then pay the bill later. McNamara started the club with 27 participating restaurants and 200 $3 memberships that he sold to his friends and acquaintances.
At the same time, Alfred Bloomingdale — grandson of the founder of Bloomingdale's department stores — founded Dine and Sign in Los Angeles, another credit card business, according to the New York Times.
After a friend notified Bloomingdale of the Diner's Club, Bloomingdale had a series of meetings with McNamara and Sneider, which resulted in a merger of the two companies. Bloomingdale was named vice president of the new Diner's Club.
A man ahead of his time, Bloomingdale was the one to predict the eventual demise of cash and rise of the credit card: "The day will come when the plastic card will make money obsolete," he said.
According to NerdWallet, the Diner's Club offered charge cards — which are almost extinct, and not exactly beneficial nowadays — that allowed customers to borrow credit from a middleman, use that money to buy something, and then eventually pay the middleman back.
The Diner's Club generated a profit by charging stores a 7% fee on all purchases and requiring customers to pay a $3 annual fee.
Although merchants weren't exactly happy with the idea of a credit card that could be used everywhere — fewer people would be using their individual store credit cards — McNamara's cards caught on quick with customers, expanding to 20,000 users in the first year. In its second year, The Diner's Club made $60,000 and established franchises in Canada, Cuba, and France.
According to Bankrate, eight years after the founding of the Diner's Club, American Express and Carte Blanche started issuing cards, along with banks such as Bank of America who originally issued the Visa card as the BankAmericard, and then turned the card into a national franchise that could be issued by local banks across the US (in 1958, national banks didn't yet exist).
In 1967, four banks in California founded a competitor for BankAmericard, known as the MasterCharge program. This program became MasterCard 12 years later.
Credit card use didn't really take off until 1978, though, when a Supreme Court ruling allowed nationally chartered banks the ability to charge out-of-state customers the interest rate set in the bank's home state.
From there, credit card use only grew. According to ABC News, today more than 75% of Americans own at least one credit card, and in 2012, there were a total of 26.2 billion credit card transactions in the US alone.
SEE ALSO: Here's How Many Credit Cards You Should Have
Join the conversation about this story »
NOW WATCH: 14 things you didn't know your iPhone headphones could do
Scientists discovered why the Washington Monument is shrinking
Though some discrepancies in the Washington Monument's height may be the result of different measurement methods, they appear to be partly due to lightning strikes.
This video originally appeared on Slate Video. Watch More: slate.com/video
Jim Festante is an actor/writer in Los Angeles and regular video contributor to Slate. He's the author of the Image Comics miniseries The End Times of Bram and Ben.
US and Norwegian troops reenacted a successful WWII special forces operation in the snow
In commemoration of the 70th anniversary of a successful sabotage campaign against the Nazis, members of the Minnesota National Guard reenacted the mission in Snaasa, Norway, the US Army reports.
The reenactment included a 12-mile cross-country ski trek that retraced the movements of US and Norwegian troops in 1945. The original WWII operation was a sabotage campaign aimed against the occupying Nazi force during the closing days of the war.
By 1945, the Nazis had occupied Norway for five years and hundreds of thousands of German soldiers remained in the country. Eager to end the war in Europe, the US and Norwegian resistance members carried out a targeted campaign of destroying Norwegian railroads. The goal was to hinder movement so as to prevent the large detachment of Nazis from reinforcing German positions in central Europe.
The special operations were carried out by Norwegian-speaking Americans from the 99th Infantry Battalion, which largely recruited from Minnesota and the Dakotas. These operators were trained by the Office of Strategic Services, the precursor to the CIA, before uniting with Norwegian forces.
The reenactment also included a mock demolition of the Jorstad railroad bridge. The exercise ended with a ceremony at a memorial near the bridge.
Members of the Norwegian, US, and German armed forces attended the ceremony. It honored both the soldiers that carried out the sabotage, as well as allied troops who died hours later, on January 13, 1945, in an operation after a train derailed due to the destruction of the bridge.
SEE ALSO: Here's a Marine's advice for braving the extreme cold
Here's what Microsoft co-founder Paul Allen actually found at the bottom of the ocean in the Philippines
Microsoft co-founder and philanthropist Paul Allen announced that he has discovered Musashi, a World War II Japanese battleship that was sunk by US forces over 7o years ago.
Allen and his research team found the ship in the Sibuyan Sea, more than eight years after their search began.
Produced by Jason Gaines. Video courtesy of Associated Press.
Follow BI Video: On Facebook
Hundreds of medieval bodies unearthed under a supermarket in Paris
More than 200 bodies were recently unearthed in several mass burials beneath a Paris supermarket.
The bodies, which were lined up head to feet, were found at the site of an ancient cemetery attached to the Trinity Hospital, which was founded in the 13th century.
Though it's not clear exactly how these ancient people died, the trove of bodies could reveal insights into how people in the Middle Ages buried their dead during epidemics or famine, the researchers involved said.
Supermarket renovations
The burials were discovered during renovations to the basement of the Monoprix Réaumur-Sébastopol supermarket, located in the second-arrondissement neighborhood of Paris.
As workers lowered the floor level of the basement, they found a shocking surprise: the bodies of men, women and children, neatly arranged in what looked to be mass graves.
The site was once the location of the Trinity Hospital, which was founded in 1202 by two German noblemen. The hospital was conceived not just as a place to provide care for the sick, but also as one where weary pilgrims and travelers could rest and enjoy themselves, according to a 1983 presentation given at the French Society on the History of Medicine.
But in 1353, during the height of the Black Death, the hospital also opened a cemetery, which provided a lucrative side business for the religious folk who operated the hospital, according to the presentation.
During that catastrophic period, hundreds of people a day died in the Hôtel-Dieu de Paris, the city's oldest hospital, and burial space was tough to find in the crowded city. Occasionally, the overflow bodies were buried at the Trinity Hospital site, according to the presentation.
Mass death
So far, archaeologists have uncovered about eight mass burial pits on different levels of the site. Seven of those sites hold between five and 20 individuals, while the remaining pit contains more than 150 bodies, according to a statement about the findings.
The bodies were laid down methodically in neat rows, head to feet, with one burial extending beyond the boundaries of the excavation. The pits contain the skeletons of men and women, old and young, none of which show obvious signs of injury or disease.
Given the huge number of skeletons found, it seems likely the bodies were buried during some mass medical crisis, when too many people were dying at once to provide individual burials, the researchers note in the statement.
As a follow-up, the team plans to use radioactive isotopes of carbon (elements of carbon with different numbers of neutrons) to estimate when these people lived. By combining this data with ancient texts and maps of Medieval Paris, researchers hope to reveal how and when these people died.
In the 1500s, the Trinity Hospital converted to a site where little boys and girls trained as apprentices. By the 1700s, the site fell into disrepair. During the French Revolution, the hospital was destroyed and the remaining structures were turned into stables for animals, according to the presentation.
Follow Tia Ghose on Twitter and Google+. Follow Live Science @livescience, Facebook& Google+. Originally published on Live Science.
- 8 Grisly Archaeological Discoveries
- In Photos: 14th-Century 'Black Death' Grave Discovered
- Top 10 Weird Ways We Deal With the Dead
Copyright 2015 LiveScience, a Purch company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
SEE ALSO: CT scan finds a mummified Buddhist monk was stuffed with scripts in Chinese writing
Join the conversation about this story »
NOW WATCH: Scientists Discovered What Actually Wiped Out The Mayan Civilization
Circumcision has a long and complicated history
On a recent Saturday morning, Craig Adams stood outside the Robert Wood Johnson University Hospital in New Brunswick, New Jersey.
It was sunny but cold. Adams, who had turned 40 the day before, wore white sneakers and a black T-shirt over a long-sleeve shirt.
A fuzz of thinning hair capped his still-youthful face.
His appearance would have been unremarkable if not for the red splotch of fake blood on the crotch of his white trousers.
The stain had the intended effect: drivers rounding the corner were slowing down just enough to see the sign he was holding, which read "No Medical Excuse for Genital Abuse".
Next to him, Lauren Meyer, a 33-year-old mother of two boys, held another sign, a white poster adorned only with the words: "Don’t Cut His Penis".
She had on a white hoodie with a big red heart and three red droplets, and a pair of leopard-print slipper-boots to keep her feet warm for the several hours she would be outside.
Meyer’s first son is circumcised; she sometimes refers to herself as a "regret mother" for having allowed the procedure to take place.
It was two days after Christmas. Adams and Meyer had each driven about an hour to stand by the side of a road holding up signs about penises.
On that same day, a woman stood alone at what qualifies as a busy intersection in the small town of Show Low, Arizona. She also wore white trousers with a red crotch, and held aloft anti-circumcision signs. A few people more people did the same in the San Francisco Bay area.
The protests were triggered by a recent event, but the issue at stake was an ancient one. Circumcision has been practised for millennia. Right now, in America, it is so common that foreskins are somewhat rare, and may become more so.
A few weeks before the protests, the Centers for Disease Control and Prevention (CDC) had suggested that healthcare professionals talk to men and parents about the benefits of the procedure, which include protection from some sexually transmitted diseases, and the risks, which the CDC describes as low. But as the protesters wanted drivers to know, there is no medical consensus on this issue.
Circumcision isn’t advised for health reasons in Europe, for instance, because the benefits remain unclear. Meanwhile, Western organisations are paying for the circumcision of millions of African men in an attempt to rein in HIV – a campaign that critics say is also based on questionable evidence.
Men have been circumcised for thousands of years, yet our thinking about the foreskin seems as muddled as ever. And a close examination of this muddle raises disturbing questions.
Is this American exceptionalism justified? Should we really be funding mass circumcision in Africa? Or by removing the foreskins of men, boys and newborns, are we actually committing a violation of human rights?
***
The tomb of Ankhmahor, a high-ranking official in ancient Egypt, is situated in a vast burial ground just outside Cairo. A picture of a man standing upright is carved into one of the walls. His hands are restrained, and another figure kneels in front of him, holding a tool to his penis.
Though there is no definitive explanation of why circumcision began, many historians believe this relief, carved more than four thousand years ago, is the oldest known record of the procedure.
The best-known circumcision ritual, the Jewish ceremony of brit milah, is also thousands of years old. It survives to this day, as do others practised by Muslims and some African tribes.
But American attitudes to circumcision have a much more recent origin. As medical historian David Gollaher recounts in his book Circumcision: A History of the World’s Most Controversial Surgery, early Christian leaders abandoned the practice, realising perhaps that their religion would be more attractive to converts if surgery wasn’t required.
Circumcision disappeared from Christianity, and the secular Western cultures that descended from it, for almost two thousand years.
Then came the Victorians. One day in 1870, a New York orthopaedic surgeon named Lewis Sayre was asked to examine a five-year-old boy suffering from paralysis of both legs.
Sayre was the picture of a Victorian gentleman: three-piece suit, bow tie, mutton chops. He was also highly respected, a renowned physician at Bellevue Hospital, New York’s oldest public hospital, and an early member of the American Medical Association.
After the boy’s sore genitals were pointed out by his nanny, Sayre removed the foreskin. The boy recovered. Believing he was on to something big, Sayre conducted more procedures. His reputation was such that when he praised the benefits of circumcision – which he did in the Transactions of the American Medical Associationand elsewhere until he died in 1900 – surgeons elsewhere followed suit.
Among other ailments, Sayre discussed patients whose foreskins were tightened and could not retract, a condition known as phimosis. Sayre declared that the condition caused a general state of nervous irritation, and that circumcision was the cure.
His ideas found a receptive audience. To Victorian minds many mental health issues originated with the sexual organs and masturbation. The connection had its roots in a widely read 18th-century treatise entitled Onania, or the Heinous Sin of Self-Pollution, and All Its Frightful Consequences, in Both Sexes, Considered. With Spiritual and Physical Advice to Those Who Have Already Injur’d Themselves By This Abominable Practice.
The anonymous author warned that masturbation could cause epilepsy, infertility, "a wounded conscience" and other problems. By 1765 the book was in its 80th printing.
Later puritans took a similar view. Sylvester Graham associated any pleasure with immorality.
He was a preacher, health reformer and creator of the graham cracker.
Masturbation turned one into "a confirmed and degraded idiot", he declared in 1834.
Men and women suffering from otherwise unlabelled psychiatric issues were diagnosed with masturbatory insanity; treatments included clitoridectomies for women, circumcision for men.
Graham’s views were later taken up by another eccentric but prominent thinker on health matters: John Harvey Kellogg, who promoted abstinence and advocated foreskin removal as a cure.
(He also worked with his brother to invent the cornflake.)
"The operation should be performed by a surgeon without administering anesthetic," instructed Kellogg, "as the brief pain attending the operation will have a salutary effect upon the mind, especially if it be connected with the idea of punishment."
Counter-examples to Sayre’s supposed breakthrough could be found in operating theatres across America.
Attempts to cure children of paralysis failed. Men, one can assume, continued to masturbate. It mattered not.
The circumcised penis came to be seen as more hygienic, and cleanliness was a sign of moral standards. An 1890 journal identified smegma as "infectious material".
A few years later, a book for mothers – Confidential Talks on Home and Child Life, by a member of the National Temperance Society – described the foreskin as a "mark of Satan". Another author described parents who did not circumcise their sons at an early age as "almost criminally negligent".
By now, the circumcision torch had passed from Sayre to Peter Charles Remondino, a popular San Diego physician descended from a line of doctors that stretched back to 14th-century Europe.
In an influential 1891 book about circumcision, Remondino described the foreskin as a "malign influence" that could weaken a man "physically, mentally and morally; to land him, perchance, in jail or even in a lunatic asylum". Insurance companies, he advised, should classify uncircumcised men as "hazardous risks".
Further data came from studies of the "Hebrew penis", which showed a "superior cleanliness" that had protective benefits, according to John Hutchinson, an influential surgeon at the Metropolitan Free Hospital of London.
Hutchinson and others noted that Jews had lower rates of syphilis, cancer and mental illness, greater longevity, and fewer stillbirths – all of which they attributed to circumcision. Remondino agreed, calling circumcision "the real cause of differences in longevity and faculty for enjoyment of life that the Hebrew enjoys".
By the turn of the 20th century the Victorian fear of masturbation had waned, but by then circumcision become a prudent precaution, and one increasingly implemented soon after birth. A desire to prevent phimosis, STDs and cancer had turned the procedure into medical dogma.
Antiseptic surgical practices had rendered it relatively safe, and anaesthesia made it painless. Once a procedure for the relatively wealthy, circumcision had become mainstream. By 1940, around 70 per cent of male babies in the United States were circumcised.
In the decades since, medical practice has come to rely increasingly on evidence from large research studies, which, as many American doctors see it, have supported the existing rationale.
When the CDC made its recent statement, for example, it cited studies showing that circumcision reduces the risk of urinary tract infection, several STDs, penile cancer, phimosis, balanitis (inflammation of the foreskin and head of the penis) and HIV.
The CDC even noted benefits for women with circumcised partners, namely a lower risk of cervical cancer linked to human papillomavirus.
The mechanism behind these benefits is simple: the warm and moist region under the foreskin can house the bacteria and viruses that cause disease. A circumcised penis can’t be colonised so easily; without the blanket, it’s harder to hide. Circumcision also removes a large quantity of Langerhans cells, a component of the immune system that, according to some research, is targeted by HIV.
During the second half of the last century, an accumulation of studies demonstrated the beneficial impacts of these mechanisms. At times the research helped all but end the debate over circumcision. By the 1970s, for instance, more than 90 per cent of US men were circumcised, according to one study. The American foreskin had become a thing of the past.
***
Today circumcision is among the most common surgeries in the US: an estimated 1.2 million infants are circumcised each year, at a cost of up to $270 million. Its popularity has fluctuated since the peak of the 1970s; the CDC’s most recent estimate puts the current rate at 60 per cent of newborns. This may in part be because the American Association of Pediatrics (AAP) for a time equivocated over the issue. But in 2012 the AAP announced that benefits of circumcision outweighed the risks, suggesting that rates may rise again.
Yet whether it’s 60 or 90 per cent of American men who are circumcised, what’s more remarkable is that American parents are almost alone in the Western world in their desire to separate boys from their foreskins for reasons other than religion. This difference of opinion is decades old.
It began in 1949, when a British paediatrician and scientist named Douglas Gairdner published the first investigation of the rationale for circumcision in English-speaking countries. He found the procedure to be unwarranted.
Phimosis, the condition Sayre held responsible for so many neuroses, was essentially a non-issue, said Gairdner. He discovered something that had somehow gone undocumented before: that most foreskins remain unretracted well into the toddler years.
Phimosis is the natural state of the penis, Gairdner concluded. (Later work would confirm that the foreskin sometimes does not fully retract until the teenage years.) This was just the beginning. Gairdner showed that balanitis and posthitis, forms of inflammation that were considered cause for circumcision, were uncommon.
He found no data to show that circumcision could prevent venereal diseases and little evidence for a lesser risk of cervical cancer. Cleaning the intact foreskin would do as much to thwart penile cancer as would removing it, he added.
At the National Health Service, which was founded a year before Gairdner’s paper appeared, officials heeded his advice and refused to cover circumcision unless it was medically necessary. By 1958, the circumcision rate in the United Kingdom had fallen to close to 10 per cent. Excluding British men who are circumcised for religious reasons, the rate is now 6 per cent or lower.
The situation is much the same elsewhere in Europe. The Victorian focus on circumcision was concentrated in English-speaking countries, and its popularity never spread.
When European experts examine the evidence, they generally see no reason that it should. In 2010, for instance, the Royal Dutch Medical Association reviewed the same studies the AAP looked at.
Aside from preventing urinary tract infections, which can be treated with antibiotics, it concluded that the health benefits of circumcision are "questionable, weak, and likely to have little public health relevance in a Western context".
How can experts who have undergone similar training evaluate the same studies and come to opposing conclusions? I’ve spent months scrutinising the medical literature in an attempt to decide which side is right. The task turned out to be nearly impossible.
That’s partly because there is so much confused thinking around the risks and benefits of circumcision, even among trained practitioners. But it’s also because, after reading enough studies, I realised that the debate doesn’t have a scientific conclusion. It is impossible to get to the bottom of this issue because there is no bottom.
Assessing the true risks of circumcision is the first challenge. Immediate complications are usually easily treatable, and also relatively rare – the AAP report states that problems like bleeding and infection occur in up to 1 in 100 circumcisions.
But the frequency of later problems is less well understood. Some studies find few; others conclude that as many as one in four patients suffer some kind of complication after the surgery and subsequent wound healing.
The possible late problems are many. The remaining foreskin tissue can adhere to the penis. The opening of the urethra may narrow, making urination painful and preventing the bladder from fully emptying, which in turn can lead to kidney problems.
Craig Adams, the New Jersey protester, had to have surgery to correct such a problem when he was five years old. Lauren Meyer’s first son had surgery for the same reason when he was three.
Other late complications include a second surgery to correct an incomplete circumcision, a rotated penis, recurrent phimosis, and concealment of the penis by scar tissue, a condition commonly known as buried penis.
The AAP acknowledges some uncertainty surrounding the data on risks, but not in a way that a parent looking for advice is likely to fully grasp. "The true incidence of complications after newborn circumcision is unknown," the AAP’s recent report states.
But complications are risks. "They’re saying, ‘The benefits outweigh the risks but we don’t know what the risks are," says Brian Earp, research fellow at Oxford University’s Uehiro Centre for Practical Ethics. "This is basically an unscientific document."
The debate about the effectiveness of circumcision can be just as convoluted. One way of thinking about this is the number needed to treat (NNT), a figure that answers the question: how many people need to be treated with this approach in order to prevent one illness?
For the ideal treatment the answer is one. But penile cancer is rare and circumcision doesn’t provide complete protection against it, so around 900 circumcisions are needed to prevent a single case. That’s a very high NNT. By comparison, 50 people need to take aspirin to prevent one cardiovascular problem.
It’s also worth noting that other preventive methods can have a greater impact on penile cancer. The American Cancer Society suggests avoiding smoking, for example. The same logic applies to sexually transmitted diseases. Studies show that circumcision reduces the chances of a man contracting herpes, for example.
But the risk of this and every known STD can be stopped or at least dramatically reduced by correct and consistent condom use. "The benefits can all be obtained in other ways," says Adrienne Carmack, a Texas-based urologist who opposes routine infant circumcision.
Even the premise behind this debate – that the usefulness of circumcision can be determined by weighing the risks and benefits – is questionable. A drug for a deadly disease has a lot of leeway in terms of side-effects. Cancer patients are willing to endure chemotherapy if it means they get to live, for example.
But when the person is healthy and too young to weigh the risks and benefits themselves, the maths changes. "Your tolerance for risk should go way down because it’s done without consent and it’s done without the presence of disease," says Earp.
These uncertainties undermine the case for circumcision. They don’t completely destroy it though. Even after the criticisms are factored in, circumcision does bring some benefits, such as reducing the risk of urinary tract infections in young boys. What the uncertainties do is raise questions about whether those benefits justify the procedure.
And this is where an evidence-based approach breaks down. Because the procedure results in the loss of something whose value cannot be quantified: the foreskin. If you view the foreskin as disposable, circumcision might be worth it. For those who see the act as the removal of a valuable body part, the reverse is likely true.
More than the medical data, it’s these unquantifiable feelings about the foreskin that shape doctors’ thinking about circumcision, or at least that of male doctors.
Because when it comes to medical opinions on circumcision, the foreskin status of the opiner matters. A 2010 survey in the Journal of Men’s Health found that close to 70 per cent of circumcised male physicians supported the procedure.
An almost identical fraction of uncircumcised physicians were opposed. The AAP Task Force behind the 2012 statement was made up mainly of men, all of whom were circumcised and from the US, where newborn circumcision is the norm.
"Seen from the outside, cultural bias reflecting the normality of nontherapeutic male circumcision in the United States seems obvious," wrote a group of European physicians in response to the AAP.
It’s also likely that most of these critics were not circumcised. "We never deny that we are from a non-circumcising culture," said Morten Frisch, lead author of the response and an epidemiologist who studies sexual health at Statens Serum Institut in Denmark. "While we claim that the US view is culturally biased, the opposing view from the AAP was that it’s us who are culturally biased, and to an extent they are right."
These cultural divisions make it nearly impossible to sort through the medical literature. Rather than clarifying, the debate gets bogged down in accusations of poor research and bias.
Brian Morris, a molecular biologist at the University of Sydney who is an outspoken proponent of circumcision, recently circulated a 23-page critique of a study by Frisch. The Danish researcher’s work was "an ideological rant against male circumcision", said Morris, who asked colleagues to complain to the journal that published it.
In response, Frisch called out Morris for citing his own "pro-circumcision manifesto" as source material for his critique and, in a video response on YouTube, said that Morris had accused "us of racism and dishonesty and all sorts of things… in order to have the editors reject the paper".
"Both sides tend to be highly selective on which bits of evidence they want to quote," says Basil Donovan, an epidemiologist focused on sexual health at the University of New South Wales and a community-based infectious disease physician. Professional discussions have become so heated that Donovan rarely participates. "I stay out of the area," he said. "I want to have a life, I don’t want people bombing the front door."
***
None of this is much help to a circumcised man who is wondering about a body part he never knew. Then again, many circumcised men want to know something besides the health benefits. They want to know whether removal of the foreskin negatively impacts sex.
Some of the most compelling data in this area came from a pathologist named John Taylor, who in 1996 published the first description of the cells that make up the foreskin. An uncircumcised Englishman, Taylor was initially motivated by the prospect of his Canada-born children being circumcised.
That’s what led him to examine the foreskins of 22 uncircumcised corpses. He wanted to know whether the tissue had any functional value – if foreskin cells are specialised and serve some particular purpose, Taylor reasoned, that should be weighed when considering circumcision.
Specialised cells were exactly what Taylor found. Measuring about 6.5 centimetres long when fully grown, the foreskin is a mucosal membrane that contains copious amounts of Meissner’s corpuscles, touch-sensitive cells that are also present in our lips and fingertips.
"We only find this sort of tissue in areas where it has to perform specialised function," Taylor later told an interviewer from Intact Canada, an organisation seeking to end circumcision.
The mucosal inner surface is kept wet by a natural lubricant, and the tip contains elastic fibres that allow it to stretch without becoming slack. "This is sexual tissue, and there’s no way you can avoid the issue."
One of Taylor’s most noteworthy discoveries was the "ridged band", an accordion-like strip of flesh about 10 to 15 millimetres long that is as sensitive as the fingertips. During an erection, the band is turned inside out, placing highly sensitive cells at the base of the penis. In later work, Taylor and a colleague described the band as far more sensitive than the glans, the part of the penis left exposed after circumcision.
"The only portion of the body with less fine-touch discrimination than the glans penis is the heel of the foot," they wrote. The penis still works without a foreskin, of course. But the foreskin is erogenous tissue. It also keeps the penis protected and moist.
Without it the exposed surface is smoother, drier, more sensitive to changes in temperature and more easily irritated by clothing. A thickening of the surface of the glans, known as keratinisation, can also decrease sensitivity.
Foreskin cells don’t grow back. Efforts to restore the foreskin by pulling the flesh downwards – a practice attempted by some men who’ve experienced sexual problems or who simply dislike having had their foreskin removed without their say – can create an overhang of skin, but can’t replace the sensitive cells.
Taylor, who died in 2010, believed that the foreskin is as important as the glans to sexual function. "Doctors doing this procedure don’t know what they’re removing," he told Intact Canada.
If Taylor is right, circumcised men should have less sensitive penises. One way to test that idea is to touch a lot of penises, circumcised and not, in a laboratory setting. At least one group has done so.
In 2006, a team of US scientists and anti-circumcision activists used stiff nylon thread to measure the sensitivity of 19 points on the foreskin (when present) and glans of almost 160 men. The most sensitive spot on circumcised men was the circumcision scar; in uncircumcised men, it was the foreskin.
Many men also wonder if circumcision leads to sexual problems. Again there’s tentative evidence that it does. In 2011, Morten Frisch published data on the sexual experiences of more than 5,500 men and women. (This was the study disputed by Brian Morris.)
Few people reported problems, but of those who did – trouble achieving orgasm, for instance, or, for women, pain during sex – most were circumcised men or their female partners.
Another opportunity to study the question arose when widescale circumcision was introduced in South Korea around 1950, largely as a result of the US presence there in the years after World War II. Researchers at Seoul National University asked recently circumcised men about sexual function before and after the procedure.
Of the approximately 140 men who were sexually active before and after the surgery, nearly half said masturbation was now less pleasurable. Of the 28 men from this group who said sex was also now less enjoyable, most attributed the difference to a decrease in sensation.
Still, these data are far from conclusive, and other researchers have reached the opposite conclusion. Morris, the circumcision advocate, reviewed 36 studies, encompassing a total of around 40,000 men, and found no impairment in sensitivity, orgasm achievement, erectile function or any other measure of sexual function connected to circumcision.
And so the debate goes on, offering little clarity to the people who need it most: parents wondering if they should circumcise their newborn sons.
***
All of this – the benefits, the harms, the bias, the anger – could justify a randomised clinical trial of circumcision. These experiments are the surest way to judge the usefulness of a treatment, and could eliminate the angst over the decision.
Yet circumcision has never been the subject of one. It’s hard to see that changing. American parents would presumably be happy to have such a study to inform their thinking, but few would want their babies take part in it.
Actually, that point about trials isn’t entirely accurate: there have been randomised controlled trials of circumcision – three, to be exact. Just not in America.
The studies took place in Uganda, Kenya and South Africa between 2002 and 2006. Their primary purpose was to determine whether circumcision reduces the risk of HIV transmission from women to men during sex.
Each was large, involving around 3,000 subjects, and lasted around two years. Adult volunteers were randomly assigned to be circumcised or not, and the circumcised men ended up with fewer cases of HIV. Follow-up analyses have confirmed that the protective benefits persist.
This was big news in a region living through some of the worst of the AIDS epidemic. In South Africa, for example, around 6 million people are HIV-positive. The studies suggested that circumcision could reduce the risk of a man in the region acquiring HIV from heterosexual sex by 60 per cent.
Based on this, a 2007 analysis estimated that if every man in sub-Saharan Africa were circumcised over a five-year period, countries in the region could cut their HIV rates from 12 per cent to 6 per cent by 2020.
Once the potential became clear donors decided to attempt something almost as ambitious.
In 2007, the United States Agency for International Development (USAID) and the Bill & Melinda Gates Foundation, together with other donor organisations, launched a $1.5 billion campaign to circumcise 80 per cent of boys and men across and southern Africa by the end of this year, a total of about 20 million people.
One afternoon last July I watched the final stages of this extraordinary campaign play out in Iringa, a city in the southern highlands of Tanzania. A pick-up loaded with a DJ and booming sound system was parked at a dirt crossroad bordered by concrete shops and lean-tos covered in corrugated metal.
A young woman – peer promoter was her job title – spoke through a microphone. She wore a black T-shirt with "tohara", the Swahili word for circumcision, across the front.
A crowd gathered, and she asked circumcised onlookers to give testimonials about the importance of the procedure. Barefoot children sat listening on fence posts and danced to the music when the peer promoter took a break.
This was a demand-creation activity – an outreach effort designed to generate interest in circumcision. Iringans had good reason to be interested. Sixteen per cent of the local population have HIV, partly because truckers overnight there, and prostitution near the truck stops is common.
Jhpiego, a nonprofit health organisation affiliated with Johns Hopkins University that was running the event, has placed circumcision clinics at health facilities in the area, advertised on the radio and posted giant billboards at heavy-traffic intersections.
In the crowd listening to the lessons on tohara I met Violet Msuya, a 21-year-old student holding her niece on her hip. "I want my man to be clean," she said through a translator when I asked about her interest. "If that man is clean, it will help me avoid cervical cancer and HIV." She told me that she hadn’t had sex with an uncircumcised man, but had heard from friends that a foreskin makes sex less pleasant.
Later that day, at one of Iringa’s larger hospitals, I talked with Gabriel, a 20-year-old who was about to be circumcised. "Circumcision will reduce my chance of being infected with HIV by 60 per cent," he told me.
He added that it would be easier for him to stay clean without a foreskin, and said he’d heard through the media that circumcision could reduce his risk of cancer.
It was his second visit to the hospital. Gabriel had chickened out the first time, but mustered the courage to return after discovering he could be circumcised with a device known as PrePex, a circular clamp that is applied to the penis.
He sat on an operating table as Dennis Fischer, the clinic’s physician, demonstrated the health benefits of circumcision for me using a wooden dildo covered with a brown felt foreskin.
Gabriel was still sitting on the table when I left, his thin, jean-clad legs dangling over the side, awaiting the PrePex. In a few days, he would return to have the clamp removed and the dead flesh cut off.
Before the campaign, Tanzania, which is home to over 100 ethnic groups, had a mixed prevalence of circumcision. Some groups, like the Maasai, practised traditional circumcision.
So did the country’s Muslims. Others, including the Christian population, had rarely done so. Changing that required millions of dollars in infrastructure and salaries, and a collision with a variety of beliefs.
Men feared circumcision would leave them impotent, or automatically convert them to Islam. Parents worried that their son’s penises would not grow. When the programme first began, there were rumours about discarded foreskins being ground up for use in meat stock in America, or being sent to Europe to make cosmetics.
Even the country’s administration was resistant. "It took two years to convince government officials," says Sifuni Raphael Koshuma, a surgeon from Dar es Salaam who leads the PrePex research.
Since then, organisations like Jhpiego have been so successful that circumcision is now fashionable. Even married couples embrace it. After visiting Iringa I drove out to Usokami, a rural clinic where the mud houses have no electricity or running water and bicycles are more common than cars. At the clinic I met Meshak Msigwa, 42, who told me his wife had encouraged him to get circumcised.
He spoke to me from behind a blue hospital curtain, and his sentences were punctuated by the metallic click of scissors as a doctor snipped off his foreskin. I asked him if having the surgery implied that he or his wife – both are HIV-negative – would cheat. He told me that won’t happen. "I swore in church I would be faithful to my wife," he said.
The goal of circumcising 80 per cent of men and boys over the age of ten in Iringa is nearly accomplished. Jhpiego is now conducting what the programme administrators refer to as a "mop-up", targeting specific clinics where the total number of circumcisions has been low.
The organisation is also promoting routine early infant medical circumcision. In Tanzania and other countries in Africa, as in America, the foreskin is becoming a thing of the past.
There is another similarity between the situation in Africa and that in America: in both cases, the scientific evidence for circumcision is less certain than advocates make out. A 60 per cent risk reduction is a long way from total protection, for one thing.
Michel Garenne, an epidemiologist at the Pasteur Institute in Paris, notes that many interventions with that kind of efficacy – an early version of the cholera vaccine, the rhythm method as contraception – have not been recommended as wide-scale public health measures because the benefits don’t translate to a broad population that is repeatedly exposed to infection.
The same is true of HIV: a man who repeatedly has sex without a condom runs a high risk of contracting the virus, regardless of his circumcision status. "If the randomised controlled trials had shown 99 per cent efficacy, that would be one thing," says Garenne. "But they haven’t."
There is also a problem with the information given to those who volunteer for surgery. I met many newly circumcised men who repeated what Gabriel had heard: circumcision reduces the risk of contracting HIV by 60 per cent. Yet this figure is what epidemiologists call the relative risk reduction.
It tells us that in the clinical trials there were 60 per cent fewer new HIV infections among the circumcised men than the uncircumcised group. It says nothing about the actual risk of contracting HIV.
That risk depends very much on sexual behaviour. Critically, if men have frequent sex with infected women they will likely get HIV, regardless of whether they are circumcised. It’s also crucial, but perhaps not appreciated by all volunteers, that circumcision does not reduce the chances of an HIV-positive man infecting his female partner.
The campaign organisers know all of this, of course. It’s one reason why every man who is circumcised is also counselled in the ABCs of HIV prevention: Abstinence, Being faithful and Condom use. The campaign administrators also talk of "condom fatigue".
They know that men will forgo condoms on occasion, and circumcision reduces the risk when they do. "It’s a single, one-off procedure," says Ronald Gray, of Johns Hopkins University, who led the trial in Uganda. Because the benefit, however large or small, is conferred for life, it’s worth it, Gray argues.
Still, no one knows what the level of protection will be outside the confines of the clinical trials, in which volunteers were counselled and tested for HIV every few months, receiving money at each clinic visit.
The impact of circumcision on HIV rates among women is particularly hard to predict, and it’s possible that the procedure could confer a false sense of protection on circumcised men. "My impression is the campaign is as likely to have a positive effect as a negative effect," says Garenne. "We’ll know in 20 years."
Transitioning to routine early infant circumcision, as is happening in Tanzania and a few other locations, is also controversial. "The evidence in adults is also true for infants," says Emmanuel Njeuhmeli, a senior USAID official working on the circumcision campaign.
But so far we only have data on adult circumcision. In the absence of better evidence, should governments be recommending a surgical procedure to citizens who are too young to agree to the procedure? "It’s highly questionable in terms of medical ethics," says Garenne.
Such concerns aren’t likely to have much impact, because the thinking about circumcision in Africa is settled for now. The procedure is voluntary, but opting out is getting harder. Radio advertisements persuade men that circumcised penises are cleaner and sexier.
Food vouchers are sometimes used as incentive to get circumcised. "It’s really increasingly becoming a sort of socially coerced activity," says Oxford’s Brian Earp. "That’s not voluntary any more." Njeuhmeli isn’t sure that’s a problem. If circumcision can help halt HIV, why not stigmatise foreskins? "When you reach 80 per cent coverage, the remaining 20 per cent of men are definitely being stigmatised," he says. "Is it a bad stigma or a good stigma? I honestly don’t know."
***
If I were a new mother in a country hit hard by HIV, I would at least strongly consider having my infant son circumcised. There are uncertainties, but if circumcision can put a dent in the epidemic, then I understand why parents would look at the evidence and choose the procedure. In the United States the picture is less clear. HIV rates here are much lower and the route of transmission is usually not heterosexual sex. What should parents do?
After reading the literature, I’m unconvinced by the evidence used to justify circumcision for health reasons. I’ll explain why by means of a thought experiment. Imagine that infant male circumcision had never been a part of American medical practice, but was common in, say, Spain or Senegal or Japan.
Based on what we know about the health benefits of the procedure, would American doctors recommend introducing the procedure? And would that evidence be enough for American parents to permanently remove a part of their child’s body without his agreement?
Remember what the evidence tells us. Either the benefits can be obtained by a milder intervention (antibiotics and condoms in the case of urinary tract infections and sexually transmitted diseases), or the risk is low and open to other preventive measures (penile cancer), or the concern is rarely justified (HIV in the United States).
Remember also that Western countries where circumcision is rare do not see higher rates of the problems that foreskin removal purports to prevent: not STDs, not penile cancer, not cervical cancer, not HIV. It’s hard to imagine circumcision being introduced on this basis. It’s equally difficult to picture studies on the benefits of the procedure being done.
The main reason we have circumcision in the US today is not the health benefits. It’s because we’re used to it. After all, if circumcision is not definitively preventing a life-threatening issue that cannot be prevented by other means, can removal of a body part without the agreement of the child be justified? We are so accustomed to the practice that operating on an infant so that he resembles his father seems acceptable. I’ve heard many people give this as their reason. It isn’t a good one.
It’s disconcerting to think that circumcising infant boys may be a violation of their human rights. We castigate cultures that practise female genital mutilation (FGM).
Rightfully so: no one should be coerced into such a violation. But removal of the clitoral hood, one form of FGM, is anatomically analogous to removal of the foreskin.
Some forms of FGM, such as nicking or scratching the female genitalia, are unequivocally deemed a human rights violation but are even milder than the foreskin removal done in US hospitals.
Thinking about male circumcision as an unnecessary and irreversible surgery forced on infants, I can’t but hope that the troubled history of the foreskin will come to an end, and that the foreskin will be known for its presence rather than its absence. I understand why some people demand an immediate end to circumcision.
And I understand why a man would stand on a street corner for hours on a cold day wearing red-stained trousers, angry at what was done to him without his agreement and trying to prevent other men from suffering the same fate.
SEE ALSO: New Study Confirms The Health Benefits Of Circumcision
Join the conversation about this story »
NOW WATCH: Selling Sex: Crazy Condom TV Ads From Around The World
ISIS just 'bulldozed' the ancient Assyrian city of Nimrud
The Islamic State group began bulldozing the ancient Assyrian city of Nimrud in Iraq, the government said, in the jihadists' latest attack on the country's historical heritage.
The Islamic State, also known as ISIS or ISIL, "assaulted the historic city of Nimrud and bulldozed it with heavy vehicles," the tourism and antiquities ministry said on an official Facebook page.
An Iraqi antiquities official confirmed the news, saying the destruction began after noon prayers on Thursday and that trucks that may have been used to haul away artifacts had also been spotted at the site.
"Until now, we do not know to what extent it was destroyed," the official said on condition of anonymity.
Nimrud, one of the jewels of the Assyrian era, was founded in the 13th century B.C. and lies on the Tigris River about 30 kilometers (18 miles) southeast of Mosul, Iraq's second-biggest city and the main hub of ISIS in the country.
"I'm sorry to say everybody was expecting this. Their plan is to destroy Iraqi heritage, one site at a time," said Abdulamir Hamdani, an Iraqi archaeologist from Stony Brook University.
"Hatra of course will be next," he said, referring to a beautifully preserved city in Nineveh province that is more than 2,000 years old and is a Unesco world heritage site.
"I'm really devastated. But it was just a matter of time," he said.
Nimrud is the site of what was described as one of the greatest archaeological finds of the 20th century when a team unearthed a collection of jewels and precious stones in 1988.
The jewels were briefly displayed at the Iraqi national museum before disappearing from public view, but they survived the looting that followed the 2003 US invasion and were eventually found in a Central Bank building.
Most of Nimrud's priceless artifacts have long been moved to museums, in Mosul, Baghdad, Paris, London, and elsewhere, but giant "lamassu" statues — winged bulls with human heads — and reliefs were still on site.
The destruction at Nimrud on Thursday came a week after the jihadist group released a video showing militants armed with sledgehammers and jackhammers smashing priceless ancient artifacts at the Mosul museum.
That attack sparked widespread consternation and alarm, with some archaeologists and heritage experts comparing it with the 2001 demolition of the Bamiyan Buddhas in Afghanistan by the Taliban.
In the jihadists' extreme interpretation of Islam, statues, idols, and shrines amount to recognizing other objects of worship than God and must be destroyed.
The video released by ISIS last week showed militants knocking statues off their plinths and rampaging through the Mosul museum's collection.
It also shows jihadists using a jackhammer to deface an imposing granite Assyrian winged bull at the Nergal Gate in Mosul.
"These artifacts behind me are idols for people from ancient times who worshipped them instead of God," a bearded militant said in the video.
"The prophet removed and buried the idols in Mecca with his blessed hands," he said, referring to the Muslim Prophet Muhammad.
Many of the artifacts destroyed in the Mosul museum were from Nimrud and Hatra.
Unesco director general Irina Bokova demanded an emergency meeting of the Security Council and called for the International Criminal Court to look into the Mosul museum destruction.
ISIS spearheaded a sweeping offensive last June that overran Nineveh province, where Mosul and Nimrud are located, and swept through much of Iraq's Sunni Arab heartland.
The Mosul region was home to a mosaic of minorities, including the Assyrian Christians, who consider themselves to be the region's indigenous people.
ISIS has systematically destroyed heritage sites, including Sunni Muslim shrines that it also considers heretical, in areas it controls, and repeatedly attacked members of religious minorities.
Iraqi security forces and allied fighters are battling to regain ground from the jihadists with backing from an international anti-ISIS coalition as well as neighboring Iran.
But major operations to drive ISIS out of Nineveh are likely months away, leaving the province's irreplaceable historical sites at the mercy of militants who have no regard for Iraq's past.
SEE ALSO: ISIS is torching oil fields
Join the conversation about this story »
NOW WATCH: 14 things you didn't know your iPhone headphones could do
This broken 700-ton generator demonstrates everything that went wrong with the reconstruction of Iraq
It has been almost four years since US forces withdrew from Iraq, and the fate of the country has never seemed more uncertain.
ISIS continues to control large swathes of the country, Iran's influence is growing, and the Kurds are in an increasingly strong position to declare independence and fracture the country for good.
This splintering of Iraq illustrates how difficult it is to reconstruct a country after years of war and destruction. Unfortunately, the seeds of this current dysfunction were in place years well before the US military began its withdrawal in 2011.
No story demonstrates the difficulty of Iraqi reconstruction, and the mistakes made, quite as vividly as the Mother of All Generators (MOAG).
Almost immediately after the invasion of Iraq in 2003 the country faced persistent energy problems. Rolling blackouts were common and Iraqis could count on a scarce few hours of power a day. To rectify this, the US Agency for International Development (USAID) bought a $50 million Siemens V94 generator which was designated for a new power plant in Kirkuk. It was supposed to single-handed increase Iraq's power generation by six 6%.
But the program encountered programs from the get go, former USAID regional coordinator Kirk W. Johnson told The Daily Beast in an interview about his book "To Be a Friend Is Fatal: The Fight to Save the Iraqis America Left Behind."
Since the 700-ton generator was too heavy to airlift to its final destination, MOAG was first transported by sea to the Syrian port of Tartous. From Tartous it was driven to the Tishrim Dam east of Aleppo at a painstakingly slow speed of five miles per hour.
But the Syrians refused to allow the generator to cross the dam in retaliation for US sanctions on the country.
USAID was forced to reroute MOAG overland through Syria to Jordan. To reach Kirkuk from the Jordan, the generator would be forced pass through the Iraqi province of Anbar, the center of the ongoing Sunni insurgency. Instability in the province necessitated that the generator's movement be delayed as a "single Kalashnikov round could destroy" it, Johnson said.
This rerouting caused the generator to sit on the Jordanian border for all of 2004 and the first three months of 2005. James Stephenson, a veteran member of USAID, notes in his book Losing The Golden Hour how the generator's delivery was further delayed until after the battle of Fallujah and the subsequent clearing of insurgents. Moving the generator before the city was pacified — with its maximum convoy speed of five miles per hour — would have given the insurgents an easy and very tempting target. But the costs of protecting the generator in Jordan ran around $20,000 a day in private security fees, Johnson notes.
All this time, Kirkuk continued to face power shortages. By April 2, 2005, MOAG finally reached its destination in Kirkuk after a 640-mile journey through Iraq, with 250 to 300 military personnel accompanying MOAG alongside Humees and a number of helicopters.
Ultimately, all of this work and money was completely wasted.
"[N]obody had bothered to train the Iraqi plant workers in the operations and maintenance of this state-of-the-art generator," Johnson told The Daily Beast. "So, months after it was handed over in a triumphant ribbon-cutting ceremony, the generator was broken."
Today, Kirkuk is at the frontline in the fight against ISIS. Kurdish forces took over the city in June of 2014 shortly after ISIS blitzed through much of the rest of northern Iraq. The city remains a point of dispute between Baghdad and Iraq's Kurdistan Regional Government, each of which claims the city as part of its sphere of control.
Hardly anyone realizes the classic board game Monopoly started as an early feminist's attack on capitalism
Ralph Anspach wasn't going to stop making his Anti-Monopoly game just because Parker Brothers told him to. At least not without a fight.
It was 1974 and Anspach, an economics professor at San Francisco State University, was caught in a legal battle with the makers of the popular board game Monopoly for allegedly infringing on the game's copyright. The premise of Anspach's game, as its title suggests, was to bust trusts rather than create them. He wanted to use it as a teaching tool, especially for children.
In his quest to prove that Monopoly's roots far preceded its 1935 patent, he discovered that its origins dated back to 1904, in a game that was very similar to his own. After a long legal battle, Parker Brothers ended up with the Anti-Monopoly name, but let Anspach print the game under license — more importantly, the court validated that Anspach had proven that Monopoly was not as original as it had seemed to be.
Mary Pilon wrote a piece for the Wall Street Journal about Anspach in 2009 after coming across his story. Her research grew into her new book "The Monopolists: Obsession, Fury, and the Scandal Behind the World's Favorite Board Game." It's the first journalistic account of the true origin of the game that Parker Brothers' parent company Hasbro says has sold more than 275 million times across 111 countries in 43 languages.
For decades after its 1935 launch, Parker Bros' Monopoly board game included an origin story in its instruction manual that was a celebration of the American Dream: Charles Darrow, an unemployed salesman determined to support his family during the Great Depression — or at the very least entertain them — tinkered away in his basement on a game about buying property. Parker Brothers initially turned down the game, but after it gained popularity through word of mouth, the company bought Monopoly, it became a sensation, and both Darrow and Parker Brothers enjoyed fame and fortune.
If the instruction manual told the full truth, it would begin with Elizabeth "Lizzie" Magie designing the Landlord's Game in 1903 as a teaching tool. It was the Progressive Era in the U.S. and Magie, the daughter of an abolitionist, was a suffragist and Georgist, a follower of the writer and economist Henry George.
George's 1879 book "Progress and Poverty" was a foundational text of the Progressive movement and reports from the time say that it became so popular that several million copies were sold, making it the most read book in America for a time, second only to the Bible.
"The amount of wealth being created in this country was something nobody had ever really seen before," Pilon says, and George was searching for ways to protect regular people from being exploited by wealthy land owners.
A main tenet of George's philosophy is the single-tax theory, which essentially replaces all taxes deemed unfair with a tax on land only, not the properties built on top of them.
George died in 1897, and so Magie believed she was doing her part to keep the fight alive through her game. She included two rule sets with her game: the anti-monopolist rules and the monopolist rules. The idea was that she could expose the evils of land-grabbing by having players see how it works.
It turned out that most people found the monopoly rules more fun. And though Magie patented the Landlord's Game in 1904, the nature of game culture at the time combined with the lack of a mass production deal resulted in it becoming a "folk game," as Pilon calls it, meaning groups of people throughout the country would learn about the game through word of mouth and develop their own variation.
The Landlord Game's unofficial offspring became popular in the Progressive and academic communities, including the radical leftist utopia Arden in Delaware, which included Pulitzer Prize-winning author Upton Sinclair and controversial economist Scott Nearing among its residents.
Magie obtained a patent for an updated version of the game in 1924, but by the early '30s, the game and its original intentions were significantly overshadowed by the monopoly folk game.
Among its biggest fans were a large group of Quakers in Atlantic City. It was this version that Charles Darrow played with some friends.
Not everything about the fake Monopoly origin story were false. Darrow was unemployed at the height of the Depression and at his wit's end. One of his sons had scarlet fever and he lacked funds for getting him proper treatment.
One day he decided that he would try marketing that board game he had played. He got his successful cartoonist friend, F.O. Alexander, to spice up the board with some illustrations.
After Milton Bradley and Parker Brothers each turned it down once, Darrow got his game called Monopoly a large enough following that Parker Brothers bought it for $7,000 plus residuals in 1935.
The game company sold 278,000 copies of Monopoly in its first year, and then 1,751,000 the next year, which Pilon says brought Parker Brothers millions in profits.
In letters Pilon includes in her book, Darrow tells Parker Brothers that he was inspired by a game he played with his friends that was based off one they learned from a college professor. Darrow keeps his language vague and unclear.
By the time the game took off, Parker Brothers became aware that Darrow's Monopoly game had at the very least some heavy inspiration, and so began to acquire any other "folk game" offshoots that were still out there.
At one point very early in Monopoly's life, an article ran that exposed to the public its true origin story. "Very likely your grandma and grampa played Monopoly," an article starts in the January 26, 1936 issue of the Washington Evening Star. "It isn't new."
The article tells the story of how Magie patented the Landlord's Game in 1904 as a teaching tool for Georgian economics.
But just two months before this article was published, Parker Brothers had wisely inked a deal with Magie to avoid a scandal, which Magie signed in hopes that the game company would promote her work as much as they did Monopoly.
Parker Brothers printed a modified version of the Landlord's Game in 1939, with Magie's face on the cover, along with two more of her games, but "there's little evidence they were ever seriously marketed," Pilon says.
Pilon tells us that the 1940 Census lists Magie's occupation as "Maker of games" with an income of "$0.00." She died in 1948.
Though the real story behind Monopoly has always been out there, and was well-known in the board game community since the court confirmed the research Anspach did in the '70s, but Pilon says that before she started her book, there was widespread false information spread across the Internet. She thinks part of it was the original Darrow story just sounded better.
"I think that when we think about innovation and how things are made," Pilon says, "we love lightbulb stories because they're romantic, they're beautiful, they're Cinderella stories. But the truth is when things are made it's often a collaborative effort with lots of product testing — it's way more complicated. And we don't think to question origin stories."
SEE ALSO: Here are the core lessons from a book that Mark Zuckerberg and Bill Gates think everyone should read
Join the conversation about this story »
NOW WATCH: People were shocked to see what this female executive was really like
Long before drones, the US tried to automate warfare during the Vietnam War
In this excerpt from Andrew Cockburn's 'Kill Chain: The Rise of the High-Tech Assasins', Cockburn delves into the strategic, historical, and technological developments that led to the widespread use of drones in the 21st century.
[By mid-1967] F-4 fighter-bombers and other aircraft strewed hundreds and then thousands of sensors across the jungle.
Fleets of assorted aircraft were deployed to circle day and night and relay radio signals from the sensors back to Nakhon Phanom, a military base on the west bank of the Mekong River in northeast Thailand that was so secret it officially did not exist.
The base hosted a whole variety of unacknowledged “black” activities, but at its heart, behind additional layers of razor wire and guard posts, sat an enormous air-conditioned building, the largest in Southeast Asia, that was home to Task Force Alpha, the “brain” of the automated battlefield.
Behind air locks pressurized to keep the omnipresent red dust of northeast Thailand away from the delicate machines, technicians monitored incoming sensor signals as they were fed to two IBM 360/65 mainframe computers, the very fastest and most powerful in existence at that time.
Teams of programmers on contract from IBM labored to rewrite software that would make sense of the data. Not coincidentally, the layout of the darkened, aseptic “war room” resembled that of the command centers of the air force ballistic missile early warning system back in the US waiting for signs of a Soviet nuclear attack.
In the view of the military command, the highly classified project represented the first step into a world in which human beings, with all their messy, unpredictable traits, would be eliminated, except as targets.
By the time Task Force Alpha began operating Igloo White, the secret code name for the overall electronic barrier (the military likes to preserve the illusion of security with a proliferation of code names), this approach was already failing against the Vietnamese “people’s war,” but there was little inclination for a change of heart.
General William Westmoreland, the army chief of staff and former commander in Vietnam, expressed the vision most concisely in October, 1969: “On the battlefield of the future,” he declared in a lunch-time speech to the Association of the US Army, a powerful pressure group, “enemy forces will be located, tracked and targeted almost instantaneously through the use of data links, computer assisted intelligence evaluation, and automated fire control.
With first round kill probabilities approaching certainty, and with surveillance devices that can continually track the enemy, the need for large forces to fix the opposition will be less important.”
As it so happened, other components of the air force were fighting a very different kind of war. Marshall Harrison, a former high school teacher, spent 1969 piloting a slow-flying air force plane with excellent visibility called an OV-10 Bronco over South Vietnam as a forward air controller tasked with tracking and fixing the enemy for bombing by jet fighter-bombers.
Equipped with no surveillance device more sophisticated than his own eyes, he learned to look for signs that no sensor would ever catch: “fresh tracks along a trail, smoke coming from areas where there should be no smoke, too many farmers toiling in the paddy fields... small vegetable patches where they shouldn’t be.”
No computer would calculate, as he learned to do, that footprints on a muddy trail early in the morning if it had rained only during the night probably belonged to the enemy, since civilians were wary of moving at night and being killed if caught breaking curfew.
Needless to say, Harrison did not encounter senior officers at the barebones strips where he was usually based. The futuristic complex at Nakhon Phanom, on the other hand, could move privileged visitors to awe. “Just as it is almost impossible to be an agnostic in the Cathedral of Notre Dame,” reported Leonard Sullivan, a high-ranking Pentagon official who visited in 1968, “so it is difficult to keep from being swept up in the beauty and majesty of the Task Force Alpha temple.”
Sullivan’s boss, Dr. John S. Foster, was even more unbridled in his enthusiasm, telling a congressional committee in 1969 that “this system has been so effective... that there has been no case where the enemy has successfully come through the sensor field... It is a very, very successful system.”
Foster’s support was potent. His title, director of defense research and engineering, a post he occupied from 1965 to 1973, belied the immense power of his office, since the research projects he authorized and paid for could turn into multibillion-dollar production programs.
The suave, smooth-talking physicist-bureaucrat was an ardent proponent of high-technology weapons programs, the more esoteric the better, dispensing billions of dollars for weapons development without excessive concern for cost or practical results.
Among the aspects of the electronic fence that most excited Foster, an avid model plane hobbyist, was the plan to deploy unmanned planes — drones — not only to relay sensor signals back to Thailand but also ultimately to attack targets.
Remotely piloted aircraft had been a topic of military interest ever since World War I, when a prototype radio-controlled biplane designed to attack enemy trenches had been tested and discarded for lack of accuracy and reliability, not to mention frequent crashes.
Further radio-control experiments in the inter-war years led to the actual coining of the term drone by a pair of naval scientists in 1936, according to an official history, “after analyzing various names of insects and birds.”
In World War II the US Navy had brought about the death of the heir to the Kennedy fortune by enlisting him in Operation Aphrodite, a scheme to fly remote-controlled B-24 bombers packed with explosives into German submarine pens.
Human pilots were required to handle takeoff and to switch on the radio controls.
When Joseph Kennedy Jr. flipped the switch prior to bailing out, the plane promptly blew up. None of Aphrodite’s other eleven attempts were successful. By the 1960s drones had carved out a useful niche mission as semirealistic targets for aerial gunnery training.
Come the Vietnam War, they were adapted for reconnaissance, though without much success, being easy targets for enemy gunners.
Foster, however, cherished the notion that they could soon begin replacing manned attack planes in various roles, and so money poured into a variety of drone programs under development by corporations such as Boeing, Vought, and Teledyne Ryan.
Excerpted with permission from 'Kill Chain: The Rise of the High-Tech Assassins' by Andrew Cockburn, published by Henry Holt and Company, LLC. Copyright © 2015 by Andrew Cockburn. All rights reserved.
The Marines were the first US ground troops to land in Vietnam 50 years ago
On the morning of March 8, 1965, 3,500 US Marines landed on a beach in South Vietnam, becoming the first US ground troops to be committed to the Vietnam War, The Guardian reports.
While it was a clear message to North Vietnamese forces that American troops were moving away from just a support role for South Vietnam, the Marine landing was an administrative landing in friendly territory.
The Marines of 3rd Battalion, 9th Marines would not come under enemy fire in their initial foray into the country, according to Global Security.
Instead of encountering bullets, the Marines were greeted by welcoming South Vietnamese troops and pretty girls giving them leis of flowers.
“Nevertheless, a new phase of the Vietnam war had begun. About one-third of the Marine ground forces and two-thirds of the Marine helicopter squadrons in the Western Pacific had been committed to South Vietnam,” reads an official Marine Corps history of the service’s involvement in Vietnam.
It wouldn’t be long before US troops were involved in major combat operations. In August, four Marine infantry battalions launched Operation Starlite in order to repel Vietcong forces from the area around the Chu Lai Air Base.
"The landing was carefully stage managed. The troops were given a warm welcome by a delegation of smiling children and traditionally dressed Vietnamese women brandishing garlands of flowers. A sign held aloft read: 'Welcome, Gallant Marines,'" The Guardian recounted. "Nobody on the beach that day had any idea of the long and tortuous conflict that was to follow."
Nearly 185,000 US troops had been deployed to Vietnam by the end of 1965.
SEE ALSO: Here's a Marine's advice for braving the extreme cold
Join the conversation about this story »
NOW WATCH: 14 things you didn't know your iPhone headphones could do
A spherical bunker in Russia was the most secure place in the entire Cold War
The Dead Hand system was the Soviet Union's last line of deterrence in the event of a crippling nuclear strike, a way to inflict millions of casualties on its enemies even if the Soviet chain of command were decapitated.
The system was set up in 1985, not long after the most serious war scare of the late Cold War period. It's been likened to a real-life doomsday machine, and it isn't decisively known if the Kremlin's dismantled it.
David Hoffman, a Washington Post contributing editor and author of a Pulitzer Prize-winning history of the Cold War nuclear arms race, recently described the mechanics of the Dead Hand system on arms control scholar Jeffrey Lewis's podcast. The Dead Hand or "perimeter," which Hoffman characterized as a "system of guaranteed retaliation," had a critical human safeguard, bunkered in an incredibly strange underground chamber.
As Hoffman told Lewis, the Perimeter system's on-switch was "briefly" located in "a deep hardened underground bunker in the shape of a sphere, very deep and very hardened, probably the most secure place of all time in the Cold War."
In this isolated spheroid room, "a couple of duty officers" would sit and wait for three conditions to be met: "predelegation," decapitation, and active nuclear war.
Hoffman explains that the duty officers could make the perimeter system operational only if the Kremlin activated the system through a "predelegation switch," if there was "a complete loss of communication with the national command authority," and if "a complex system of sensors" had detected a nuclear detonation inside of the Soviet Union.
Three lights in the bunker had to be flashing the indicate the three criteria. The duty officers then had the option of issuing launch commands to small missiles that would fly the entire width of the Soviet Union and "give commands electronically to all the missile silos below to launch," Hoffman said.
In a sense, the spherical bunker really was the most secure place in the Cold War: inexplicably, the Soviet Union never told the US about the existence of its automated deterrence system and the US may not have even realized it existed until after the Soviet Union fell.
The US had figured out the system's "hardware," Hoffman says: the CIA knew about the existence of command rockets that had been tested to beam information to military bases, for instance, but wasn't sure what this hardware was actually for. "Dead Hand" didn't become publicly known in the US until 1993, when researcher Bruce Blair revealed its existence in a New York Times opinion piece.
Bizarrely, Moscow built a system aimed at deterring a strategic nuclear attack against the Soviet Union — and then didn't communicate its existence to the one country capable of launching such a strike.
There's another enduring mystery to the Dead Hand as well. As Hoffman explains, historians and researchers aren't sure whether the duty officers in the spherical bunker were "automatons" trained to activate the system as soon as all three lights turned green — or if they would have been more circumspect before deciding to escalate an ongoing nuclear war that the Soviets were almost necessarily losing.
Luckily for the Soviet Union and perhaps even the survival of the human race, researchers don't have a test case for how those duty officers would have behaved in this situation. But Hoffman says there's evidence that the Soviets valued the "human factor," even if the Kremlin created a secret system that would have killed millions of people in the course of a nuclear war it would already be losing.
"The Soviets did look briefly at totally computer driven automatic system and they decided against it," he said, opting for "a human firewall in a deep, safe bunker."
(As the podcast notes, there's a scene in the 1664 Stanley Kubrick film "Dr Strangelove: Or How I Learned To Stop Worrying And Love The Bomb" where Peter Sellers' titular character notes that there's no point in having a nuclear doomsday system in place if it's keep secret. Although the Dead Hand shows that one of "Strangelove's" darkest jokes is closer to the truth than the filmmakers could possibly have realized, the scene also shows why it's heinously dangerous to set up a doomsday device that's entirely automated — something the architects of Dead Hand sensibly decided against.
The whole scene is here:)
Listen to the entire podcast here
SEE ALSO: These are the 12 largest man-made explosions in history
Join the conversation about this story »
NOW WATCH: 14 things you didn't know your iPhone headphones could do
Here's why the Apple Watch always shows the time as 10:09 in advertisements (AAPL)
If you see an advertisement for a watch, chances are you'll see the watch's time set to 10:10.
Watchmakers have traditionally chosen 10:10 as their display time because it ensures that the watchmaker's logo, which is usually engraved beneath the 12, isn't obscured by the watch hands. On top of that, having the hands at 10:10 is symmetrical.
Apple, however, chooses to display a slightly different time on all of its Apple Watch promotions, setting the time one minute ahead to 10:09 rather than 10:10.
It's no mistake, either. Apple has a history of choosing a display time that has some significance, famously setting the time on all of its iPhone promotional materials and images to 9:41, the approximate time of day when Steve Jobs first unveiled the iPhone to the world back in 2007.
So why 10:09 for the Apple Watch? Apple appears to be making a statement about being ahead of the curve when it comes to smartwatches, and the facts back this theory up.
Many of the most famous watchmakers have a preference for the exact time displayed on their watches, according to Quartz. Rolex loves 10:10:31, TAG Heuer prefers 10:10:37, and Bell & Ross always opts for uniformity with 10:10:10. Timex, one of the few watchmakers who deviate from the 10:10 norm, displays the time 10:09:36.
Diving deeper, it appears that Apple wants the Apple Watch's time to be ahead of even Timex, and displays a specific time of 10:09:00 or 10:09:30, both of which allow Apple to consider itself "ahead of the times" with the Apple Watch.
So there you have it, it all boils down to Apple using a cheeky pun to symbolically stake its claim to the smartwatch market, all while tipping its hat to an age-old watchmakers tradition.
Update: Apple blogger Dave Mark over at The Loop has a different theory.
I think this is more about symmetry, about attention to detail, than about being ahead of the curve. At 10:10, the hour hand will be 1/6 of the way between the 10 and the 11 on the watch face. If the minute hand is precisely on the 2 (as it would be at 10:10), the minute and hour hands would not be symmetrical. At 10:09, the hands would be much closer to symmetrical perfection.
That sounds more like Apple logic to me.
Mark makes a great point, and his theory could also explain why the second hand is always at the 12 or the 6 position, as both maintain the symmetry of the watch face. Either way, there's no doubt that Apple was deliberate with its choice.
We went inside the top-secret tunnel under Grand Central that only presidents use
It is no secret now that Franklin D. Roosevelt had the unique challenge of hiding his crippling disease of what was believed to be polio throughout his terms as president. He went through great lengths to hide this disease from the press and the public. One of his many tricks of disguise can still be seen on a secret track, hidden below Grand Central Terminal.
Produced by Justin Gmoser. Additional camera by Sam Rega.
Follow BI Video: On Facebook
The Queen has reigned for so long that other world leaders look like temps next to her
People forget just how historically long Queen Elizabeth's reign has been.
She is the second-longest-serving, still-living monarch in the world today, having been Queen for 62 years. And she will overtake Queen Victoria as the longest-reigning British monarch ever if she is still on the throne by September 9, 2015.
The graphic below shows the Queen's reign compared to the periods in power of the leaders of the United States, France, Germany, Russia and the British prime ministers.
In her time, she has waved goodbye to Hitler, Stalin, Roosevelt, Thatcher, Kennedy, DeGaulle and Brandt.
When Adolf Hitler became chancellor in 1933, Queen Elizabeth was already in Buckingham Palace, as Princess Elizabeth Alexandra Mary. Churchill was the fifth prime minister of her life.
Margaret Thatcher, Britain's most iconic politician of the last 50 years, was in office for just one fifth of the Queen's reign, and US president Ronald Reagan for less than a sixth.
Compared to the leaders we hear the most in the news these days, the Queen is just so much bigger: Nicolas Sarkozy, arguably one of the most influential leaders in Europe in the last few decades, is the second to last square in the fourth line of the graphic. He is almost invisible next to the Queen.
To the Queen, other world leaders must feel like temp workers. Here one day, gone the next.
PS: Among the US presidents, President Kennedy is the small light blue square between President Eisenhower and President Johnson. He was in office for just two years, and his name does not fit into the tiny space available.
Join the conversation about this story »
NOW WATCH: We went inside the top-secret tunnel under Grand Central that only presidents use
Here's how humans have evolved over the past 550 million years
These charts helped US troops identify enemy aircraft during World War II
Distinguishing between friendly and enemy forces can be difficult in war, and World War II posed its own basic battlefield challenges. US soldiers — most of whom were conscripted men who might not have had extensive military experience — needed to be able to quickly recognize their forces' aircraft. This could be a matter of life and death for both soldiers and pilots, as noted by a joint manual from the US War and US Navy Departments issued during the war.
"The existence of these problems was soon apparent when," the manual notes, "after two months, the casualties of the British Advanced Air Striking Force in France amounted to:— Shot down by the Germans, eight: Shot down by the French, nine."
To help rectify the problem of friendly fire, US soldiers were given the Recognition Pictorial Manuel to help build their knowledge of enemy and friendly aircraft.
Below are the graphical depictions of Allied and Axis planes that US servicemen had to study, in hopes of achieving "the highest general level of proficiency in recognition."
The pictorial depictions of aircraft were separated into two pages labeled as 'Friend' and 'Foe.'
Each section was then further subdivided to account for the aircraft's countries of origin. Axis aircraft were categorized by aircraft belonging to the Reich, Japan, and Italy while Allied aircraft were divided between the USA and the UK.
Aside from the pictorial representation of the aircraft and their countries of origin, the manual also gives an approximation of the size of each plane. Each box within the charts designates a 100-foot-by-100-foot area. This information allowed US military personnel to be able to better spot and recognize both friendly and enemy aircraft at a time when radar technology was still new and not fully deployed.
SEE ALSO: This chart shows the astounding devastation of World War II
Join the conversation about this story »
NOW WATCH: What the Chinese saying 'The ugly wife is a treasure at home' actually means