Some urban legends are so persuasive, they’ve even made way into the common classroom, only to be debunked long after they’ve already been accepted as canon. Author George Orwell once said, "myths which are believed in tend to become true," but in the case of the following false teachings, they have not, and they will not. Here are lies your teacher taught you in school.
Space and the Great Wall
The lie: The only man-made structure that can be seen from outer space is the Great Wall of China, and the astronauts who landed on the Moon could see it from its surface.
The truth: According to NASA, it’s actually very, very difficult for even low-orbiting astronauts to spot the Great Wall of China, and the moonwalkers from the Apollo 12 mission decidedly did not see the structure from the surface of the Moon … or any other human structure for that matter. One of China’s own cosmic cadets has participated in disproving the long-held belief that the Great Wall was visible from space by the naked eye (it is more readily seen with the use of radar imagery, however).
There are some structures which are visible from the outside of Earth’s atmosphere — including the Pyramids of Giza in Egypt along with some of the lights, roadways and major bridges from large cities — but none of those things are within eye-shot of someone on the Moon. The Great Wall rumor is said to have begun as early as 1938, but couldn’t be dispelled until space boots hit the ground on the Moon on November 19, 1969 and confirmed the sight of Earth from its Moon is just a blueish-white blob, with no signs of human life.
We have blue blood
The lie: Human blood runs blue within the veins, and only turns red once it’s exposed to oxygen. This is why it usually looks blue beneath the skin surface (and bruises are often deep blue or purple, too), but you’ve never seen blue blood on the surface.
The truth: Our blood flows red, both inside the body and out. Blood is red because of the iron molecules (hemes) binding with oxygen in the hemoglobin. When blood does become deoxygenated, it takes on a darker shade of red, but does not turn blue. Human blood cells are actually rarely deprived of all oxygen, since blood cells transport oxygen from the lungs to the brain and other body tissues throughout the circulatory system, and then re-filtrate through the heart.
The reason blood veins may look blue to the eye, is because our veins are made of connective tissues that tend to appear blueish-white, due to fancy technical optic factors like skin pigmentation, thickness, and — in general — the way that light wavelengths reach and reflect those tissues beneath the skin surface, if at all. The blood cells in bruises are also red … well, until our janitorial cells called phagocytes come to clean ’em up through the green-yellow-brown adaptation process, that is. Certain sea animals and land insects might indeed bleed blue, but not us. Not ever, even if we turn royal, rich, and snobby.
That ridiculous "10 percent of our brains" thing
The lie: Most humans only use about 10 percent of their brains, and only exceptional geniuses throughout history have been known to use more. There are parts of the brain that may never be tapped by any human being.
The truth: According to leading scientists, we actually use almost all of our brains, and there simply is no sleepy sector that’s staying dormant throughout the day and depriving us of some grand grey matter potential. Some experts can accept that just 10 percent of the brain’s neurons may be firing at any given moment, during a given activity, but that doesn’t mean the rest of the mind is completely inactive. Even those neurons which aren’t in action at that time may still be receiving signals at the time and re-fire up in another instant.
Dr. Barry Gordon of Johns Hopkins School of Medicine summarized the current neurologist perspective on the matter, by telling Scientific American that, despite the idea’s appeal, "It turns out … we use virtually every part of the brain, and that [most of] the brain is active almost all the time … the brain represents three percent of the body’s weight and uses 20 percent of the body’s energy." That’s way too much energy for 10 percent of effort.
The myth may have originated from a William James quote — in his 1907 book The Energies of Men, he wrote, "We are making use of only a small part of our mental and physical resources." It also may have been a mere misquote from Albert Einstein, who was rumored to have insulted a journalist for using just a tenth of his mind. No matter how it began, or how many stupid Limitless or Lucy-like movies Hollywood keeps churning out, modern neurological imaging options have since invalidated the theory.
Just five senses
The lie: We have five human sensory functions: sight (optical), sound (auditory), smell (olfactory), taste (gustatory), and touch (tactile).
The truth: The original pentagon of senses, as first demarcated by the philosopher Aristotle in the Ancient Greek era of early education, are still around. However, some contend that they should be further subcategorized for accuracy’s sake, and others have a few additions to throw on the list as well.
According to the braintrust at Harvard University, many experts would now divide certain members of the big five into smaller groups — like splitting eyesight into groups of varying perceptions, such as color, brightness, and depth, for example. Meanwhile, neurologists have also devised a list of other central human senses, some of which are derivative of the originals and some of which stand on their own. First, there’s equilibrioception — your sense of balance, which is in small part reliant on ocular vision, but more largely depends on the vestibular system of the inner ear. Secondly, nociception is the sense of pain, which can be distinguished from simple touch based on the brain’s reaction to the same. Third is proprioception, which also relies on certain touch receptors called spindles. This function allows the brain to know where your limbs are at all times, without sight or sound perception of the same. Fancy, huh?
There’s also thermoception which, as the name indicates, is a sense of hot and cold. Temporal perception is another name giveaway, as it represents your temporal or time association in a given moment of time (in other words, thank your pesky basal ganglia for making Mondays drag on, guys). Finally, there’s interoception, which is what the major organs use to let us know when to pay attention to them, like when we need food or water, or to get out of a smoke-filled room in order to breathe. Not all scientists would include that last bit on their lists, but the general consensus is now that you need at least one more hand to count your senses.
Post-mortem hair and nail growth
The lie: Human hair and fingernails continue to grow after death.
The truth: Sorry to let down the more morbid members of the crowd, but without the oxygen that is carried throughout the body by a living heartbeat, skin cells will die within 12 hours, so hair growth won’t be happening after you’ve passed. Plus, the glucose required to make new fingernail cells and push out the old to simulate growth will no longer be produced, so your nails won’t grow (much) either.
That’s not to say a decaying corpse won’t appear to grow a five o’clock shadow and a tiny bit of length on their nails — dehydration draws back the skin cells on the hands and face, making nails and facial hair appear longer. However, you’re not going to find a coffin full of hair and nails if you ever exhume a dead body.
The lie: Thomas Edison invented the lightbulb.
The truth: The so-called Father of Invention might’ve earned his name in dubious course. Yes, Edison did successfully commercialize his version of the light bulb, but there were a few others which came along before his design hit the market in 1879. Most notably, British scientist Humphrey Davy created what was called an arc lamp in 1806, using charcoal rods. The result was too bright for individual home use, but eventually brainiacs uncovered how to contain this incandescent light in a bulb, so as to limit oxygen contact and prevent accidental house fires. That’s when Frederick De Moleyns patented his bulb technology in 1841. It wasn’t perfect, by any means, but it was still a start.
Edison eventually came along and hired Francis Upton to work in his Menlo Park lab, and the pair figured out a way to make the bulbs practical and safe for home consumption — namely, developing the kind of power lines needed to make them work — and voila! The incandescent light, as we know it, was born. But no, it wasn’t his idea. He just made other people’s ideas better.
"War of the Worlds" panic
The lie: Orson Welles’ radio broadcast of H.G. Wells’ War of the Worlds on October 30, 1938 sent listeners into a state of mass panic, with people really believing that the nation was under attack from aliens.
The truth: The crowd that tuned in for Welles’ show was actually fairly small, when compared to the much larger audience with ears on rival programs like NBC’s comedic Chase & Sanborn Hour. Plus, the unrest which did result from War was substantially less than the news made it out to be. Reports of people flooding the hospitals and swarming the streets were proven to be false, and even though calls to the police of New Jersey — the setting of the story — did increase, they weren’t all the doom and gloom dial-ins reporters would have people believe they were. Some suspect that the papers that told tales of large-scale chaos stemming from the show may have been attempting to discredit the radio scene at large, since it was arriving as a viable competitor for the traditional written news medium.
The lie: The tongue is divided into a map of four major sectors of taste bud receptors for sweet, sour, salty, and bitter.
The truth: The tongue is covered with thousands of taste buds, and while there are some individual cells which distinctly identify the varying tastes, they are actually spread across the tongue in no discernible pattern. Also, there are actually five different tastes to interpret (so far): the original four plus a fifth called umami, which is a Japanese term for savory. So your grade-school science teacher and his darn science fair demonstration of bitter and sweet and lies can just march right on out of your memory.
Don’t swallow that gum!
The lie: If you swallow chewing gum, your body can’t digest it, so it stays in your stomach for as many as seven years.
The truth: It’s true that your body may not be able to digest some of the synthesized materials in chewing gum, but according to experts, it’ll simply pass through your system in a matter of days, just like other indigestibles (i.e., corn, un-chewed seeds). There is still a chance that too much of it may cause an obstruction, though, so just gulp that gum in moderation, guys.
Australian toilet water
The lie: Toilet water drains counterclockwise in the Northern hemisphere but clockwise in the Southern hemisphere, just like hurricanes do, because of Coriolis forces.
The truth: If you call someone in Australia from the United States and time your flushes, first, don’t call collect like Bart Simpson did. Second, expect that you probably won’t be seeing any directional differences in the drainage — or if you do, it won’t be based on your locations. While the impact of Coriolis forces on large bodies of water does affect its movement in, say, the open ocean, you’re much less likely to see the effect in smaller bodies of water. If you let a body of water in a non-directional container sit for long enough that it has absolutely no movement (like, at least a day) and then you just remove the drain (but somehow without inducing any rotational motion into the water), you should be able to see the effect. It worked in this Smarter Every Day/Veritasium video of one trial, at least. However, your toilet is built to have a directional spin to it and a day is too long to leave your load in the crapper, even if it’s for science.
The Independence Day Declaration signing
The lie: The Fourth of July commemorates the Founding Fathers’ signing of the Declaration of Independence on July 4, 1776, after they successfully began the Revolutionary War against England to earn national independence.
The truth: The celebration of July 4th as America’s day of independence is a little misunderstood, despite your teacher’s best efforts. While the document is dated July 4, 1776, it was actually July 2 when the 13 American colonies voted to declare independence from Great Britain — July 4 was merely the date on which the document was approved after revisions. Oddly enough, the document’s signees aren’t even believed to have put quill to Declaration parchment on the famed date, either. They may have waited as long as a month to sign it, and Britain didn’t receive its copy until the end of the August that year. And you thought breaking up with someone over dial-up AIM took forever.