- FOOD ART
- TRIGGER Pts
- ENERGY WORK
- HEALTH Jobs LINKS
- HEALTH CARE Jobs
- HOLISTIC HEALTH Jobs
- NUTRITION Jobs
Treatment for alcohol use disorders works best if the patient actively understands and incorporates the interventions provided in the clinic. Multiple factors can influence both the type and degree of neurocognitive abnormalities found during early abstinence, including chronic cigarette smoking and increasing age. A new study is the first to look at the interactive effects of smoking status and age on neurocognition in treatment-seeking alcohol dependent (AD) individuals. Findings show that AD individuals who currently smoke show more problems with memory, ability to think quickly and efficiently, and problem-solving skills than those who don't smoke, effects which seem to become exacerbated with age.
Physicists are developing a new tool that uses low-energy near-infrared light and fiber optics for optogenetic stimulation of cells. He believes it will be a useful tool for mapping physical and functional connections in the brain.
The neural machinery underlying our olfactory sense continues to be an enigma for neuroscience. A recent review in Neuron seeks to expand traditional ideas about how neurons in the olfactory bulb might encode information about odorants. One of the main authors, Terry Sejnowski, had the floor for a brief while at last week's national BRAIN Initiative meeting, where discussion of neural codes was a key issue. The Neuron review was published the day after the meeting, and it supports the previously established idea that the olfactory bulb is in many ways structurally comparable to the retina. The authors note however, that due to the apparent sparsity and lack of topographical organization in the olfactory front end, the particular blend of temporal coding used there should differ significantly from that used in the retina.
Computers are getting smarter and faster; what's lagging is a way to scan and transfer our brain.
(Medical Xpress)-The instability of "white matter" in humans may contribute to greater cognitive decline during the aging of humans compared with chimpanzees, scientists from Yerkes National Primate Research Center, Emory University have found.
Melbourne researchers have identified an immune protein that has the potential to stop or reverse the development of type 1 diabetes in its early stages, before insulin-producing cells have been destroyed.
A ground-breaking advance in colonoscopy technology signals the future of colorectal care, according to research presented today at Digestive Disease Week(DDW). Additional research focuses on optimizing the minimal withdrawal time for colonoscopies and exploring safer methods for removing polyps.
Regular consumption of coffee is associated with a reduced risk of primary sclerosing cholangitis (PSC), an autoimmune liver disease, Mayo Clinic research shows. The findings were being presented at the Digestive Disease Week 2013 conference in Orlando, Fla.
In the May issue of Food Technology magazine published by the Institute of Food Technologists (IFT), Associate Editor Karen Nachay writes about how food manufacturers are trying to overcome formulation challenges to develop better-tasting, low- and reduced-sodium products.
Engineers combine layers of flexible materials into pressure sensors to create a wearable heart monitor thinner than a dollar bill. The skin-like device could one day provide doctors with a safer way to check the condition of a patient's heart.
In the crucial early stages of a possible heart attack, EMTs on the scene now rely on slow and unreliable proprietary technology to transmit vital ECG data to physicians at a hospital for evaluation. But a new iPhone app using standard cell phone networks may help speed the process and, ultimately, cut delays in treatment for heart attack patients.
The type of sensors that pick up the rhythm of a beating heart in implanted cardiac defibrillators and pacemakers are vulnerable to tampering, according to a new study conducted in controlled laboratory conditions.
New research indicates that women's reproductive function may be tied to their immune status. Previous studies have found this association in human males, but not females.
Our visions of the future tend to be forged in the pages of science fiction. But for the past half-century, a number of prominent thinkers, activists, and scientists have made significant contributions to our understanding of what the future could look like. Here are 10 recent futurists you absolutely need to know about.
Above image courtesy Dylan Cole.
A few months ago we told you about 9 historical figures who may have predicted our future. Now it's time to focus on major contributions made during the past five decades.
Needless to say, there were dozens upon dozens of amazing futurists who could have been included in this article, so it wasn't easy to pare down this list. But given the width and breadth of futurist discourse, we decided to select nine thinkers whose contributions should be considered seminal and highly influential to their field of study.
And as for any futurist I might have missed, please add to comments and let's discuss! But let's not get into futurists who are also scifi writers - that's a separate list I'll hit in the future.
1. Robert Ettinger
He's known as the intellectual father of the cryonics movement. Physicist Robert Ettinger, who only died recently and is currently in cryonic stasis, was an early advocate of immortalism, or what we would today call radical life extension. In his 1964 book, The Prospect of Immortality, Ettinger argued that whole body or head-only freezing should be used to place the recently deceased into a state of suspended animation for later revival. To that end, he made the case that governments should immediately start a mass-freezing program. He also believed that the onset of immortality would endow humanity with a higher, nobler nature.
"Someday there will be some sort of psychological trigger that will move all these people to take the practical steps they have not yet taken," he wrote, "When people realize that their children and grandchildren will enjoy indefinite life, that they may well be the last generation to die."
Ettinger is also considered a pioneer in the transhumanist movement by virtue of his 1972 book, Man Into Superman.
Image: Christopher Barnatt/ExplainingTheFuture.
2. Shulamith Firestone
Back in 1970, at the tender age of 25, Shulamith Firestone kickstarted the cyberfeminist movement by virtue of her book, The Dialectic of Sex: The Case for Feminist Revolution. To come up with her unique feminist philosophy, Firestone took 19th and 20th century socialist thinking and fused it with Freudian psychoanalysis and the existentialist perspectives of Simone de Beauvoir.
Firestone argued that gender inequality was the result of a patriarchal social structure that had been imposed upon women on account of their necessary role as incubators. She felt that pregnancy, childbirth, and child-rearing imposed physical, social, and psychological disadvantages upon women - and that the only way for women to free themselves from these biological impositions would be to seize control of reproduction. She advocated for the development of cybernetic and assistive reproductive technologies, including artificial wombs, gender selection, and in vitro fertilization. In addition, she advocated for the dissemination of contraception, abortion, and state support for child-rearing.
She would prove to be a major influence on later thinkers like Joanna Russ (author of "The Female Man"), sci-fi author Joan Slonczweski, and Donna Haraway (who we'll get to in just a bit).
3. I. J. Good
British mathematician I. J. Good was one of the first thinkers - if not the first - to properly articulate the problem that is the pending Technological Singularity. Predating Hans Moravec, Ray Kurzweil, and Vernor Vinge by several decades, Good penned an article in 1965 warning about the dramatic potential for recursively improving artificial intelligence.
Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.
The phrase intelligence explosion has since been adopted by futurists critical of "soft" Singularity scenarios, like a slow takeoff event, or Kurzweilian notions of the steady, accelerating growth of all technologies (including intelligence). His work has influenced AI theorists like Eliezer Yudkowsky, Ben Goertzel, and of course, the Machine Intelligence Research Institute (formerly the Singularity Institute for Artificial Intelligence).
Interestingly, Good served as a cryptologist at Bletchley Park with Alan Turing during World War II. He also worked as a consultant on supercomputers for Stanley Kubrick for the 1968 film, 2001: A Space Odyssey.
4. K. Eric Drexler
Back in 1959, the renowned physicist Richard Feynman delivered an extraordinary lecture titled "There's Plenty of Room at the Bottom" in which he talked about the "experimental physics" of "manipulating and controlling things on a small scale." This idea largely languished, probably because it was ahead of its time. It wouldn't be until 1986 and the publication of K. Eric Drexler's Engines of Creation: The Coming Era of Nanotechnology that the idea of molecular engineering would finally take root and take its modern form.
Drexler, by virtue of this book and his subsequent lectures and writings, was the first futurist to give coherency to the prospect of molecular nanotechnology. Given the potential for working at such a small scale, Drexler foresaw the rise of universal assemblers (also called molecular assemblers, or simply "fabs") - machines that can build objects atom by atom (basically Star Trek replicators). He predicted that we'll eventually use nanotech to clear the environment of toxins, grow rockets from a single seed, and create biocompatible robots that will be injected into our bodies. But unlike Robert Ettinger, Drexler actually came up with a viable technique for reanimating individuals in cryonic suspension; he envisioned fleets of molecular robots guided by sophisticated AI that would reconstruct a person thawed from liquid nitrogen.
But he also foresaw the negative consequences, such as weaponized nanotechnology and the potential for grey goo - an out-of-control scourge of self-replicating micro-machines.
As an aside, Drexler also predicted hypertext.
Image: New Scientist.
5. Timothy Leary
Timothy Leary is typically associated with drug culture and the phrase, "tune in, turn on, and drop out," but his contributions to futurism are just as significant - and surprisingly related. He developed his own futurist philosophy called S.M.I2.L.E, which stands for Space Migration, Increased Intelligence, and Life Extension. These ideas developed out of Leary's life-long interest in seeing humanity evolve beyond its outdated morality, which would prove to be highly influential within certain segments of the transhumanist community.
As a futurist, Leary is also important in that he was an early advocate for cognitive liberty and potential for neurodiversity. Through his own brand of psychedelic futurism, he argued that we have the right to modify our minds and create our own psychological experiences. He believed that each psychological modality - no matter how bizarre or unconventional - could still be ascribed a certain value. What's more, given the extreme nature of certain psychedelic experiences, he also demonstrated the potential for human consciousness to function beyond what's considered normal. More.
6. Donna Haraway
Donna Haraway made a name for herself after the publication of her 1984 essay, "A Manifesto for Cyborgs: Science, Technology, and Socialist Feminism in the 1980s." At the time, it was seen as a reaction to the rise of anti-technological ecofeminism, but it has since been interpreted and reinterpreted by everyone from postmodernist lefties through to transhumanist postgenderists.
Referring to Haraway as a Cyborgian Socialist-Feminist, the futurist and sociologist James Hughes describes her legacy this way:
Haraway argued that it was precisely in the eroding boundary between human beings and machines, and between women and machines in particular, that we can find liberation from the old patriarchal dualisms. Haraway says she would rather be a cyborg than a goddess, and proposes that the cyborg could be the liberatory mythos for women. This essay, and Haraway's subsequent writings, have inspired a new cultural studies sub-discipline of "cyborgology," made up of feminist culture and science fiction critics, exploring cyborgs and the woman-machine interface in various permutations.
And as Wired's Hari Kunzru noted, "Sociologists and academics from around the world have taken her lead and come to the same conclusion about themselves. In terms of the general shift from thinking of individuals as isolated from the "world" to thinking of them as nodes on networks, the 1990s may well be remembered as the beginning of the cyborg era."
7. Peter Singer
He's primarily regarded as a philosopher, ethicist, and animal rights advocate, but Princeton's Peter Singer has also made a significant impact to futurist discourse - albeit it through rather unconventional channels.
Singer, as a utilitarian, social progressive, and personhood-centered ethicist, has argued that the suffering of animals, especially apes and large mammals, should be put on par with the suffering of children and developmentally disabled adults. To that end, he founded the Great Ape Project, an initiative that seeks to confer basic legal rights to non-human great apes, namely chimpanzees, bonobos, gorillas, and orangutans. It's a precursor to my own Rights of Non-Human Persons Program, which also includes dolphins, whales, elephants - and makes provisions for artificial intelligence. Singer has also suggested that chickens be genetically engineered so that they experience less suffering.
And in 2001, Singer's A Darwinian Left: Politics, Evolution, and Cooperation argued that there is a biological basis for human selfishness and hierarchy - one that has thwarted our attempts at egalitarian reform. What's needed, says Singer, is the application of new genetic and neurological sciences to identify and modify the aspects of human nature that cause conflict and competition - what today would be regarded as moral enhancement. He supports voluntary genetic improvement, but rejects coercive eugenic pseudo-science.
8. Freeman Dyson
Theoretical physicist and mathematician Freeman Dyson is one of the first thinkers to consider the potential for megascale engineering projects.
His 1959 paper, "Search for Artificial Stellar Sources of Infrared Radiation," outlined a way for an advanced civilization to utilize all of the energy radiated by their sun - an idea that has since inspired other technologists to speculate about similar projects, like Matrioshka and J-Brains.
9. Nick Bostrom
Swedish philosopher and neuroscientist Nick Bostrom is one of the finest futurists in the business, who is renowned for taking heady concepts to the next level. He has suggested, for example, that we may be living in a simulation, and that an artificial superintelligence may eventually take over the world - if not destroy us all together. And indeed, one of his primary concerns is in assessing the potential for existential risks. An advocate of transhumanism and human enhancement, he co-founded the World Transhumanist Association in 1998 (now Humanity+), and currently runs the Future of Humanity Institute at Oxford.
Image: Nick Bostrom.
10. Aubrey de Grey
Love him or hate him, gerontologist Aubrey de Grey has revolutionized the way we look at human aging.
He's an advocate of radical life extension who believes that the application of advanced rejuvenation techniques may help many humans alive today live exceptionally long lives. What makes de Grey particularly unique is that he's the first gerontologist to put together an actual action plan for combating aging; he's one of the first thinkers to conceptualize aging as a disease unto itself. Rather than looking at the aging process as something that's inexorable or overly complicated, his macro-approach (Strategies for Engineered Negligible Senescence) consists of a collection of proposed techniques that would work to not just rejuvenate the human body, but to stop aging altogether.
Back in 2006, MIT's Technology Review offered $20,000 to any molecular biologist who could demonstrate that de Grey's SENS is "so wrong that it was unworthy of learned debate." No one was able to claim the prize. But a 2005 EMBO report concluded that none of his therapies "has ever been shown to extend the lifespan of any organism, let alone humans." Regardless of the efficacy of de Grey's approach, he represents the first generation of gerontologists to dedicate their work to the problem that is human aging. Moreover, he's given voice to the burgeoning radical life extension movement.
Big names in medicine are set to give an upbeat assessment of the war on AIDS on Tuesday, 30 years after French researchers identified the virus that causes the disease.
Mathematicians have developed a mathematical model to disrupt the flow of information in a complex real-world network, such as a terrorist organization, using minimal resources.
Scientists at the University of California, Davis have engineered a strain of photosynthetic cyanobacteria to grow without the need for light. They report their findings today at the 113th General Meeting of the American Society for Microbiology.
(AP)-A woman who lost both hands, her left leg and right foot after contracting a flesh-eating disease has been fitted with prosthetic hands.
Whether you need to brush up on your chemistry, or just love it when someone sets the Periodic Table to music, AsapSCIENCE's The NEW Periodic Table Song is for you.
This rundown of the elements in numerical order is set to Jacques Offenbach's Infernal Galop, but was otherwise written, produced, and performed by Mitchell Moffit. Here are the lyrics in case you missed anything:
There's Hydrogen and Helium Then Lithium, Beryllium Boron, Carbon everywhere Nitrogen all through the air
With Oxygen so you can breathe And Fluorine for your pretty teeth Neon to light up the signs Sodium for salty times
Magnesium, Aluminium, Silicon Phosphorus, then Sulfur, Chlorine and Argon Potassium, and Calcium so you'll grow strong Scandium, Titanium, Vanadium and Chromium and Manganese
CHORUS This is the Periodic Table Noble gas is stable Halogens and Alkali react agressively Each period will see new outer shells While electrons are added moving to the right
Iron is the 26th Then Cobalt, Nickel coins you get Copper, Zinc and Gallium Germanium and Arsenic
Selenium and Bromine film While Krypton helps light up your room Rubidium and Strontium then Yttrium, Zirconium
Niobium, Molybdenum, Technetium Ruthenium, Rhodium, Palladium Silver-ware then Cadmium and Indium Tin-cans, Antimony then Tellurium and Iodine and Xenon and then Caesium and...
Barium is 56 and this is where the table splits Where Lanthanides have just begun Lanthanum, Cerium and Praseodymium
Neodymium's next too Promethium, then 62's Samarium, Europium, Gadolinium and Terbium Dysprosium, Holmium, Erbium, Thulium Ytterbium, Lutetium
Hafnium, Tantalum, Tungsten then we're on to Rhenium, Osmium and Iridium Platinum, Gold to make you rich till you grow old Mercury to tell you when it's really cold
Thallium and Lead then Bismuth for your tummy Polonium, Astatine would not be yummy Radon, Francium will last a little time Radium then Actinides at 89
Actinium, Thorium, Protactinium Uranium, Neptunium, Plutonium Americium, Curium, Berkelium Californium, Einsteinium, Fermium Mendelevium, Nobelium, Lawrencium Rutherfordium, Dubnium, Seaborgium Bohrium, Hassium then Meitnerium Darmstadtium, Roentgenium, Copernicium
Ununtrium, Flerovium Ununpentium, Livermorium Ununseptium, Ununoctium And then we're done!!
More than a decade ago, British parents refused to give measles shots to at least a million children because of a vaccine scare that raised the specter of autism. Now, health officials are scrambling to catch up and stop a growing epidemic of the contagious disease.
The use of a smartphone application significantly improves patients' preparation for a colonoscopy, according to new research presented today at Digestive Disease Week (DDW). The preparation process, which begins days in advance of the procedure, includes dietary restrictions and requires specific bowel preparation medication to be taken at strict intervals. The better the preparation, the easier it is for doctors to see cancer and precancerous polyps in the colon. The study, which was conducted by the gastroenterologists of Arizona Digestive Health in Phoenix, featured the first doctor-designed app of its kind.
Did you know that working around a certain chemical can make your jaw glow green and have to be chopped off? Not your teeth. Not your bones. Not your head. Your jaw. Learn what happens when biochemistry gets terrifyingly specific.
Around the mid-1800s the first instances of "phossy jaw," cropped up. It wasn't called phossy jaw at first. It was just a terrible disease that seemed to afflict mostly those who worked in a factories that manufactured matches. It began as generalized pain in the jaw and swelling of the gums. Then abscesses would crop up along the jaw, and more and more bone would be exposed. The bone would glow in the dark. That got people's attention, but not as much as the people who had to have their jaws amputated in order to prevent infection from spreading to the rest of the body.
It didn't take long before researchers narrowed down the culprit for this particular ill - the new chemical known as yellow (now called white) phosphorus. Eventually, the manufacture of white phosphorus in everyday materials was stopped, and proper safety standards were put in the few places that still did use the chemical. But the horror of "phossy jaw" remained. The horror wasn't just due to the physical devastation that phosphorus caused, but to the element of the bizarre. If a chemical is inhaled it might damage the lungs. If it is touched it should poison the skin. If it enters the entire body, it might poison the whole body or work best in one tissue or organ. But how could it specifically single out the jawbone? Why, of all the places in the body, did it only show up there? If you had read, in a novel, about a poison that made the jaw bone glow, would you believe it?
Phosphorus has the ability to affect tissue this way because of the way bone is maintained in the body. Although it seems static and unchanging, bone is thinning, thickening, breaking and repairing all the time. Two major cells responsible for this are osteoclasts and osteoblasts. The osteoclasts are the cells that break down bone and, in a way, ingest it. The osteoblasts build it back up. The break down and build up of bone is crucial to keeping the tissue healthy. Phosphorus binds to bone. Osteoclasts break down that bone tissue and assimilate it. The chemical poisons them and they die, and so bone no longer gets broken down. Bone "turnover," or rejuvenation, stops. In some tissues, this turnover rate is naturally higher than others, and jaw, particularly the tissue around the teeth, has a high turnover rate. The osteoclasts get poisoned, the process stops, and the tissue sickens and dies in the most gruesome and bizarre way possible. Sometimes fiction doesn't hold a candle to life.
(AP)-Former Navy Secretary Richard J. Danzig, who has served as a bio-warfare adviser to the president, the Pentagon, and the Department of Homeland Security, urged the government to stockpile an anti-anthrax drug while serving as a director for the company that supplied it, according to a report published Sunday.
What is the reason for the widely reported fact that men are more likely than women to die of cancer? New evidence from population studies suggests that free testosterone could be a key driver of cancer aggressiveness in a broad range of solid tumors and sarcomas, not just gender-specific cancers, according to researchers at the Danbury Hospital Research Institute. The conclusions, published in PLOS One, are based on analyses of about 1.2 million cases from the National Cancer Institute's Surveillance, Epidemiology and End Results (SEER) database.
Sometimes, more medical information is a bad thing. The influential United States Preventive Services Task Force recommends against most women getting genetic screenings for their susceptibility to breast cancer. Why? Because the tests are imperfect: for every woman who gets tested for genes associated with onset breast cancer, even more will falsely test positive, leading spooked patients into needless surgery or psychological trauma. Super cheap genetic testing from enterprising health startups, such as 23andMe, have complicated cancer detection for us all by increasing the accessibility of imperfect medical information.
After discovering a mutated BRCA1 gene, known to increase the likelihood of breast cancer 60 to 80 percent, actress Angelina Jolie underwent a radical preventive double mastectomy. Her brave confession in the New York Times brought much needed attention to breast cancer awareness, but it's dangerous in the hands of a statistically illiterate population.
For instance, as New York Times statistical guru, Nate Silver, once reminded me, while breast cancer mammograms are 75 percent accurate, a woman who tests positive only has about a 10 percent chance of actually getting cancer. Since the vast majority of women don't have cancer, there are far more women who will falsely test positive (here is a helpful blog post with the numbers worked out). Most importantly, surveys reveal that many people don't understand the math behind false positives in cancer testing, and may make uninformed decisions as a result.
The same math holds true for the mutated BRCA1/2 gene of Jolie's confession: researchers estimate that a tiny 0.11 to 0.12 of women have the faulty gene. "I believe in doing genetic testing for BRCA1/2 with appropriate counseling," writes University of Southern California's David Agus, one of Steve Jobs' cancer doctors. The answers are not simple in this case and require experienced professionals to discuss with the patient."
Traditionally hundreds of thousands of dollars to test, a cottage industry of cheap genetic testing has sprung up. 23andMe, one of the most popular, offers the service for as little at $99, and has even dared to weigh in on the BRCA controversy on the company blog.
Citing a new study that found no negative emotional consequences from patients after learning about their BRCA1 mutation, 23andMe concludes, "The findings are important given that a frequent criticism of direct-to-consumer testing is based on the assumption that it causes either serious emotional distress or triggers deleterious actions on the part of consumers."
Given the absence of evidence for serious emotional distress or inappropriate actions in this subset of mutation-positive customers who agreed to be interviewed for this study, "broader screening of Ashkenazi Jewish women for these three BRCA mutations should be considered."
Sometimes, however, voluntary surveys don't tell the whole story. In its cover story on Jolie's decision, TIME magazine recounts the tale of one woman who likely had unnecessary preventative surgery after learning about a genetic defect. "She freaked out and had a bilateral mastectomy," said Otis Brawley, chief medical officer for the American Cancer Society, who worried that this patient's particular mutation was not as troubling as she worried it was.
Interestingly, TIME's author, Kate Pickart, argues the financial costs of genetic testing has stall mass run on genetic tests. Even a new provision under the Affordable Care Act (a.k.a. Obamacare) only mandates 100 percent insurance coverage for patients with a family history of genetic flaws.
But, at just $99 (and probably far less in the future), financial barriers are crumbling. This isn't to say that genetic screening is bad, it just complicates things for the rest of us, especially those who don't understand statistics. The more women get tested, the more false positives exist, the less confident patients and physicians become in a course of action.
Maybe our only hope out of this cheaper testing spiral is technology that makes detection more accurate and more predictive. One promising solution is a new bra that constantly monitors deep tissue for cancerous signs (below).
So, perhaps, before long, we will innovate our way out of this dilemma.
PSYCHOLOGY - Ketamine shows significant therapeutic benefit in people with treatment-resistant depression
Patients with treatment-resistant major depression saw dramatic improvement in their illness after treatment with ketamine, an anesthetic, according to the largest ketamine clinical trial to-date led by researchers from the Icahn School of Medicine at Mount Sinai. The antidepressant benefits of ketamine were seen within 24 hours, whereas traditional antidepressants can take days or weeks to demonstrate a reduction in depression.
In 2008 researchers from the University of Southern Denmark showed that the drug thioridazine, which has previously been used to treat schizophrenia, is also a powerful weapon against antibiotic-resistant bacteria such as staphylococci (Staphylococcus aureus).
It's not everyday that you come across a childhood game in a psychology experiment - particularly one that you thought that you had made up. But the "finger tapping" game gives us some insights into our past, and explains why people can't understand you when you think you're being clear as day.
A few days ago I wrote about cryptomnesia, the phenomenon of people believing that they had invented a thought that they had in fact only remembered. (If you've ever been reminded by stone-faced companions that the joke you thought you made up was actually from The Simpsons, you've had a run-in with cryptomnesia.) What do I come across a few days later but a childhood game - one I thought I had invented with my friends - actually featuring in a psychology experiment.
In 1990, Elizabeth Newton came up with a test in which one person "taps out" a song with their finger, and sprung it on about fifty students. They each tapped out a popular and familiar song of their choice with their finger. They were assigned a partner who attempted to guess what the song was. Tappers thought that the other student would be able to guess their songs about fifty percent of the time. The other student was able to guess their song only about three percent of the time.
I never kept tabs on the games I played with my friends - we were a wild bunch, and would have no truck with formal statistics when it came to our crazy finger tapping games - but I remember the sense of frustration when my friends were unable to guess the music that I was tapping out as plain as day. I could hear how the taps perfectly coincided with the notes of the song. Why couldn't they?
They couldn't because all they heard was tap . . . tap tap tap . . . tap tap, tap-tap, tap tap (that, by the way, was Rule, Britannia) and it didn't correspond to any song they'd ever heard. Babies tend to believe that whatever information is obvious to them must be obvious to the world at large. Although we, intellectually, know that other people can't possibly know what's on our mind, there remains that lingering sense that we're communicating everything to the outside world. Games like Taboo and Pictionary capitalize on both sides of that frustration - especially when the player gets stuck in a loop because they can't possibly imagine that anyone could be dense enough not to understand what they've been communicating, while their team is going out of their minds with frustration because two circles and a square don't help us understand what you're trying to say, no matter how many times you underline them, Gary! (Sorry, I may be remembering something traumatic.)
But we don't need official games, or psychological experiments, to trip us up in this regard. In life it often seems clear to us that we've communicated something - enough information to get to a destination, our own discomfort at a situation, or the fact that we're only joking when we make a sarcastic remark - only to be surprised when people don't understand us. We're not as transparent, either with our mouths or our body language and expression, as we think we are. We don't understand that other people aren't trapped in our head with us.
Top Image: Jin