CAT | science
Judith Potts writes in the Daily Telegraph:
One day in 2007 my late mother – then a sprightly 93 year old – said to me “I do wish these people would get off my sofa. They sit there all day and only go if I tap them on their heads or shoulders.” She and I were the only people in the room.
I was extremely alarmed when she described her “visions”. Not only were there the faceless people on her sofa but other apparitions which peppered her daily life and had been doing so for about 18 months. I suspect she decided to confide in me at that point because some of the “visions” were becoming difficult to tolerate. Up to that moment, she was terrified – not of the visions, but of losing her sanity.
Listening to her descriptions of gargoyle-like creatures evading capture, an Edwardian funeral procession – complete with plumed horses, carriages and clergy in red cassocks – and an urchin hopping from room to room, I was perplexed. I drove home pondering how to help, picking up a newspaper on the way. To my enormous surprise, the paper carried an article about exactly my mother’s experience and I learned that the condition had a name – Charles Bonnet Syndrome.
With this lucky discovery, I began to do some research and was able to reassure my mother that she was not alone – in fact it is reckoned that there might be two million people in the UK suffering from this condition….
The medical term for mum’s “visions” is Visual Hallucinations and they occur when there is partial or total loss of vision, caused by macular degeneration, diabetic retinopathy or cataracts. The brain, which controls sight through the eye, fills in the blank spots with stored images. These can be from real life, from film or television, from books or radio. While these visual hallucinations tend to happen to people as they age, anyone who experiences loss of vision can be affected, even children.
We shall have to see whether the conclusions drawn by this reporter (for Religion News) reflect his own biases or is indeed what the pontiff does have in mind.
The pope’s homily was striking for its repeated references to environmental protection, highlighting what is likely to be a central theme of his papacy and setting up the 76-year-old pope as a leading activist against climate change.
Should people be allowed to see details of their own genetic information?
The public discussion about DNA testing tends to focus on ethical dilemmas: What if doctors find that a person’s father isn’t really? Should they tell a patient about a DNA glitch if it’s only occasionally linked to disease? What if, while looking for mutations that could explain a known sickness, they stumble on others that might predict late-life dementia or indicate the presence of HIV? Would adding this data to someone’s medical record affect health insurance rates? What if—gasp—we end up with a real-life Gattaca?
These questions are worth talking about. But the genetics community and popular press spend too much time debating when and how the medical establishment should “protect” people from their children’s or their own DNA.
For example, many bioethicists argue that DNA glitches shouldn’t be disclosed if they’re ambiguous or linked to untreatable conditions. Doing so “may create unwanted psychosocial burdens on parents,” according to a commentary on newborn sequencing in the Journal of the American Medical Association.
Ah yes, bioethicists.
They really are a useless, worthless bunch, philosopher-princelings and wannabe clergymen who hawk their insulting and condescending babble to a political class always looking for new reasons to tell people what not to do.
It is way past time for 25 or so states that have restricted the right of citizens to obtain their genetic information from direct-to-consumer companies to lift their bans.
Indeed it is.
Grim reading, I fear, for New Year’s Day, but the (London) Daily Mail has a report here on the evolution of the ‘Liverpool Pathway’, a National Health Service procedure which “involves withdrawal of lifesaving treatment, with the [terminally] sick sedated and usually denied nutrition and fluids. Death typically takes place within 29 hours.”
The notion that allowing someone to die—even if heavily sedated—through starvation or thirst is somehow humane is grotesque. At some level a good number of these patients know what is going on, and often they do indeed suffer.
And then there’s this:
Up to 60,000 patients die on the Liverpool Care Pathway each year without giving their consent, shocking figures revealed yesterday. A third of families are also kept in the dark when doctors withdraw lifesaving treatment from loved ones.
Despite the revelations, Jeremy Hunt last night claimed the pathway was a ‘fantastic step forward’.
So much for consent.
Naturally, anti-euthanasia vigilantes are up in arms over this news, but they would do better to ask themselves the extent to which their own pressure to keep patients alive regardless of what those patients themselves want has done to contribute the creation of this ‘pathway’.
Those opposed to empowering agonized patients to shape their own exit for themselves often talk darkly about a slippery slope. Greedy relatives, stingy governments and all that. Well, the individual being starved or dried-out to death (sometimes it seems without even the courtesy of being asked for his consent), not to speak of someone trapped in the coils of an excruciating disease from which he has no means to extricate himself, may well feel that he has already arrived at the bottom of the abyss.
Appalling. Simply appalling.
The Washington Post reports:
Sen. Marco Rubio (R-Fla.) said in an interview that he isn’t certain what the age of the earth is, and that parents should be able to teach their kids both scientific and religious attempts to answer the question.
“I’m not a scientist, man. I can tell you what recorded history says, I can tell you what the Bible says, but I think that’s a dispute amongst theologians and I think it has nothing to do with the gross domestic product or economic growth of the United States,” Rubio told GQ. “I think the age of the universe has zero to do with how our economy is going to grow. I’m not a scientist. I don’t think I’m qualified to answer a question like that.”
The U.S. Geological Survey notes that scientists have estimated the earth’s age to be about 4.5 billion years old. Rubio, who identifies himself as Catholic, noted there are both faith-based and scientific theories about the earth’s beginnings. He said that he is “not sure” we will ever be able to fully answer the question of how old our planet is.
“At the end of the day, I think there are multiple theories out there on how the universe was created and I think this is a country where people should have the opportunity to teach them all,” he added. “I think parents should be able to teach their kids what their faith says, what science says. Whether the Earth was created in 7 days, or 7 actual eras, I’m not sure we’ll ever be able to answer that. It’s one of the great mysteries.”
Parents can, of course, teach their children whatever idiocy they may choose at home or in church, but as for the rest of what Rubio has to say, well…
Ten years after the publication of The Blank Slate, The Daily Telegraph’s Ed West asks some awkward questions:
…The blank slate doctrine affects almost every area of our lives. Take, for example, recent moves in Ireland to set quotas on women in politics, a move that is moderate compared to quota systems already implemented in Scandinavia. Whether one thinks this is right or not, what is wrong is that the starting premise is a totally pseudoscientific view of human nature – gender feminism.
As Pinker wrote, there are two types of feminism: “Equity feminism is a moral doctrine about equal treatment that makes no commitments regarding open empirical issues in psychology or biology. Gender feminism is an empirical doctrine committed to three claims about human nature. The first is that the differences between men and women have nothing to do with biology but are socially constructed in their entirety. The second is that humans possess a single social motive – power – and that social life can be understood only in terms of how it is exercised. The third is that human interactions arise not from the motives of people dealing with each other as individuals but from the motives of groups dealing with other groups – in this case, the male gender dominating the female gender.
“In embracing these doctrines, the genderists are handcuffing feminism to railroad tracks on which a train is bearing down.”
Gender feminism is no more scientific than astrology, yet the idea of total equality of outcomes is still some sort of vague official goal among the European elite, largely because “people’s unwillingness to think in statistical terms has led to pointless false dichotomies”, between “women are unqualified” and “fifty-fifty absolutely”…
…But just as the good name of feminism has been stigmatised by its radical wing, the whole of the social sciences have been damaged by the blank-slate orthodoxy, which has led to widespread anti-intellectualism, since the public at large come to view academia as a font of convenient untruths and agenda-driven nonsense (or to use the popular phrase, political correctness). Worst of all it has actually made it harder to help the most vulnerable, because we fail to take account of the fact that some people are less smart than others or, as Savulescu pointed out, more prone to vice or violence; and it has even made society less sympathetic to people who, because they have been less blessed by nature, lose out in the rat race.
A decade after The Blank Slate, why is human nature still taboo?
Mr. West effectively answers his own question here:
I don’t agree with Pinker about everything…His belief that there is no soul – “the ghost in the machine” – I find too awful to contemplate.
And that’s just why so many people cling to the belief in the blank slate.
Others, more sinister, find it politically useful.
Read the whole thing.
Jules Evans over-worries:
I had evangelists either side of me at dinner. The woman on my right was beautiful, charming and, technically speaking, psychotic. I mean that in the nicest possible way. Her eyes grew wide as she told me how God regularly spoke to her, cared for her, entered her. She believed she had witnessed many miracles, that her eyes had been opened to a hidden level of reality.
A western psychiatrist would nod and tick off the classic symptoms of psychosis: hearing voices, feeling guided by spirits, feeling singled out by the universe, believing you have magical abilities to save the world. Our psychiatric wards are full of people locked up for expressing such beliefs.
We define ourselves, as a culture, by our attitude to such experiences. Before the modern age, they were very common and were categorised as heavenly or demonic visitations. Some of the founding figures of civilisation were, technically speaking, psychotic: Socrates, the father of western rationalism, had a daemon who gave him orders.
But since the 17th century such phenomena have been shifted to the margins of our secular, scientific, post-animist culture and defined as pathological symptoms of a physical or emotional disease. Today, if you tell your doctor about such experiences, you are likely to receive a diagnosis of schizophrenia and be prescribed debilitating anti-psychotic drugs.
And yet such experiences are very common. A new paper by Heriot-Maitland, Knight and Peters in the British Journal of Clinical Psychology (BJCP) estimates that 10-15 percent of the population encounter “out-of-the-ordinary experiences” (OOEs) such as hearing voices. By automatically pathologising and hospitalising such people, we are sacrificing them to our own secular belief system, not unlike the Church burning witches.
Actually, it’s completely “unlike”. There were no (true) witches, while the pathological conditions are real. That they are far from rare that doesn’t make them any less of a disease.
But that doesn’t mean that the treatment always has to be medical in the conventional sense of the word, and here Mr. Evans makes a good point:
Perhaps we need to find a more pragmatic attitude to revelatory experiences, an attitude closer to that of William James, the pioneering American psychologist and pragmatic philosopher. James studied many different religious experiences, asking not “Are they true?” but rather “What do they lead to? Do they help you or cause you distress? Do they inspire you to valuable work or make you curl up into a ball?”
We can evaluate the worth of a revelatory experience without trying to find out if the experience “really” came from God or not.
Fair enough, therapeutically speaking, but keep the pills handy….
The Atlantic recently ran a piece on the use of smartphone apps as behavioral trainers. It is an interesting enough topic in its own right but it was a good reintroduction to B. F. Skinner too. I hadn’t thought about him for ages. The description of the angry response he generated made me think that I should:
In 1965, When Julie Vargas was a student in a graduate psychology class, her professor introduced the topic of B. F. Skinner, the Harvard psychologist who, in the late 1930s, had developed a theory of “operant conditioning.” After the professor explained the evidently distasteful, outmoded process that became more popularly known as behavior modification, Vargas’s classmates began discussing the common knowledge that Skinner had used the harsh techniques on his daughter, leaving her mentally disturbed and institutionalized. Vargas raised her hand and stated that Skinner in fact had had two daughters, and that both were living perfectly normal lives. “I didn’t see any need to embarrass them by mentioning that I was one of those daughters,” she says.
Vargas is a retired education professor who today runs the B. F. Skinner Foundation out of a one-room office in Cambridge, Massachusetts, a block away from Harvard Yard. The foundation’s purpose is largely archival, and Vargas spends three days a week poring over boxes and shelves full of lab notes, correspondence, and publications by her father, who died in 1990. A prim but engaging woman, Vargas can’t seem to help seething a bit about how her father’s work was perceived. She showed me a letter written in 1975 by the then wildly popular and influential pediatrician Benjamin Spock, who had been asked to comment on Skinner’s work for a documentary. “I’m embarrassed to say I haven’t read any of his work,” Spock wrote, “but I know that it’s fascist and manipulative, and therefore I can’t approve of it.”
The other, greater (if fictional), Spock would have found that most irrational…
Behaviorism exploded in prominence in the 1950s and ’60s, both in academic circles and in the public consciousness. But many academics, not to mention the world’s growing supply of psychotherapists, had already staked their careers on the sort of probing of thoughts and emotions that behaviorism tends to downplay. The attacks began in the late 1950s. Noam Chomsky, then a rising star at MIT, and other thinkers in the soon-to-be-dominant field of cognitive science acknowledged that behavior modification worked on animals but claimed it did not work on people—that we’re too smart for that sort of thing.
You have to laugh at that.
While Skinner’s argument that behavior modification techniques could be used to improve society raises quite a few quis custodiet issues to say the least, that controversy has no relevance to the question of whether these techniques actually work. Soft machines that we are, they seem to…
And you have to laugh at that too.
It’s not referred to in the article, but the good doctor also claimed that it was possible to create ‘superstitions’ in pigeons (his test species of choice). To be sure, those findings have since been challenged, but, consider their implications and…
Yes, you are laughing again.
Good work, Dr. Skinner.
From a WSJ review of a new book chronicling the Minnesota Study of Twins Reared Apart:
The Minnesota study’s IQ results hit a nerve years before their publication in 1990, overshadowing other controversies that might have been. Many of its findings are bipartisan shockers. Take religion, which almost everyone attributes to “socialization.” Separated-twin data show that religiosity has a strong genetic component, especially in the long run: “Parents had less influence than they thought over their children’s religious activities and interests as they approached adolescence and adulthood.” The key caveat: While genes have a big effect on how religious you are, upbringing has a big effect on the brand of religion you accept. Identical separated sisters Debbie and Sharon “both liked the rituals and formality of religious services and holidays,” even though Debbie was a Jew and Sharon was a Christian.
Just another example of the ‘God Gene’ at work, I suppose, and, as such, just another reminder that there is little or no prospect of ever weaning mankind off religion (Professor Dawkins, please note).