Showing posts with label Steven Pinker. Show all posts
Showing posts with label Steven Pinker. Show all posts

Thursday, March 03, 2022

Ranking Dangers

The March-April issue of SKEPTICAL INQUIRER contains an article titled "The World's Most Deadly Animal," by Harriet Hall. The various candidates for this honor are ranked by the number of human beings they kill annually. A character in Robert Heinlein's TUNNEL IN THE SKY declares humans to be the deadliest animals. The villain in Richard Connell's classic short story "The Most Dangerous Game," who hunts his captives like wild beasts, would agree. Hall's list of the top ten most dangerous animals comes from this source:

Science Focus

Sharks and black widow spiders, which many people might think of when "deadly creatures" come to mind, don't even make it into the top ten. Lions do, barely, at the bottom. Hippos, elephants, and crocodiles beat them. The human animal (counting only homicides) rates second rather than first. The deadliest animal on Earth as quantified by people killed every year? The mosquito.

As Hall points out at the beginning of her article, our tendency to overlook mosquitoes and another high-ranking insect, the assassin bug, highlights the "availability heuristic." Facts and incidents that stick in our minds because of their sensational content tend to be perceived as more common than they actually are. There's a widespread attitude of, "Why is it getting so hot, and how did we get into this handbasket?" when in fact teenage pregnancy, adolescent illicit drug use, violent crime, drunk driving fatalities, and the worldwide number of deaths in battle have all decreased over the past few decades. We sometimes forget that frightening incidents and trends make headlines BECAUSE they're unusual, not commonplace. The occasional shark attack draws much more attention than thousands of malaria-causing mosquito bites.

Steven Pinker, in the section on phobias in his book HOW THE MIND WORKS, postulates that adult phobias are instinctive childhood fears that haven't been outgrown. These universal fears, which fall into certain well-defined categories, reflect the threats most hazardous to our "evolutionary ancestors"—snakes, spiders, heights, storms, large carnivores, darkness, blood, strangers, confinement, social scrutiny, and leaving home alone. In modern cities, the brain's fear circuitry often fails to function for optimal protection. "We ought to be afraid of guns, driving fast, driving without a seatbelt. . . not of snakes and spiders." Children have no innate aversion to playing with matches or chasing balls into traffic; instead, a survey of Chicago school kids revealed their greatest fears to be of lions, tigers, and snakes. Many writers of horror fiction draw upon intuitive awareness of our hard-wired terrors. The cosmic entity in Stephen King's IT targets children because, while adults obsess over mundane hazards such as heart attacks and financial ruin, children's fears run deeper and purer.

Margaret L. Carter

Carter's Crypt

Thursday, June 17, 2021

Canine Conversations

A speech language pathologist, Christina Hunger, claims to have taught her dog, Stella, to "talk":

Can That Dog on Instagram Really Talk?

The communication method depends on a soundboard like those used by some apes, with the animal pushing buttons that stand for words. They produce sound recordings of words such as "outside" and "play." According to the author of the above article, Jane C. Hu, a cognitive scientist, there's little doubt that Stella "understands" the meanings of some buttons in the sense that she knows certain actions, in terms of choosing a button to push, cause certain results. Was she deliberately combining words to form a message when she pushed "outside" followed by "Stella"? Maybe. I'm highly skeptical, however, that she combined "good" and "bye" to make "goodbye" or that "'Later Jake' (Jake is Hunger’s partner), in response to him doing a chore, meant 'do that later'," and Hu seems to agree. Granted, it would be big news to discover "a dog could plan future events and express those desires," but does Stella's performance prove her capable of abstract thought to that extent?

I'm neither a cognitive scientist, a linguist, nor a zoologist. Reacting as an interested layperson, though, I don't go so far in the skeptical direction as a critic of ape communication I read about somewhere who dismissed an ape's situation-appropriate use of "please" as the animal's having been trained to push that particular key before making a request. How is that different from a toddler's understanding of "please"? He or she doesn't start out knowing what the word "means." It's simply a noise he has to make to get adults to listen when he wants something.

Another catch in interpreting Stella's dialogues with her mistress, as pointed out by Alexandra Horowitz, a psychology professor and expert on dog cognition, is that the dog's "vocabulary" is limited by the available buttons. Also, it's possible that Stella, instead of acting independently, may be responding to unconscious signals from her owner. Yet we know dogs do "understand" some words in the sense of associating specific sounds with things, people, and actions. A border collie (recognized as one of the most intelligent breeds) named Rico is famous for his 200-word vocabulary. After being ordered to go fetch any one of the objects whose name he knew, he could get it from a different room, a procedure that eliminated the risk of his picking up cues from a human observer:

Rico

Psychologist Steven Pinker, author of THE LANGUAGE INSTINCT, takes a dim view of attempts to teach animals some form of human language, as if learning to "talk" would prove the animals' intelligence. He maintains that rather than trying to induce apes and dolphins to communicate like us, we should focus on understanding their own innate modes of communication. He may have a point. If IQ were measured by how many different odors one could distinguish, how would our "intelligence" compare to that of dogs?

Margaret L. Carter

Carter's Crypt

Thursday, May 30, 2019

The Omnivore's Dilemma

No, not the book of that name, which was the only reference that popped up on a full page of Google results. I first encountered this term in the section on "Disgust" in HOW THE MIND WORKS, by Steven Pinker, who attributes it to psychologist Paul Rozin. The omnivore's dilemma encapsulates the double-edged nature of our ability to digest a vast variety of different foods. Therefore, human beings can survive in almost any environment on Earth. The negative side of this advantage is that we can't be sure whether a new potential food source is safe to eat until we've tried it.

As Pinker puts it, "Disgust is intuitive microbiology." After a certain age (when they outgrow the "put everything in their mouths" phase), children avoid things we would consider intrinsically disgusting, such as decayed organic matter or body fluids and excretions. Most people will even refuse to put in their mouths harmless items that resemble disgusting objects (e.g., fake vomit). Contact or resemblance equals contagion, an emotional aversion that overrides mere rationality. But what accounts for "disgust" reactions to items that we dismiss as inedible but many other cultures classify as food?

Pinker points out that we accept a very narrow range of animal products as food, even though those we shun are perfectly edible. Most Americans confine their animal diets to chickens, pigs, cattle, sheep, and selected types of fish and other seafood. From the mammals we raise for food, many of us eat only certain parts of their bodies and avoid the rest (e.g., organ meats, feet, tails, etc.). Pinker discusses how we learn these dietary prejudices as a byproduct of the omnivore's dilemma. In infancy and early toddler-hood, the "put everything in their mouths" stage, children have to eat what their parents offer them. When the child gets mobile enough to forage for himself or herself (in a hunter-gatherer society), the "picky" stage sets in. (It's probably not a coincidence that the food-finicky phase coincides with the drop in appetite when the rapid growth spurt of early life slows down.) Now the child regards new foods with suspicion. The items fed by the parents during the early months are accepted as edible. All other potential foods are, by definition according to the child's world-view, repulsive. Whatever isn't explicitly permitted is forbidden and therefore disgusting. As a practical corollary of this process, it seems parents should try to introduce their toddlers to as many different foods as possible during the sensitive learning period.

I was reminded of this section in HOW THE MIND WORKS (a fascinating, highly readable book—check it out) by Facebook videos of our seven-month-old grandson trying his first solid foods. He likes avocado. Until recently, he liked applesauce. Last week, he rejected it; maybe that's just a temporary fluke. Babies, like human beings in general, crave sweet tastes, because in a state of nature our ancestors depended on sweetness to tell them when fruit was edible. This natural attraction to sugar inspires infant-care experts to advise starting babies on less sweet foods (e.g., vegetables) first, rather than letting them get fixated on sugary things such as fruit right off the bat.

Pinker, by the way, says that not only are most parts and products of animals considered disgusting (see above), but also most or all disgusting things come from animals. Vegetables may be rejected because they taste bitter, but they're not viewed as disgusting. I reacted to that statement with, "Speak for yourself, Dr. Pinker." As a child, I was disgusted—i.e, nauseated—by several kinds of vegetables because they were served in a cooked-to-mush condition. The combination of change in taste from overcooking and the yucky texture made my stomach revolt. I believe, by the way, that the cliche of children hating vegetables arises from the crimes perpetrated on perfectly harmless plants by 1950s cooking styles and the prevalence of over-processed canned veggies in the American diet of that period.

One especially interesting issue: What about bugs? Why don't many cultures—ours included—eat insects and similar arthropods (e.g., spiders)? We often pay high prices for the privilege of consuming certain other arthropods, such as lobsters. And we happily eat one kind of insect secretion (honey). Yet we abhor the termites and grubs that form an important part of our ape relatives' diets. The easy answer in American culture is that bugs aren't included among the "permitted" items we're fed in childhood. But why aren't we?

An article from SCIENTIFIC AMERICAN attempting to answer that question:

What's Stopping Us from Eating Insects?

And one from the anthropology website "Sapiens":

Why Don't More Humans Eat Bugs?

Neither of these articles exactly repeats Pinker's hypothesis, which makes a lot of sense to me, although the second essay touches upon it: Gathering enough insects or other small arthropods to provide sufficient protein isn't a very efficient process. It takes a lot of time and energy. Therefore, people incorporated bugs into their diets only if those creatures were abundant (in the tropics, for instance) and nothing better was readily available. Where a society could obtain plenty of protein from more efficient sources, such as raising herd animals, they didn't bother to eat bugs. And since whatever isn't permitted during the early learning period is by definition forbidden, bugs are disgusting to most of us. This cultural phenomenon drives the humorous appeal of the popular children's novel HOW TO EAT FRIED WORMS, since at a certain age many kids develop a sort of queasy fascination with yucky things.

One lesson for future interplanetary explorers might be that colonists should conscientiously expose their children from infancy to all sorts of safe native foods in extraterrestrial environments, even if the parents find those items repugnant.

Margaret L. Carter

Carter's Crypt

Thursday, April 18, 2019

Hopeful Futures

Kameron Hurley's column for the April issue of LOCUS explains how her writing has recently shifted from a pessimistic to an optimistic view of human possibilities. She decided "being grim and nihilistic is boring" rather than "exciting or edgy." Instead, in a world that seems increasingly darker, she finds her writing "to be a perfect outlet for exploring how people can still make good decisions in bad situations."

The Future Is Intrinsically Hopeful

This message resonates with me. As argued by Steven Pinker in THE BETTER ANGELS OF OUR NATURE and ENLIGHTENMENT NOW, we are living in the best of times, not the worst of times (although, admittedly, with considerable room for improvement).

A few striking quotes from Hurley's essay on why she believes in the future:

"Humanity didn’t survive this long because of its worst impulses. We survived this long because, despite all of that, we learned how to work together."

"What a time to be a creator, when believing humanity has a future that is not just a series of dystopic post-apocalypse nightmares is the most radical position one can have."

"What if what we are presenting to our audiences, as artists, is 'This is how the world could be really different. Have you thought about how to get there?'"

"Increasingly, I find that writing any type of work at all is hopeful....It is profoundly optimistic to assume there is a generation after ours that will create a society one hundred years from now that is recognizable to us at all."

The last two quotes seem to me to encapsulate a major theme and purpose of science fiction. Dystopian futures serve the important function of warning us and potentially motivating us to change our course: "If this goes on...." The other classic SF question, "What if...?" is equally or more important, however. One reason the original STAR TREK became so beloved was surely its optimism about human destiny. At the height of the civil rights movement, the Enterprise crew portrays men and women (even if female characters didn't fully come into their own until later iterations of the ST universe) of many races and cultures working together to discover new worlds. In the middle of the Cold War, STAR TREK envisions Russian, Americans, and Asians exploring space as a team. And many of those "predictions" have come true! THE ORVILLE, as a drama-comedy homage to ST, further develops that hopefulness about mutual tolerance and cooperation and the joy of discovery in the context of 21st-century sociopolitical concerns.

Writing as if we "believe in the future" can infuse readers with hope and perhaps inspire them to create that kind of future.

Margaret L. Carter

Carter's Crypt

Thursday, August 09, 2018

Defining Deviancy

In sociological discourse, we encounter the term "defining deviancy down." This phrase refers to behavior that used to be condemned but now is tolerated. It's an academic way of grumbling, "Society is going to the dogs." Profanity and obscenity in what used to be called "mixed company," for example. Open sale of sexually explicit literature. "Four-letter-words," extreme gore, and onscreen sex in movies. Going to houses of worship or expensive restaurants without wearing a coat and tie or a dress (as appropriate). (In my childhood, it was frowned upon for a girl or woman to shop at an upscale department story without dressing up.) For boys, wearing a T-shirt to school (the crisis in one episode of LEAVE IT TO BEAVER centered around this transgression); for girls, going to school in pants instead of skirts. Individuals of opposite sexes living together outside of marriage. Unmarried women becoming pregnant and having babies openly instead of hiding their condition in shame. Ubiquitous gun violence in the inner cities—in WEST SIDE STORY, the introduction of a gun into the feud between the rival gangs was framed as a shocking escalation of the conflict.

In many respects, however, we've defined "deviancy" upward since what some people nostalgically recall as the good old days of the 1950s. Smoking, for example. In my childhood, most adults smoked cigarettes, and they did it anytime almost everywhere. In grocery stores! At the doctor's office! Air pollution by big-engined, gas-guzzling cars that used to be status symbols is now disapproved of. So are the racial slurs often heard in casual conversation back then. Dogs nowadays don't run loose in our communities like Lassie and Lady (my main sources of information on dogs until my parents acquired one, who didn't act nearly so intelligent as Lady, the Tramp, and their friends). Leash laws didn't become widespread until my teens. Alleged humor based on physical abuse of women by men used to be common in the media. Ralph on THE HONEYMOONERS regularly threatened to hit his wife ("to the moon, Alice!"), though he never did so on screen, and in THE QUIET MAN, John Wayne spanked Maureen O'Hara in the middle of the road. Public intoxication, including drunk driving, was also casually treated as funny, as in many of P. G. Wodehouse's Jeeves stories and the novels of Thorne Smith (author of TOPPER). Most adults seemed to regard bullying as a commonplace childhood rite of passage that kids had to learn to cope with, as long as it didn't cause significant injury. As far as safety features such as seat belts in cars were concerned, there was no law requiring passengers to wear them, because they didn't exist.

Where some societal changes are concerned, factions differ on whether they constitute improvement or deterioration. Some contemporary parents wouldn't think of letting their children visit friends, roam around the neighborhood, or ride a bus on their own at ages that were considered perfectly normal until recent decades. Conversely, if adults from the 1950s could witness today's trends, most of them would probably consider "helicopter parenting" harmful as well as ridiculous. Are the emergence of same-sex marriage, dual-career households, and legal access to abortion good or bad changes? The answer to that question depends on one's political philosophy. Does a decline in church and synagogue membership mean we've become a society of secularists and atheists, or does it simply mean that, because we no longer have so much social pressure to look "religious," for the most part only sincere believers join religious organizations? (C. S. Lewis noted that an alleged "decline" in chapel attendance among university students in fact reflected a sudden drop as soon as attendance became optional instead of compulsory.)

Whether you think current trends in behavior, customs, and morals are mainly positive or negative probably influences whether you believe Steven Pinker, for instance, is right or wrong when he claims in ENLIGHTENMENT NOW that we're living in the best of times rather than the worst.

Margaret L. Carter

Carter's Crypt

Thursday, July 05, 2018

Illusions of Safety

Last week, five people on the staff of our local newspaper were killed by a gunman who attacked their office because he had a long-standing grudge against the paper. (It's worth noting that the paper did not skip putting out a single issue.) Naturally, the rector of our church preached on the incident. He drew upon Psalm 30, which includes the beautiful verse, "Weeping may spend the night, but joy comes in the morning." To reach that epiphany, however, the psalmist has to recall a time when he felt confident in his security but then experienced the apparent loss of that safety and protection. Our rector talked about how we might have existed in a "bubble," thinking we were safe from such unpredictable mass violence, that it would never strike where we live. Now the bubble has been burst.

That reflection reminded me of what the media repeatedly told us after 9-11: "Everything has changed." Then and now, that remark brings to mind an essay by one of my favorite authors, C. S. Lewis, "On Living in an Atomic Age" (collected in the posthumous volume PRESENT CONCERNS). Lewis reminds us that such catastrophic events change nothing objectively. What has changed is our perception. That idea of safety was always an illusion. To the question, "How are we to live in an atomic age?" Lewis replies:

"'Why, as you would have lived in the sixteenth century when the plague visited London almost every year, or as you would have lived in a Viking age when raiders from Scandinavia might land and cut your throat any night; or indeed, as you are already living in an age of cancer, an age of syphilis, an age of paralysis, an age of air raids, an age of railway accidents, an age of motor accidents.' In other words, do not let us begin by exaggerating the novelty of our situation. Believe me, dear sir or madam, you and all whom you love were already sentenced to death before the atomic bomb was invented: and quite a high percentage of us were going to die in unpleasant ways."

As he says somewhere else (in THE SCREWTAPE LETTERS, maybe), the human death rate is 100 percent and cannot be increased or decreased. The bottom line is NOT that, knowing the inevitability of death, we should make ourselves miserable by brooding over our ultimate fate. It's one thing to take sensible precautions, quite another to live in fear. Just the opposite—we should live life abundantly. Lewis again:

"If we are all going to be destroyed by an atomic bomb, let that bomb when it comes find us doing sensible and human things—praying, working, teaching, reading, listening to music, bathing the children, playing tennis, chatting to our friends over a pint and a game of darts—not huddled together like frightened sheep and thinking about bombs. They may break our bodies (a microbe can do that) but they need not dominate our minds."

Steven Pinker's two most recent books, THE BETTER ANGELS OF OUR NATURE and ENLIGHTENMENT NOW, offer an antidote to the mistaken belief that we live in a uniquely, horribly violent age. Although Pinker and Lewis hold radically different world-views (Pinker is a secular humanist), both counsel against despair. Pinker demonstrates in exhaustive, rigorous detail that in most ways this is the best era in history in which to live—and not only in first-world countries. The instantaneous, global promulgation of news makes shocking, violent events loom larger in our minds than they would have for past generations. (But what's the alternative—to leave the public uninformed?)

We can deplore evils and work for solutions without losing our perspective.

Margaret L. Carter

Carter's Crypt

Thursday, June 07, 2018

Common Assumptions

In his essay "On the Reading of Old Books" (written as the introduction to a 1943 translation of St. Athanasius's book on the Incarnation), C. S. Lewis explains why he thinks it vital for modern people to read old books:

"All contemporary writers share to some extent the contemporary outlook—even those, like myself, who seem most opposed to it. Nothing strikes me more when I read the controversies of past ages than the fact that both sides were usually assuming without question a good deal which we should now absolutely deny. They thought that they were as completely opposed as two sides could be, but in fact they were all the time secretly united—united with each other and against earlier and later ages—by a great mass of common assumptions. We may be sure that the characteristic blindness of the twentieth century—the blindness about which posterity will ask, 'But how could they have thought that?'—lies where we have never suspected it, and concerns something about which there is untroubled agreement between Hitler and President Roosevelt or between Mr. H. G. Wells and Karl Barth."

Therefore, says Lewis, we need the literature of past ages to awaken us to the truth that the "common assumptions" of one era aren't necessarily those of another, and ours might actually be wrong. Speaking of the "contemporary outlook" of Lewis's own period, through much of the twentieth century experts in psychology and sociology held the shared assumption that no inborn "human nature" existed, that the human mind and personality were almost infinitely malleable—the theory of the "blank slate." We meet versions of that belief in works as different as Lewis's THE ABOLITION OF MAN (where he views the prospect with alarm), Huxley's BRAVE NEW WORLD, Orwell's 1984, Skinner's WALDEN TWO, and Heinlein's first novel (published posthumously), FOR US, THE LIVING. Later research in psychology, neurology, etc. has decisively overturned that theoretical construct, as explored in great detail in Steven Pinker's THE BLANK SLATE.

Whatever our positions on the political spectrum, in the contemporary world we embrace certain common assumptions that may not have been shared by people of earlier periods. We now believe everybody should receive a free basic education, a fairly new concept even in our own country. In contrast to our culture's acceptance of casual racism a mere sixty years ago, now racial prejudice is unequivocally condemned. Whatever their exact views, all citizens except members of lunatic fringe groups deny being racists. Outward respect for individual rights has become practically worldwide. Dictatorships call themselves republics and claim to grant their citizens fundamental human rights. In our country, all sides claim they want to protect the environment and conserve energy; disputes revolve around exactly how to go about reaching those goals. Everybody in the civilized world supposedly respects and values human life, even if in some regions and subcultures there's little evidence of this value being practiced. One universally accepted principle in the modern, industrialized world is that children and especially babies are so precious that we should go to any lengths to protect them and extend their lives. For instance, expending huge amounts of energy and money to keep a premature baby alive is considered not only meritorious but often obligatory. The only differences on this topic among various factions of our society involve how much effort is reasonable and where the cutoff line should be drawn (e.g., how developed a preemie should be to receive this degree of medical attention, at what stage and for what reasons abortion should be allowed, etc.). Yet in many pre-industrial societies, it was obligatory to allow a very premature newborn or one with severe birth defects to die; expending resources on an infant who would almost certainly die anyway would be condemned as detrimental to the welfare of the family and tribe. The development of advanced medical technology has probably played a vital part in changing attitudes like this to the opposite belief we hold in contemporary society.

It's likely that alien cultures we encounter will have different universal assumptions from our own. In Heinlein's STRANGER IN A STRANGE LAND, Mike (the human "Martian") reports that on Mars competition between individuals occurs in childhood instead of adulthood. Infants, rather than being cherished, are cast out to survive as best they can, then re-admitted to the community after they've proven their fitness. To creatures who've evolved as units in a hive mind, the value we place on individual rights would make no sense. A member of a solitary species wouldn't understand the concept of loyalty to a group. Where might the "characteristic blindness" of our time and place in history be lurking?

Margaret L. Carter

Carter's Crypt

Thursday, February 22, 2018

Is the World Improving?

Psychologist Steven Pinker has just published a new book, ENLIGHTENMENT NOW, a follow-up to his 2011 book THE BETTER ANGELS OF OUR NATURE: WHY VIOLENCE HAS DECLINED. In that earlier work, he demonstrated with page after page of hard facts that we're living in the least violent period in recorded history. ENLIGHTENMENT NOW, subtitled "The Case for Reason, Science, Humanism, and Progress," expands that project to support the claim that human well-being has increased in virtually every measurable way since the dawn of the Enlightenment in the seventeenth to eighteenth centuries. (I have to confess that I bristled a bit at the title itself, since "Enlightenment," like "Renaissance," was a self-designated label meant to dismiss previous eras as centuries of benighted superstition, barbarism, and stagnation.) Contrary to the widespread belief that the world is going to Hell in a handbasket, according to Pinker this is the best time in history to be born, even in third-world nations. The headlines that make many people wonder, "Why is it getting so hot, and what are we doing in this handbasket?" represent, in Pinker's view, a distortion of the facts. (Why a handbasket, by the way? If all of us are in it collectively, wouldn't a bushel basket make more sense? Or a laundry basket? Of course, then we'd lose the alliteration.) Health, education, the spread of representative government, overall quality of life (evaluated by leisure time, household conveniences, access to information and entertainment, etc.), among many other metrics, have measurably improved. Fewer children die in childhood, fewer women die in giving birth, many diseases have been conquered or even eradicated, in the U.S. drug addiction and unwed teen pregnancy have decreased, fewer people worldwide live in extreme poverty, and in the developed world even the poorest possess wealth (in the form of clean running water, electricity, and other modern conveniences) that nobody could have at any price a couple of centuries ago. As for violence, Pinker refers in both books to what he calls "The Long Peace," the period since 1945 in which no major world powers have clashed head-on in war. What about the proxy wars such as the Korean and Vietnam conflicts? Faded away with the Cold War itself. Anarchy and bloody conflicts in third-world countries? While horrible present-day examples can easily be cited, the number of them has also decreased. Pinker also disputes, with supporting figures, the hype about "epidemics" of depression and suicide.

Despite Pinker's convincing array of statistics, readers may still find themselves protesting, "But—but—school shootings!" Why do we often have the impression that the condition of the world is getting worse when it's actually getting better?

For one thing, as we all know, "If it bleeds, it leads." News media report extraordinary, exciting events. Mass murder shocks us BECAUSE we're used to expecting our daily lives to remain peaceful and safe. Yet even the editorial page of our local paper recently noted that, although high-profile episodes of "rampage killings" (as Pinker labels them) seem to have occurred with alarming frequency lately, incidence of gun violence in general in the U.S. is down. We tend to be misled by the "availability heuristic" (things we've heard of or seen more frequently or recently, or that we find disturbing, loom large in our consciousness, appearing more common than they really are) and the "negativity bias" (we recall bad things more readily and vividly than good ones). Then there's the well-known confirmation bias, the inclination to notice facts in support of a predetermined position and ignore those that refute it. As for the actual numbers for mass murder, the stats for 2015 (the latest year for which he had data while writing the book) classify most rampage killings under the category of terrorism. The total number of deaths from "terrorism" in the U.S. in that year was 44, as compared to over 15,000 fatalities from other kinds of homicides and vastly more deaths from accidents (motor vehicle and other).

What does Pinker's thesis that the arc of history bends toward justice (and peace, health, and prosperity) imply for the prospect of encountering alien civilizations? Isaac Asimov believed we're in no danger of invasion from hostile extraterrestrials because any culture advanced enough to develop interstellar travel would have developed beyond violence and war. Pinker would probably agree. I'm still dubious of this position, considering that one of the most technologically advanced nations of the twentieth century perpetrated the Holocaust. Moral advancement may tend to grow in step with scientific development, but I don't see that trend as inevitable. The reason I think an alien invasion is unlikely is that any species capable of interstellar travel would have the intelligence and technological skills to get anything they need in much easier ways that crossing vast expanses of space to take over an already inhabited planet. I trust that any hypothetical aliens we eventually meet will be intelligent enough to realize, as most of the nations on Earth have, that trade and exchange of ideas trump genocidal conquest as methods of getting what they want from other sapient species. Much of science fiction has traditionally offered hope, for instance many of Robert Heinlein's novels. Today, amid the fashion for post-apocalyptic dystopias, we can still find optimistic fiction. S. M. Stirling's Emberverse, which begins with the downfall of civilization in DIES THE FIRE, focuses throughout the series on cooperation in rebuilding society rather than on the initial collapse.

While Pinker doesn't deny that our world is far from a utopian paradise, there's a lot of work yet to be done, and any mass murder rampage is one too many, this is fundamentally an optimistic book. It's a refreshing reminder that we're not necessarily doomed.

Margaret L. Carter

Carter's Crypt

Thursday, September 14, 2017

Mundane Psionics

Many characters in fantasy and science fiction possess psychic superpowers. They can read thoughts, view events at a distance or (maybe by touching an object) in the past, or see the spirits of the dead. In a sense, we don't have to fantasize about having such abilities, because we already do, sort of. Through writing, we can transmit our thoughts directly into the minds of other people we'll never meet face-to-face. While reading, we receive the thoughts of the writers, even if they died centuries ago. Film allows us to travel in time, in that it shows us scenes from the past. We can even see dead people in the prime of life. Through recording technology, we hear their voices.

Psychologist Steven Pinker, in "The Seven Words You Can't Say on Television" (a chapter in his book THE STUFF OF THOUGHT), speculates on why taboo words—profanity and obscenity—have been forbidden or restricted in most human cultures. Often against our will, "dirty" words force images into our minds that we may not want to entertain. Unlike eyes, ears don't have "earlids" to shut out objectionable sounds spoken by other people. Also, as he points out, "understanding the meaning of a word is automatic"; "once a word is seen or heard we are incapable of treating it as a squiggle or noise but reflexively look it up in memory and respond to its meaning." Language equals thought control. The official Newspeak dialect in Orwell's 1984 strives to make heretical thoughts literally "unthinkable"—at least as far as "thought is dependent on language."

Many fantasy novels postulate that magic depends on a special, often secret language. In one of my favorite series, Diane Duane's Young Wizards stories, learning wizardry consists mainly of mastering the Speech, the universal language of reality understood by all creatures, including those we ordinarily think of as inanimate. A wizard affects the world by using the Speech to persuade an object, creature, or system to change. However, some speech acts in the mundane world also alter reality. Enactive speech not only describes an event but makes it happen, e.g. taking an oath of office or uttering the words, "I now pronounce you husband and wife."

A prayer in the Episcopal Book of Common Prayer titled "For Those Who Influence Public Opinion" makes this petition: "Direct, in our time, we pray, those who speak where many listen and write what many read, that they may do their part in making the heart of this people wise, its mind sound, and its will righteous." A heavy responsibility for authors, especially in this divisive, volatile era!

Margaret L. Carter

Carter's Crypt

Thursday, June 29, 2017

The Sense of Style

I've been rereading THE SENSE OF STYLE, by Steven Pinker, published in 2014. The lucid and witty cognitive scientist Pinker, one of my favorite nonfiction authors, explores the question of what constitutes good writing by connecting grammar and style with the way the brain handles language. He begins by reminding us, “Complaints about the decline of language go at least as far back as the invention of the printing press.” Contemporary writing isn’t uniquely dreadful, regardless of complaints about what the Internet and texting have done to the thought processes of today’s youth. He analyzes several passages of nonfiction to unpack why they’re effective (and, in one case, to uncover weaknesses in the style and strategy of the writer). Although he concentrates on nonfiction, his detailed explanations of why and how these prose samples work would be illuminating for fiction authors, too.

With the help of sentence “tree” diagrams, he demonstrates why the brain finds some sentences easier to comprehend and others difficult. I must confess I had trouble following the trees (the old-fashioned sentence-diagramming method I grew up with makes more intuitive sense to me, probably just because I'm used to it), but visually oriented readers may find them helpful. Pinker shows us what kinds of structures create coherence in sentences and paragraphs. He explains the problems that make for incoherent writing, especially the “curse of knowledge,” his term for what happens when a writer assumes the audience shares his or her background and degree of expertise in the subject matter. Speaking of “his or her,” Pinker tackles the issue of gender-neutral pronouns and defends the use of “they” for that purpose. He illuminates the proper uses of punctuation, especially commas. In the final chapter, “Telling Right from Wrong,” he works through a long list of “errors” condemned by purists and offers his rationale for why each “rule” is or isn’t justified. Though I don’t agree with all his conclusions (e.g., “lay” and “lie” are not and will never be the same verb, and the former should not be substituted for the latter except in passages of dialogue; "between you and I" is an abomination against nature; he tolerates dangling participles to a degree that I can't accept), I found the entire book entertaining and informative. His distinctions between grammatical vs. ungrammatical and formal vs. informal strike me as refreshingly sensible, even if I don't agree with him on where to draw the line in every case.

He makes short work of the grammatical superstitions that forbid splitting infinitives, starting sentences with coordinating conjunctions (e.g., "and" or "but"), and ending sentences with prepositions. I enjoyed and learned from his analyses of many other groundless prohibitions whose invalidity is less obvious. I wish he had also addressed a baffling fetish one of my former editors held—she insisted inanimate nouns couldn't have possessive forms. Say what? "A midsummer night's dream"; "the Church's one foundation"; "the dawn's early light"; "the twilight's last gleaming"; "New Year's Eve"? If there was ever a pointless "rule" that could generate awkward, wordy sentences through attempts to "correct" the "errors," that's one.

He brings up one problem, related to the "curse of knowledge," that frequently trips me up: Writers often string together phrases and clauses in the order they spontaneously come to mind instead of the order that facilitates smooth reader comprehension. In self-editing, one of the first things I usually have to fix is the bad effect of this stream-of-consciousness writing on my sentences. While I was dimly aware of this weakness, his explanation highlighted and clarified it for me.

I won't claim this will be the last style manual you'll ever need; he doesn't aim to cover every possible stylistic and grammatical pitfall. However, I think any writer would benefit from this book and find it a pleasure to read. Besides its useful content, A SENSE OF STYLE functions as an example of elegant writing in its own right.

Margaret L. Carter

Carter's Crypt