Showing posts with label Robert Heinlein. Show all posts
Showing posts with label Robert Heinlein. Show all posts

Thursday, December 05, 2024

Welcome to the Future

Recommended for fans and writers of near-future science fiction: YOU CALL THIS THE FUTURE? THE GREATEST INVENTIONS SCI-FI IMAGINED AND SCIENCE PROMISED (2007), by Nick Sagan, Mark Frary, and Andy Walker, systematically explores fifty examples of scientific, technological, and social developments predicted in fiction from the perspective of which have come true or might plausibly do so. The possibilities range from those that already exist in some form (e.g., cloning, telemedicine, AI marketing, e-books, bionic organs, space tourism) all the way to concepts that may remain flights of fancy, such as warp drive and time travel.

Coverage of each topic is divided into three parts, headed Scientific History, Sighting in Sci-Fi, and Reality. Some also include a section titled "Tech Spec," such as facts about "truth serum" and an explanation of the procedure involved in cloning the famous sheep Dolly. Inventions and developments fall under the categories of Travel and Transportation (of course including flying cars and personal jetpacks); Computers, Cyborgs, and Robots; Communications; Weapons and Security; and the very broad field of Life, Health, and Sex. If the authors hadn't apparently deliberately restricted each category to ten items, doubtless the latter could have included much more content. The text is both highly readable and informative, as well as illustrated with numerous photos and drawings. Commendably, there's also an index. In addition to the book's entertainment value, it could serve as a quick reference source for SF authors.

Although published recently enough to reflect most of the cutting-edge technology we currently have, it leaves plenty of room for speculation about science-fiction devices and techniques that don't exist yet. J. D. Robb's "In Death" mystery series, set around 2060, has featured a combination handheld computer and portable phone called a "link" since the publication of the first novel in 1995. That vision has come true way ahead of schedule. On the other hand, I'm still waiting for the household cleaning robot Robert Heinlein promised we'd have in 1970, in his 1957 novel THE DOOR INTO SUMMER.

On the third hand, consider all the wonders we enjoy that weren't even imagined just decades ago, as celebrated in Brad Paisley's upbeat song "Welcome to the Future":

Welcome to the Future

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, November 07, 2024

One Person, One Vote?

In Terry Pratchett's Discworld series, Lord Vetinari, the Patrician who rules the largest city, embraces the principle of "one man, one vote." He's the man; he has the vote.

In "Franchise," by Isaac Asimov, elections are indirectly decided by AI as follows: An algorithm operated by the government's super-computer analyzes the political and cultural traits and beliefs of every citizen in the nation. One person is selected as the archetypal representative of the entire population and becomes the Voter for that year. Not that he actually makes a choice. He's exhaustively examined and interviewed, after which the computer registers his "vote."

At the end of Robert Heinlein's essay "Who Are the Heirs of Patrick Henry?" (which seems to advance the -- to me -- peculiar notion that signing a nuclear nonproliferation treaty equals destroying our freedoms, but that's beside the point), reprinted in his collection EXPANDED UNIVERSE (1980), he defends STARSHIP TROOPERS from the mistaken charges often brought against it (e.g., advocating a "militaristic" society, as portrayed in the abomination of a movie by the same title). He then goes on to discuss the general topic of voting rights. He expands on the novel's premise that a democratic society should require citizens to earn the franchise, rather than having it automatically bestowed on everyone over the age of eighteen. In STARSHIP TROOPERS, only veterans of public service can vote. That service can be either civil or military, and if military it needn't be in a combat role. Furthermore, citizens exercise the franchise only after their service ends, so the government isn't in any sense run by the military.

Heinlein speculates on other ways the right to vote and hold office might be "earned." He says the Founding Fathers "never intended to extend the franchise to everyone." They expected voters to be "stable" members of the community, such as by property ownership, employment of others, holding a journeyman status in a trade, etc. Well, yeah, but they didn't extend the franchise to women, Blacks, or Native Americans either. If Heinlein seriously advocated material "stability" as a prerequisite for voting, he would've favored disenfranchising the poor and most of the working class, a practically guaranteed method of keeping them poor.

His essay plays around with fanciful alternative ideas for voter qualifications. (1) The sale of voting rights, with the proceeds being the main source of government revenue. Heinlein maintains that the potential for corruption by the wealthy would be minimal, because most rich people wouldn't bother to spend lavishly on multiple franchise slots. I'd be less optimistic on that point. (2) Requiring a minimum level of "intelligence and education" by making each voter solve an equation upon stepping into the booth. Variations on that method: Pay a small fee for the opportunity to try to vote; if you pass the test, you get your money back. Or, more drastically, those who fail the test are instantly euthanized to improve the gene pool. (3) Why hasn't the quality of government improved with the enfranchisement of women, as some idealists predicted it would? Maybe we didn't go far enough. In a spirit of fairness, let's bar men from voting, practicing law, and holding office for 150 years. "An all-female government could not possibly be worse than what we have been enduring."

Foreshadowing a comment that has sparked widespread outrage in the current election cycle, he suggests taking that last modest proposal even further. On the grounds that "a woman who is mother to a child knows she has a stake in the future," suppose we legally restrict voting, practicing law, and office-holding to mothers?

He also mentions Mark Twain's "The Curious Republic of Gondour," which can be read here:

The Curious Republic of Gondour and Other Whimsical Sketches

Under the law of Gondour, every citizen has one vote. However, people gain additional votes on the basis of education or wealth, with level of education more heavily weighted. People who control more votes win higher social status and more respect.

I trust Heinlein wasn't seriously proposing any of these innovations -- all of which, except the female-only franchise, would mean disadvantaged groups would become steadily more disadvantaged -- but they're entertaining to fantasize about. As for the election-booth intelligence test, I'm reminded of a short story about a dystopian future in which every child, upon reaching a certain age, undergoes a mandatory IQ exam. Those who score too HIGH don't come home. At least in our reality there's little danger that systematic dumbing down of the population will become official government policy. I hope.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, September 26, 2024

Quantity Versus Quality?

Are quantity and quality incompatible strategies or goals? Not according to the observations in these and other similar essays:

Quantity Leads to Quality

The Origin of a Parable

The second article concerns a ceramics class whose teacher divided students into two groups. One would be graded on the quality of the work produced, while the other would be graded solely on amount of output. "Well, came grading time and a curious fact emerged: the works of highest quality were all produced by the group being graded for quantity. It seems that while the 'quantity' group was busily churning out piles of work -– and learning from their mistakes — the 'quality' group had sat theorizing about perfection."

Similarly, we've heard of writers who endlessly polish the first paragraph, first page, or first three chapters to perfection, but when an agent or editor requests the full manuscript, the rest of the work doesn't measure up to the meticulously crafted opening.

The "quantity vs. quality" opposition seems to underlie the contemptuous -- and invalid -- dismissal of prolific writers as "hacks," as if high productivity automatically implies mediocrity. Stephen King used to publish two or three books per year and most years still produces at least two. Nora Roberts regularly releases two "J. D. Robb" mysteries per year and at least one "Nora Roberts" romance (probably more, but I don't keep close track of her output in that genre). Those figures may sound "prolific," but consider: A professional, full-time author, living solely on writing income, probably treats that vocation like a "job," writing several hours most days. Even a slow writer can produce at least 1000 words in two hours, and a faster one more like 1000 words per hour. Postulate only three hours per day, possibly a low estimate for a bestselling pro. 3000 words per day add up to 90,000 words in a month, a draft of a typical novel (if weekends aren't included, allow five to six weeks). A producer of "doorstops" like Stephen King, at that rate, might take two months for a first draft. With that time allotment, the writer could generate three novels in six months -- presumably not continuously, but with breaks in between -- with half the year free for revising, editing, polishing, marketing, and business minutiae. This kind of schedule, of course, assumes abundantly flowing story ideas, but from what I've read, the typical professional writer never has a shortage of those.

I'm reminded of Robert Heinlein's famous rules for success as an author: (1) Write. (2) Finish what you write. (3) Send it to an editor who might publish it. (4) Repeat number 3 until somebody buys it. I don't remember whether he addressed the question of when it's time to give up on a story -- maybe when you've exhausted all possible markets? However, I clearly recall the other "rule" he sometimes added: Never rewrite except to editorial order.

His point was that your time is better used in creating new stories than obsessively revising old material. He would probably agree with the "quantity over quality" proponents who maintain that each fresh project gives you a chance to learn something new about your craft. I would allow for one exception, though -- when you're deeply emotionally invested in one piece of work and have your heart set on getting it "right." My first vampire novel, DARK CHANGELING, conceived in embryonic form when I was thirteen or fourteen, went through multiple incarnations before I felt ready to submit it. After that, rejection feedback showed me its remaining flaws. Another extensive revision finally got it published. The protagonist, half-vampire psychiatrist Roger Darvell, continues to hold a special place in my heart. On the other hand, throughout that multi-decade process, I was writing and publishing other stuff, too.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, August 15, 2024

Changing the Past

I've recently finished the latest book by S. M. Stirling (best known for alternate history SF), a time-travel adventure, TO TURN THE TIDE, first volume in a new series. Partially inspired by L. Sprague DeCamp’s vintage novel LEST DARKNESS FALL (but Stirling's book is better), TO TURN THE TIDE transports a Harvard professor of history and four graduate students to central Europe in 165 A.D., era of the Roman Empire under Marcus Aurelius. They know they've made a one-way trip, since the time machine is stationary instead of a vehicle like the one in H. G. Wells's classic, so they decide to use the literal ton of supplies sent with them as planned by the inventor of the machine (who accidentally fails to come along as he'd meant to). They set out to change history for the better, beginning with simple improvements, e.g, sterile medical procedures and wheelebarrows, and building on their early successes. In this first installment of the series, their innovations consist of “Type A” changes, things the inhabitants of that era and locale can implement with available tools and materials once they’re given the concepts. “Type B” developments, those that require inventing the tools to make the tools to construct the new things, will come later.

In fiction, altering the past in an attempt to improve the future produces a wide range of effects. At one extreme, we have Ray Bradbury's story of a tiny, accidental change with disastrous results, when a visitor to the age of dinosaurs crushes a butterfly, thereby generating a future worse than the one he originally came from (yet unrealistically similar, but, then, it's a short story with no real pretense of scientific rigor). At the other extreme, some of Heinlein's fiction, notably THE DOOR INTO SUMMER, postulates that any alteration you make in the past isn't a real change at all. You're just doing whatever you did in the first place but weren't aware of in hindsight until after you went back and did it. (Is your head spinning yet?) Likewise, in one of the Harry Potter books, the actions of Harry and Hermione when using the time turner simply cause things to happen just as they had all along, previously unknown to the characters. In Diana Gabaldon's "Outlander" series, Claire (the traveler from the 20th century to the mid-18th) and Jamie strive to prevent the 1745 Jacobite rising and Bonnie Prince Charlie's invasion of Scotland. Although not completely powerless, they find their major goals unattainable. After the war unfolds on schedule, culminating in the catastrophic battle of Culloden despite their strenuous efforts to influence the course of events, they realize they can make only minor changes. It's as if the flow of time resists any significant alterations.

Time travel seems to work similarly in Connie Willis's series about mid-21st-century historians from Oxford. The transporting device can't send them anywhere close to a major historical event. If they deliberately or inadvertently aim for a critical nexus point, the traveler is simply bounced to a different nearby location. Thus the timeline corrects itself, smoothing out any ripples the characters create. Or so they believe -- this postulate is tested in the two-volume World War II epic BLACKOUT / ALL CLEAR, in which the historians fear they may have triggered disastrous changes in the original history.

The major theoretical issue with trying to improve the future -- one's own present -- by altering the past is what happens if you succeed. You would have had no reason to go into the past in the first place, and therefore you couldn't have peformed the actions that result / resulted / will or would result in achieving your goal. Many time-travel authors simply ignore this paradox. Some stories work on the premise that the travelers exist in a sort of bubble, in which only they remember both the original timeline and the new one, while everybody else is oblivious that anything has changed. The most logical solution is the outcome Stirling implies: The paradox makes it impossible to reshape one's own original history. Instead, the chrononaut's actions generate a new timeline branching off from the point of intervention. The protagonist of TO TURN THE TIDE can never find out whether that's what happens in the history he and his friends are creating, but the question is moot anyway. In the future they left, every person and thing they knew and loved has almost certainly been wiped out in a nuclear holocaust. Their hope is to spawn a new future without that apocalyptic destruction, even though they'll never know whether they've succeeded.

Although the "branching timelines" model makes the most rigorous sense, I do enjoy stories in which the protagonist achieves positive change by tweaking the past and returns home to enjoy the fruits of his or her efforts.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, January 11, 2024

Robotic Companions

A robotic device called ElliQ, which functions as an AI "companion" for older people, is now available for purchase by the general public at a price of only $249.99 (plus a monthly subscription fee):

Companion Robot

As shown in the brief video on this page, "she" has a light-up bobble-head but no face. Her head turns and its light flickers in rhythm with her voice, which in my opinion is pleasant and soothing. The video describes her as "empathetic." From the description of the machine, it sounds to me like a more advanced incarnation of inanimate personal assistants similar to Alexa (although I can't say for sure because I've never used one). The bot can generate displays on what looks like the screen of a cell phone. ElliQ's makers claim she "can act as a proactive tool to combat loneliness, interacting with users in a variety of ways." She can remind people about health-related activities such as exercising and taking medicine, place video calls, order groceries, engage in games, tell jokes, play music or audiobooks, and take her owner on virtual "road trips," among other services. She can even initiate conversations by asking general questions.

Here's the manufacturer's site extolling the wonders of ElliQ:

ElliQ Product Page

They call her "the sidekick for healthier, happier aging" that "offers positive small talk and daily conversation with a unique, compassionate personality." One has to doubt the "unique" label for a mass-produced, pre-programmed companion, but she does look like fun to interact with. I can't help laughing, however, at the photo of ElliQ's screen greeting her owner with "Good morning, Dave." Haven't the creators of this ad seen 2001: A SPACE ODYSSEY? Or maybe they inserted the allusion deliberately? I visualize ElliQ locking the client in the house and stripping the premises of all potentially dangerous features.

Some people have reservations about devices of this kind, naturally. Critics express concerns that dependence on bots for elder care may be "alienating" and actually increase the negative effects of isolation and loneliness. On the other hand, in my opinion, if someone has to choose between an AI companion or nothing, wouldn't an AI be better?

I wonder why ElliQ doesn't have a face. Worries about the uncanny valley effect, maybe? I'd think she could be given animated eyes and mouth without getting close enough to a human appearance to become creepy.

If this AI were combined with existing machines that can move around and fetch objects autonomously, we'd have an appliance approaching the household servant robots of Heinlein's novel THE DOOR INTO SUMMER. That book envisioned such marvels existing in 1970, a wildly optimistic notion, alas. While I treasure my basic Roomba, it does nothing but clean carpets and isn't really autonomous. I'm not at all interested in flying cars, except in SF fiction or films. Can you imagine the two-dimensional, ground-based traffic problems we already live with expanded into three dimensions? Could the average driver be trusted with what amounts to a personal aircraft in a crowded urban environment? No flying car for me, thanks -- where's my cleaning robot?

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, December 21, 2023

Dangerous Gifts

The solstice is upon us! There's hope that within a few weeks darkness will stop falling at 5 p.m. Happy winter holidays!

It might seem natural that if people with arcane psychic talents existed, they would dominate the ungifted majority, whether officially or not, overtly or subtly, gently or cruelly. They might constitute a ruling class like the laran-wielding aristocrats of Marion Zimmer Bradley's Darkover, an order of official problem-solvers like the Heralds of Mercedes Lackey's Valdemar, or an autocratic clique like the sociopathic tyrants of the STAR TREK episode "Plato's Stepchildren." More often than not, however, far from holding exalted status, fictional possessors of such talents are regarded with ambiguity or hostility by their societies.

For example, the Slans in A. E. Van Vogt's classic 1946 novel face relentless persecution because of their powers. Fictional vampires surely inspire deeper horror than many other imaginary monsters because of the hypnotic mind control that renders their victims helpless and even unwilling to resist. Zenna Henderson's People, refugees from a distant planet living secretly on Earth, although benign, are often confronted with suspicion or fear when ordinary earthlings discover their powers. In the Sime-Gen series by Jacqueline Lichtenberg and Jean Lorrah, Gens regard the much less numerous Simes with terror not only because they drain life-energy but because they're suspected of occult abilities such as mind-reading.

Historical romance author Mary Jo Putney recently published the first novel in a new series called "Dangerous Gifts." In this book's slightly altered version of Regency England, psychic powers are known to exist but often viewed negatively. The hero lives happily among a circle of people who share similar gifts, and he works for the Home Office using his abilities for the good of his country. As a child, though, he was brutally rejected by his father because of his wild talents. At the beginning of the story, the gifted heroine is being held prisoner by villains who keep her mind clouded as they plot to use her powers for their nefarious goals. Putney has also written a YA series about an alternate-world Britain where magic is considered a lower-class pursuit, a shameful defect if it shows up in a noble family. The magically endowed heroine's upper-class parents send her to an exclusive but very strict academy that exists to train gifted young people to suppress their powers.

In fiction, miracle workers in general often inspire fear and revulsion rather than awe. Consider Mike, the "Martian" in Heinlein's STRANGER IN A STRANGE land. In real life, too, such people sometimes meet violent ends.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, November 30, 2023

Animal Facial Expressions

A study at the University of Kansas Medical Center "discovered that cats use nearly 300 distinct facial expressions to communicate with one another":

Cats' Facial Expressions

In contrast, humans have 44 different facial expressions, and I was surprised to read that dogs have only 27. Feline expressions of emotion often involve ear movements and whiskers, however, so it's not so strange that they have more "distinct" expressions than we do. I was also surprised that cats' "facial signals" play such a large part in their communications with each other. As this article points out, cats are more social than people usually assume.

Chimpanzees convey a lot of information to each other by subtle facial movements:

How Chimps Communicate with a Look

Lisa Parr, director of the Yerkes National Primate Research Center, discusses how small changes in expression can communicate different emotions. Chimps were tested on how well they could distinguish and identify the significance of other chimps' facial expressions. Studying these behaviors in chimpanzees may contribute to better understanding of human nonverbal communication.

Dogs may have developed some types of facial expressions specifically to communicate with us:

How Dog Expressions Evolved

Of course, as this article mentions, a lot of canine communication occurs through body language. Maybe that's why they haven't evolved as many variations on facial expressions as we have. Also, scent plays a vital role in dogs' experiences of the world, a sensory dimension we almost entirely lack compared to canines.

Quora features questions about why animals of the same species tend to look so much "alike," while human beings have distinct individual appearances. Some answers explain, in addition to the human-centered bias that causes us to make finer distinctions among members of our own species, that many animals have less variation in facial appearance than we do because they rely on other senses such as smell to recognize each other.

If intelligent Martians existed, we might think they all look alike, as the narrator of Heinlein's DOUBLE STAR does at the beginning of the novel. On the other hand, the Martians would probably have trouble telling Earth people apart.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, October 19, 2023

Defining Death

I've been reading a book called WHEN THE "DEAD" ROSE IN BRITAIN, by Nicole C. Salomone. After a forty-page overview of the history of medicine in Europe and Britain, the author delves into "premature burial and the misdiagnosis of death," mostly in the eighteenth and nineteenth centuries. Among the various related topics covered, there's a chapter on European vampire legends, the main reason I bought the book. Over hundreds of years, doctors as well as clergymen and philosophers debated and analyzed in great detail the dividing line between life and death and the criteria for diagnosing death. They distinguished between apparent death (or suspended animation) and absolute death, from which no recovery was possible.

Some physicians explained the essence of aliveness as the "vital spark," rather tautologically defined as the force that maintained life in the body. Later, it was suggested that the vital spark was in fact electricity, a hypothesis seemingly validated by the fact that an electrical current sent through an animal cadaver can make its limbs move. The recognition of the absence of breath and heartbeat as probable but not certain evidence of death inspired development of techniques for resuscitation, some of which produced concrete benefits in reviving victims of drowning and eventually led to CPR as we know it today. Societies for "the Recovery of Persons Apparently Dead" were organized. Salomone seems to accept as fact most of the recorded accounts of people misdiagnosed as dead, often prepared for interment and buried or dissected. On the other hand, the lack of specific details in many of those stories (e.g., names and precisely identified locations) leads me to think a lot were what would now be called urban legends. In any case, a widespread belief in and fear of premature burial in the nineteenth century resulted in the invention of numerous models of "safety coffins."

In modern times, medicine and the law have determined that life resides in the brain. Permanent cessation of brain activity -- "brain death" -- equals the demise of the person. Robert Heinlein's very uneven brain-transplant novel, I WILL FEAR NO EVIL, includes an extended dialogue on this issue, for me the most interesting scene in the book.

If a person has apparently died and been restored to life, was he or she actually dead during the period of "apparent death"? Are "near-death experiences" genuine glimpses of the afterlife or merely the random firing of nerve impulses? Maybe such people are only "mostly dead," like the hero in THE PRINCESS BRIDE.

If science eventually develops a technique for uploading a person's consciousness into a computer, as often envisioned in speculative fiction, is a person whose body has died with the mind preserved in this way alive or dead?

In the Star Trek universe, given that the transporter disintegrates the transportee into component particles that are reassembled at the destination, do people being teleported survive the experience? Or, as Dr. McCoy speculates, do you die every time you step onto the transporter pad, to be replaced by an exact duplicate? If it's an exact duplicate, though, how could you tell? Your memories and personality seem unimpaired. Furthermore, what about the episodes when a transporter accident creates two of the same person? Does destroying one of them or even merging them together (or splitting a new individual generated from two people by the transporter into his component halves, as debated in one VOYAGER episode) count as murder? In the eighteenth century, when the foolproof way of determining whether someone was alive or dead was to wait until the body started to decompose, the quandary was simple by comparison.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt

Thursday, August 03, 2023

Retro Futures

Watching the first few episodes of STAR TREK: STRANGE NEW WORLDS, which takes place during Captain Christopher Pike's command of the Enterprise, started me thinking about the phenomenon of science fiction set in the near future with technology that gets overtaken and surpassed by real-life inventions. "Retrofuturism" brings to mind elevator operators in Huxley's BRAVE NEW WORLD (a world that relies on reproductive tech far beyond our present capacity) or slide rules coexisting with a lunar settlement in Heinlein's HAVE SPACESUIT, WILL TRAVEL. It's an inescapable hazard of writing about the near future that "cutting edge" can quickly become dated. The TV Tropes site has a page about retrofuturism under the term "Zeerust":

Zeerust

The page includes examples from the Star Trek universe under "Live-Action TV." The best-known one from the original series, of course, is the communicator. To avoid having its communicators look outdated in comparison to real-life cell phones, the prequel series ENTERPRISE had to feature devices more "modern" than those shown chronologically later in-universe.

In the original series, Captain Pike appears after the accident that made him a quadriplegic. According to Wikipedia, he operates his whole-body automated chair by brain waves, a not-implausible distant-future invention, in view of the brain-computer interface devices currently in development. Captain Pike, however, can communicate only by activating Yes or No lights on his wheelchair. In our own time, the late Stephen Hawking used a computer program that allowed him to speak through an artificial voice -- although, toward the end of his life, at the rate of only about one word per minute. Thereafter, as explained on Wikipedia, an "adaptive word predictor" enhanced his ability to communicate. The system developed for him used "predictive software similar to other smartphone keyboards." Therefore, surely by two or three centuries in the future, Captain Pike could have equipment that would enable him to produce full sentences in a completely natural-sounding manner.

As the opposite of retrofuturism or Zeerust, much science fiction displays exaggerated optimism about the futuristic features of the near future. Heinlein, in THE DOOR INTO SUMMER, predicted that advanced household robots and commercially available cryogenic "long sleep" would exist in 1970. In the same year, he has the protagonist invent what amounts to an engineering drafting program, something we've had for decades although Heinlein's versions of robotic servants haven't materialized yet. TV Tropes references this phenomenon here:

I Want My Jet Pack

As Yogi Berra is alleged to have said, "It's tough to make predictions, especially about the future."

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, January 26, 2023

Clones as Organ Donors

I've recently read an excellent novel called NEVER LET ME GO, by Kazuo Ishiguro, author of THE REMAINS OF THE DAY. (I've seen the film adaptation of the latter, but I don't plan to watch the movie of NEVER LET ME GO. The story just strikes me as too depressing to view as a dramatization, without being filtered through the narrator's voice as in the book—and I generally LIKE sad stories.) NEVER LET ME GO traces the youth and coming-of-age of children cloned for the sole purpose of serving as organ donors. Kathy, the narrator, and her friends have always known, on some level, what their purpose and inevitable destiny are, but their vague awareness becomes more explicit as they grow to adulthood. The reader learns about their world along with them, through extended reminiscences by Kathy, who as a young adult serves as a "carer" for other donors until she eventually has to assume the latter function herself. She knows once she progresses from carer to donor, she will probably live through three or at most four donations before she "completes," i.e., dies. The clones don't serve as donors for the specific individuals whose DNA they share (whose identities, of course, they never know) but as general organ banks. The characters we follow grow up in a sort of orphanage / boarding school, where they live a fairly good life; they later learn that theirs is one of the best group homes, whereas others treat their inmates worse. We never learn details about the other homes, the background of the cloning project, or the science underlying it. Nor do we find out how the public was induced to accept this radical development. The novel seems to take place in an alternate mid-twentieth-century. This version of England has pre-cellphone, pre-internet technology, yet judging from the apparent ages of older donors mentioned in passing, reliable human cloning has existed for well over twenty years.

The novel focuses on the relationships among the characters, their gradual discovery of the full truth about their own status, and the ethics of treating human beings as manufactured products. Therefore, it doesn't delve into the scientific dimensions of the cloning process. Toward the end of the book, a retired guardian (as their teachers are called) mentions controversies over whether the donors have souls. Nobody brings up the obvious fact that a clone is simply an identical twin conceived at a different time, who grows like any other person and is as human, with as much of a soul (if souls exist) as anybody else. Another unanswered question raised in the story is why the characters can't have babies. There's no biological reason for clones to be infertile. Are they genetically manipulated to be that way? Surgically sterilized in childhood?

Wouldn't it be more efficient for donors to provide spare parts specifically for the people from whom they're cloned? No risk of organ rejection that way. Some of Heinlein's imagined futures include clones produced to supply organs for their originals. In these books, it's clear the cloned bodies never come alive, are never persons at all but only inert shells. One such body is used in THE NUMBER OF THE BEAST to fake the death of Lazarus Long's mother. In principle, an individual could achieve immortality by having his or her brain transplanted into a cloned body when the birth body wears out.

For most purposes, though, why grow a whole body at all? Surely it would be easier to develop cloning technology that could generate particular organs as needed. You could get a new heart, liver, kidney, or whatever with your own DNA and with none of the ethical issues involved in mass-producing live, conscious people to serve as spare-part factories.

So, although NEVER LET ME GO raises fascinating issues, and its characters' plight is deeply moving, it doesn't seem to me a likely portrayal of a realistic scenario.

Margaret L. Carter

Carter's Crypt

Thursday, January 12, 2023

Quest for Longevity

The cover story of the January 2023 NATIONAL GEOGRAPHIC, a 35-page article titled "The Science of Living Longer and Better," explores several different approaches, both theoretical and practical, to the goal of extending the human lifespan. The genetically programmed maximum age for us seems to be around 120 years. However, very few people make it that far.

Numerous drugs enable mice to live as much as 60% longer than normal. Why don't they work on people? Why do certain animals such as naked mole rats and some bats live significantly longer, in proportion to their size, than we do? Why do Greenland sharks live at least 250 years, maybe longer? Altering a single gene in a certain species of roundworms doubles their lifespan while keeping them youthfully energetic, but we're more complicated than worms. Why do people in some societies tend to enjoy longer, healthier lives than the average? Environment? Diet? Exercise? Other lifestyle factors? Some scientists have tried promising drug therapies on themselves, with mixed results. Animal studies show life extension outcomes from severe restriction of calorie intake, but, again, such a regimen hasn't produced similar effects on human subjects. Anyway, personally, if I could lengthen my lifetime by a decade or two that way, I wouldn't bother; adding on years of semi-starvation would be no fun.

Stipulating the natural human upper age limit as about 120 years suggests that the Howard Families project in Robert Heinlein's METHUSELAH'S CHILDREN couldn't work the way the novel portrays it. By the date of the novel, the 22nd century, the typical Howard Families member lives to 150, retaining the appearance and vitality of a person in the prime of life. This situation exists before rejuvenation therapies are invented later in the story. Simply interbreeding bloodlines of naturally long-lived people couldn't extend their maximum ages past the 120-year limit if genes for such extension don't already exist. Moreover, real-life super-centenarians, however vigorous, still look their age, not so youthful they have to adopt new identities to avoid unwelcome attention. The only way the "Methuselahs" of Heinlein's novel could survive and remain young-looking to the age of 150 would be if Lazarus Long had already spread the mutated gene responsible for his apparent immortality through most of the Howard population. (Given the character of Lazarus as portrayed in the later book TIME ENOUGH FOR LOVE, that hypothesis seems not unlikely.) That explanation wouldn't work for the early generations such as Lazarus's own mother and her contemporaries, though. There's no plausible way mere selective breeding for a century or so could produce human beings who live over 100 years with the appearance of well-preserved middle age.

So if we want lifespans like Heinlein's characters, we'll have to develop futuristic technologies similar to those speculated about in the NATIONAL GEOGRAPHIC article. Even so, surpassing the natural limit of 120 years would seem to require something radically beyond those techniques, maybe direct alteration of DNA—such as the hypothetical "cellular reprogramming" mentioned in the article.

Margaret L. Carter

Carter's Crypt

Thursday, October 27, 2022

Is Time Travel Impossible?

A character in C. S. Lewis's posthumously published novel fragment THE DARK TOWER asserts it is. (Granted, one faction within Lewis scholarship maintains THE DARK TOWER wasn't actually written by him, but I don't find that claim convincing. Anyway, the issue doesn't affect the point of the story.) He argues that physical travel to the past or future can't be done for a basic, irrefutable reason: A corporeal trip into a different time necessarily carries all the atoms in one's own body into that other time. But in the past, all those particles existed in other entities in the physical world, whether inanimate objects, living creatures, liquids, gasses, whatever. In the future, those same particles will again be distributed through the environment. The only way you could materialize in a different moment would be if duplicates of each of your atoms, molecules, etc. existed in the same place at the same time. According to the laws of physics as we know them, that's impossible. Therefore, physical time travel is forever, irrevocably ruled out, unless we invoke magic rather than science.

That story is the only place where I've encountered this argument, which strikes me as highly convincing. On this hypothesis, other temporal "locations" could be only viewed, never visited. Accordingly, Lewis's character has invented a device for viewing other times, although it turns out the true situation is more complicated than he believed.

While I've come across other stories of observing rather than traveling to some non-present time, I don't remember any that offer a theoretical grounding for the impossibility of temporal travel in the flesh. It's not unusual in time-travel fiction, however, for a traveler to be unable to exist in the same location more than once in the same moment. In Dean Koontz's LIGHTNING, a traveler can't visit a place/time where he already is/was. He's automatically shunted away from that point. In Connie Willis's series about time-traveling historians from a near-future Oxford University, the same prohibition applies, but it's not clear whether the simultaneous existence of two of the same person is outright impossible or would produce a catastrophic result if it accidentally happened. In such works as the Harry Potter series, THE TIME TRAVELER'S WIFE, and Robert Heinlein's "By His Bootstraps," on the other hand, any number of you can be in the same point in space/time at once.

To me, the former rule seems more plausible, because it makes the issue of the same material object being in two places at once less obvious, although I've enjoyed lots of fiction in the second category. One possible way to get around the problem raised in Lewis's DARK TOWER: Instead of a corporeal leap into a different time, travelers might project their consciousness and build temporary bodies in the other time by "borrowing" stray particles from the surrounding air, water, and earth. When the traveler released the borrowed matter to return to his or her point of origin, the particles would dissipate harmlessly into the environment. Another method of bypassing the problem shows up in the new QUANTUM LEAP series: The leaper's consciousness occupies the body of a person in the past, presumably suppressing the host's personality in a sort of temporary, benign possession. (The time-shift operated differently in the original series, while this version does leave unanswered the question of where the leaper's body is while his immaterial consciousness travels to multiple past eras.)

Margaret L. Carter

Carter's Crypt

Thursday, September 15, 2022

The Meaning of Money

What gives money (or any "moneylike" form of currency) its value? What makes us willing to accept it in exchange for concrete items of value? Cory Doctorow dissects this conundrum in his latest LOCUS column:

Moneylike

After an attempt to define money, he explores its origin. He rejects the familiar hypothesis of its having been invented to solve the cumbersome difficulties of barter, labeling this a "folk-tale." Instead of a "bottom-up" model of the creation of media of exchange, he describes money as a "top-down" system imposed by governments, which required the existence of currency to collect taxes in order to provision their armies. Where, then, does the money itself come from? It's generated by governments, and problems can occur if the state issues either too much or too little of it. Doctorow illustrates and analyzes this model at length in an extended parable. Items other than official currency can be "moneylike," such as gift certificates. Elaborating on the concept of "moneylike" media of exchange, he goes into detail about how cryptocurrency works, especially with reference to internet ransomware.

Robert Heinlein includes a discussion of what constitutes value in STARSHIP TROOPERS, where the narrator's high-school teacher refutes the claim that labor alone creates value. Heinlein's TIME ENOUGH FOR LOVE contains an amusing scene in which Lazarus Long, acting as the banker for a frontier planet colony, destroys a batch of paper money, to the horror of the man he's dealing with. Lazarus has to explain that money doesn't consist of a physical thing with objective value, but a consensus reality people agree on. As long as Lazarus keeps a record of the serial numbers from the bills he gets rid of, there's no need to preserve the bills themselves (which pose a theft risk).

In one of Terry Pratchett's Discworld novels, the capital city adopts the Golem Standard. What could serve as a better backing for currency than objects that are almost impossible to steal, counterfeit, or destroy (especially since they're sapient and can defend themselves)?

In the Star Trek universe, conflicting information about the future economy appears in the various series. In the original series, Starfleet personnel must get paid somehow, as shown by Uhura's purchase of a tribble in "The Trouble with Tribbles." Outside of Starfleet, the existence of money is confirmed in "Mudd's Women" and the episode in which Spock poses as a Vulcan merchant. Supposedly by the time of STAR TREK: THE NEXT GENERATION the ubiquity of replicators has made the Federation a post-scarcity society with no need for money. Yet on the fringes (as in DEEP SPACE NINE) and outside the Federation's borders, as made clear by the Ferengi veneration of profit, money exists. Gold-pressed latinum as a medium of exchange is explained on the premise that it's one of the few substances incapable of being replicated. (We have to assume dilithium crystals must fall into the same category, or else obtaining them wouldn't be such a vital preoccupation in the original series.) It seems reasonable that luxury goods in the form of items not produced by replicators, such as the Picard family's wines, would require a medium of exchange for their sale. Or are we to assume creators of such products make them for the sheer joy of the process and give them away? Regardless of post-scarcity abundance, widespread actions like that would imply a radical change in human nature that we don't witness among the Terrans of the Star Trek universe in any other behavioral category.

Margaret L. Carter

Carter's Crypt

Thursday, June 30, 2022

Communicating with Pets

Netflix has a new series called THE HIDDEN LIVES OF PETS. Although it sounds as if it should reveal what our pets do when we're not watching, it actually deals with the intelligence, sensory perceptions, etc. of domestic animals. Dogs, cats, and birds feature prominently, of course, but also such creatures as rabbits, small rodents, turtles, and even soccer-playing goldfish. The episode about communication between people and animals includes a lot of video footage about a dog named Bunny who has become famous for learning to use electronic push-buttons to "talk." This system goes way beyond the battery-operated collars that attempt to translate canine barks and body language into verbal messages (prerecorded and linked to various dog behaviors by the owner):

Petspeak Collars

Here's an article about Bunny, who became the subject of a research project at UC San Diego after she and her mistress, Alexis Devine, amassed millions of TikTok followers:

Bunny the Talking Dog

This dog communicates by pressing buttons on a floor mat, each activating a prerecorded word. As the article mentions, this system is similar to the experiments in which apes learn to select symbols on keyboards to express their wants. At the age of 15 months (in November 2020) Bunny had mastered 70 buttons, including terms such as "scritches," "outside," "play," and "ouch." More problematic words such as "more," "now," "happy," and even "why" are included. While watching the video clips on the Netflix program, I wondered whether an animal could really grasp an abstract concept such as "why." Our dog responds appropriately to quite a few words in addition to the basic commands, such as "upstairs," "downstairs," "inside," "outside," "food," "leash," and "plate." All those refer to concrete objects or actions, though.

Scientists at the Comparative Cognitive Lab "comb through" the Bunny videos rather than checking only a sample. “We want to make sure we’re not just getting cherry-picked clips.” They also watch for the possibility that the dog might be reacting to subtle cues from her human partner instead of recognizing what the buttons represent. And could she "understand" words at all in the sense we mean it? Even Bunny's owner believes she's "made an association between pressing a button and something happening" rather than learning language as we do. On the other hand, human infants start by simply associating sounds with objects, too. Fitting the words into the brain's inborn grammar template comes a little later.

The Petspeak collar and Bunny's button mat remind me of the "voder" the Venusian dragon in Robert Heinlein's BETWEEN PLANETS uses to "talk." Since the highly intelligent dragons don't have vocal organs suitable for human speech, the dragon character wears an electronic device that converts his communications into audible English sentences. It doesn't duplicate the STAR TREK universal translator, being programmed only for dragon-to-English conversion, but in the distant future something like it might be used to communicate with extraterrestrials.

Margaret L. Carter

Carter's Crypt

Thursday, June 09, 2022

Types of Telepathy

In reading THE SCIENCE OF STAR TREK, by Mark Brake, I'm naturally reminded of Vulcan telepathy (not discussed much if at all in this book, though). I don't recall the scope and nature of Spock's telepathic power being strictly defined in the original series. For complete access to the consciousness of another, Vulcans must perform a mind meld. From the episode with the alien Horta, we know language poses no barrier. Spock comprehends the thoughts of aliens through mind melds even if the other species aren't humanoid. However, he seems to exercise some limited form of telepathy without melding; in one later episode, we witness him silently "making a suggestion" to a humanoid antagonist who's not mentally on guard. The "Empath" episode introduces a young woman whose species, if she's typical, is mute. Rather than truly telepathic, they're empathic, sensing emotions but not thoughts. It seems unlikely that this species could have a technologically advanced culture, with no ability to communicate precise concepts, especially abstract ones.

Some theories of telepathy assume the participants must share a language for mutual understanding. Others postulate a universal mental "language" so that access to someone's thoughts automatically allows total comprehension. The title character of "The Mindworm," C. M. Kornbluth's classic psychic vampire tale, can hear the surface thoughts of everybody near him but can understand them only if the subject is mentally verbalizing in a language he knows (a limitation that proves his undoing when he clashes with Eastern European immigrants who recognize him from their native folklore).

Does a telepath "hear" only what the subject is thinking of at the moment or delve at will into all the contents of the person's mind? If the former, can you mask your secrets by deliberately thinking of something else? The telepath in Spider Robinson's VERY BAD DEATHS, so sensitive to the clamor of other people's minds that he lives as a hermit, picks up only surface thoughts. In Robert Heinlein's TIME FOR THE STARS, the telepathic twins "just talk," communicating silently in much the same way they do aloud. Trying to open themselves totally to each other's minds produces chaotic confusion, like being inside someone else's dream, so they don't bother.

On the other hand, some fictional telepaths can rummage through people's minds and quickly learn everything about the subject's past and present. Trying to conceal anything from a psychic with this power by simply thinking of pink elephants would be futile.

Here's a big question that I've never seen addressed, except implicitly in the STAR TREK "Empath" episode: Would a completely telepathic species have a language at all? It seems to me that they wouldn't have a reason to evolve it naturally. On the other hand, for any kind of advanced civilization to develop, surely they would have to invent language sooner or later. They would need a system of writing in order to keep records. They would need a way to communicate at long distance. Even if they got along without speech, surely written language would be a prerequisite for complex societies and any but the most rudimentary technology. It wouldn't evolve naturally, however. Geniuses among them would have to create it, as cultures on Earth invented mathematical notation. A first-contact premise of interstellar explorers from Earth meeting extraterrestrials whose only form of language is written, to whom audible speech is an alien concept, would make an exciting, challenging story.

Margaret L. Carter

Carter's Crypt

Thursday, May 19, 2022

Time Travel as a Curse

If you've read Audrey Niffenegger's THE TIME TRAVELER'S WIFE, you know it's a highly unusual approach to time travel. In fact, I haven't come across any other science-fiction or fantasy novel quite like it. Henry, the traveler, bounces through time uncontrollably and at random. Most often, he lands in moments related to his own life, but not always. Visiting points in the past and future in no particular order, he arrives at each destination disoriented, nauseated, and naked, for he doesn't take anything along on the temporal jaunts. Even tooth fillings, since they aren't technically part of his body, don't stay with him. He has multiple encounters with his wife, Clare, in the past (from his viewpoint on his timeline, after they're married) when she's between the ages of six and eighteen. On one visit, he tells her which dates he will appear on, and she writes them down. Later, when the two of them meet earlier in his timeline (for him at that age, the first time), she gives him the written list, which thereby becomes the source of his knowledge of their predicted meetings. So how does this list exist? As Clare says, it's a mysterious "Mobius" loop. Similarly, Henry appears to his younger self when child-Henry makes his first time leap, into a museum. Adult-Henry knows he'll need to teach child-Henry the rules of time travel because he remembers a friendly stranger doing that for him when he experienced his first leap.

HBO is airing a new series based on the book, starting last weekend. Judging from the first episode, it's going to follow the novel closely. The book's chapters have helpful headings that state the year and how old each character is on his or her timeline in that encounter. The TV program, likewise, has captions at the beginning of each scene to indicate the ages of Henry and Clare at that point. Otherwise, viewers could get hopelessly lost.

I've never encountered another story that portrays time travel as a disability rather than a superpower (although TV Tropes mentions a few). Henry has no way of knowing whether he'll bounce back to his point of origin within minutes or remain stranded for days or more. He has to steal to survive. He frequently gets beaten up, in addition to the hazards of bad weather and the risk of landing in the middle of a street or railroad track. Small wonder that, at the age of twenty, the first occasion in his timeline when he meets Clare, he's a bit of a self-centered jerk. It takes her love, reinforced by her knowledge of the man he will become, to transform him. One of the saddest features of the novel consists of the multiple miscarriages Clare suffers because her unborn babies inherit Henry's mutant gene and spontaneously time-leap out of her womb. Another inevitable source of sorrow for Henry is knowing when he'll die and keeping that information a secret from her.

Unlike some fictional chrononauts, Henry has no problem being in the same time slot more than once. He can and often does meet other versions of himself. In Dean Koontz's LIGHTING, the Germans who come forward from World War II into the present can't jump into a moment where they already exist, a restriction that plays a critical part in the novel's climax. Connie Willis's Oxford-based time travelers (in DOOMSDAY BOOK, TO SAY NOTHING OF THE DOG, etc.) have the same limitation. Whatever force controls the space-time continuum won't allow them to overlap themselves, just as it prevents them from getting too close to any critical historical events they might alter. For Henry, on the other hand, there's no worry about altering the past. Whatever he does in any moment he travels to is simply what he has already done. As in Robert Heinlein's THE DOOR INTO SUMMER, whose protagonist also has the ability to have two of himself in the same spatio-temporal location, anything you "change" in another time period doesn't really change the outcome but causes it to happen the way it was/is supposed to all along. While THE DOOR INTO SUMMER ends happily, with the narrator using a time machine to bring about the optimal conclusion, Heinlein's "All You Zombies—", in which every major character is the same person, whose life endlessly loops upon itself, concludes with a cry of existential despair.

The more one thinks about it, the more this aspect of Henry's time travel seems like a reason for despair. If his life is locked into a preset pattern dependent on events he has already experienced, whether in the past or in the future, what happens to free will? Yet Niffenegger manages to conclude the story on a note of love and fulfillment rather than futility.

Margaret L. Carter

Carter's Crypt

Thursday, April 21, 2022

Pregnancy Alternatives

On this season of one of my favorite TV shows, CALL THE MIDWIFE, a recently married character just suffered a miscarriage. This episode and the overall premise of the series reminded me of the ways some animals seem to have an easier time with reproduction than we do. Suppose women could resorb embryos to terminate an early pregnancy, like rats and rabbits, but consciously and at will? Or wouldn't it be more convenient if we were marsupials? Imagine giving birth painlessly to tiny, underdeveloped offspring and completing gestation in a pouch, which doubles as a cradle and food source for the growing infant. Moreover, performing mundane tasks and working at a career would be facilitated by the ability to carry babies around with us, hands-free, twenty-four-seven.

Better yet, wouldn't it be nice if fathers shared the burdens of gestation? Seahorses, of course, fertilize their mates' eggs in a pouch on the male's body where the eggs are sheltered until they hatch. TV Tropes has a page about this phenomenon in various media:

Mister Seahorse

Remember the TV series ALIEN NATION? The Tenctonese (who have three sexes, female and two types of males, but that's a different topic) transfer the pod holding the fetus from mother to father partway through gestation. The father undergoes all the typical experiences of pregnancy, including birth. If human beings had evolved this system, imagine the radical differences that might have historically existed in women's political rights and career opportunities.

Laying eggs like Dejah Thoris (John Carter's wife in Edgar Rice Burroughs's Mars series) would be a less attractive alternative. Even with high-tech incubators, parental care after hatching would be intensive and prolonged. The babies would be small and helpless, probably more so than real-life human newborns because of the limitations of an egg rather than a womb. The only advantage of oviparous over viviparous reproduction would be that both parents could share the work equally.

How about artificial wombs? In my opinion, they're never likely to become universal and replace natural reproduction as in BRAVE NEW WORLD, in the absence of some catastrophic fertility crisis. As long as the natural method remains viable, the expense and technical complications of in vitro gestation would surely far outweigh the potential convenience, except maybe for the very wealthy. Robert Heinlein's PODKAYNE OF MARS includes a less drastic technological modification of the human reproductive cycle. Some couples (those who can afford the cost, I assume) choose to go through pregnancy and birth at the optimal physiological age for healthy reproduction but bring up the children at the optimal economic stage of the parents' life. They achieve this goal by having newborn infants placed into cryogenic suspended animation until parental career and income factors reach the desired point.

Would I want to have done this, if possible? I'm not sure. Getting through college and graduate school would have been easier without babies and toddlers. On the other hand, young parents probably have more energy for chasing after kids than they would in their thirties or forties, and there's something to be said for "growing up with" one's children. Having given birth four times over the span from age nineteen to age thirty-four, I've experienced both ends of that range.

Margaret L. Carter

Carter's Crypt

Thursday, January 13, 2022

Luddites and SF

The term "Luddite" is typically applied to people who oppose technological advances. That's basically what I've always assumed the word to mean. Well, Cory Doctorow's latest LOCUS column corrects that misconception:

Science Fiction Is a Luddite Literature

Luddites were textile workers belonging to a secret society in England in the early nineteenth century, best known for destroying the newfangled equipment in textile mills. According to Doctorow, however, their primary objective wasn't the destruction of machinery. That was "their tactic, not their goal." Rather, their goal was "to challenge not the technology itself, but rather the social relations that governed its use." Doctorow details some of the local and global changes that resulted from mechanization of the textile industry. Factory owners could have used the new-model looms to improve employment conditions for skilled craftspersons. Instead, employers chose to hire fewer workers at lower wages. The Luddites imagined and agitated in favor of a different path for the industrial revolution. Doctorow gives several examples of how we, today, "are contesting the social relations surrounding our technologies."

New technology always generates social change, often with unanticipated consequences. Robert Heinlein's "The Roads Must Roll" is one story that pioneered the type of science fiction focusing not on aspects such as the technical details of how automated roads work, but on how their existence affects the society that uses them. An obvious real-world example, the automobile, had easily predicted effects on individuals' freedom of movement and the decline of passenger railroads, but few people probably foresaw the impact on courtship customs and sexual mores. With cars, the balance of power in courtship shifted from the girl, who invited the boy to "call on" her in her parents' home, to the boy, who invited the girl on a "date" outside her parents' control. And of course the automobile gave young couples more freedom for sexual experimentation than they had under the old system. Consider the telephone: In his final novel, TO SAIL BEYOND THE SUNSET, Heinlein has the narrator's father, a doctor, predict at the turn of the nineteenth to the twentieth century that telephones in the home would mean the end of house calls. When the only way to get a home doctor visit was to send someone in person to summon him, patients expected house calls only in emergencies. Once they could contact their family physician by picking up the phone, they would call for less and less urgent reasons, and doctors would soon refuse to cooperate. (This decline probably happened more slowly than Heinlein anticipated; I have a dim memory of the doctor's visiting me at home when I had measles around age five, the only time I've personally encountered a doctor's house call.)

Like the mechanical looms in the early stage of the industrial revolution, most if not all new technologies benefit some people while disadvantaging others. The ease of paying bills and performing other transactions online provides great convenience for most while disadvantaging those who can't afford a computer and internet connection or, like my ninety-year-old aunt, simply decline to adopt those "conveniences." Businesses now routinely expect customers to have internet access and hence make transactions more difficult for those who don't. Cell phones have made fast connections everywhere all the time routine, so that people are often expected to be instantly available whether they want to be or not. Moreover, as pay telephones have been used less and less, they've tended to disappear, so when anybody does need one—whether because he or she doesn't have a mobile phone or because the battery has run down or they're in a dead zone with no cell service—a phone booth is almost impossible to find. I actually "met" a person nagging me for contact information on Goodreads who accused me of lying when I said I didn't own a smart phone. (Yes, I have a cell phone for urgent or otherwise time-sensitive communication away from home, but it's a plain old "dumb" flip model.)

According to Doctorow, science fiction is a Luddite genre because both the historical movement and the fiction "concern themselves with the same questions: not merely what the technology does, but who it does it for and who it does it to."

Margaret L. Carter

Carter's Crypt

Thursday, July 16, 2020

AI and Human Workers

Cory Doctorow's latest LOCUS essay explains why he's an "AI skeptic":

Full Employment

He believes it highly unlikely that anytime in the near future we'll create "general AI," as opposed to present-day specialized "machine learning" programs. What, no all-purpose companion robots? No friendly, sentient supercomputers such as Mike in Heinlein's THE MOON IS A HARSH MISTRESS and Minerva in his TIME ENOUGH FOR LOVE? Not even the brain of the starship Enterprise?

Doctorow also professes himself an "automation-employment-crisis skeptic." Even if we achieved a breakthrough in AI and robotics tomorrow, he declares, human labor would be needed for centuries to come. Each job rendered obsolete by automation would be replaced by multiple new jobs. He cites the demands of climate change as a major driver of employment creation. He doesn't, however, address the problem of retraining those millions of workers whose jobs become superseded by technological and industrial change.

The essay broadens its scope to wider economic issues, such as the nature of real wealth and the long-term unemployment crisis likely to result from the pandemic. Doctorow advances the provocative thesis, "Governments will get to choose between unemployment or government job creation." He concludes with a striking image:

"Keynes once proposed that we could jump-start an economy by paying half the unemployed people to dig holes and the other half to fill them in. No one’s really tried that experiment, but we did just spend 150 years subsidizing our ancestors to dig hydrocarbons out of the ground. Now we’ll spend 200-300 years subsidizing our descendants to put them back in there."

Speaking of skepticism, I have doubts about the premise that begins the article:

"I don’t see any path from continuous improvements to the (admittedly impressive) 'machine learning' field that leads to a general AI any more than I can see a path from continuous improvements in horse-breeding that leads to an internal combustion engine."

That analogy doesn't seem quite valid to me. An organic process (horse-breeding), of course, doesn't evolve naturally into a technological breakthrough. Development from one kind of inorganic intelligence to a higher level of similar, although more complex, intelligence is a different kind of process. Not that I know enough of the relevant science to argue for the possibilities of general AI. But considering present-day abilities of our car's GPS and the Roomba's tiny brain, both of them smarter than our first desktop computer only about thirty years ago, who knows what wonders might unfold in the next fifty to a hundred years?

Margaret L. Carter

Carter's Crypt

Thursday, June 04, 2020

The Rules of Writing

Cory Doctorow's latest LOCUS column explores the issue of whether there are any truly unbreakable writing rules:

Rules for Writers

You're probably acquainted with the collection of "rules" he cites, the Turkey City Lexicon, to which he faithfully adhered for many years. It's a list of colorfully labeled errors into which writers can fall, many of them specific to science fiction:

Turkey City Lexicon

The page begins with a long introduction by Bruce Sterling about the origin and background of the Lexicon. The errors and frequently perpetrated SF tropes are divided into categories such as Words and Sentences, Plots, Common Workshop Story Types, etc. Some of the entries now familiar to most speculative fiction writers include: Tom Swifties (although I prefer to think of them as "Tom Swiftlies," in keeping with the adverbial theme), e.g., "I'm not lying," Pinocchio said woodenly. "Said-bookisms," substituting an outlandishly obtrusive dialogue tag for a simple "said," e.g., "No, Mr. Bond, I expect you to die," Goldfinger gloated. "Call a Rabbit a Smeerp," sticking an exotic name on a mundane animal without changing the creature in any material way. Hand waving, "An attempt to distract the reader with dazzling prose or other verbal fireworks, so as to divert attention from a severe logical flaw."

Doctorow's article links the topic of writing rules to Sterling's nonfiction book THE HACKER CRACKDOWN, leading into the hacker's task of analyzing "which devices were likeliest to contain a fatal error." Inherent difficulty—proneness to error—according to Doctorow, is what the writing "rules" are really all about. At some point in his career, he received the epiphany that the guidelines he'd revered for so long "weren't rules at all! They're merely things that are hard to do right!" In the hands of a Heinlein or an Asimov, for example, an "expository lump" can be fascinating. The rule against exposition is better understood as a warning that "most exposition isn't good, and bad exposition is terrible."

It's sometimes said that there's only one truly unbreakable rule in writing: "Don't be boring." Excellent advice, although hardly specific enough to put into practice. It's on the level of Heinlein's rules for how to succeed as an author, which go something like this: (1) Write. (2) Finish what you write. (3) Submit it to an editor who might buy it. (4) Keep sending it out until it sells. He also advised, "Never rewrite except to editorial order," by which I can't imagine he meant one should submit rough drafts without revision. He apparently meant a writer shouldn't bother rewriting an unsuccessful piece from scratch but should devote his or her energy to producing a new work. Yet Heinlein didn't consistently follow his own advice on that point, as demonstrated by his recent posthumously released book THE PURSUIT OF THE PANKERA. It comprises the abandoned original version of the novel published as THE NUMBER OF THE BEAST. The first half of the text has some differences in detail, while the second half radically diverges. (Personally, I prefer the original draft, which reads much more like "vintage Heinlein" than the fun but meandering, self-indulgent NUMBER OF THE BEAST.)

In short, no ironclad rules, just wise guidelines.

Margaret L. Carter

Carter's Crypt