Showing posts with label Robert Heinlein. Show all posts
Showing posts with label Robert Heinlein. Show all posts

Thursday, January 11, 2024

Robotic Companions

A robotic device called ElliQ, which functions as an AI "companion" for older people, is now available for purchase by the general public at a price of only $249.99 (plus a monthly subscription fee):

Companion Robot

As shown in the brief video on this page, "she" has a light-up bobble-head but no face. Her head turns and its light flickers in rhythm with her voice, which in my opinion is pleasant and soothing. The video describes her as "empathetic." From the description of the machine, it sounds to me like a more advanced incarnation of inanimate personal assistants similar to Alexa (although I can't say for sure because I've never used one). The bot can generate displays on what looks like the screen of a cell phone. ElliQ's makers claim she "can act as a proactive tool to combat loneliness, interacting with users in a variety of ways." She can remind people about health-related activities such as exercising and taking medicine, place video calls, order groceries, engage in games, tell jokes, play music or audiobooks, and take her owner on virtual "road trips," among other services. She can even initiate conversations by asking general questions.

Here's the manufacturer's site extolling the wonders of ElliQ:

ElliQ Product Page

They call her "the sidekick for healthier, happier aging" that "offers positive small talk and daily conversation with a unique, compassionate personality." One has to doubt the "unique" label for a mass-produced, pre-programmed companion, but she does look like fun to interact with. I can't help laughing, however, at the photo of ElliQ's screen greeting her owner with "Good morning, Dave." Haven't the creators of this ad seen 2001: A SPACE ODYSSEY? Or maybe they inserted the allusion deliberately? I visualize ElliQ locking the client in the house and stripping the premises of all potentially dangerous features.

Some people have reservations about devices of this kind, naturally. Critics express concerns that dependence on bots for elder care may be "alienating" and actually increase the negative effects of isolation and loneliness. On the other hand, in my opinion, if someone has to choose between an AI companion or nothing, wouldn't an AI be better?

I wonder why ElliQ doesn't have a face. Worries about the uncanny valley effect, maybe? I'd think she could be given animated eyes and mouth without getting close enough to a human appearance to become creepy.

If this AI were combined with existing machines that can move around and fetch objects autonomously, we'd have an appliance approaching the household servant robots of Heinlein's novel THE DOOR INTO SUMMER. That book envisioned such marvels existing in 1970, a wildly optimistic notion, alas. While I treasure my basic Roomba, it does nothing but clean carpets and isn't really autonomous. I'm not at all interested in flying cars, except in SF fiction or films. Can you imagine the two-dimensional, ground-based traffic problems we already live with expanded into three dimensions? Could the average driver be trusted with what amounts to a personal aircraft in a crowded urban environment? No flying car for me, thanks -- where's my cleaning robot?

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, December 21, 2023

Dangerous Gifts

The solstice is upon us! There's hope that within a few weeks darkness will stop falling at 5 p.m. Happy winter holidays!

It might seem natural that if people with arcane psychic talents existed, they would dominate the ungifted majority, whether officially or not, overtly or subtly, gently or cruelly. They might constitute a ruling class like the laran-wielding aristocrats of Marion Zimmer Bradley's Darkover, an order of official problem-solvers like the Heralds of Mercedes Lackey's Valdemar, or an autocratic clique like the sociopathic tyrants of the STAR TREK episode "Plato's Stepchildren." More often than not, however, far from holding exalted status, fictional possessors of such talents are regarded with ambiguity or hostility by their societies.

For example, the Slans in A. E. Van Vogt's classic 1946 novel face relentless persecution because of their powers. Fictional vampires surely inspire deeper horror than many other imaginary monsters because of the hypnotic mind control that renders their victims helpless and even unwilling to resist. Zenna Henderson's People, refugees from a distant planet living secretly on Earth, although benign, are often confronted with suspicion or fear when ordinary earthlings discover their powers. In the Sime-Gen series by Jacqueline Lichtenberg and Jean Lorrah, Gens regard the much less numerous Simes with terror not only because they drain life-energy but because they're suspected of occult abilities such as mind-reading.

Historical romance author Mary Jo Putney recently published the first novel in a new series called "Dangerous Gifts." In this book's slightly altered version of Regency England, psychic powers are known to exist but often viewed negatively. The hero lives happily among a circle of people who share similar gifts, and he works for the Home Office using his abilities for the good of his country. As a child, though, he was brutally rejected by his father because of his wild talents. At the beginning of the story, the gifted heroine is being held prisoner by villains who keep her mind clouded as they plot to use her powers for their nefarious goals. Putney has also written a YA series about an alternate-world Britain where magic is considered a lower-class pursuit, a shameful defect if it shows up in a noble family. The magically endowed heroine's upper-class parents send her to an exclusive but very strict academy that exists to train gifted young people to suppress their powers.

In fiction, miracle workers in general often inspire fear and revulsion rather than awe. Consider Mike, the "Martian" in Heinlein's STRANGER IN A STRANGE land. In real life, too, such people sometimes meet violent ends.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, November 30, 2023

Animal Facial Expressions

A study at the University of Kansas Medical Center "discovered that cats use nearly 300 distinct facial expressions to communicate with one another":

Cats' Facial Expressions

In contrast, humans have 44 different facial expressions, and I was surprised to read that dogs have only 27. Feline expressions of emotion often involve ear movements and whiskers, however, so it's not so strange that they have more "distinct" expressions than we do. I was also surprised that cats' "facial signals" play such a large part in their communications with each other. As this article points out, cats are more social than people usually assume.

Chimpanzees convey a lot of information to each other by subtle facial movements:

How Chimps Communicate with a Look

Lisa Parr, director of the Yerkes National Primate Research Center, discusses how small changes in expression can communicate different emotions. Chimps were tested on how well they could distinguish and identify the significance of other chimps' facial expressions. Studying these behaviors in chimpanzees may contribute to better understanding of human nonverbal communication.

Dogs may have developed some types of facial expressions specifically to communicate with us:

How Dog Expressions Evolved

Of course, as this article mentions, a lot of canine communication occurs through body language. Maybe that's why they haven't evolved as many variations on facial expressions as we have. Also, scent plays a vital role in dogs' experiences of the world, a sensory dimension we almost entirely lack compared to canines.

Quora features questions about why animals of the same species tend to look so much "alike," while human beings have distinct individual appearances. Some answers explain, in addition to the human-centered bias that causes us to make finer distinctions among members of our own species, that many animals have less variation in facial appearance than we do because they rely on other senses such as smell to recognize each other.

If intelligent Martians existed, we might think they all look alike, as the narrator of Heinlein's DOUBLE STAR does at the beginning of the novel. On the other hand, the Martians would probably have trouble telling Earth people apart.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, October 19, 2023

Defining Death

I've been reading a book called WHEN THE "DEAD" ROSE IN BRITAIN, by Nicole C. Salomone. After a forty-page overview of the history of medicine in Europe and Britain, the author delves into "premature burial and the misdiagnosis of death," mostly in the eighteenth and nineteenth centuries. Among the various related topics covered, there's a chapter on European vampire legends, the main reason I bought the book. Over hundreds of years, doctors as well as clergymen and philosophers debated and analyzed in great detail the dividing line between life and death and the criteria for diagnosing death. They distinguished between apparent death (or suspended animation) and absolute death, from which no recovery was possible.

Some physicians explained the essence of aliveness as the "vital spark," rather tautologically defined as the force that maintained life in the body. Later, it was suggested that the vital spark was in fact electricity, a hypothesis seemingly validated by the fact that an electrical current sent through an animal cadaver can make its limbs move. The recognition of the absence of breath and heartbeat as probable but not certain evidence of death inspired development of techniques for resuscitation, some of which produced concrete benefits in reviving victims of drowning and eventually led to CPR as we know it today. Societies for "the Recovery of Persons Apparently Dead" were organized. Salomone seems to accept as fact most of the recorded accounts of people misdiagnosed as dead, often prepared for interment and buried or dissected. On the other hand, the lack of specific details in many of those stories (e.g., names and precisely identified locations) leads me to think a lot were what would now be called urban legends. In any case, a widespread belief in and fear of premature burial in the nineteenth century resulted in the invention of numerous models of "safety coffins."

In modern times, medicine and the law have determined that life resides in the brain. Permanent cessation of brain activity -- "brain death" -- equals the demise of the person. Robert Heinlein's very uneven brain-transplant novel, I WILL FEAR NO EVIL, includes an extended dialogue on this issue, for me the most interesting scene in the book.

If a person has apparently died and been restored to life, was he or she actually dead during the period of "apparent death"? Are "near-death experiences" genuine glimpses of the afterlife or merely the random firing of nerve impulses? Maybe such people are only "mostly dead," like the hero in THE PRINCESS BRIDE.

If science eventually develops a technique for uploading a person's consciousness into a computer, as often envisioned in speculative fiction, is a person whose body has died with the mind preserved in this way alive or dead?

In the Star Trek universe, given that the transporter disintegrates the transportee into component particles that are reassembled at the destination, do people being teleported survive the experience? Or, as Dr. McCoy speculates, do you die every time you step onto the transporter pad, to be replaced by an exact duplicate? If it's an exact duplicate, though, how could you tell? Your memories and personality seem unimpaired. Furthermore, what about the episodes when a transporter accident creates two of the same person? Does destroying one of them or even merging them together (or splitting a new individual generated from two people by the transporter into his component halves, as debated in one VOYAGER episode) count as murder? In the eighteenth century, when the foolproof way of determining whether someone was alive or dead was to wait until the body started to decompose, the quandary was simple by comparison.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt

Thursday, August 03, 2023

Retro Futures

Watching the first few episodes of STAR TREK: STRANGE NEW WORLDS, which takes place during Captain Christopher Pike's command of the Enterprise, started me thinking about the phenomenon of science fiction set in the near future with technology that gets overtaken and surpassed by real-life inventions. "Retrofuturism" brings to mind elevator operators in Huxley's BRAVE NEW WORLD (a world that relies on reproductive tech far beyond our present capacity) or slide rules coexisting with a lunar settlement in Heinlein's HAVE SPACESUIT, WILL TRAVEL. It's an inescapable hazard of writing about the near future that "cutting edge" can quickly become dated. The TV Tropes site has a page about retrofuturism under the term "Zeerust":

Zeerust

The page includes examples from the Star Trek universe under "Live-Action TV." The best-known one from the original series, of course, is the communicator. To avoid having its communicators look outdated in comparison to real-life cell phones, the prequel series ENTERPRISE had to feature devices more "modern" than those shown chronologically later in-universe.

In the original series, Captain Pike appears after the accident that made him a quadriplegic. According to Wikipedia, he operates his whole-body automated chair by brain waves, a not-implausible distant-future invention, in view of the brain-computer interface devices currently in development. Captain Pike, however, can communicate only by activating Yes or No lights on his wheelchair. In our own time, the late Stephen Hawking used a computer program that allowed him to speak through an artificial voice -- although, toward the end of his life, at the rate of only about one word per minute. Thereafter, as explained on Wikipedia, an "adaptive word predictor" enhanced his ability to communicate. The system developed for him used "predictive software similar to other smartphone keyboards." Therefore, surely by two or three centuries in the future, Captain Pike could have equipment that would enable him to produce full sentences in a completely natural-sounding manner.

As the opposite of retrofuturism or Zeerust, much science fiction displays exaggerated optimism about the futuristic features of the near future. Heinlein, in THE DOOR INTO SUMMER, predicted that advanced household robots and commercially available cryogenic "long sleep" would exist in 1970. In the same year, he has the protagonist invent what amounts to an engineering drafting program, something we've had for decades although Heinlein's versions of robotic servants haven't materialized yet. TV Tropes references this phenomenon here:

I Want My Jet Pack

As Yogi Berra is alleged to have said, "It's tough to make predictions, especially about the future."

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, January 26, 2023

Clones as Organ Donors

I've recently read an excellent novel called NEVER LET ME GO, by Kazuo Ishiguro, author of THE REMAINS OF THE DAY. (I've seen the film adaptation of the latter, but I don't plan to watch the movie of NEVER LET ME GO. The story just strikes me as too depressing to view as a dramatization, without being filtered through the narrator's voice as in the book—and I generally LIKE sad stories.) NEVER LET ME GO traces the youth and coming-of-age of children cloned for the sole purpose of serving as organ donors. Kathy, the narrator, and her friends have always known, on some level, what their purpose and inevitable destiny are, but their vague awareness becomes more explicit as they grow to adulthood. The reader learns about their world along with them, through extended reminiscences by Kathy, who as a young adult serves as a "carer" for other donors until she eventually has to assume the latter function herself. She knows once she progresses from carer to donor, she will probably live through three or at most four donations before she "completes," i.e., dies. The clones don't serve as donors for the specific individuals whose DNA they share (whose identities, of course, they never know) but as general organ banks. The characters we follow grow up in a sort of orphanage / boarding school, where they live a fairly good life; they later learn that theirs is one of the best group homes, whereas others treat their inmates worse. We never learn details about the other homes, the background of the cloning project, or the science underlying it. Nor do we find out how the public was induced to accept this radical development. The novel seems to take place in an alternate mid-twentieth-century. This version of England has pre-cellphone, pre-internet technology, yet judging from the apparent ages of older donors mentioned in passing, reliable human cloning has existed for well over twenty years.

The novel focuses on the relationships among the characters, their gradual discovery of the full truth about their own status, and the ethics of treating human beings as manufactured products. Therefore, it doesn't delve into the scientific dimensions of the cloning process. Toward the end of the book, a retired guardian (as their teachers are called) mentions controversies over whether the donors have souls. Nobody brings up the obvious fact that a clone is simply an identical twin conceived at a different time, who grows like any other person and is as human, with as much of a soul (if souls exist) as anybody else. Another unanswered question raised in the story is why the characters can't have babies. There's no biological reason for clones to be infertile. Are they genetically manipulated to be that way? Surgically sterilized in childhood?

Wouldn't it be more efficient for donors to provide spare parts specifically for the people from whom they're cloned? No risk of organ rejection that way. Some of Heinlein's imagined futures include clones produced to supply organs for their originals. In these books, it's clear the cloned bodies never come alive, are never persons at all but only inert shells. One such body is used in THE NUMBER OF THE BEAST to fake the death of Lazarus Long's mother. In principle, an individual could achieve immortality by having his or her brain transplanted into a cloned body when the birth body wears out.

For most purposes, though, why grow a whole body at all? Surely it would be easier to develop cloning technology that could generate particular organs as needed. You could get a new heart, liver, kidney, or whatever with your own DNA and with none of the ethical issues involved in mass-producing live, conscious people to serve as spare-part factories.

So, although NEVER LET ME GO raises fascinating issues, and its characters' plight is deeply moving, it doesn't seem to me a likely portrayal of a realistic scenario.

Margaret L. Carter

Carter's Crypt

Thursday, January 12, 2023

Quest for Longevity

The cover story of the January 2023 NATIONAL GEOGRAPHIC, a 35-page article titled "The Science of Living Longer and Better," explores several different approaches, both theoretical and practical, to the goal of extending the human lifespan. The genetically programmed maximum age for us seems to be around 120 years. However, very few people make it that far.

Numerous drugs enable mice to live as much as 60% longer than normal. Why don't they work on people? Why do certain animals such as naked mole rats and some bats live significantly longer, in proportion to their size, than we do? Why do Greenland sharks live at least 250 years, maybe longer? Altering a single gene in a certain species of roundworms doubles their lifespan while keeping them youthfully energetic, but we're more complicated than worms. Why do people in some societies tend to enjoy longer, healthier lives than the average? Environment? Diet? Exercise? Other lifestyle factors? Some scientists have tried promising drug therapies on themselves, with mixed results. Animal studies show life extension outcomes from severe restriction of calorie intake, but, again, such a regimen hasn't produced similar effects on human subjects. Anyway, personally, if I could lengthen my lifetime by a decade or two that way, I wouldn't bother; adding on years of semi-starvation would be no fun.

Stipulating the natural human upper age limit as about 120 years suggests that the Howard Families project in Robert Heinlein's METHUSELAH'S CHILDREN couldn't work the way the novel portrays it. By the date of the novel, the 22nd century, the typical Howard Families member lives to 150, retaining the appearance and vitality of a person in the prime of life. This situation exists before rejuvenation therapies are invented later in the story. Simply interbreeding bloodlines of naturally long-lived people couldn't extend their maximum ages past the 120-year limit if genes for such extension don't already exist. Moreover, real-life super-centenarians, however vigorous, still look their age, not so youthful they have to adopt new identities to avoid unwelcome attention. The only way the "Methuselahs" of Heinlein's novel could survive and remain young-looking to the age of 150 would be if Lazarus Long had already spread the mutated gene responsible for his apparent immortality through most of the Howard population. (Given the character of Lazarus as portrayed in the later book TIME ENOUGH FOR LOVE, that hypothesis seems not unlikely.) That explanation wouldn't work for the early generations such as Lazarus's own mother and her contemporaries, though. There's no plausible way mere selective breeding for a century or so could produce human beings who live over 100 years with the appearance of well-preserved middle age.

So if we want lifespans like Heinlein's characters, we'll have to develop futuristic technologies similar to those speculated about in the NATIONAL GEOGRAPHIC article. Even so, surpassing the natural limit of 120 years would seem to require something radically beyond those techniques, maybe direct alteration of DNA—such as the hypothetical "cellular reprogramming" mentioned in the article.

Margaret L. Carter

Carter's Crypt

Thursday, October 27, 2022

Is Time Travel Impossible?

A character in C. S. Lewis's posthumously published novel fragment THE DARK TOWER asserts it is. (Granted, one faction within Lewis scholarship maintains THE DARK TOWER wasn't actually written by him, but I don't find that claim convincing. Anyway, the issue doesn't affect the point of the story.) He argues that physical travel to the past or future can't be done for a basic, irrefutable reason: A corporeal trip into a different time necessarily carries all the atoms in one's own body into that other time. But in the past, all those particles existed in other entities in the physical world, whether inanimate objects, living creatures, liquids, gasses, whatever. In the future, those same particles will again be distributed through the environment. The only way you could materialize in a different moment would be if duplicates of each of your atoms, molecules, etc. existed in the same place at the same time. According to the laws of physics as we know them, that's impossible. Therefore, physical time travel is forever, irrevocably ruled out, unless we invoke magic rather than science.

That story is the only place where I've encountered this argument, which strikes me as highly convincing. On this hypothesis, other temporal "locations" could be only viewed, never visited. Accordingly, Lewis's character has invented a device for viewing other times, although it turns out the true situation is more complicated than he believed.

While I've come across other stories of observing rather than traveling to some non-present time, I don't remember any that offer a theoretical grounding for the impossibility of temporal travel in the flesh. It's not unusual in time-travel fiction, however, for a traveler to be unable to exist in the same location more than once in the same moment. In Dean Koontz's LIGHTNING, a traveler can't visit a place/time where he already is/was. He's automatically shunted away from that point. In Connie Willis's series about time-traveling historians from a near-future Oxford University, the same prohibition applies, but it's not clear whether the simultaneous existence of two of the same person is outright impossible or would produce a catastrophic result if it accidentally happened. In such works as the Harry Potter series, THE TIME TRAVELER'S WIFE, and Robert Heinlein's "By His Bootstraps," on the other hand, any number of you can be in the same point in space/time at once.

To me, the former rule seems more plausible, because it makes the issue of the same material object being in two places at once less obvious, although I've enjoyed lots of fiction in the second category. One possible way to get around the problem raised in Lewis's DARK TOWER: Instead of a corporeal leap into a different time, travelers might project their consciousness and build temporary bodies in the other time by "borrowing" stray particles from the surrounding air, water, and earth. When the traveler released the borrowed matter to return to his or her point of origin, the particles would dissipate harmlessly into the environment. Another method of bypassing the problem shows up in the new QUANTUM LEAP series: The leaper's consciousness occupies the body of a person in the past, presumably suppressing the host's personality in a sort of temporary, benign possession. (The time-shift operated differently in the original series, while this version does leave unanswered the question of where the leaper's body is while his immaterial consciousness travels to multiple past eras.)

Margaret L. Carter

Carter's Crypt

Thursday, September 15, 2022

The Meaning of Money

What gives money (or any "moneylike" form of currency) its value? What makes us willing to accept it in exchange for concrete items of value? Cory Doctorow dissects this conundrum in his latest LOCUS column:

Moneylike

After an attempt to define money, he explores its origin. He rejects the familiar hypothesis of its having been invented to solve the cumbersome difficulties of barter, labeling this a "folk-tale." Instead of a "bottom-up" model of the creation of media of exchange, he describes money as a "top-down" system imposed by governments, which required the existence of currency to collect taxes in order to provision their armies. Where, then, does the money itself come from? It's generated by governments, and problems can occur if the state issues either too much or too little of it. Doctorow illustrates and analyzes this model at length in an extended parable. Items other than official currency can be "moneylike," such as gift certificates. Elaborating on the concept of "moneylike" media of exchange, he goes into detail about how cryptocurrency works, especially with reference to internet ransomware.

Robert Heinlein includes a discussion of what constitutes value in STARSHIP TROOPERS, where the narrator's high-school teacher refutes the claim that labor alone creates value. Heinlein's TIME ENOUGH FOR LOVE contains an amusing scene in which Lazarus Long, acting as the banker for a frontier planet colony, destroys a batch of paper money, to the horror of the man he's dealing with. Lazarus has to explain that money doesn't consist of a physical thing with objective value, but a consensus reality people agree on. As long as Lazarus keeps a record of the serial numbers from the bills he gets rid of, there's no need to preserve the bills themselves (which pose a theft risk).

In one of Terry Pratchett's Discworld novels, the capital city adopts the Golem Standard. What could serve as a better backing for currency than objects that are almost impossible to steal, counterfeit, or destroy (especially since they're sapient and can defend themselves)?

In the Star Trek universe, conflicting information about the future economy appears in the various series. In the original series, Starfleet personnel must get paid somehow, as shown by Uhura's purchase of a tribble in "The Trouble with Tribbles." Outside of Starfleet, the existence of money is confirmed in "Mudd's Women" and the episode in which Spock poses as a Vulcan merchant. Supposedly by the time of STAR TREK: THE NEXT GENERATION the ubiquity of replicators has made the Federation a post-scarcity society with no need for money. Yet on the fringes (as in DEEP SPACE NINE) and outside the Federation's borders, as made clear by the Ferengi veneration of profit, money exists. Gold-pressed latinum as a medium of exchange is explained on the premise that it's one of the few substances incapable of being replicated. (We have to assume dilithium crystals must fall into the same category, or else obtaining them wouldn't be such a vital preoccupation in the original series.) It seems reasonable that luxury goods in the form of items not produced by replicators, such as the Picard family's wines, would require a medium of exchange for their sale. Or are we to assume creators of such products make them for the sheer joy of the process and give them away? Regardless of post-scarcity abundance, widespread actions like that would imply a radical change in human nature that we don't witness among the Terrans of the Star Trek universe in any other behavioral category.

Margaret L. Carter

Carter's Crypt

Thursday, June 30, 2022

Communicating with Pets

Netflix has a new series called THE HIDDEN LIVES OF PETS. Although it sounds as if it should reveal what our pets do when we're not watching, it actually deals with the intelligence, sensory perceptions, etc. of domestic animals. Dogs, cats, and birds feature prominently, of course, but also such creatures as rabbits, small rodents, turtles, and even soccer-playing goldfish. The episode about communication between people and animals includes a lot of video footage about a dog named Bunny who has become famous for learning to use electronic push-buttons to "talk." This system goes way beyond the battery-operated collars that attempt to translate canine barks and body language into verbal messages (prerecorded and linked to various dog behaviors by the owner):

Petspeak Collars

Here's an article about Bunny, who became the subject of a research project at UC San Diego after she and her mistress, Alexis Devine, amassed millions of TikTok followers:

Bunny the Talking Dog

This dog communicates by pressing buttons on a floor mat, each activating a prerecorded word. As the article mentions, this system is similar to the experiments in which apes learn to select symbols on keyboards to express their wants. At the age of 15 months (in November 2020) Bunny had mastered 70 buttons, including terms such as "scritches," "outside," "play," and "ouch." More problematic words such as "more," "now," "happy," and even "why" are included. While watching the video clips on the Netflix program, I wondered whether an animal could really grasp an abstract concept such as "why." Our dog responds appropriately to quite a few words in addition to the basic commands, such as "upstairs," "downstairs," "inside," "outside," "food," "leash," and "plate." All those refer to concrete objects or actions, though.

Scientists at the Comparative Cognitive Lab "comb through" the Bunny videos rather than checking only a sample. “We want to make sure we’re not just getting cherry-picked clips.” They also watch for the possibility that the dog might be reacting to subtle cues from her human partner instead of recognizing what the buttons represent. And could she "understand" words at all in the sense we mean it? Even Bunny's owner believes she's "made an association between pressing a button and something happening" rather than learning language as we do. On the other hand, human infants start by simply associating sounds with objects, too. Fitting the words into the brain's inborn grammar template comes a little later.

The Petspeak collar and Bunny's button mat remind me of the "voder" the Venusian dragon in Robert Heinlein's BETWEEN PLANETS uses to "talk." Since the highly intelligent dragons don't have vocal organs suitable for human speech, the dragon character wears an electronic device that converts his communications into audible English sentences. It doesn't duplicate the STAR TREK universal translator, being programmed only for dragon-to-English conversion, but in the distant future something like it might be used to communicate with extraterrestrials.

Margaret L. Carter

Carter's Crypt

Thursday, June 09, 2022

Types of Telepathy

In reading THE SCIENCE OF STAR TREK, by Mark Brake, I'm naturally reminded of Vulcan telepathy (not discussed much if at all in this book, though). I don't recall the scope and nature of Spock's telepathic power being strictly defined in the original series. For complete access to the consciousness of another, Vulcans must perform a mind meld. From the episode with the alien Horta, we know language poses no barrier. Spock comprehends the thoughts of aliens through mind melds even if the other species aren't humanoid. However, he seems to exercise some limited form of telepathy without melding; in one later episode, we witness him silently "making a suggestion" to a humanoid antagonist who's not mentally on guard. The "Empath" episode introduces a young woman whose species, if she's typical, is mute. Rather than truly telepathic, they're empathic, sensing emotions but not thoughts. It seems unlikely that this species could have a technologically advanced culture, with no ability to communicate precise concepts, especially abstract ones.

Some theories of telepathy assume the participants must share a language for mutual understanding. Others postulate a universal mental "language" so that access to someone's thoughts automatically allows total comprehension. The title character of "The Mindworm," C. M. Kornbluth's classic psychic vampire tale, can hear the surface thoughts of everybody near him but can understand them only if the subject is mentally verbalizing in a language he knows (a limitation that proves his undoing when he clashes with Eastern European immigrants who recognize him from their native folklore).

Does a telepath "hear" only what the subject is thinking of at the moment or delve at will into all the contents of the person's mind? If the former, can you mask your secrets by deliberately thinking of something else? The telepath in Spider Robinson's VERY BAD DEATHS, so sensitive to the clamor of other people's minds that he lives as a hermit, picks up only surface thoughts. In Robert Heinlein's TIME FOR THE STARS, the telepathic twins "just talk," communicating silently in much the same way they do aloud. Trying to open themselves totally to each other's minds produces chaotic confusion, like being inside someone else's dream, so they don't bother.

On the other hand, some fictional telepaths can rummage through people's minds and quickly learn everything about the subject's past and present. Trying to conceal anything from a psychic with this power by simply thinking of pink elephants would be futile.

Here's a big question that I've never seen addressed, except implicitly in the STAR TREK "Empath" episode: Would a completely telepathic species have a language at all? It seems to me that they wouldn't have a reason to evolve it naturally. On the other hand, for any kind of advanced civilization to develop, surely they would have to invent language sooner or later. They would need a system of writing in order to keep records. They would need a way to communicate at long distance. Even if they got along without speech, surely written language would be a prerequisite for complex societies and any but the most rudimentary technology. It wouldn't evolve naturally, however. Geniuses among them would have to create it, as cultures on Earth invented mathematical notation. A first-contact premise of interstellar explorers from Earth meeting extraterrestrials whose only form of language is written, to whom audible speech is an alien concept, would make an exciting, challenging story.

Margaret L. Carter

Carter's Crypt

Thursday, May 19, 2022

Time Travel as a Curse

If you've read Audrey Niffenegger's THE TIME TRAVELER'S WIFE, you know it's a highly unusual approach to time travel. In fact, I haven't come across any other science-fiction or fantasy novel quite like it. Henry, the traveler, bounces through time uncontrollably and at random. Most often, he lands in moments related to his own life, but not always. Visiting points in the past and future in no particular order, he arrives at each destination disoriented, nauseated, and naked, for he doesn't take anything along on the temporal jaunts. Even tooth fillings, since they aren't technically part of his body, don't stay with him. He has multiple encounters with his wife, Clare, in the past (from his viewpoint on his timeline, after they're married) when she's between the ages of six and eighteen. On one visit, he tells her which dates he will appear on, and she writes them down. Later, when the two of them meet earlier in his timeline (for him at that age, the first time), she gives him the written list, which thereby becomes the source of his knowledge of their predicted meetings. So how does this list exist? As Clare says, it's a mysterious "Mobius" loop. Similarly, Henry appears to his younger self when child-Henry makes his first time leap, into a museum. Adult-Henry knows he'll need to teach child-Henry the rules of time travel because he remembers a friendly stranger doing that for him when he experienced his first leap.

HBO is airing a new series based on the book, starting last weekend. Judging from the first episode, it's going to follow the novel closely. The book's chapters have helpful headings that state the year and how old each character is on his or her timeline in that encounter. The TV program, likewise, has captions at the beginning of each scene to indicate the ages of Henry and Clare at that point. Otherwise, viewers could get hopelessly lost.

I've never encountered another story that portrays time travel as a disability rather than a superpower (although TV Tropes mentions a few). Henry has no way of knowing whether he'll bounce back to his point of origin within minutes or remain stranded for days or more. He has to steal to survive. He frequently gets beaten up, in addition to the hazards of bad weather and the risk of landing in the middle of a street or railroad track. Small wonder that, at the age of twenty, the first occasion in his timeline when he meets Clare, he's a bit of a self-centered jerk. It takes her love, reinforced by her knowledge of the man he will become, to transform him. One of the saddest features of the novel consists of the multiple miscarriages Clare suffers because her unborn babies inherit Henry's mutant gene and spontaneously time-leap out of her womb. Another inevitable source of sorrow for Henry is knowing when he'll die and keeping that information a secret from her.

Unlike some fictional chrononauts, Henry has no problem being in the same time slot more than once. He can and often does meet other versions of himself. In Dean Koontz's LIGHTING, the Germans who come forward from World War II into the present can't jump into a moment where they already exist, a restriction that plays a critical part in the novel's climax. Connie Willis's Oxford-based time travelers (in DOOMSDAY BOOK, TO SAY NOTHING OF THE DOG, etc.) have the same limitation. Whatever force controls the space-time continuum won't allow them to overlap themselves, just as it prevents them from getting too close to any critical historical events they might alter. For Henry, on the other hand, there's no worry about altering the past. Whatever he does in any moment he travels to is simply what he has already done. As in Robert Heinlein's THE DOOR INTO SUMMER, whose protagonist also has the ability to have two of himself in the same spatio-temporal location, anything you "change" in another time period doesn't really change the outcome but causes it to happen the way it was/is supposed to all along. While THE DOOR INTO SUMMER ends happily, with the narrator using a time machine to bring about the optimal conclusion, Heinlein's "All You Zombies—", in which every major character is the same person, whose life endlessly loops upon itself, concludes with a cry of existential despair.

The more one thinks about it, the more this aspect of Henry's time travel seems like a reason for despair. If his life is locked into a preset pattern dependent on events he has already experienced, whether in the past or in the future, what happens to free will? Yet Niffenegger manages to conclude the story on a note of love and fulfillment rather than futility.

Margaret L. Carter

Carter's Crypt

Thursday, April 21, 2022

Pregnancy Alternatives

On this season of one of my favorite TV shows, CALL THE MIDWIFE, a recently married character just suffered a miscarriage. This episode and the overall premise of the series reminded me of the ways some animals seem to have an easier time with reproduction than we do. Suppose women could resorb embryos to terminate an early pregnancy, like rats and rabbits, but consciously and at will? Or wouldn't it be more convenient if we were marsupials? Imagine giving birth painlessly to tiny, underdeveloped offspring and completing gestation in a pouch, which doubles as a cradle and food source for the growing infant. Moreover, performing mundane tasks and working at a career would be facilitated by the ability to carry babies around with us, hands-free, twenty-four-seven.

Better yet, wouldn't it be nice if fathers shared the burdens of gestation? Seahorses, of course, fertilize their mates' eggs in a pouch on the male's body where the eggs are sheltered until they hatch. TV Tropes has a page about this phenomenon in various media:

Mister Seahorse

Remember the TV series ALIEN NATION? The Tenctonese (who have three sexes, female and two types of males, but that's a different topic) transfer the pod holding the fetus from mother to father partway through gestation. The father undergoes all the typical experiences of pregnancy, including birth. If human beings had evolved this system, imagine the radical differences that might have historically existed in women's political rights and career opportunities.

Laying eggs like Dejah Thoris (John Carter's wife in Edgar Rice Burroughs's Mars series) would be a less attractive alternative. Even with high-tech incubators, parental care after hatching would be intensive and prolonged. The babies would be small and helpless, probably more so than real-life human newborns because of the limitations of an egg rather than a womb. The only advantage of oviparous over viviparous reproduction would be that both parents could share the work equally.

How about artificial wombs? In my opinion, they're never likely to become universal and replace natural reproduction as in BRAVE NEW WORLD, in the absence of some catastrophic fertility crisis. As long as the natural method remains viable, the expense and technical complications of in vitro gestation would surely far outweigh the potential convenience, except maybe for the very wealthy. Robert Heinlein's PODKAYNE OF MARS includes a less drastic technological modification of the human reproductive cycle. Some couples (those who can afford the cost, I assume) choose to go through pregnancy and birth at the optimal physiological age for healthy reproduction but bring up the children at the optimal economic stage of the parents' life. They achieve this goal by having newborn infants placed into cryogenic suspended animation until parental career and income factors reach the desired point.

Would I want to have done this, if possible? I'm not sure. Getting through college and graduate school would have been easier without babies and toddlers. On the other hand, young parents probably have more energy for chasing after kids than they would in their thirties or forties, and there's something to be said for "growing up with" one's children. Having given birth four times over the span from age nineteen to age thirty-four, I've experienced both ends of that range.

Margaret L. Carter

Carter's Crypt

Thursday, January 13, 2022

Luddites and SF

The term "Luddite" is typically applied to people who oppose technological advances. That's basically what I've always assumed the word to mean. Well, Cory Doctorow's latest LOCUS column corrects that misconception:

Science Fiction Is a Luddite Literature

Luddites were textile workers belonging to a secret society in England in the early nineteenth century, best known for destroying the newfangled equipment in textile mills. According to Doctorow, however, their primary objective wasn't the destruction of machinery. That was "their tactic, not their goal." Rather, their goal was "to challenge not the technology itself, but rather the social relations that governed its use." Doctorow details some of the local and global changes that resulted from mechanization of the textile industry. Factory owners could have used the new-model looms to improve employment conditions for skilled craftspersons. Instead, employers chose to hire fewer workers at lower wages. The Luddites imagined and agitated in favor of a different path for the industrial revolution. Doctorow gives several examples of how we, today, "are contesting the social relations surrounding our technologies."

New technology always generates social change, often with unanticipated consequences. Robert Heinlein's "The Roads Must Roll" is one story that pioneered the type of science fiction focusing not on aspects such as the technical details of how automated roads work, but on how their existence affects the society that uses them. An obvious real-world example, the automobile, had easily predicted effects on individuals' freedom of movement and the decline of passenger railroads, but few people probably foresaw the impact on courtship customs and sexual mores. With cars, the balance of power in courtship shifted from the girl, who invited the boy to "call on" her in her parents' home, to the boy, who invited the girl on a "date" outside her parents' control. And of course the automobile gave young couples more freedom for sexual experimentation than they had under the old system. Consider the telephone: In his final novel, TO SAIL BEYOND THE SUNSET, Heinlein has the narrator's father, a doctor, predict at the turn of the nineteenth to the twentieth century that telephones in the home would mean the end of house calls. When the only way to get a home doctor visit was to send someone in person to summon him, patients expected house calls only in emergencies. Once they could contact their family physician by picking up the phone, they would call for less and less urgent reasons, and doctors would soon refuse to cooperate. (This decline probably happened more slowly than Heinlein anticipated; I have a dim memory of the doctor's visiting me at home when I had measles around age five, the only time I've personally encountered a doctor's house call.)

Like the mechanical looms in the early stage of the industrial revolution, most if not all new technologies benefit some people while disadvantaging others. The ease of paying bills and performing other transactions online provides great convenience for most while disadvantaging those who can't afford a computer and internet connection or, like my ninety-year-old aunt, simply decline to adopt those "conveniences." Businesses now routinely expect customers to have internet access and hence make transactions more difficult for those who don't. Cell phones have made fast connections everywhere all the time routine, so that people are often expected to be instantly available whether they want to be or not. Moreover, as pay telephones have been used less and less, they've tended to disappear, so when anybody does need one—whether because he or she doesn't have a mobile phone or because the battery has run down or they're in a dead zone with no cell service—a phone booth is almost impossible to find. I actually "met" a person nagging me for contact information on Goodreads who accused me of lying when I said I didn't own a smart phone. (Yes, I have a cell phone for urgent or otherwise time-sensitive communication away from home, but it's a plain old "dumb" flip model.)

According to Doctorow, science fiction is a Luddite genre because both the historical movement and the fiction "concern themselves with the same questions: not merely what the technology does, but who it does it for and who it does it to."

Margaret L. Carter

Carter's Crypt

Thursday, July 16, 2020

AI and Human Workers

Cory Doctorow's latest LOCUS essay explains why he's an "AI skeptic":

Full Employment

He believes it highly unlikely that anytime in the near future we'll create "general AI," as opposed to present-day specialized "machine learning" programs. What, no all-purpose companion robots? No friendly, sentient supercomputers such as Mike in Heinlein's THE MOON IS A HARSH MISTRESS and Minerva in his TIME ENOUGH FOR LOVE? Not even the brain of the starship Enterprise?

Doctorow also professes himself an "automation-employment-crisis skeptic." Even if we achieved a breakthrough in AI and robotics tomorrow, he declares, human labor would be needed for centuries to come. Each job rendered obsolete by automation would be replaced by multiple new jobs. He cites the demands of climate change as a major driver of employment creation. He doesn't, however, address the problem of retraining those millions of workers whose jobs become superseded by technological and industrial change.

The essay broadens its scope to wider economic issues, such as the nature of real wealth and the long-term unemployment crisis likely to result from the pandemic. Doctorow advances the provocative thesis, "Governments will get to choose between unemployment or government job creation." He concludes with a striking image:

"Keynes once proposed that we could jump-start an economy by paying half the unemployed people to dig holes and the other half to fill them in. No one’s really tried that experiment, but we did just spend 150 years subsidizing our ancestors to dig hydrocarbons out of the ground. Now we’ll spend 200-300 years subsidizing our descendants to put them back in there."

Speaking of skepticism, I have doubts about the premise that begins the article:

"I don’t see any path from continuous improvements to the (admittedly impressive) 'machine learning' field that leads to a general AI any more than I can see a path from continuous improvements in horse-breeding that leads to an internal combustion engine."

That analogy doesn't seem quite valid to me. An organic process (horse-breeding), of course, doesn't evolve naturally into a technological breakthrough. Development from one kind of inorganic intelligence to a higher level of similar, although more complex, intelligence is a different kind of process. Not that I know enough of the relevant science to argue for the possibilities of general AI. But considering present-day abilities of our car's GPS and the Roomba's tiny brain, both of them smarter than our first desktop computer only about thirty years ago, who knows what wonders might unfold in the next fifty to a hundred years?

Margaret L. Carter

Carter's Crypt

Thursday, June 04, 2020

The Rules of Writing

Cory Doctorow's latest LOCUS column explores the issue of whether there are any truly unbreakable writing rules:

Rules for Writers

You're probably acquainted with the collection of "rules" he cites, the Turkey City Lexicon, to which he faithfully adhered for many years. It's a list of colorfully labeled errors into which writers can fall, many of them specific to science fiction:

Turkey City Lexicon

The page begins with a long introduction by Bruce Sterling about the origin and background of the Lexicon. The errors and frequently perpetrated SF tropes are divided into categories such as Words and Sentences, Plots, Common Workshop Story Types, etc. Some of the entries now familiar to most speculative fiction writers include: Tom Swifties (although I prefer to think of them as "Tom Swiftlies," in keeping with the adverbial theme), e.g., "I'm not lying," Pinocchio said woodenly. "Said-bookisms," substituting an outlandishly obtrusive dialogue tag for a simple "said," e.g., "No, Mr. Bond, I expect you to die," Goldfinger gloated. "Call a Rabbit a Smeerp," sticking an exotic name on a mundane animal without changing the creature in any material way. Hand waving, "An attempt to distract the reader with dazzling prose or other verbal fireworks, so as to divert attention from a severe logical flaw."

Doctorow's article links the topic of writing rules to Sterling's nonfiction book THE HACKER CRACKDOWN, leading into the hacker's task of analyzing "which devices were likeliest to contain a fatal error." Inherent difficulty—proneness to error—according to Doctorow, is what the writing "rules" are really all about. At some point in his career, he received the epiphany that the guidelines he'd revered for so long "weren't rules at all! They're merely things that are hard to do right!" In the hands of a Heinlein or an Asimov, for example, an "expository lump" can be fascinating. The rule against exposition is better understood as a warning that "most exposition isn't good, and bad exposition is terrible."

It's sometimes said that there's only one truly unbreakable rule in writing: "Don't be boring." Excellent advice, although hardly specific enough to put into practice. It's on the level of Heinlein's rules for how to succeed as an author, which go something like this: (1) Write. (2) Finish what you write. (3) Submit it to an editor who might buy it. (4) Keep sending it out until it sells. He also advised, "Never rewrite except to editorial order," by which I can't imagine he meant one should submit rough drafts without revision. He apparently meant a writer shouldn't bother rewriting an unsuccessful piece from scratch but should devote his or her energy to producing a new work. Yet Heinlein didn't consistently follow his own advice on that point, as demonstrated by his recent posthumously released book THE PURSUIT OF THE PANKERA. It comprises the abandoned original version of the novel published as THE NUMBER OF THE BEAST. The first half of the text has some differences in detail, while the second half radically diverges. (Personally, I prefer the original draft, which reads much more like "vintage Heinlein" than the fun but meandering, self-indulgent NUMBER OF THE BEAST.)

In short, no ironclad rules, just wise guidelines.

Margaret L. Carter

Carter's Crypt

Thursday, April 02, 2020

Accessible Writing

The April 2020 issue of RWR (magazine of the Romance Writers of America) contains an article titled "The Literary Craft of Accessibility," by Rebecca Hunter. She begins by analyzing the difference between literary fiction and genre fiction, for which she focuses on level of accessibility: "Literary fiction expects the reader to come to the book, while genre fiction books come to the reader." To put it simply, literary fiction expects the reader to work harder. It would be easy to conclude that denser novels are therefore of higher quality than less "difficult" works, a "false—and harmful—hierarchy" the author warns against. I readily agree that a "literary" novel may be difficult and dense for the sheer sake of difficulty, putting unnecessary roadblocks in the reader's path from the mistaken notion that lucid prose and a clear narrative thread equate to "dumbing down." And a genre novel can include deep themes that make a reader think and challenge her established assumptions.

Hunter undercuts her cautionary reference to false hierarchies, in my opinion, by contrasting "lyrical" and "thoughtful" with "fast-paced" and "light," the latter suggesting a "more accessible style." A genre novel can be accessible, yet sedately paced and deeply emotional. Some factors she lists as contributing to degree of accessibility include length of sentences, breadth of vocabulary, balance among action, atmosphere, and ideas, moral clarity or ambiguity, how clearly the characters and plot fulfill "expectations set in the beginning of the story," and "use of cliches, idioms, and other familiarities." I have reservations about some items on the list. For example, I don't think a novel has to lean heavily toward "action" to be accessible. Many romance novels don't, nor do many vintage favorites in other genres. GONE WITH THE WIND is one perennial bestseller that has many more reflective and emotional scenes than action scenes in the popular sense of the word. I find the mention of "cliches" off-putting; while familiar tropes, handled well, can be welcome, an outright "cliche" is another matter. Another feature, "amount of emotional complexity spelled out for readers," sounds as if excessive telling over showing is being recommended. Every writer must balance all these elements in her own way, of course, and Hunter does address the shortcomings of cliches and "telling." She points out that "frankly, there are lots of readers who like this familiarity and clarity." So an author needs to know her target audience well. "Each reader's preferences are different. . . .there are readers for all accessibility levels." Hunter also discusses theme, which she defines as "an open-ended question our story asks" and briefly covers the possibility of increasing a work's complexity by adding additional thematic layers.

Personally, I enjoy a book with a varied, challenging vocabulary and complex characters and emotions. What make me impatient are works that appear to be confusing for the sake of confusion, such as failing to clearly distinguish characters from each other or coming to a conclusion that leaves the reader with literally no way to be sure what happened—by which I mean, not an ambiguous ending deliberately designed to allow multiple interpretations, but one in which it's impossible to puzzle out the plain sense of what transpires on the page. As Marion Zimmer Bradley used to say in her submission guidelines, "If I can't figure out what happened, I assume my readers won't care." Levels of acceptable "accessibility," of course, vary over the decades and centuries according to the fashions of the times. Long descriptive and expository passages, common in nineteenth-century novels, would get disapproved by most editors nowadays, no matter how well written. Something similar to the opening paragraphs of Dickens' A TALE OF TWO CITIES ("It was the best of times; it was the worst of times. . . ."), although accessible in the sense of easily understandable, probably wouldn't be accepted by most contemporary publishers. It also used to be common for authors to include untranslated passages in foreign languages, especially in nonfiction but sometimes even in fiction. Most nonfiction writers up through the early twentieth century assumed all educated readers understood Latin and Greek. Dorothy Sayers inserted a long letter in French into her Lord Peter Wimsey mystery CLOUDS OF WITNESS; the publisher insisted on having a translation added. On the other hand, to cite a contemporary example, in Barbara Hambly's Benjamin January mysteries, set in Louisiana of the 1830s, January's erudite friend Hannibal often includes Greek and Latin quotations in his speech. They add flavor to the story's atmosphere, but understanding them is rarely necessary for following the story; when it is, Hambly clues us in as needed. Readers who'd be put off by this kind of linguistic play simply don't form part of her target audience, but then, such people probably aren't fans of historical mysteries in general, which require openness to navigating an unfamiliar time and place.

Hunter's article also doesn't discuss accessibility in relation to genre conventions. For instance, Regency romance authors probably assume their target audience has some familiarity with the period, if only from reading lots of prior novels in that setting. Science fiction, in particular, expects a certain level of background knowledge from its readers. We should know about hyperdrive and other forms of FTL travel, if only enough to suspend disbelief and move on with the story. Some SF stories expect more acquaintance with the genre than others. Any viewer with a willing imagination can follow the original STAR TREK, designed to appeal to a mass audience. Near the other end of the accessibility spectrum, the new posthumous Heinlein novel, THE PURSUIT OF THE PANKERA (the previously unpublished original version of his 1980 NUMBER OF THE BEAST), envisions a reader with a considerable fannish background. The ideal reader knows or at least has some acquaintance with Edgar Rice Burroughs' Barsoom books and E. E. Smith's Lensman series. That reader also has a high tolerance for dialogue about the intricacies of alternate universes and the heroes' device for transiting among them, on which the text goes into considerable detail at some points. Optimally, that fan will also have read Heinlein's own previous work, at least his best-known books. This novel is not the way to introduce a new reader to Heinlein, much less to SF in general.

It seems to me that "accessibility" forms a subset of the larger topic of reader expectations. So the question of how accessible our work is (or needs to be) comes back to knowing the expectations of the target audience.

Margaret L. Carter

Carter's Crypt

Thursday, November 14, 2019

Life Not as We Know It

One episode of the BBC series PLANET EARTH: BLUE PLANET II highlights denizens of the ocean depths that thrive independently of energy from the sun. They rely on energy from other sources, and some have no need of oxygen.

Some live in methane-rich environments known as "cold seeps" or "cold vents":

Cold Seeps

These spots aren't "cold" in the absolute sense, just less hot than the hot vents referenced below. Bacteria, mussels, and tube worms live happily in the methane or hydrogen sulfide of these ecosystems. Some individual tube worms have been estimated to survive for 250 years in such locations. If similar life-forms developed on other planets in environments like these, in the absence of competition from oxygen-dependent and sunlight-dependent creatures, and eventually became intelligent, a lifespan of that length would allow them plenty of time to learn and pass on their learning to future generations.

Other organisms have evolved in the volcanically active areas around hydrothermal vents, where water can reach temperatures of several hundred degrees Fahrenheit:

Hydrothermal Vents

Like inhabitants of cold vents, life-forms in hydrothermal vents also depend on chemosynthetic bacteria for food. Crustaceans, tube worms and other types of worms, gastropods such as snails, and even eels are among some of the creatures that populate these locations. It's believed that life on Earth may have originated in an environment like this. Again, on a planet where this kind of environment dominated, we can imagine that hyrdrothermal-vent species might evolve sentience and intelligence.

So living creatures can exist right here on our planet in conditions that would be lethal to most Earth species. The quest for extraterrestrial life needn't confine itself to oxygen-rich environments. Moreover, we don't have to expect advanced beings to conform to the familiar humanoid shape. In Heinlein's HAVE SPACE SUIT, WILL TRAVEL, the teenage narrator describes the villain, an invader from a distant solar system. He's puzzled that these decidedly inhuman-looking aliens can survive in Terran environmental conditions, until he reminds himself that spiders resemble us much less, yet they live in our houses. We don't have to search beyond Earth's ecological systems to find bizarrely alien creatures.

The Wikipedia articles include some color photos of those exotic organisms. Take a look.

Margaret L. Carter

Carter's Crypt

Thursday, January 10, 2019

Robots in the Home

More new developments in household robotics:

Are Domestic Robots the Way of the Future?

One problem foregrounded by this article is people's expectation for robots to look humanoid, versus the optimal shape for efficiently performing their functions. A real-world autonomous floor cleaner, after all, doesn't take the form of "a humanoid robot with arms" able to "push a vacuum cleaner." A related problem is that our household environments, unlike factories, are designed to be interacted with by human beings rather than non-humanoid machines. Research by scientists at Cornell University has been trying "to balance our need to be able to relate ­emotionally to robots with making them genuinely useful."

Dave Coplin, CEO of The Envisioners, promotes the concept of "social robotics":

Domestic Robots Are Coming in 2019

He advocates "trying to imbue emotion into communication between humans and robots," as, for example, training robots to understand human facial expressions. He even takes the rather surprising position that the household robot of the future, rather than a "slave" or "master," should be "a companion and peer to the family.” According to Coplin, the better the communication between us and our intelligent machines, the more efficiently they will work for us. Potential problems need to be solved, however, such as the difficulty of a robot's learning to navigate a house designed for human inhabitants, as mentioned above. Security of data may also pose problems, because the robot of the future will need access to lots of personal information in order to do its job.

In Robert Heinlein's THE DOOR INTO SUMMER, the engineer narrator begins by creating single-task robots that sound a bit like the equivalent of Roombas. Later, he invents multi-purpose robotic domestic servants with more humanoid-like shapes, because they have to be almost as versatile as human workers. We're still a long way from the android grandmother in one of Ray Bradbury's classic stories, but robots are being designed to help with elder care in Japan. According to the article cited above, some potential customers want robots that may offer "companionship" by listening to their troubles or keeping pets company while owners are out. Now, if the robot could walk the dog, too, that would really be useful. The January NATIONAL GEOGRAPHIC mentions medical robots that can draw blood, take vital signs, and even shift bedridden patients. One snag with such machines: To have the power to lift objects of significant weight, not to mention human adults, a robot has to be inconveniently heavy (as well as expensive).

On the subject of balancing usefulness with the need for relating emotionally: In Suzette Haden Elgin's poem "Too Human by Half," an elderly woman grows so attached to her lifelike household robot that she can't bear to replace it when it starts to malfunction. "Replace JANE? . . . Just because she's getting OLD?" Therefore, when the company launches its next model, "they made every one of the units look exactly like a broom."

Margaret L. Carter

Carter's Crypt

Thursday, December 06, 2018

Alternate Timelines

One of my favorite authors, S. M. Stirling, recently launched a new alternate-history series with BLACK CHAMBER, published in July of this year. His website has begun displaying sample chapters from the first sequel, due in spring of 2019. Reading them started me thinking about the effects small or large changes might have on the historical timeline. The POD (point of departure) for the Black Chamber universe—the moment when it diverges from our reality—occurs in 1912, when President Taft dies prematurely and Theodore Roosevelt returns to the White House (instead of Woodrow Wilson becoming President). With no constitutional term limits for the presidency at that time, Roosevelt has free rein to shape the nation according to his principles. Not only the circumstances of U.S. involvement in World War I but the direction of the entire twentieth century will change. The main story line of the novel begins in 1916.

If you could go back in time and alter the twentieth century for the better, what single action would you take? Killing Hitler before he can do any damage immediately springs to mind, of course. However, aside from the ethical problem of murdering a person who hasn't yet committed evil deeds, killing Hitler never works. TV Tropes even has a page on this topic, "Hitler's time-travel exemption." One example: In an episode of the later incarnation of THE TWILIGHT ZONE, a time traveler from the future installs herself as a servant in the household of Hitler's parents. She finally manages to kill baby Adolf along with herself. The nursery maid, however, is so terrified of Herr Hitler's probable reaction to the loss of his son that she substitutes a look-alike infant taken from a beggar woman. So history still plays out with an Adolf Hitler, just not the original one. Nonviolent ways of eliminating Hitler might work, such as preventing his parents from meeting, kidnapping the baby and having him adopted by a nice English couple, or giving young Adolf a scholarship to art school. Would forestalling his political career actually prevent the war, though? Some authors speculate that, given the conditions of post-World-War-I Europe, the Nazi Party would come to power anyway with a different, possibly worse tyrant in charge.

Arguably, the most productive single thing you could do to avert the catastrophic events of the twentieth century would be to go to Sarajevo in 1914 and arrange for Archduke Franz Ferdinand's car to be re-routed so the assassin would never have a chance to shoot him. But would the erasure of the assassination definitely prevent the Great War? The nations of Europe, with their weapons development and entangled alliances, had been building toward that conflict for decades. It's not unlikely that some other spark would have set off the conflagration anyway. Various speculative fiction authors disagree about the ease of altering the timeline. Do we embrace the "Great Man" theory, where the removal of one person makes all the difference? Or do we lean toward Heinlein's position that "when it's time for railroads, people will railroad"? In Stephen King's novel about a time traveler who tries to prevent the assassination of President Kennedy, saving Kennedy creates a major disruption in the flow of history, but not for the better.

Jo Walton's fascinating novel MY REAL CHILDREN takes a unique approach to the theme. The protagonist, as an old woman in a nursing home, remembers two different lives in two worlds (neither of them our own timeline). In one, the more prosperous and peaceful version of the twentieth and early twenty-first centuries, she suffers through an unhappy marriage. In the other timeline, which verges on dystopia, she has a generally happy life. If she has the power to make one of them definitively "real," which should she choose?

In most of Heinlein's time-travel fiction, he reveals that no change actually occurs, because the traveler's actions simply bring about what was destined to happen anyway. The past as we know it already includes whatever input we contribute—as in, for instance, THE DOOR INTO SUMMER. Some other writers postulate that history inevitably tries to repair itself when "damaged." Diana Gabaldon's Outlander series illustrates the elasticity of the timeline. Claire (a visitor to the eighteenth century from the twentieth) and her husband Jamie can make small changes, but all their attempts to prevent or mitigate Bonnie Prince Charlie's disastrous 1745 campaign fail. The ultimate example of this principle may be "The Men Who Murdered Mohammed," by Alfred Bester. The time traveler assassinates a series of successively more important personages without ever managing to make a permanent mark on the past.

The opposite approach postulates that the slightest change will have vast consequences—the "butterfly effect." Appropriately, Ray Bradbury provided the classic example of this theory in "A Sound of Thunder," when a member of a tourist group traveling to the age of the dinosaurs alters his own future by accidentally killing a butterfly. The trouble with this story, alas, is that if a small change that far back could shift the entire direction of history, by the traveler's present day the alterations would have snowballed to such an extent that his native time would become unrecognizable, not just subtly distorted toward a dystopian outcome. On the same principle, consider the many alternate-history stories whose authors introduce famous people from the past in different roles from their real-life ones. Actually, depending on how far back the POD occurs, random alterations in meetings, matings, and conceptions would ensure that most if not all of those people would never be born. But what fun for writers and readers would that be?

Margaret L. Carter

Carter's Crypt