Showing posts with label J. D. Robb. Show all posts
Showing posts with label J. D. Robb. Show all posts

Thursday, December 05, 2024

Welcome to the Future

Recommended for fans and writers of near-future science fiction: YOU CALL THIS THE FUTURE? THE GREATEST INVENTIONS SCI-FI IMAGINED AND SCIENCE PROMISED (2007), by Nick Sagan, Mark Frary, and Andy Walker, systematically explores fifty examples of scientific, technological, and social developments predicted in fiction from the perspective of which have come true or might plausibly do so. The possibilities range from those that already exist in some form (e.g., cloning, telemedicine, AI marketing, e-books, bionic organs, space tourism) all the way to concepts that may remain flights of fancy, such as warp drive and time travel.

Coverage of each topic is divided into three parts, headed Scientific History, Sighting in Sci-Fi, and Reality. Some also include a section titled "Tech Spec," such as facts about "truth serum" and an explanation of the procedure involved in cloning the famous sheep Dolly. Inventions and developments fall under the categories of Travel and Transportation (of course including flying cars and personal jetpacks); Computers, Cyborgs, and Robots; Communications; Weapons and Security; and the very broad field of Life, Health, and Sex. If the authors hadn't apparently deliberately restricted each category to ten items, doubtless the latter could have included much more content. The text is both highly readable and informative, as well as illustrated with numerous photos and drawings. Commendably, there's also an index. In addition to the book's entertainment value, it could serve as a quick reference source for SF authors.

Although published recently enough to reflect most of the cutting-edge technology we currently have, it leaves plenty of room for speculation about science-fiction devices and techniques that don't exist yet. J. D. Robb's "In Death" mystery series, set around 2060, has featured a combination handheld computer and portable phone called a "link" since the publication of the first novel in 1995. That vision has come true way ahead of schedule. On the other hand, I'm still waiting for the household cleaning robot Robert Heinlein promised we'd have in 1970, in his 1957 novel THE DOOR INTO SUMMER.

On the third hand, consider all the wonders we enjoy that weren't even imagined just decades ago, as celebrated in Brad Paisley's upbeat song "Welcome to the Future":

Welcome to the Future

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, September 26, 2024

Quantity Versus Quality?

Are quantity and quality incompatible strategies or goals? Not according to the observations in these and other similar essays:

Quantity Leads to Quality

The Origin of a Parable

The second article concerns a ceramics class whose teacher divided students into two groups. One would be graded on the quality of the work produced, while the other would be graded solely on amount of output. "Well, came grading time and a curious fact emerged: the works of highest quality were all produced by the group being graded for quantity. It seems that while the 'quantity' group was busily churning out piles of work -– and learning from their mistakes — the 'quality' group had sat theorizing about perfection."

Similarly, we've heard of writers who endlessly polish the first paragraph, first page, or first three chapters to perfection, but when an agent or editor requests the full manuscript, the rest of the work doesn't measure up to the meticulously crafted opening.

The "quantity vs. quality" opposition seems to underlie the contemptuous -- and invalid -- dismissal of prolific writers as "hacks," as if high productivity automatically implies mediocrity. Stephen King used to publish two or three books per year and most years still produces at least two. Nora Roberts regularly releases two "J. D. Robb" mysteries per year and at least one "Nora Roberts" romance (probably more, but I don't keep close track of her output in that genre). Those figures may sound "prolific," but consider: A professional, full-time author, living solely on writing income, probably treats that vocation like a "job," writing several hours most days. Even a slow writer can produce at least 1000 words in two hours, and a faster one more like 1000 words per hour. Postulate only three hours per day, possibly a low estimate for a bestselling pro. 3000 words per day add up to 90,000 words in a month, a draft of a typical novel (if weekends aren't included, allow five to six weeks). A producer of "doorstops" like Stephen King, at that rate, might take two months for a first draft. With that time allotment, the writer could generate three novels in six months -- presumably not continuously, but with breaks in between -- with half the year free for revising, editing, polishing, marketing, and business minutiae. This kind of schedule, of course, assumes abundantly flowing story ideas, but from what I've read, the typical professional writer never has a shortage of those.

I'm reminded of Robert Heinlein's famous rules for success as an author: (1) Write. (2) Finish what you write. (3) Send it to an editor who might publish it. (4) Repeat number 3 until somebody buys it. I don't remember whether he addressed the question of when it's time to give up on a story -- maybe when you've exhausted all possible markets? However, I clearly recall the other "rule" he sometimes added: Never rewrite except to editorial order.

His point was that your time is better used in creating new stories than obsessively revising old material. He would probably agree with the "quantity over quality" proponents who maintain that each fresh project gives you a chance to learn something new about your craft. I would allow for one exception, though -- when you're deeply emotionally invested in one piece of work and have your heart set on getting it "right." My first vampire novel, DARK CHANGELING, conceived in embryonic form when I was thirteen or fourteen, went through multiple incarnations before I felt ready to submit it. After that, rejection feedback showed me its remaining flaws. Another extensive revision finally got it published. The protagonist, half-vampire psychiatrist Roger Darvell, continues to hold a special place in my heart. On the other hand, throughout that multi-decade process, I was writing and publishing other stuff, too.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, June 20, 2024

Do Spoilers Really Spoil?

The latest issue of SKEPTICAL INQUIRER includes an article exploring whether advance exposure to spoilers actually makes the experience of reading a book or viewing a movie (the author mainly discusses films) worse, neutral, or better:

Savoring Uncertainty

The author, Stuart Vyse, starts by analyzing the difference between stories that provide a "clear resolution" and those that end with ambiguities unresolved. He notes, "Given the chaos of everyday life, it’s understandable that people are drawn to stories that make sense and provide closure." He links this tendency to a wish to believe we live in a just universe, offering the TV series LAW AND ORDER as a typical example. There I think he's absolutely right. The traditional detective novel is the most moral of genres. It promises that problems will be solved, questions answered, justice served, and criminals punished. In rare cases when the criminal escapes the grasp of the law, it's because the detective has determined his or her crime was justified. Vyse contrasts the traditional formula with the "noir" subgenre, in which ambiguity reigns, morality comes in shades of gray, and justice is far from guaranteed.

He then discusses the connection, if any, between enjoyment of ambiguity and tolerance of spoilers. He also goes into the definition of a spoiler, which can vary according to the individual experiencing it -- e.g., someone who's naive about the particular genre, such as a small child -- and to what extent the information constitutes "common knowledge." We'd all probably agree that the prohibition on spoilers has run out for mentioning that Romeo and Juliet die at the end of the play, for example. For a century or more, certainly since the first movie adaptations came out, everybody has known Dr. Jekyll and Mr. Hyde inhabit the same body. The phrase "Jekyll and Hyde" has become proverbial. When the novella was first published, however, that secret came as a shocking revelation near the end. Upon the original publication of DRACULA, readers who ignored reviews could have picked up the novel without suspecting the Count's true nature. Nowadays, even elementary-school kids know "Dracula" equals "vampire."

Vyse cites research on whether spoilers decrease appreciation for a work, increase it, or have no effect. Results of various studies yield different answers. I've noticed tolerance for spoilers ranges from the zero-tolerance of fans such as one of our children, who avoids even book cover blurbs if possible, to my own attitude, sympathetic to a comment I read somewhere that a story capable of being "spoiled" by knowledge of what happens isn't worth spoiling. I admit exceptions of course, such as knowing the killer before the big reveal in a murder mystery (on first reading, at least) or works in which the climactic twist is the whole point of the thing, such as THE SIXTH SENSE. I don't at all mind knowing in advance whether a major character will live or die; in fact, I sometimes sneak a peak at the end to relieve the stress of wondering. When the series finale of FOREVER KNIGHT aired, I was glad I'd read a summary before viewing the episode. When I actually saw the devastating final scene, having braced myself for the worst allowed me to feel it wasn't quite so bad as other fans had maintained. Having reread many of my favorite books over and over demonstrates that foreknowledge of the plot doesn't bother me. With that knowledge, I can relax into the pleasure of revisiting familiar characters.

In one of C. S. Lewis's works of literary criticism, he declares that the point of a startling twist in a book or any artistic medium isn't the surprise in itself. It's "a certain surprisingness." During subsequent exposures to the work, we have the fun of anticipating the upcoming surprise and enjoying how the creator prepares us for it. In a second or later reading of a mystery, for example, we can notice the clues the author has hidden in plain sight. We realize how we should have guessed the murderer and admire the author's skill at concealing the solution to while still playing fair with the reader. (Along that line, I was astonished to hear Nora Roberts remark at a convention that she doesn't plan her "In Death" novels written under the name "J. D. Robb" in advance. How can anyone compose a detective story without detailed plotting? She must have to do an awful lot of cleanup in revision.)

Learning the general plot of a novel or film prior to reading or viewing doesn't "spoil" it for me. I read or watch for the experience of sharing the characters' problems, dangers, and joys, discovering how they navigate the challenges of the story, and getting immersed in their emotional and interpersonal growth. Once the "narrative lust" (another phrase from Lewis, referring to the drive to rush through the narrative to find out what happens next) has been satisfied by the first reading or viewing, in future ones we can take a while to savor all the satisfying details we didn't fully appreciate the first time.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt

Thursday, September 09, 2021

More Futuristic Forecasts

"Prediction is hard, especially about the future." Over the past week, I've been rereading LIFE AND TIME, a 1978 collection of essays by Isaac Asimov (some of them written as early as the 1960s). In contrast to the imaginative speculations in his fiction, these articles include serious forecasts about potential developments in technology and society.

Most strikingly, he anticipated the internet, a global repository of information anybody could draw upon. He envisioned everybody on Earth having a personal "channel" just as most people now have individual telephone numbers. We sort of have that system now, considering the unique IP address of each computer as a personal channel. Also, an individual tablet or smart phone serves the same function. Incidentally, J. D. Robb's "In Death" SF mystery series anticipated today's smart phone as the pocket "link" most people in her fictional future carry with them, long before such devices became common in real life. Asimov hailed the future possibilities of lifelong, customized learning through the worldwide computer bank. Granted, many people benefit from the internet in that way, yet the satirical lament too often holds some truth: We have a network that gives us access to the entire accumulated knowledge of humanity, and we use it mostly for political rants and pictures of cats. Asimov suggested computer learning could overcome one of the main disadvantages of our educational system, the necessity for one teacher to instruct a large group of students, making it impossible to adjust lessons to the comprehension level, interests, and learning style of each individual. Computer education could effectively give each pupil a private tutor. Although we've recently had over a year of experience with online education, it's still been mainly a group-oriented activity. Advanced AI might fulfill Asimov's vision. He also foresaw cashless monetary transactions, electronic transmission of documents, and virtual rather than in-person business meetings, all of which exist now. Unfortunately, his expectation that these developments would greatly reduce travel and its attendant pollution hasn't come to pass yet, probably because many employers are reluctant to embrace the full potential of remote work.

On some topics, he was too pessimistic. For example, he foresaw the world population reaching seven billion by the early 21st century, a point we've already passed. However, we're not forced to survive on synthetic nourishment derived from genetically engineered microbes, as he speculated might become necessary. We still eat a lavish variety of fresh foods. He seemed to believe a population of the current level or higher would reduce humankind to universal misery; while many of the planet's inhabitants do live in abject circumstances, Earth hasn't yet become a dreary anthill.

Not surprisingly, Asimov favored genetically modified agricultural products, which already exist, although not in some of the radically altered or enhanced forms he imagined. He also focused on the hope of cleaner energy, perhaps from controlled fusion or large-scale solar power. He proposed solar collectors in orbit, beaming energy down to Earth, far from a practical solution at present. And, as everyone knows, fusion-generated power is only twenty years away—and has been for a generation or more. :) Asimov predicted autonomous cars, almost commercially viable in the present. He also discussed the potential advantages of flying cars, however, without apparently considering the horror of city skies thronged with thousands of individual VTOL vehicles piloted by hordes of amateurs. Maybe self-driving vehicles would solve that problem, being programmed to avoid collisions.

To save energy on cooling and heating as well as to shelter inhabitants from severe weather, he proposed moving cities underground, as in his novel THE CAVES OF STEEL. This plan might be the optimal strategy for colonizing the Moon or Mars. I doubt most Earth citizens would accept it unless it beomes the only alternative to a worldwide doom scenario. Asimov, a devoted claustrophile, seemed to underestimate the value the average person puts on sunshine, fresh air, nature, and open space.

In general, he tended to be over-pessimistic about the fate looming over us unless we solve the problem of overpopulation right now (meaning, from his viewpoint, in the 1980s). As dire as that problem is in the long run, the decades since the publication of the essays in LIFE AND TIME demonstrate that Earth is more resilient than Asimov (and many prognosticators at that time) feared. Moreover, the worldwide birthrate is declining, although the shift isn't spread evenly over the world and for the present global population continues to rise through sheer momentum. Asimov analyzed the issue of whether a demographic pattern of old people far outnumbering younger ones would lead to a rigid, reactionary culture. He maintained that the mental stagnation traditionally associated with aging could be prevented by an emphasis on lifelong learning and creativity. He devoted no attention to the more immediate problem of declining birthrates some nations already begin to face now—a young workforce that isn't large enough to support its millions of retired and often infirm elders. Encouraging immigration would help. (But that's "modpol"—shorthand for modern politics on one list I subscribe to—so I'll say no more about it.) In the long run, however, if and when prosperity rises and births decline worldwide, there won't be anyplace for a supply of young workers to immigrate from.

Asimov seemed over-optimistic about the technological marvels and wondrous lifestyle we'll all enjoy IF over-population and its attendant problems are conquered. He envisioned the 21st century as a potential earthly paradise. Judging from the predictions of such optimists over many decades, just as controlled fusion is always twenty years away, utopia is always fifty years away.

Margaret L. Carter

Carter's Crypt

Thursday, June 10, 2021

Plotting and Discovery

In the June issue of LOCUS, Kameron Hurley writes about how she gets from the beginning of a story to the end:

Endings and Beginnings

I'm always interested in the techniques used by other writers, and Hurley's current procedure isn't quite like any I've come across before. She describes how her method changed from free-writing in a process of discovery all the way through a piece of fiction to a hybrid of freeform and outlining. Early in her career, she "began every story with a scene, an inciting incident, a mood, a situation, and wrote until [she] figured out what happened next." She ended up with "dozens and dozens of beginnings, a few middles, and not a lot of endings." As she points out, it's hard to sell beginnings and middles to publishers.

Now she free-writes the beginning, works on it until the characters and their motivations become clear, and then plots the rest of the book. She needs to write a story opening that establishes all the vital ""problems, relationships, tensions, and setups" before she can move forward. Judging from the rest of the essay, Hurley seems to be very much a character-driven rather than plot-driven writer. She finds that, for her, it's "impossible to write an ending unless the beginning works." She concludes the essay with the principle, "Get the first part right, and you'll find the ending was staring at you all along."

This method runs contrary to the common advice to write the ending first and then work out what needs to happen to get there. Even if a writer doesn't literally compose the final scene first, it's generally assumed that for effective fiction writing the author has to know the culmination all along. On the other hand, Nora Roberts, in answer to a question at a conference session where I heard her speak, claimed she didn't outline her Eve Dallas mysteries (published under the name "J. D. Robb"). She was as surprised by the twists and turns of the murder investigations as Lt. Dallas was. The notion of writing a detective story that way boggled my mind. Imagine the backtracking and revision that must be required to make all the clues fit the solution. Yet clearly this method works for Roberts, who dependably releases two Lt. Dallas "In Death" mysteries every year in addition to the Nora Roberts romances.

I'm one of those dedicated outliners Hurley mentions, who would find her old process, if not exactly "horrifying" as she puts it, distressingly inefficient. As a novice writer, I surged forward through my narratives on waves of inspiration. In my teens, writing short pieces, I found that approach could work well enough, in the sense that I finished stories. (Whether they were any good is a different matter.) Holding a short-story or novelette plot in my head from beginning to end wasn't hard. When I started trying to create novels, though, starting at the beginning and charging forward to the end resulted in often not reaching the end because I'd get bogged down in the middle. I realized I needed to know where the plot was going and the steps along the road. For the same reason, although I used to occasionally write scenes out of order (as Diana Gabaldon, a bestselling "pantser," does), I've long since switched to linear scene-by-scene composition following my outline. With my early novel-writing attempts, if I yielded to the temptation of writing the most "exciting" incidents first, I tended to get bored with the necessary filling-in work. Some "pantsers" find an outline too limiting. I feel just the opposite; the outline liberates me from the fear of getting stuck in the middle and losing interest in the project.

Regardless of one's favorite method of composition, one of Hurley's discoveries has general application: Plot doesn't consist of "what happened to people"; it's "how people respond to and influence the world around them."

Margaret L. Carter

Carter's Crypt

Thursday, January 14, 2021

Sufficiently Advanced Technology

As we know, "Any sufficiently advanced technology is indistinguishable from magic" (Arthur C. Clarke). Conversely, many magical events in older fiction can be duplicated today by mainstream technology. A century and a half ago, someone who witnessed a translucent human figure floating in midair and emitting eerie moans would unquestioningly recognize it as a ghost. Now we'd respond with, "Cool special effect. I wonder how they did that?" Just such an apparition appears in Jules Verne's 1892 novel THE CARPATHIAN CASTLE, on the cusp of the shift between the two probable reactions. The local people think the vision of a dead opera star at the titular castle is her spirit, when it fact it's produced by a sound recording and a projected photograph.

In George du Maurier's 1894 novel TRILBY, the villain, Svengali, uses hypnotism to transform an ordinary girl who's tone-deaf into a famous singer. She can produce exquisite melodies only in a trance. When Svengali dies, she instantly becomes unable to sing. At the time of the novel's publication, little enough was known about hypnosis that this scenario doubtless looked scientifically plausible. Now that we know hypnosis doesn't work that way, Svengali's control over Trilby seems like magic, and to us the story reads as fantasy.

Several decades ago, I read a horror story about an author who acquires a typewriter that's cursed, possessed, or something. He finds that it corrects his typos and other minor errors. Gradually, this initially benign feature becomes scary, as the machine takes over his writing to an ever greater extent. He narrates his experience in longhand, since if using the typewriter he wouldn't even be able to demonstrate an example of a misspelling. At the time of publication, this story was an impossible fantasy. Now it would be merely a cautionary tale of a word processor with an excessively proactive auto-correct feature. From the beginning of J. D. Robb's Eve Dallas science fiction mysteries, set in the late 2050s and early 2060s, almost everybody carries a handheld "link," a combination communications device and portable computer. When the earliest books in the series were published, that device was a futuristic high-tech fantasy. Now the equivalent has become commonplace in real life. But another tool Lt. Dallas uses in her homicide investigations still doesn't exist and remains problematic. Police detectives employ a handheld instrument reminiscent of Dr. McCoy's tricorder to gather data about murder victims. One of its functions is to pinpoint the precise time of death to the minute. That capability would seem to run counter to the intrinsic limitations arising from the nature of the decomposition processes being analyzed. Therefore, the exact-time-of-death function strikes me as irreducibly quasi-magical rather than scientific, something the audience has to accept without dissecting its probability, like the universal translator in STAR TREK.

The distinction between science and magic can get fuzzy when nominal SF has a fantasy "feel." Marion Zimmer Bradley's Darkover series takes place on an alien planet inhabited partly by descendants of shipwrecked Terran colonists. Strict "hard science" readers might not accept psi powers as a real-world possibility, however, and the common people of Darkover regard laran (psi gifts) as sorcery. Anne McCaffrey's Pern series, also set on a planet colonized by migrants from Earth, features fire-breathing, empathic, teleporting, time-traveling dragons. Although these creatures have an in-universe scientific explanation, they resemble the dragons of myth and legend. Robert Heinlein's novella "Waldo" blends SF and what many if not most readers would consider fantasy. The title character lives on a private space station because of his congenital muscular weakness. Yet he overcomes his disability by learning to control his latent psychic talent under the guidance of an old Pennsylvania hex doctor who teaches Waldo how to access the "Other World." Incidentally, "Waldo" offers an example of how even a brilliant speculative author such as Heinlein can suffer a lapse of futuristic imagination. Amid the technological wonders of Waldo's orbiting home, Heinlein didn't envision either electronic books or computer games; a visitor notices paper books suspended from the bulkheads and wonders how Waldo would manage to play solitaire in zero-G.

I've heard of a story (can't recall whether I actually read it) whose background premise states that, in the recent past, the wizards who secretly control the world revealed that all technology is actually operated by magic. The alleged science behind the machines was only a smoke screen. If such an announcement were made in real life, I wouldn't have much trouble accepting it. For non-scientists, some of the fantastic facts science expects us to believe—that we and all the solid objects around us consist of mostly empty space; that the magical devices we used to communicate, research, and write are operated by invisible entities known as electrons; that starlight we see is millions of years old; that airplanes stay aloft by mystical forces called "lift" and "thrust"; that culture and technology have advanced over millennia from stone knives and bearskins to spacecraft purely through human ingenuity—require as much faith in the proclamations of authorities as any theological doctrine does.

Margaret L. Carter

Carter's Crypt

Thursday, October 12, 2017

Villainous Motives

Supervillains generally aspire to destroy or conquer a realm, whether a country, a continent, the planet, or even an entire solar system or galaxy. In a kids' cartoon series current when our children were little (I don't remember which one it was), the league of villains had one goal, "to destroy the universe for their own gain." To me, a drive for conquest purely for the sake of power makes no more sense than that. Why would anybody bother? Who'd WANT to rule the world?

In the new Marvel TV series INHUMANS, there's a society of people with Inhuman powers living secretly on the moon. The antagonist, Maximus, stages a coup to depose his brother, the king, and become the ruler himself. Maximus has several plausible reasons for this goal: As a child, he wanted the kingship, while his brother, the destined heir, had no great desire for the crown. Maximus grew up without Inhuman powers, so others looked down on him; therefore, he's driven to seize power in compensation for his "inferiority." Also, he seems to hold a sincere belief that his brother's policies are bad for Inhuman society and his own rule would benefit their people.

A three-dimensional villain needs plausible motives, especially supervillains with fantastic powers and global or cosmic ambitions. According to an often-cited principle, every villain is the hero of his own story. Why would he or she want to conquer a country, a continent, or the world? A sheer maniacal lust for power isn't enough of a motive to make a credible antagonist. Maybe the character truly believes himself or herself to be the only one who can rule wisely for the good of the country or world. Maybe the character perceives an outside threat to his or her people and preemptively expands his or her dominion before the "threat" can strike first. Or perhaps the antagonist craves power in compensation for some personal hurt suffered in the past or from a secret fear of his or her own inadequacy. If the ruler of the "threatening" country or planet happens to be a relative of the antagonist (as many of the European royal families at the time of World War I were related through Queen Victoria), family jealousies and resentments could contribute to the villain's drive for conquest. On a smaller scale, why did the evil King Ahab (in the Bible) have a neighbor framed for a fictitious crime and executed in order to seize the neighbor's vineyard? Why would a king feel the need to commit such a petty theft? Could it be that Ahab did this BECAUSE he was king and, perhaps, feared for his position when constantly challenged by the prophet Elijah? Maybe Ahab wanted to prove, "I'm the king, so I can have anything I want."

To me, a drive to become a multimillionaire doesn't feel any more credible as a motive than a craving for absolute power. One person can usefully possess only a certain number of houses, cars, or boats. Even at the most rarefied levels of wealth, there has to be an upper limit to the amount one can spend on food, drink, clothes, jewelry, or collectibles. After a certain point, money probably becomes just a means of keeping score. Billionaire Roarke in J. D. Robb's Eve Dallas series—a good guy (although a former crook) rather than a villain—seems to enjoy acquiring more money on the scorekeeping principle, as a move in a game. Also, he does productive things with his wealth; when he buys a building or a company, he makes it better. Maybe a supervillain driven by a craving for money has personal reasons to value the "score" and therefore wouldn't be satisfied even by infinite wealth. Or maybe, deep inside, he's insecure, seeking wealth to make him feel safe, and never able to accumulate enough to fulfill that need. In effect, it comes back to using money as a means to power.

Along the same line, why do rich, powerful men sexually prey on their employees, when they could find any number of women who'd gladly welcome their advances without being forced? Probably because it's the display of power in itself these men crave. It's all incomprehensible to me, so to believe in a power-hungry villain of any kind, I need to know what underlying drive produces this kind of motivation.

Margaret L. Carter

Carter's Crypt