Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

Thursday, January 04, 2024

AI as a Bubble

Cory Doctorow's latest LOCUS column analyzes AI as a "tech bubble." What Kind of Bubble Is AI?

Although I had a vague idea of what economists mean by "bubble," I looked it up to make sure. I thought of the phenomenon as something that expands quickly and looks pretty but will burst sooner or later. The Wikipedia definition comes fairly close to that concept: "An economic bubble (also called a speculative bubble or a financial bubble) is a period when current asset prices greatly exceed their intrinsic valuation, being the valuation that the underlying long-term fundamentals justify." The term originated with the South Seas Bubble of the early eighteenth century, involving vastly inflated stocks. The Dutch "tulip mania" of the seventeenth century offers another prominent example.

Doctorow takes it for granted that AI fits into this category. He begins his essay with, "Of course AI is a bubble. It has all the hallmarks of a classic tech bubble." He focuses on the question of what KIND of bubble it is. He identifies two types, "The ones that leave something behind, and the ones that leave nothing behind." Naturally, the first type is desirable, the second bad. He analyzes the current state of the field with numerous examples, yet always with the apparent underlying assumption that the "bubble" will eventually "pop." Conclusion: "Our policymakers are putting a lot of energy into thinking about what they’ll do if the AI bubble doesn’t pop – wrangling about 'AI ethics' and 'AI safety.' But – as with all the previous tech bubbles – very few people are talking about what we’ll be able to salvage when the bubble is over."

This article delves into lots of material new to me, since I confess I don't know enough about the field to have given it much in-depth thought. I have one reservation about Doctorow's position, however -- he discusses "AI" as if it were a single monolithic entity, despite the variety of examples he refers to. Can all possible levels and applications of artificial intelligence be lumped together as components of one giant bubble, to endure or "pop" together? Maybe those multitudes of different applications are what he's getting at when he contemplates "what we'll be able to salvage"?

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, November 23, 2023

Thanksgiving and Traditions

Happy American Thanksgiving! Ordinarily we spend the weekend after Turkey Day at ChessieCon, formerly Darkover Grand Council, which has traditionally occurred every Thanksgiving weekend for several decades. Last November they held their first in-person con since 2019. At that time it moved from the hotel where it's been held for many years to a different one in the same general area, north of Baltimore. Attendance turned out to be dismayingly low. Doubtless in part for that reason, the committee decided to cancel this year's event and take time off to regroup and rethink the con's future. On top of that, the hotel it had moved to abruptly closed a few months ago. Where will ChessieCon go next, if anywhere? Will we lose this venerable local SF/F tradition?

Thanksgiving traditions typically include the familiar turkey and its required accessories, e.g., stuffing, potatoes, and gravy. Some households, however, depart from the conventional menu for more adventurous fare. For instance, our second son and his family eschew turkey in favor of entrees such as homemade sushi. Many Americans also consider TV football essential on that day.

In mainline Christian churches, the first Sunday of Advent, the build-up to Christmas, falls on or near the first weekend of December. Most of us now accept as inevitable and proper that the winter holiday shopping and decorating season begins on the day after Thanksgiving. However, when Christmas and other winter-themed displays in stores overlap with Halloween merchandise, and internet merchants advertise "Black Friday" sales starting over a week early, many of us think the extension of the season is going too far. Commercialization of Christmas gifting, though, started almost simultaneously with the invention of the family-centered Christmas as we know it in the nineteenth century. Moreover, people have been complaining about it almost as long.

The popular film A CHRISTMAS STORY, aka the BB gun movie, set around 1940, based on Jean Shepherd's fictionalized memoir IN GOD WE TRUST: ALL OTHERS PAY CASH, illustrates how even before the middle of the twentieth century intensive holiday gift advertising and department store Santas already pervaded the Christmas-season consciousness of American children. Our parents and grandparents didn't experience some newer Christmas traditions that existed in our childhoods and those of our children, because those customs depend on new technology, mainly television. Many people watch the Macy's Thanksgiving Day parade, culminating in the arrival of Santa Claus. They also enjoy favorite Christmas specials over and over. Nowadays we don't have to wait for those treasured memories to show up in reruns; we can view them on home video media or streaming services at will. I always watch at least two versions of A CHRISTMAS CAROL every year, usually more. We can also look forward to original programs reliably appearing every December, such as one of my favorites, the annual new CALL THE MIDWIFE Christmas episode.

If our grandparents had been able to peer into the future and note these novel customs, they might have disdained them as soulless products of technology, violating the true spirit of the season. For us and our children, recurring winter holiday movies and TV shows simply became an expected part of the celebration, cherished traditions as much as the tree, the feast, and the presents.

When some earthlings live in artificial habitats on the Moon or Mars or in generation-spanning starships, what holiday traditions will they bring along, and what fresh customs will life in extraterrestrial environments demand? It seems likely that even in locations vastly distant from Earth's solstice cycles, human beings will cling to the core elements of their seasonal celebrations.

Margaret L Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, June 15, 2023

The Internet Knows All

This week I acquired a new HP computer to replace my old Dell, which had started unpredictably freezing up at least once per day. Installing Windows 11 didn't fix it. It had reached the point where even CTRL-ALT-DEL didn't unfreeze it; I had to turn it off manually and restart every time it failed. It feels great to have a reliable machine again.

Two things struck me about the change: First, the price of the new one, bundled with a keyboard and mouse, about $500. Our first computer, an Apple II+ purchased as a gift at Christmas of 1982, cost over $2000 with, naturally, nowhere near the capabilities of today's devices. No hard drive, no Windows or Apple equivalent therof, and of course no internet. And in that year $2000 was worth a whole lot more than $2000 now. Imagine spending today's equivalent in 2023 dollars for a home electronic device. Back then, it was a serious financial decision that put us into debt for a long time. Thanks to advances in technology, despite inflation some things DO get cheaper. An amusing memory: After unveiling the wondrous machine in 1982, my husband decreed, "The kids are never going to touch this." LOL. That rule didn't last long! Nowadays, in contrast, we'd be lost if we couldn't depend on our two youngest offspring (now middle-aged) for tech support.

The second thing that struck me after our daughter set up the computer: How smoothly and, to my non-tech brain, miraculously, Windows and Google Chrome remembered all my information from the previous device. Bookmarks, passwords, document files (on One Drive), everything I needed to resume work almost as if the hardware hadn't been replaced. What a tremendous convenience. On the other hand, it's a little unsettling, too. For me, the most eerie phenomenon is the way many websites know information from other websites they have no connection to. For example, the weather page constantly shows me ads for products I've browsed on Amazon. Sometimes it seems that our future AI overlords really do see all and know all.

In response to recent warnings about the "existential threat" posed by AI, science columnist Keith Tidman champions a more optimistic view:

Dark Side to AI?

He points out the often overlooked difference between weak AI and strong AI. Weak AI, which already exists, isn't on the verge of taking over the world. Tidman, however, seems less worried about the subtle dangers of the many seductively convenient features of the current technology than most commentators are. As for strong AI, it's not here yet, and even if it eventually develops human-like intelligence, Tidman doesn't think it will try to dominate us. He reminds us, "At the moment, in some cases what’s easy for humans to do is extraordinarily hard for machines to do, while the converse is true, too." If this disparity "evens out" in the long run, he nevertheless believes, "Humans won’t be displaced, or harmed, but creative human-machine partnerships will change radically for the better."

An amusing incidental point about this article: On the two websites I found by googling for it, one page is headlined, "There Is Inevitable Dark Side to AI" and the other, "There Is No Inevitable Dark Side to AI." So even an optimistic essay can be read pessimistically! (Unless the "No" was just accidentally omitted in the first headline. But it still looks funny.)

Margaret L. Carter

Carter's Crypt

Thursday, April 13, 2023

How Will AI Transform Childhood?

According to columnist Tyler Cowen, "In the future, middle-class kids will learn from, play with and grow attached to their own personalized AI chatbots."

I read this essay in our local newspaper a couple of weeks ago. Unfortunately, I wasn't able to find the article on a site that didn't require registering for an account to read it. The essence of its claim is that "personalized AI chatbots" will someday, at a not too far distant time, become as ubiquitous as pets, with the advantage that they won't bite. Parents will be able to control access to content (until the kid learns to "break" the constraints or simply borrows a friend's less restricted device) and switch off the tablet-like handheld computers remotely. Children, Cowen predicts, will love these; they'll play the role of an ever-present imaginary friend that one can really interact with and get a response.

He envisions their being used for game play, virtual companionship, and private AI tutoring (e.g., learning foreign languages much cheaper than from classes or individual tutors) among other applications. I'm sure our own kids would have loved a device like this, if it had been available in their childhood. I probably would have, too, back when dinosaurs roamed the Earth and similar inventions were the wild-eyed, futuristic dreams of science fiction. If "parents are okay with it" (as he concedes at one point), the customized AI companion could be a great boon—with appropriate boundaries and precautions. For instance, what about the risks of hacking?

One thing that worries me, however, isn't even mentioned in the article (if I remember correctly from the paper copy I neglected to keep): The casual reference to "middle-class kids." The "digital divide" has already become a thing. Imagine the hardships imposed on students from low-income families, who couldn't afford home computers, by the remote learning requirements of the peak pandemic year. What will happen when an unexamined assumption develops that every child will have a personal chatbot device, just as many people and organizations, especially businesses and government offices, now seem to assume everybody has a computer and/or a smart phone? (It exasperates me when websites want to confirm my existence by sending me texts; I don't own a smart phone, don't text, and don't plan to start.) Not everybody does, including some who could easily afford them, such as my aunt, who's in her nineties. Those assumptions create a disadvantaged underclass, which could only become more marginalized and excluded in the case of children who don't belong to the cohort of "middle-class kids" apparently regarded as the norm. Will school districts provide free chatbot tablets for pupils whose families fall below a specified income level? With a guarantee of free replacement if the thing gets broken, lost, or stolen?

In other AI news, a Maryland author has self-published a horror book for children, SHADOWMAN, with assistance from the Midjourney image-generating software to create the illustrations:

Shadowman

In an interview quoted in a front-page article of the April 12,2023, Baltimore Sun, she explains that she used the program to produce art inspired by and in the style of Edward Gorey. As she puts it, "I created the illustrations, but I did not hand draw them." She's perfectly transparent about the way the images were created, and the pictures don't imitate any actual drawings by Gorey. The content of each illustration came from her. "One thing that's incredible about AI art," she says, "is that if you have a vision for what you're wanting to make it can go from your mind to being." And, as far as I know, imitating someone else's visual or verbal style isn't illegal or unethical; it's one way novice creators learn their craft. And yet. . . might this sort of thing, using software "trained" on the output of one particular creator, skate closer to plagiarism than some other uses of AI-generated prose and art?

Another AI story in recent news: Digidog, a robot police K-9 informally known as Spot, is being returned to active duty by the NYPD. The robot dog was introduced previously but shelved because some people considered it "creepy":

Robot Dog

Margaret L. Carter

Carter's Crypt

Thursday, March 31, 2022

SF Versus Fantasy

At this year's ICFA (which I wrote about last week), one of the free goodies distributed at meals was a copy of the March/April 2020 ASIMOV'S magazine. It happened to include a provocative article by David D. Levine called "Thoughts on a Definition of Science Fiction." The author takes an approach to distinguishing science fiction from fantasy that never occurred to me before.

Of course, this perennial and never-settled question has many proposed answers. And many works cross genre boundaries; SF is a "fuzzy set." Anne McCaffrey's Pern and Marion Zimmer Bradley's pre-rediscovery Darkover, although science fiction, have a fantasy "feel." S. M. Stirling's Emberverse series, beginning with DIES THE FIRE, clearly near-future or alternate-history SF, also includes something like magic. Diane Duane's Young Wizards series focuses on the protagonists' learning and using magic—which they prefer to label "wizardry" to avoid the implication that it can do anything, unbounded by rules—yet they visit distant planets and make friends with extraterrestrials. Cases like these are part of why the term "speculative fiction" is so useful.

Levine suggests that the distinction between fantasy and science fiction rests on a fundamental difference between worldviews. Science fiction arises from an Enlightenment worldview and fantasy from a pre-Enlightenment worldview. In SF, "the universe is logical, predictable, and understandable, governed by rules that are impersonal and have no moral dimension." Fantasy, on the other hand, inhabits a universe that "has a moral compass, and is governed by rules that, though they may be understandable, are not necessarily always consistent, logical, or predictable in their application." For example, fantasy contains swords that can be drawn only by the "pure in heart" (a moral dimension). As an extension of this definition, Levine focuses on the central importance of "the means by which characters affect the world," whether by technology or by magic. Using this principle, he maintains that the later Star Wars films, after the original movie, slip further and further into fantasy territory because of the way the Force becomes more powerful and less scientifically plausible (e.g., action at a distance).

While I admire his theory, it doesn't align completely with my own concept of the SF-fantasy divide. I've always seen the distinction as—perhaps too simplistically—primarily a matter of authorial intent as it appears on the surface of the text. If the text claims a scientific rationale for its phenomena, it's science fiction. If not, it's fantasy. Edgar Rice Burroughs's interplanetary adventures count as science fiction, even if most of the science is obsolete. Randall Garrett's Lord Darcy mysteries, set in an alternate-history England in a world where magic plays a commonplace role in society, are fantasy even though the rules of magic are systematic and predictable. What about works such as Madeleine L'Engle's A WRINKLE IN TIME and its sequels and spinoffs, invoking scientific principles, featuring visits to other worlds, and marketed as SF, but containing some elements of apparent magic as well as a religious worldview? Or C. S. Lewis's Space Trilogy, wherein the superhuman intelligences ruling the other planets are also identified as angels? The Wild Sorceress trilogy, co-written by my husband and me, starts as apparent fantasy, to be revealed as SF at the end of the third book. Well, that's where the flexible terms "science fantasy" and "speculative fiction" come in handy.

Margaret L. Carter

Carter's Crypt

Thursday, March 10, 2022

Big Tech Tyranny?

Cory Doctorow's March LOCUS column discusses tech tycoons from the perspective of monopoly and world domination. Well, that phrase may be a bit exaggerated but not totally inapplicable, considering his term "commercial tyrant":

Vertically Challenged

Is meritocracy a "delusion"? Are people such as Mark Zuckerberg (founder of Facebook) unique geniuses, or did they just get lucky? One might maintain that some sort of genius is required to recognize opportunities and take advantage of the "luck," but that's beside Doctorow's point. He argues against "vertical integration" and in favor of "structural separation." Fundamental antitrust principles should forbid mega-corporations from competing with the companies to which they sell services. "Amazon could offer virtual shelf space to merchants, or it could compete with those merchants by making its own goods, but not both. Apple could have an app store, or it could make apps, but not both."

It's easy to see his point. It would be better if Google could somehow be prevented from giving preference in search results to entities in which it has a financial interest. On the other hand, more ambiguous "liminal" cases exist, a point Doctorow himself does acknowledge. For example, "Amazon might say it gives preferential search results to businesses that use its warehouses because it can be sure that those items will be delivered more efficiently and reliably, but it also benefits every time it makes that call." Granting the second half of that sentence, I'm still not sure this practice is a bad thing. Given a choice between two identical products of equal price, I DO tend to choose the one labeled "Fulfilled by Amazon" for that very reliability, as well as speed of delivery. As for splitting off Amazon's publishing services, as he advocates, I'd be dubious. I like the way Kindle self-publishing currently works.

Doctorow also brings up problems that may require "structural integration" rather than separation, to prevent Big Tech from evading its legitimate responsibilities. He tentatively calls for "a requirement that the business functions that harm the rest of us when they go wrong be kept in-house, so that the liabilities from mismanaging those operations end up where they belong." Is there a simple answer to the dilemma of maintaining the conveniences we enjoy while preventing the abuses?

Margaret L. Carter

Carter's Crypt

Thursday, January 13, 2022

Luddites and SF

The term "Luddite" is typically applied to people who oppose technological advances. That's basically what I've always assumed the word to mean. Well, Cory Doctorow's latest LOCUS column corrects that misconception:

Science Fiction Is a Luddite Literature

Luddites were textile workers belonging to a secret society in England in the early nineteenth century, best known for destroying the newfangled equipment in textile mills. According to Doctorow, however, their primary objective wasn't the destruction of machinery. That was "their tactic, not their goal." Rather, their goal was "to challenge not the technology itself, but rather the social relations that governed its use." Doctorow details some of the local and global changes that resulted from mechanization of the textile industry. Factory owners could have used the new-model looms to improve employment conditions for skilled craftspersons. Instead, employers chose to hire fewer workers at lower wages. The Luddites imagined and agitated in favor of a different path for the industrial revolution. Doctorow gives several examples of how we, today, "are contesting the social relations surrounding our technologies."

New technology always generates social change, often with unanticipated consequences. Robert Heinlein's "The Roads Must Roll" is one story that pioneered the type of science fiction focusing not on aspects such as the technical details of how automated roads work, but on how their existence affects the society that uses them. An obvious real-world example, the automobile, had easily predicted effects on individuals' freedom of movement and the decline of passenger railroads, but few people probably foresaw the impact on courtship customs and sexual mores. With cars, the balance of power in courtship shifted from the girl, who invited the boy to "call on" her in her parents' home, to the boy, who invited the girl on a "date" outside her parents' control. And of course the automobile gave young couples more freedom for sexual experimentation than they had under the old system. Consider the telephone: In his final novel, TO SAIL BEYOND THE SUNSET, Heinlein has the narrator's father, a doctor, predict at the turn of the nineteenth to the twentieth century that telephones in the home would mean the end of house calls. When the only way to get a home doctor visit was to send someone in person to summon him, patients expected house calls only in emergencies. Once they could contact their family physician by picking up the phone, they would call for less and less urgent reasons, and doctors would soon refuse to cooperate. (This decline probably happened more slowly than Heinlein anticipated; I have a dim memory of the doctor's visiting me at home when I had measles around age five, the only time I've personally encountered a doctor's house call.)

Like the mechanical looms in the early stage of the industrial revolution, most if not all new technologies benefit some people while disadvantaging others. The ease of paying bills and performing other transactions online provides great convenience for most while disadvantaging those who can't afford a computer and internet connection or, like my ninety-year-old aunt, simply decline to adopt those "conveniences." Businesses now routinely expect customers to have internet access and hence make transactions more difficult for those who don't. Cell phones have made fast connections everywhere all the time routine, so that people are often expected to be instantly available whether they want to be or not. Moreover, as pay telephones have been used less and less, they've tended to disappear, so when anybody does need one—whether because he or she doesn't have a mobile phone or because the battery has run down or they're in a dead zone with no cell service—a phone booth is almost impossible to find. I actually "met" a person nagging me for contact information on Goodreads who accused me of lying when I said I didn't own a smart phone. (Yes, I have a cell phone for urgent or otherwise time-sensitive communication away from home, but it's a plain old "dumb" flip model.)

According to Doctorow, science fiction is a Luddite genre because both the historical movement and the fiction "concern themselves with the same questions: not merely what the technology does, but who it does it for and who it does it to."

Margaret L. Carter

Carter's Crypt

Thursday, September 30, 2021

Planes, Trains, and Automobiles

The October 2021 issue of NATIONAL GEOGRAPHIC features a pair of lead articles about "green" power for aircraft and cars, mainly electric. The cover optimistically proclaims, "The Revolution Is Here." The issue abounds with information about the past as well as the future of electric-powered transportation. I was surprised to learn that in 1900 electric cars held over one-third of the market. Gasoline-powered internal combustion automobiles came in third, after steam (!) and electric. Then as now, the main obstacles to widespread acceptance of electric cars were battery weight and range. On the other hand, electric vehicles are quiet and emissions-free, and they have fewer moving parts to maintain. In the early twentieth century, "cheap oil and paved roads" enabled the internal combustion engine to dominate the market by the 1930s. Now auto manufacturers are embracing EVs with fresh enthusiasm, not only the big names such as Tesla, but even Volkswagen. Driving range and charging times are improving as prices decrease to become comparable to the cost of gasoline-fueled cars. Driverless, electric-powered delivery vehicles may eventually become commonplace. Meanwhile, Amazon and FedEx are switching their fleets to EVs.

This NATIONAL GEOGRAPHIC's second article on the energy revolution deals with flight. Commercial airliners produce vast quantities of fossil-fuel pollution. France is considering a ban on all domestic flights to destinations that can be reached by train in less than two and a half hours. Implementing that policy, of course, would imply a passenger rail system adequate to efficiently serve the needs of the traveling public. In most of the U.S., a situation like that is an incredible fantasy. Peter Kalmus, a NASA climate scientist, insists on "the hard fact" that "we don't need to fly." What world does he live in? Most vacation travelers crossing the Atlantic or Pacific can't afford the cost of a cruise ship or the extra time off work for the round trip by sea. If you have to get to the opposite coast of the U.S. for an emergency such as a family funeral, you certainly do need to fly; you can't drive that distance in a day or two.

For large aircraft, electric power runs into the problem that a battery of adequate size would weigh as much as the plane itself. One type of clean airplane fuel being contemplated is liquid hydrogen. For small aircraft, however, electric engines can succeed. A California company named Wisk is one of several working on designs for "air taxis," self-flying, vertical-takeoff-and-landing small electric aircraft. In fact, our long-awaited flying car may soon become a reality, although not owned and operated by individual consumers (thank goodness, considering the typical level of driving skill on the roads).

Each proposed solution, naturally, carries problems of its own. But, as Isaac Asimov maintained, the solution to such difficulties isn't to give up on technology but to develop better technology. If you don't subscribe to NATIONAL GEOGRAPHIC, do try to pick up a copy of the October issue at the library or newsstand, especially if you're a fan and/or writer of near-future SF.

Margaret L. Carter

Carter's Crypt

Thursday, September 02, 2021

Failures of Prediction

To dispose of one point up front, of course we know the purpose of science fiction isn't literally to predict future technology and social structures. Its speculations typically explore hypothetical paths that may or may not become reality, some of which are so extreme nobody seriously expects their fulfillment. They're extrapolations that answer "What if. . . ?" or "If this goes on. . . ."

Nevertheless, it's entertaining to contemplate some of the future technological and cultural developments in older SF works that drastically missed the mark. One classic example shows up in Robert Heinlein's HAVE SPACE SUIT, WILL TRAVEL, where human colonies on the moon coexist with slide rules. In I WILL FEAR NO EVIL, the fabulously wealthy protagonist has to wait several days for the result of her pregnancy test, although at the time of the novel's publication, such a test could be completed in less than half an hour. (Ordinary patients had to wait only because of lab backlogs. Now, of course, we have instant home pregnancy tests, which ought to exist in the future setting of I WILL FEAR NO EVIL.) I don't count Heinlein's transplantation of 1950s family structures into the spacefaring future in his "juveniles" as a failure of prediction, because it's obvious he was simply bowing to the constraints of the market in those books. His posthumously published utopia FOR US, THE LIVING demonstrates how early in his career he envisioned alternative marriage and sexual customs.

Isaac Asimov did foresee the hand-held calculator, but that story imagines a future in which people have become so dependent on calculators that even scientists with advanced degrees don't know how to do arithmetic the old-fashioned way. I can't believe that's meant as a serious prediction rather than a fanciful thought experiment. I suspect the same about a story in which people aren't taught to read, since computers and robots convey all information (apparently -- it's not quite clear) in audible speech. (So what about deaf users?) It comes as an incredible revelation to the two boys in this tale that their recent ancestors could decode "squiggles" on paper.

Recently I reread a collection of Asimov's robot short stories, along with his novel ROBOTS OF DAWN, and was amused at some of the predictive "fails" perpetrated by such a visionary author. For one thing, the robots are almost all roughly humanoid-shaped, supposedly because the public would feel less wary of them in that form. The plan doesn't work; throughout the series, most Earth people (as opposed to Spacers, who tend to embrace the convenience of artificial servants) fear robots, and it's pretty clear that the crude approximation of human shape makes the animated machines more distrusted, not less. It would make more sense to design robots' bodies for maximum efficiency in performing their particular tasks, as real-life industrial robots usually are. Furthermore, to learn new information robots are shown reading books rather than having the contents uploaded directly into their positronic brains. Very odd from a present-day perspective, when astronomers in one story want to identify extrasolar planets likely to harbor life, they teach a robot to perform the analysis rather than programming a stationary computer to carry out the search. This piece, of course, is set in the distant future, yet we have methods of finding Earthlike extrasolar planets right now.

In terms of social change, Asimov's robot series includes elements that require generous suspension of disbelief. For instance, THE CAVES OF STEEL emphasizes how overcrowded Earth has become. As one consequence, personal hygiene occurs in what amount to huge communal bathhouses, called Personals. All right, if overpopulation means apartments are so small it makes more sense to centralize baths, showers, and related functions, I can accept that. But it's strongly implied that individual dwellings don't have toilet facilities, which would imply no running water! This assumption is confirmed in ROBOTS OF DAWN, where Earth investigator Elijah Baley is suprised to find one-person Personals in private homes. Asimov must not have thought this through. In a technologically advanced society hundreds of years in the future, people don't have any means of washing at home? And when "nature calls" in the middle of the night or first thing in the morning, they use -- what? Chamber pots? Family structures on the Spacer worlds, at least the two we see in the series, are also problematic. One world has developed a culture in which people abhor personal contact so deeply that they never touch or even meet in person if they can help it. Almost all contact happens holographically. Children are brought up in group care homes, where they're gradually trained out of the crude desire for physical proximity. Even spouses don't live together. They have sex only for reproduction, and most people detest that "duty," yet the obvious alternative of universal artificial insemination isn't embraced. On the planet Aurora in ROBOTS OF DAWN, casual recreational sex is commonplace, children are the only purpose of formal marriage, the young are reared in communal nurseries and may not even know the identities of their parents, and sexual jealousy allegedly doesn't exist. Asimov must have subscribed to the early and mid-20th-century belief that human nature is infinitely malleable. (For a lucid, detailed, entertainingly readable rebuttal of that notion, see Steven Pinker's THE BLANK SLATE.) Consider how recognizable to us are the portrayals of marriage, family, and sexuality in the early books of the Old Testament, thousands of years ago. Are a few more centuries and the relatively minor change of venue to different planets really likely to inspire radical changes in those areas of human interaction?

Famously, when later series in the Star Trek universe were developed, the producers had to cope with the fact that some technology in the original series had already become outdated, notably the flip-phone communicators. On the other hand, some SF works predict too ambitiously, as in the proverbial plea, "Where's my flying car?" The classic 2001: A SPACE ODYSSEY envisioned a level of routine space travel in 2001 that we haven't attained yet. Heinlein's DOOR INTO SUMMER promised all-purpose housecleaning robots in 1970. I wish!

Of course, many elements in current print and film SF that seem to us like cutting-edge predictions may turn out to be laughably wrong. As far as dystopian visions such as THE HANDMAID'S TALE are concerned, we can fervently hope so. However, I still want my autonomous housecleaning robot. I'm pleased with my Roomba, but it's only a start.

Margaret L. Carter

Carter's Crypt

Thursday, July 15, 2021

Monopolies and Interoperabilty

Another LOCUS article by Cory Doctorow on monopolies and trust-busting:

Tech Monopolies

He begins this essay by stating that he doesn't oppose monopolies for the sake of competition or choice as ends in themselves. He cares most about "self-determination." By this he means the individual consumer "having the final say over how you live your life." When a small handful of companies controls any given field or industry, customers have only a limited range of products or services to choose among, preselected by those companies, even if this limitation remains mostly invisible to the average consumer. Not surprisingly, Doctorow focuses on this constraint as imposed by Big Tech. He recaps the growth of "the modern epidemic of tolerance for monopolies" over the past forty years. In the present, technology giants tend to crush small competitors and merge with large ones.

To some extent, this tendency—e.g., the situation Doctorow highlights in which everybody is on Facebook because everybody else is, in a feedback loop of expansion—provides a convenience to consumers. I'm glad I can find just about anyone I want to get in touch with on Facebook. As a result of such "network effects," a system becomes more valuable the more users it has. As a reader and a bibliographer, I don't know how I'd manage nowadays if Amazon didn't list almost every book ever published. I resent the brave new broadcasting world in which I have to pay for several different streaming services to watch only a couple of desired programs on each. I LIKED knowing almost any new series I wanted to see would air on one of our hundreds of cable channels. (Yes, we're keeping our cable until they pry it out of my cold, dead remote-clicking hand.) On the other hand, I acknowledge Doctorow's point that those conveniences also leave us at the mercy of the tech moguls' whims.

Half of his article discusses interoperability as a major factor in resisting the effects of monopolies. Interoperability refers to things working together regardless of their sources of origin. All appliances can plug into all electrical outlets of the proper voltage. Any brands of light bulbs or batteries can work with any brands of lamps or electronic devices. Amazon embraces interoperability with its Kindle books by allowing customers to download the Kindle e-reading app on any device. Likewise, "all computers are capable of running all programs." For self-published writers, services such as Draft2Digital offer the capacity to get books into a wide range of sales outlets with no up-front cost. Facebook, on the other hand, forecloses interoperability by preventing users from taking their "friends" lists to other services, a problem that falls under "switching costs." If it's too much trouble to leave Facebook, similar to the way it used to be too much trouble to change cell phone providers before it became possible to keep your old phone number, consumers are effectively held hostage unless willing to pay ransom in the form of switching costs (monetary or other).

Doctorow concludes, however, with the statement that the fundamental remedy for "market concentration" isn't interoperability but "de-concentrating markets." Granting a certain validity to his position, though, how far would we willingly shift in that direction if we had to give up major conveniences we've become accustomed to?

Margaret L. Carter

Carter's Crypt

Thursday, April 08, 2021

Starting Afresh

Kameron Hurley's newest LOCUS column discusses making a fresh start with the turn from winter to spring:

Plotting the Way Forward

Noting that the ancient Romans marked the New Year in March rather than January, Hurley muses about the signs of spring that show up in March. This year, she finds particular hope in the change of seasons because a potential end to the COVID crisis may be in sight. She ponders what is meant by "returning to normal": What will go back to the way it was? What will have changed permanently? As she puts it, “'normal' is a shifting target. After the last year, our world will not be quite the same."

One change she welcomes is the decline of shopping malls. Here I disagree. I'm a big fan of malls, even though with online ordering I haven't frequented our local mall in recent years nearly so much as I used to (especially after its chain bookstore closed). Sure, a green-space town center with a cluster of shops, within easy walking distance of home, would be lovely. But that's not likely to sprout up out of nowhere near us (all the ground within walking distance being occupied by houses or, if one has the stamina to hike one-point-three miles to the main road, existing stores). Nor does it describe the neighborhoods where I spent the years between age eight and moving out of my parents' home to get married. We lived in the suburbs. There was nowhere to walk except other houses and, a longish trek from our home, a major highway at the entrance to the development. A very long bike ride could take us to a shopping strip with one large store and several smaller ones. When the first actual mall opened near us (in the greater Norfolk, Virginia, area), in my teens, I was thrilled about the concept of shopping at a bunch of stores in the same location, with plenty of parking, under a ROOF! That last was a big deal in one of the more rainy regions of the country. And I still think malls are a great idea in places where most people depend on cars to get anywhere, which describes every city we've lived in throughout our married life.

But I digress. Some of the changes Hurley welcomes, I can agree with. As for the ambition to "re-think our crowded buildings in crowded cities that have few to no greenspaces," that sounds desirable, but such a revolution can't occur with the simple wave of a wand. Shifting many jobs to remote work is a change I'd like to see made permanent, if only for the sake of our grown children who've benefited from it. What about universal mask-wearing? I look forward to not having to do that all the time, yet I agree with Hurley on the advantage of getting sick less often. I could embrace a custom of wearing masks out and about when suffering from a mild illness, as many people do in Japan. As a probable side effect of the COVID precautions, I haven't had a cold in over a year. Hurley also looks forward to future advances in medical science as a result of discoveries made in the course of vaccine research. Like wars, pandemics can produce occasional positive technological side effects.

I've missed attending church in person, but I hope after we resume live gatherings our church will continue to record Sunday services for availabilty to people who can't be present for one reason or another. The pandemic has compelled us to try many such innovations that would be helpful to hang onto. The ubiquity of restaurant meal ordering, for example—it's become easier than ever before to get home-delivered meals from a wide variety of our favorite places, on websites instead of over the phone, prepaid with a credit card. With the success of virtual conventions in the past year, maybe some of them will continue to provide an online track for fans who can't make it to the physical location. However, there's at least one minor negative about the increasing shift to electronic media, from my personal viewpoint: More and more periodicals are switching to digital-only. I like magazines I can hold in my hands and, if worth rereading, store on a shelf.

A related trend that predated COVID but may have accelerated recently is the convenience of being able perform many activities such as financial and government transactions over the Web. No need to drive to the bank to transfer funds, the post office to buy stamps, or the motor vehicle office to renew a car registration. This trend is likely to continue and expand. Of course, the downside involves less convenience for people who don't have a computer (my 90-year-old aunt, for one, but many citizens lack computers and their associated functions from poverty, not choice) or adequate internet access. As has often been pointed out recently, computers with internet connections are no longer luxuries but household necessities on a level with water, electric, and phone services.

Hurley concludes by invoking March, which heralds spring in much of the northern hemisphere, as the time "when we celebrate surviving the very worst the world could throw at us, and plot a new way forward." Or, as Brad Paisley says in his optimistic song "Welcome to the Future," highlighting modern marvels formerly enjoyed only in the realm of science fiction, "Wherever we were going, hey, we're here."

Margaret L. Carter

Carter's Crypt

Thursday, January 14, 2021

Sufficiently Advanced Technology

As we know, "Any sufficiently advanced technology is indistinguishable from magic" (Arthur C. Clarke). Conversely, many magical events in older fiction can be duplicated today by mainstream technology. A century and a half ago, someone who witnessed a translucent human figure floating in midair and emitting eerie moans would unquestioningly recognize it as a ghost. Now we'd respond with, "Cool special effect. I wonder how they did that?" Just such an apparition appears in Jules Verne's 1892 novel THE CARPATHIAN CASTLE, on the cusp of the shift between the two probable reactions. The local people think the vision of a dead opera star at the titular castle is her spirit, when it fact it's produced by a sound recording and a projected photograph.

In George du Maurier's 1894 novel TRILBY, the villain, Svengali, uses hypnotism to transform an ordinary girl who's tone-deaf into a famous singer. She can produce exquisite melodies only in a trance. When Svengali dies, she instantly becomes unable to sing. At the time of the novel's publication, little enough was known about hypnosis that this scenario doubtless looked scientifically plausible. Now that we know hypnosis doesn't work that way, Svengali's control over Trilby seems like magic, and to us the story reads as fantasy.

Several decades ago, I read a horror story about an author who acquires a typewriter that's cursed, possessed, or something. He finds that it corrects his typos and other minor errors. Gradually, this initially benign feature becomes scary, as the machine takes over his writing to an ever greater extent. He narrates his experience in longhand, since if using the typewriter he wouldn't even be able to demonstrate an example of a misspelling. At the time of publication, this story was an impossible fantasy. Now it would be merely a cautionary tale of a word processor with an excessively proactive auto-correct feature. From the beginning of J. D. Robb's Eve Dallas science fiction mysteries, set in the late 2050s and early 2060s, almost everybody carries a handheld "link," a combination communications device and portable computer. When the earliest books in the series were published, that device was a futuristic high-tech fantasy. Now the equivalent has become commonplace in real life. But another tool Lt. Dallas uses in her homicide investigations still doesn't exist and remains problematic. Police detectives employ a handheld instrument reminiscent of Dr. McCoy's tricorder to gather data about murder victims. One of its functions is to pinpoint the precise time of death to the minute. That capability would seem to run counter to the intrinsic limitations arising from the nature of the decomposition processes being analyzed. Therefore, the exact-time-of-death function strikes me as irreducibly quasi-magical rather than scientific, something the audience has to accept without dissecting its probability, like the universal translator in STAR TREK.

The distinction between science and magic can get fuzzy when nominal SF has a fantasy "feel." Marion Zimmer Bradley's Darkover series takes place on an alien planet inhabited partly by descendants of shipwrecked Terran colonists. Strict "hard science" readers might not accept psi powers as a real-world possibility, however, and the common people of Darkover regard laran (psi gifts) as sorcery. Anne McCaffrey's Pern series, also set on a planet colonized by migrants from Earth, features fire-breathing, empathic, teleporting, time-traveling dragons. Although these creatures have an in-universe scientific explanation, they resemble the dragons of myth and legend. Robert Heinlein's novella "Waldo" blends SF and what many if not most readers would consider fantasy. The title character lives on a private space station because of his congenital muscular weakness. Yet he overcomes his disability by learning to control his latent psychic talent under the guidance of an old Pennsylvania hex doctor who teaches Waldo how to access the "Other World." Incidentally, "Waldo" offers an example of how even a brilliant speculative author such as Heinlein can suffer a lapse of futuristic imagination. Amid the technological wonders of Waldo's orbiting home, Heinlein didn't envision either electronic books or computer games; a visitor notices paper books suspended from the bulkheads and wonders how Waldo would manage to play solitaire in zero-G.

I've heard of a story (can't recall whether I actually read it) whose background premise states that, in the recent past, the wizards who secretly control the world revealed that all technology is actually operated by magic. The alleged science behind the machines was only a smoke screen. If such an announcement were made in real life, I wouldn't have much trouble accepting it. For non-scientists, some of the fantastic facts science expects us to believe—that we and all the solid objects around us consist of mostly empty space; that the magical devices we used to communicate, research, and write are operated by invisible entities known as electrons; that starlight we see is millions of years old; that airplanes stay aloft by mystical forces called "lift" and "thrust"; that culture and technology have advanced over millennia from stone knives and bearskins to spacecraft purely through human ingenuity—require as much faith in the proclamations of authorities as any theological doctrine does.

Margaret L. Carter

Carter's Crypt

Thursday, July 16, 2020

AI and Human Workers

Cory Doctorow's latest LOCUS essay explains why he's an "AI skeptic":

Full Employment

He believes it highly unlikely that anytime in the near future we'll create "general AI," as opposed to present-day specialized "machine learning" programs. What, no all-purpose companion robots? No friendly, sentient supercomputers such as Mike in Heinlein's THE MOON IS A HARSH MISTRESS and Minerva in his TIME ENOUGH FOR LOVE? Not even the brain of the starship Enterprise?

Doctorow also professes himself an "automation-employment-crisis skeptic." Even if we achieved a breakthrough in AI and robotics tomorrow, he declares, human labor would be needed for centuries to come. Each job rendered obsolete by automation would be replaced by multiple new jobs. He cites the demands of climate change as a major driver of employment creation. He doesn't, however, address the problem of retraining those millions of workers whose jobs become superseded by technological and industrial change.

The essay broadens its scope to wider economic issues, such as the nature of real wealth and the long-term unemployment crisis likely to result from the pandemic. Doctorow advances the provocative thesis, "Governments will get to choose between unemployment or government job creation." He concludes with a striking image:

"Keynes once proposed that we could jump-start an economy by paying half the unemployed people to dig holes and the other half to fill them in. No one’s really tried that experiment, but we did just spend 150 years subsidizing our ancestors to dig hydrocarbons out of the ground. Now we’ll spend 200-300 years subsidizing our descendants to put them back in there."

Speaking of skepticism, I have doubts about the premise that begins the article:

"I don’t see any path from continuous improvements to the (admittedly impressive) 'machine learning' field that leads to a general AI any more than I can see a path from continuous improvements in horse-breeding that leads to an internal combustion engine."

That analogy doesn't seem quite valid to me. An organic process (horse-breeding), of course, doesn't evolve naturally into a technological breakthrough. Development from one kind of inorganic intelligence to a higher level of similar, although more complex, intelligence is a different kind of process. Not that I know enough of the relevant science to argue for the possibilities of general AI. But considering present-day abilities of our car's GPS and the Roomba's tiny brain, both of them smarter than our first desktop computer only about thirty years ago, who knows what wonders might unfold in the next fifty to a hundred years?

Margaret L. Carter

Carter's Crypt

Thursday, June 13, 2019

Inside Apollo

The June 2019 issue of SMITHSONIAN magazine includes a long article on little-known aspects of the Apollo lunar exploration project. Unfortunately, the online publication is behind a paywall. Here's a sample of the article:

What You Didn't Know About Apollo

Pick up a copy of this issue if possible. It contains some shocking revelations (shocking to me, anyway). Despite his inspirational public speeches about the race for the Moon, President Kennedy stated in private that he had no particular interest in space as such. He simply wanted to beat the Russians. A significant percentage of Americans considered the space program a waste of money. In 1968, only four weeks after the Apollo 8 flight, a Harris Poll survey revealed that only 39% of Americans favored landing a man on the Moon. When asked whether the project was worth its cost, 55% said no—even though the war in Vietnam was costing more per year than the total price of the Apollo program so far. Aside from the excitement of televised launches, most ordinary citizens didn't give much thought to the Moon project. Even scientists, polled in 1961 by Senator Paul H. Douglas, were divided on the importance of a manned Moon mission, 36% believing it would have "great" value and 35% "little" value. This attitude seems so remarkable to me as an SF fan, since I've regarded the vital importance of space exploration as obvious for most of my life. In October 1963, funding for the Apollo program was being reduced. Ironically, if Kennedy had lived longer, lunar aspirations might have faded away, whereas President Johnson "was an authentic believer in the space program."

Equally astonishing to me, as described in the SMITHSONIAN article, was the United States' level of unpreparedness for the promised goal of a man on the Moon by the end of the 1960s. When Kennedy announced that goal, "he was committing the nation to do something we simply couldn't do." As the article puts it, "We didn't have the tools or equipment" and furthermore "didn't even know what we would need." We didn't have a list of requirements; "no one in the world had a list." And yet we proceeded to do the impossible, producing along the way results such as the most advanced computers created to date, "the smallest, fastest and most nimble computer in a single package anywhere in the world." Furthermore, NASA invented "real-time computing." Not being a tech person, before reading this article I had no idea what a revolutionary development that was. Previously, the only way to get problems solved with a computer was to submit a pile of punch cards and wait hours or days for the printed results of the calculations. Clearly, the space race gave us a lot more than Tang!

It felt strange to read this article and realize how the groundbreaking achievements of our nation's space program, which now seem like a foregone conclusion of unique historical significance, often hung by precariously slender threads.

Margaret L. Carter

Carter's Crypt

Thursday, August 02, 2018

Replicators on the Horizon

Right here in Annapolis, a 3-D printer at the local Home Depot has been used to create a prosthetic limb for a five-year-old boy born without a hand. You can read the story and watch a video of the new hand in action here:

Prosthetic from 3-D Printer

The maker, John Longo, a staff member at the store, has produced and donated about 120 of such devices over the past year and a half. One cool feature of the system is that new limbs can be printed from the same design in larger sizes as the boy grows.

Could 3-D printers be precursors of the replicators in the Star Trek universe? Currently, a wide variety of objects can be made from a generic material, spools of plastic filament. The versatility and usefulness of the technology has proven itself in many fields; simple replacement organs such as bladders and external ears have already been transplanted into patients. Presumably, replicators, on starships and elsewhere, create items from a supply of undifferentiated, cheap mass (like those plastic filaments), not out of thin air. The basic concept could evolve from the principles behind 3-D printers. Long before the imagined era of Starfleet and the Federation, those machines might become advanced and versatile enough to make almost any product needed in everyday life, as well as in specialized fields such as medicine and industry.

What about food? While we wouldn't expect that to be crafted out of plastic (I hope!), maybe a nutritionally balanced supply of goop could be shaped and flavored to simulate almost anything the consumer would want to eat. Could replicated food someday feed the world's hungry people? To a great extent, maybe, but considering the strong resistance to GMO crops by some factions, a movement might develop to reject such "fake" food.

Of course, even in the utopian future of a genie-magic level of technology, replicated products would have costs. The energy has to come from somewhere, and the raw material, although cheap, wouldn't be free. Moreover, well-off people wouldn't be satisfied with only replicated consumer goods. Doubtless foods made from fresh ingredients would taste better, and individually crafted items would become status symbols. Still, mass-manufactured products from some device analogous to the replicator would have profound effects on the global economy. Imagine living in a world where abundance, not scarcity, becomes the default assumption.

Welcome to the future!

Margaret L. Carter

Carter's Crypt

Thursday, April 19, 2018

Sapient Hibernators?

Recently on Quora someone asked why human beings can't stay awake for a week straight and then sleep for the same amount of time, instead of alternating sleep and awake time every day. The most convincing answer is that we evolved to live in harmony with the alternation of light and darkness. We are diurnal mammals, and our (roughly) 24-hour circadian rhythm harmonizes with daily changes in sunlight levels, making us active by day and at rest by night. We need that period of dormancy every night for our brains to repair ongoing wear-and-tear and process the events of the previous day. Not all animals function the same way, of course. Some are nocturnal. Cats sleep in short stretches throughout the day, with a lot of activity around dawn and dusk (crepuscular). Dolphins, some other aquatic mammals, and some birds sleep with only half of their brains at a time, so one brain hemisphere is always awake.

It occurred to me to wonder what our lives would be like if we hibernated. I would be happy to sleep through the dreary, cold stretch from January 2 through the third week of March, when the International Conference on the Fantastic in the Arts convenes in Orlando. Aside from skipping the worst of winter, I could gorge on goodies over the holidays, then painlessly burn off the fat while asleep. I've read only one story featuring a sapient species that hibernates, Melanie Tem's unique vampire novel, DESMODUS. Her non-supernatural vampires are essentially humanoid, intelligent vampire bats. The females, the dominant sex, are the only ones who hibernate through the winter. The males migrate. With access to human culture's modern technology, they drive south every year in a convoy of huge trucks, in which the sleeping females and the young (cared for communally) are safely ensconced. How would their clan manage if they all had to hibernate, though? Wouldn't they be overly vulnerable?

Risks of vampire-slayers finding their dens and slaying them while dormant might be minimized if Desmodus had a metabolism like that of bears, which don't really "hibernate" in the strictest sense, Instead, they enter a state of torpor from which they can wake up quickly and easily if the need arises. In true hibernators, heartbeat and respiration slow drastically, and body temperatures decrease to the level of the surrounding environment. Arctic ground squirrels may even reduce their abdominal temperatures below freezing.

Could a hibernating species develop an advanced technological civilization? What would happen to their machines and infrastructure during prolonged periods of universal dormancy, with nobody available to perform upkeep and maintenance? Maybe a hibernating intelligent race might be limited to preindustrial technology. If we discovered such a race on a distant planet, we could supply them with machines and technicians to care for the equipment, but the natives would remain dependent on us for those resources. Of course, such a species might instead take their cultural evolution in a completely different direction from ours and produce a civilization that doesn't rely heavily on physical technology—biologically based, perhaps. It's hard to imagine, however, how they could achieve space flight, so I visualize their being confined to their home planet when we meet them.

Margaret L. Carter

Carter's Crypt

Thursday, August 24, 2017

Trazzles and Tweedlers

While re-shelving our books in our newly redecorated basement "library," I came across WHICH WAY TO THE FUTURE? (2001), a collection of essays from ANALOG by the long-time editor of the magazine, Stanley Schmidt. While most of the stories in ANALOG don't excite me, because I don't really get into "hard science fiction" (a term Schmidt doesn't like; he maintains that rigorously science-based SF should be called simply "science fiction"), I've always loved the editorials. My favorite article in WHICH WAY TO THE FUTURE?, "Bold and Timid Prophets," contemplates how visions of the future (in both factual predictive writings and fiction) typically measure up to the actual development of culture and technology. Often a story set in the future imagines the technology as a perfected version of the cutting-edge inventions of the present day. For example, a nineteenth-century speculative novel might envision the twentieth century as powered by highly advanced steam engines. Making an imaginative leap into a world filled with devices that do things impossible in the current state of knowledge is much harder.

Schmidt illustrates this problem by starting the essay with an ordinary letter written in the late 1990s as it would appear to a reader in the 1860s. He substitutes a nonsense word for every term that didn't exist then (or combines familiar words in ways that would have made no sense in the mid-nineteenth century, such as "answering machine"). (I think he cheated a bit with "pilot." Boats had pilots for a very long time before airplanes began to need them.) "Plane" becomes "trazzle"; "computer" becomes "tweedler." "Fooba" substitutes for "e-mail" and "zilp" for "fax." Even where the nineteenth-century reader could recognize all the words, many of the sentences would appear to express impossibilities. How could parents know the sex of a baby in utero? How could a person travel a total of 20,000 miles in only one month? How could a human heart be transplanted? How could a transatlantic trip take "just a few hours"?

Doubtless the distant future will include inventions and achievements we can't currently imagine because they'll depend on discoveries and technologies unknown to us, just as the nineteenth century couldn't predict the practical applications of electromagnetic theory and quantum mechanics. Even the boldest and best of classic SF writers get things amusingly wrong when writing about the not-so-distant future. "Where's my flying car?" illustrates one well-known unfulfilled prediction. Personally, I shudder at the thought of flying cars being anything other than toys for the rich. Autonomous ground cars, which now seem just over the horizon, sound much more desirable. What I really want, however, is my housecleaning robot, which Heinlein in THE DOOR INTO SUMMER expected by 1970. Also, in HAVE SPACE SUIT, WILL TRAVEL, Heinlein envisioned a near future with a moon colony—and slide rules. The social structures portrayed in some of his juvenile novels are even less "bold" than the concept of slide rules on the moon—the families of the twenty-first century look like suburban American households of the 1950s—but, in light of his posthumously published first novel, FOR US, THE LIVING, that absence of innovation probably wasn't his fault. I suspect editors of books for teenagers in the 1950s wouldn't have accepted anything unconventional in that area.

Schmidt concludes that "well-balanced science fiction" needs "both extrapolation—things you can clearly see are possible—and innovation—the things you can't see how to do, but also can't prove impossible." That's one thing I like about J. D. Robb's Eve Dallas mysteries; their vision of the 2060s strikes me as convincingly futuristic but also plausible in terms of current technological and social trends.

WHICH WAY TO THE FUTURE? addresses a variety of other intriguing topics, such as the definitions of "intelligence" and "human," why we haven't been contacted by aliens (the Fermi Paradox), the proliferation of unrealistically exaggerated fears of marginal hazards, etc. Fortunately, Amazon offers numerous used copies of this fascinating collection.

Margaret L. Carter

Carter's Crypt

Thursday, June 22, 2017

Updating Older Books

When a book published years or decades ago and set in the "present" gets reprinted in a new edition, should the technology and cultural references in the text be updated so that the story will still feel as if it's set in the present?

My vampire romance CRIMSON DREAMS has just been re-released by my new publisher:

Crimson Dreams

At the editor's request, I revised scenes that included computers (and inserted mention of cell phones into places where they'd be expected) to avoid having readers distracted by outdated references. The story was contemporary when first published, and there was no reason it shouldn't feel contemporary now.

Diane Duane's Young Wizards series has been around for decades, beginning with SO YOU WANT TO BE A WIZARD (1983). She has self-published new editions of the earlier novels in the series, collectively labeled Millennium Editions, explicitly set in the twenty-first century, with the technology updated. She believed that the obsolete references in the original editions were confusing to the contemporary YA audience because their time period isn't far enough in the past to feel like historical fiction, just enough to feel outdated. Also, the revisions eliminate the anomaly of having the characters age only a few years over a much longer real-time span:

Diane Duane's Ebooks Direct

Some authors tacitly modernize their worlds while the characters age slowly or not at all. The BLONDIE and BEETLE BAILEY comic strips, for instance. The creator of FOR BETTER OR WORSE took an interestingly different approach when she concluded the comic series a few years ago. She started over again from the beginning, reprinting the original strips with additions and revisions.

There are some works in my Vanishing Universe vampire series that I wouldn't update if I were re-publishing them, because I had a good reason for the original dating—specifically DARK CHANGELING, its immediate sequel (CHILD OF TWILIGHT), and a couple of novellas dependent on them. DARK CHANGELING had to be set in 1979 because the then forty-year-old, half-vampire protagonist had to be born in 1939 to make his backstory plausible. My quasi-Lovecraftian novel FROM THE DARK PLACES, due to be re-released by Writers Exchange E-Publishing eventually, presents a special problem. It's set in the 1970s, and its next-generation sequel (currently a work-in-progress) focuses on a twenty-one-year-old heroine who was born at the end of the previous book. How can I set the sequel in the present (to avoid confusing readers with an unnecessary 1990s setting) and have a heroine who's twenty-one when she should be middle-aged? I plan to revise FROM THE DARK PLACES to remove blatantly specific 1970s references but have it set in a sort of "indefinite past."

Do you think it's necessary or desirable to update re-released older novels with settings that were contemporary-present when first published? Does the answer vary on a case-by-case basis for you? Authors of historical or far-future fiction have it easy in the respect. (Writers of near-future SF have a slightly different problem; their settings soon become overtaken by events and transformed into alternate history. Think of Orwell's 1984.)

Margaret L. Carter

Carter's Crypt

Thursday, April 27, 2017

Self-Driving Cars

The science column in THE MAGAZINE OF FANTASY AND SCIENCE FICTION, by Pat Murphy and Paul Doherty, is always entertaining and informative. In the May-June issue, the authors write about "Robots on the Road." In other words, the very-near-future advent of the self-driving car. Doherty reports on his experience of riding in such a car at the Google research facility. It differs radically from a conventional car at first glance, having no steering wheel, accelerator, or brake pedal. The vehicle demonstrated its ability to avoid a pedestrian, a bicycle, and another car. The current "street-safe" model does have steering wheel, etc., so the human driver can take over if necessary. The ultimate goal is to produce autonomous cars that "talk" to each other, to pedestrians' cell phones, and to the road infrastructure itself. Among other questions about unanticipated consequences of populating the highways with autonomous cars, the article speculates on energy use. Will more people choose to travel by car if they can relax and watch cat videos instead of driving? On the other hand, these cars should be more fuel-efficient than conventional ones, so maybe the overall result of the change will be "a wash." Then there are the ethical problems: If a crash can't be avoided, what should the robot car be programmed to hit? An animal or another car? A concrete pillar (injuring or possibly killing the rider) or a flock of pedestrians?

Like any other emergent technology, autonomous cars will pass through a transition stage when the new technology shares the environment with the old. It seems to me that this period, just before the "tipping point," will be the time of greatest hazard. When all the vehicles on the road are self-driving, with no human error to worry about, we should be much safer. If I live so long, I'll be glad to relax and enjoy the ride. During the transition, I'm not so sure.

Speaking of autonomous machines, Bill Nye the Science Guy has a fun new series, BILL NYE SAVES THE WORLD (available on Netflix). The third episode (I think) focuses on Artificial Intelligence—its current status, future prospects, and potential benefits and risks. Is a "smart" thermostat true AI? What about the personal assistant that talks to you on your cell phone? Could the entire Internet become a sentient being (or is it one already, and we just aren't aware of it)? How about those self-driving cars that communicate with each other and make decisions independent of human intervention?

In addition to "robot" ground vehicles, a Google-supported startup is also working on a flying car—sort of; it looks and performs more like an ultralight airplane:

Flying Car

According to this article, the FAA has approved the craft for use in "uncongested areas." The page doesn't say how that term is defined. Operators won't need a pilot's license, which sounds to me like an invitation to disaster. Consider all the accidents that happen on the roads daily, and imagine all those drivers moving in three dimensions. Of course, for the foreseeable future such vehicles will be so expensive we can't expect to see many of them around, fortunately.

For futuristic personal transportation, I'll take the driverless car (when it's perfected) over the human-piloted flying car, thanks.

Margaret L. Carter

Carter's Crypt

Tuesday, May 13, 2014

Index to Marketing Fiction In A Changing World by Jacqueline Lichtenberg

Index to Marketing Fiction In A Changing World
by 
Jacqueline Lichtenberg


It isn't enough to just write a great story, nor even to write a story that precisely fits what publishers want. 

Today's changing world requires writers to do much more than write. 

Some manage this problem by marrying or partnering up with someone with the requisite skills, and some hire an agent.  Some get lucky and connect with the right editor.

Everyone else has to pay attention to Marketing, Markets, Publishing, video, advertising, PR, and branding -- all kinds of things that really compete with creative time. 

Self-publishing is yet another whole set of skills that adds in book design, formatting, layout, cost-effective use of various online outlets, accounting, and a myriad secretarial skills. 

We have not yet covered all these requisites in this blog series, even though I've been touching on this subject since 2009.  Here is what we have so far in this series, with the newest at the top.

My series on Marketing Fiction In A Changing World:

Part 27 -
The Half Hour Drama Is Back
http://aliendjinnromances.blogspot.com/2017/08/marketing-fiction-in-changing-world.html
Part 26 - 
Must You Compromise Your Art To Sell Big? 
http://aliendjinnromances.blogspot.com/2017/06/marketing-fiction-in-changing-world.html

Part 25 - Understanding the Shifting Fiction Market
http://aliendjinnromances.blogspot.com/2017/05/marketing-fiction-in-changing-world.html

Part 24- Writing About The Future And For The Future
http://aliendjinnromances.blogspot.com/2017/02/marketing-fiction-in-changing-world.html

Part 23 - Mastering The Narrative Line
http://aliendjinnromances.blogspot.com/2016/10/marketing-fiction-in-changing-world_31.html

Part 22 - Making A Profit At Writing In A Capitalist World
http://aliendjinnromances.blogspot.com/2016/10/marketing-fiction-in-changing-world_11.html

Part 21 - Crafting Book Links To Track Via Google
http://aliendjinnromances.blogspot.com/2016/10/marketing-fiction-in-changing-world.html

http://aliendjinnromances.blogspot.com/2016/07/marketing-fiction-in-changing-world_19.html  Part

20. Guest Post by Miriam Pia

http://aliendjinnromances.blogspot.com/2016/07/marketing-fiction-in-changing-world.html Part 19, Guest Post by Deb Wunder on non-fiction writing

http://aliendjinnromances.blogspot.com/2016/05/marketing-fiction-in-changing-world.html Part 18 - Amazon makes some bad marketing decisions

http://aliendjinnromances.blogspot.com/2016/01/marketing-fiction-in-changing-world.html Part 17 - Fiction Writing still pays less than minimum wage, considering the hours spent. Make your living at non-fiction. See where the opportunities lurk.

http://aliendjinnromances.blogspot.com/2015/11/marketing-fiction-in-changing-world_24.html Part 16 about which is more science fiction, Star Trek or Star Wars? A question via Quora.com

http://aliendjinnromances.blogspot.com/2015/11/marketing-fiction-in-changing-world.html Part 15 - Guest Post by Kirok of L'Stok and discussion of new series by Jean Johnson
http://aliendjinnromances.blogspot.com/2015/09/marketing-fiction-in-changing-world.html Part 14 - posting on September 1, 2015

http://aliendjinnromances.blogspot.com/2015/05/marketing-fiction-in-changing-world.html is Part 13 in this series.

http://aliendjinnromances.blogspot.com/2015/02/marketing-fiction-in-changing-world.html is Part 12

http://aliendjinnromances.blogspot.com/2015/02/marketing-fiction-in-changing-world.html  is Part 11 in this series.

http://aliendjinnromances.blogspot.com/2014/11/marketing-fiction-in-changing-world.html  -- this is Part 10

http://aliendjinnromances.blogspot.com/2014/07/marketing-fiction-in-changing-world.html

http://aliendjinnromances.blogspot.com/2014/03/marketing-fiction-in-changing-world_18.html


http://aliendjinnromances.blogspot.com/2014/03/marketing-fiction-in-changing-world_11.html

http://aliendjinnromances.blogspot.com/2014/03/marketing-fiction-in-changing-world.html

http://aliendjinnromances.blogspot.com/2014/02/marketing-fiction-in-changing-world_25.html

http://aliendjinnromances.blogspot.com/2014/02/marketing-fiction-in-changing-world_18.html

http://aliendjinnromances.blogspot.com/2014/02/marketing-fiction-in-changing-world.html

http://aliendjinnromances.blogspot.com/2012/03/marketing-fiction-in-changing-world.html

http://aliendjinnromances.blogspot.com/2009/05/marketing-fiction-in-changing-world.html



Jacqueline Lichtenberg
http://jacquelinelichtenberg.com