Showing posts with label Internet. Show all posts
Showing posts with label Internet. Show all posts

Thursday, November 21, 2024

National Sovereignty and Free Expression

Cory Doctorow's latest LOCUS column explores the tension between the rights of nations to "establish the rule of law" and individuals' rights to freedom of expression.

Hard Cases, Bad Law

Some nations use the power of their sovereignty to protect individual rights, while some do the opposite. The internet comes into the discussion because it "crosses international borders." Currently the United Nations is working on a Cybercrime Treaty, intended to prevent "ransomware attacks and other serious crimes." The problem is that the treaty will leave it up to each country to define "cybercrime" within its borders. A dictatorship might well define it as any public criticism of the regime. Or, for example, weaponize it against a dating site that permits same-sex matches.

The essay also discusses "data localization" laws, enacted by the EU member nations and some other European countries. Beneficial effects include preventing data about internet users within these countries from being accessed by the NSA's global surveillance. A less benign provision, however, "allows sovereign nations to access and use the data stored within their borders," a power obviously vulnerable to abuse by countries such as Russia.

Encryption presents another dilemma rooted in the clash between sovereignty and individual rights. Governments would like to ban highly effective "working encryption," at least to the extent of mandating a back-door feature for investigation of criminal activity. The trouble is that it's impossible to create such encryption to allow action prosecution of criminals while still protecting the data of legitimate users. Laying out the procedures that would be required to implement the kind of restrictions authorities might like, Doctorow concludes "the collateral damage to human rights from this kind of ban are gigantic."

The essay goes into considerable detail about these and other related issues of interest to anyone devoted to freedom of speech. Conclusion -- in irreconcilable clashes between national sovereignty and human rights, the latter should rule, and "we can recognize the legitimate exercise of sovereignty without using that as a pretense to ignore when sovereign power is used to undermine free expression, especially when that use is likely to kick off a cascade of ever-more-extreme measures that are progressively worse for free expression."

Margaret L. Carter

Pease explore love among the monsters at Carter's Crypt.

Thursday, August 01, 2024

At the Mercy of Internet Services

Here's a very scary LOCUS column by Cory Doctorow about Google users arbitrarily losing their e-mail accounts and access to all their files with no explanation or recourse. He labels this possibility a "nightmare scenario," not an exaggeration in view of the two examples he describes:

Unpersoned

An author lost her works in progress, stored in Google Docs, for alleged "inappropriate" content, never specified or explained. Far worse, the victim in the other example ("Mark"), who'd been getting his e-mail, cell phone service, photo storage, document storage, and several other services through Google, lost access to literally everything in his life that relied on technology more advanced than old-style paper mail. "Google defended its decision to permanently delete all of Mark’s data and cut him off from every account for every service he’d ever signed up for (without his email, SMS, and Authenticator codes, Mark was locked out of virtually every digital service he used)."

Doctorow suggests several potential solutions to the problem of service provider overreach. His concluding summary concedes that those companies have the right to deny service to customers under some conditions:

"But when they say they want to eject some of those users and deny them forwarding service and their own data, they’re saying they should have the right to make the people they don’t like vanish. That’s more power than anyone should have — and far more power than the platforms deserve."

This essay vindicates my own established habits. The idea of depending entirely on a cloud to store my personal documents would have given me the creeps even before reading about these abuses. Of course I save everything on my own hard drive. Of course I have more than one e-mail account. And I would never consider giving up our old reliable landline phone. I regard the cell phone as a useful backup for making and receiving calls away from home, not the primary core of my electronic existence. "Mark" got in trouble because a picture he transmitted to a pediatrician from his cell was synched to his Google photo file. The only cloud storage I have anything synched to is OneDrive, for backing up my documents and pictures. And naturally, again, they're all on my hard drive, too. It's bad enough knowing any book I've bought through Kindle could be obliterated by Amazon at any time (although this has never happened to me). I ignore the suggestions on some websites to sign in with Google or Facebook rather than the password saved on the individual sites.

Yet to give up online banking and other internet services we've come to rely on would be too great an inconvenience. How can we strike a balance between the practical necessity for online access to function in daily life nowadays and the risks of having our virtual lives snatched out of our own control? At least, however, it would seem reckless to keep all one's electronic eggs in one omnipotent basket.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, May 16, 2024

Why Is the Internet Getting Worse?

In Cory Doctorow's latest LOCUS column, he analyzes why everything on the internet is in his view "(suddenly, simultaneously) getting (much) worse."

The Villain of Their Own Story

He begins with how online platforms decay -- in the sense of no longer working to the benefit of users -- and proceeds to the questions of why they do so. He summarizes the "progression of the disease" thus: "First, companies are good to their users. Once users are lured in and have been locked down, companies maltreat those users in order to shift value to business customers, the people who pay the platform’s bills. Once those business users are locked in, the platform starts to turn the screws on them, too." The process sounds like the decline from a Golden Age or the fall from a primal Eden. Not that I suppose Doctorow claims the internet was ever perfect.

This essay focuses on his explanation of why it's all "suddenly, simultaneously" getting worse at the present time. According to Doctorow, it's not because the companies changed from beneficent providers of services and content to greedy Scrooges. They've always used algorithms to "twiddle" the figurative digital dials in the direction of the maximum profit to themselves. It's just that there used to be "constraints" on this strategy that no longer exist or at least not to an effective degree. The essay lists the principal constraints and explores how they've been weakened or eliminated in recent years.

As Doctorow puts it, "No one is the villain of their own story." Not even mega-corporations. "The tech bosses who once made good products told themselves they did so because they were virtuous, but much of that virtue stemmed not from their character, but from the consequences of failing to deliver good products at fair prices under ethical conditions." With the eviscerating of the consequences, as he sees it, the "virtue" built on pragmatic foundations has crumbled.

What, if anything, can consumers do to ameliorate this situation? We're left with an apparent scenario of inevitable decline -- a rather depressing prospect.

Doctorow elaborates on this topic in a more technically detailed post here:

Algorithms

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, April 04, 2024

Technofeudalism

Cory Doctorow advances the position that capitalism isn't evolving into socialism (as classical Marxism predicted) but into a new form of feudalism:

Capitalists Hate Capitalism

His explanation of the difference between "rents" and "profits" is new information to me (being a bear of very little brain where economic theory is concerned, anyhow). "Rent" in the technical sense used by economists means "income derived from owning something that the capitalist needs in order to realize a profit." It's passive income, so to speak. In Doctorow's example, the manager of a coffee shop has to compete actively with other shops to attract labor and customers. The landlord who owns the building, though, receives money from rent no matter who occupies the space.

In these terms, a gigantic storefront such as Amazon, to which all the individual sellers pay rent, exemplifies "the contemporary business wisdom that prefers creating the platform to selling on the platform" -- "technofeudalism." Doctorow offers several examples, e.g., draconian noncompete agreements forced on employees, the expansion of IP rights to absurd degrees such as the author who attempted to own the word "cocky," and patent trolls whose "only product is lawsuits."

One related abuse he doesn't cover in this article but discusses elsewhere is the universal software marketing practice of not selling electronic products outright but "licensing" them. A "buyer" of a Kindle book, for instance, doesn't literally own it like a hard-copy book, for Amazon can remove the text from the customer's device at any point for any random reason. Granted, this probably doesn't happen often (I haven't experienced it), but the only way to avoid that risk would be to refrain from ever connecting that device to the internet again -- hardly practical.

By Doctorow's title, "Capitalists Hate Capitalism," he means, "They don’t want to be exposed to the risks entailed by competition, and feel the goad of that insecurity. They want monopolies, or platforms, or monopoly platforms." Unlike in many of his essays, in this one he doesn't suggest hypothetical remedies but simply describes a problematic situation.

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, February 29, 2024

Intermediaries on the Internet

Another post by Cory Doctorow about how good platforms go bad and, by extension, how the internet goes bad:

Intermediation

Why didn't the internet, as promised, "disintermediate the world"? Because in many situations we NEED "middlemen." Doctorow cites publishing as an example. While some authors self-publish and accomplish all the steps of the process themselves or directly pay others to do them (such as cover artists and freelance editors), most of us prefer to have someone else handle those tasks. And even the totally independent self-publishers typically need platforms such as Amazon, Draft2Digital, etc. to sell their work; very few earn money solely by hand-selling their books one by one, like the eccentric wordsmith Doctorow describes in his essay.

"The internet did disintermediate a hell of a lot of intermediaries –- that is, 'middlemen' –- but then it created a bunch more of these middlemen, who coalesced into a handful of gatekeepers." The gatekeepers, as he sees it, are the problem. Online sales of almost anything we might want or need on a single, convenient website is a service most customers value. The problem arises when a giant internet retailer locks out its competitors and/or restricts what customers and third-party sellers can do with the products. We don't hate intermediaries as such, according to Doctorow; we hate "powerful intermediaries." His solution -- for governments to enforce competition-supportive laws.

While I can't deny monopolies are generally a bad thing, except in public service spheres such as utilities and roads, I also highly value the convenience of being able to buy almost anything from Amazon, a website that remembers my address, past purchases, and payment methods and that has been reliably trustworthy with that information so far, as well as fast and efficient. Moreover, I like the capacity to sell my self-published e-books on a site that most potential readers probably use regularly. I love knowing I can find almost any book ever published, a cherished fantasy of mine in my pre-internet childhood and youth. I'd have a hard time getting along without Amazon if it vanished. Yet doubtless the abuses of which Doctorow accuses it are real, too.

As for one area in which powerful middlemen exploit their near-monopoly to perpetrate blatant ripoffs: In the Maryland General Assembly's current session, they're considering a law to forbid companies such as Ticketmaster from buying up most of the tickets for a high-demand event and reselling them at extortionate prices, among other protective measures:

Ticket-Scalping Bill

Despite such abuses, I endorse Doctorow's conclusion that, overall, "A world with intermediaries is a better world." In past centuries, people "in trade," who at first glance seem to add no value to products they profit from through their own middleman activities, used to be scorned by the upper class and regarded with suspicion by their customers. (We encounter the stereotype of the cheating miller in Chaucer's CANTERBURY TALES.) But what would we do without them?

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, November 16, 2023

Bad People Versus Bad Institutions

In his latest LOCUS essay, Cory Doctorow discusses whether "all the internet services we enjoyed and came to rely upon became suddenly and irreversibly terrible – as the result of moral decay." Setting aside the question of whether "irreversibly terrible" is a bit exaggerated, he reasonably states that "it’s tempting to think that the people who gave us the old, good internet did so because they were good people," and the internet was ruined, if it was, by bad people:

Don't Be Evil

The problem isn't that simple, however, since institutions, not individuals, created the internet. On the other hand, institutions comprise many individuals, some with honorable motives and some driven solely by the quest for profit. In short, "institutional action is the result of its individuals resolving their conflicts." Can corporations as such be evil? Doctorow doesn't seem to be saying that's the case. Every institution, private or public, includes multitudes of people, with conflicting goals, some good and some bad -- both the individuals and their goals. Moreover, as he doesn't explicitly mention, some people's characters and motivations are neither all good nor all bad. Many drift along with the corporate culture from fear of the consequences of resistance or maybe just from failure to think through the full implications of what's going on. He does seem to be suggesting, however, that vast, impersonal forces can shape negative outcomes regardless of the contrary wishes of some people involved in the process. "Tech didn’t get worse because techies [workers in the field] got worse. Tech got worse because the condition of the external world made it easier for the worst techies to win arguments."

What solutions for this quandary could be tried, other than "burn them [the allegedly villainous "giants of the internet" such as Amazon and Google] to the ground," in my opinion a bit too drastic? Doctorow insists, "A new, good internet is possible and worth fighting for," and lists some aspects he believes must change. Potential avenues for improvement can be summarized by the need to empower the people who mean well -- the ones Doctorow describes as "people within those institutions who pine for a new, good internet, an internet that is a force for human liberation" -- over those who disregard the concerns of their customers in single-minded greed for profit.

On the wider topic of individual responsibility for the villainous acts of institutions over which one doesn't have any personal control, one might be reminded of the contemporary issue of reparations to historically oppressed groups. Of course, one can quit a job and seek a more ethical employer, but renouncing one's nationality or ethnic ancestry would be severely problematic. However, since that subject veers into "modpol" (modern politics, as strictly banned on an e-mail list I subscribe to), I'll simply point out C. S. Lewis's essay, in a different context, about repenting of other people's sins:

Dangers of National Repentance

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, March 30, 2023

One Bite at a Time

Cory Doctorow's column for the March 2023 issue of LOCUS, for once, asserts a position I can support without reservation:

End-to-End

Concerning the many problems involved in making the internet user-friendly, a quest for perfection may result in no improvement at all. As Doctorow summarizes the situation, "The internet succeeded where other networks failed" because it didn't try to implement a "seemingly monolithic technological project" that would require all parties to agree on an ultimate solution that would deal with all difficulties once and for all. Instead, find one small element that everyone can accept. "Build that, then do it again, finding another step that everyone can get behind." In other words, figuratively speaking, eat the elephant one bite (or byte?) at a time. To quote Doctorow again, "I want a better internet now, not years down the road. I’ll happily take a smaller bite."

The main issue to which his current column applies this approach is the end-to-end principle, an older name for what's now usually called net neutrality. In brief, "when a willing speaker wants to say something to a willing listener, our technology should be designed to make a best effort to deliver the speaker’s message to the person who asked to get it." After decades of development of the internet, why don't we have this transparently obvious, user-friendly system?

When we ask a question with Google, why does it prioritize its own search engine's results over those of others that might be more relevant to the questioner's needs? When we search for a specific book or other product on Amazon, why do several other products pop up at the top of the page ahead of the one we typed in the search box? Why do Facebook posts from people and organizations we actually want to hear from get drowned in a sea of sponsored posts? Well, yeah, money and profit (duh). But why are such practices legally permitted? Why is Facebook allowed to restrict our access to posts from users we've liked or followed by blackmailing them into "boosting" their posts—paying to have their material seen by people who've expressed a wish to see it? Suppose when we tried to telephone a local business, the phone company routed the call to a rival business that had paid for the privilege? Nobody would stand for that, yet the equivalent happens online all the time.

Doctorow suggests examples of a few modest rules that internet companies should be required to follow: E.g. “The first result for a search should be the product that most closely matches the thing I searched for” and “If I subscribe to your feed, then when you publish something, it should show up in my feed.”

For a long time I was puzzled that my posts on my Facebook author page showed such low numbers of "Reach." The page doesn't have a huge throng of followers, but it certainly has a lot more than those being "reached." It was a shock to learn that in order to be read by more than a handful of followers, those posts needed to be boosted. In other words, I would have to bribe Facebook to carry out the function it purports to perform, connecting senders with willing receivers. Likewise, it's a constant, though minor irritant that searching for a book on Amazon often connects to a page where I have to scroll halfway down to find the desired item. According to Doctorow, the volume of ads and sponsored posts is delicately designed to stay "just below the threshold where the service becomes useless to you." I fear he may be right.

Will the limited ideal of his online utopia ever become a reality? Maybe not, but it's worth discussing.

Margaret L. Carter

Carter's Crypt

Thursday, November 03, 2022

Living in Alternate Realities?

In Philip Wylie's 1951 novel THE DISAPPEARANCE, an unexplained phenomenon divides Earth into two separate, parallel versions. In one reality, all human males instantaneously vanish; in the other, all women and girls vanish. The all-male world predictably devolves into a violent dystopia, while in the parallel world women have to cope with running society in an era when, compared to today, relatively few females held high public office or were educated in other professions dominated by men.

For a while it has seemed to me that the United States split into two alternate realities in November 2020. Instead of diverging into physically different planes of existence, though, the two realities exist side by side on the same planet while nobody notices what's happened. We talk at cross-purposes to inhabitants of the alternate world under the impression that the other person lives in the same universe, and therefore we can't figure out why they don't see things that look so obvious to us.

This impression hit me afresh during a recent conversation with a person who holds political beliefs opposite from mine. The bishop of our diocese had published a message that, among other topics related to the upcoming election, warned of the possibility of violence. The person with whom I was talking dismissed the warning on the grounds that my party would have no reason to resort to violence locally because they're likely to win the majority of electoral contests in this state, as usual (which is true). And members of his party, he said, "don't riot." I inwardly gasped in disbelief. I wouldn't have said anything anyway, to avoid useless argument, but in that moment I literally could not think of a coherent answer. It seemed we were living in two distinctly different versions of this country, which somehow overlap without coinciding.

The internet and social media, of course, go a long way toward explaining how citizens can inhabit the same physical world but totally disconnected mental universes. Before the internet and cable TV, we all got our news from much the same sources. Fringe beliefs stayed on the fringe; if my memory is more or less accurate, there was a consensus about the general nature of political, historical, and social reality, regardless of vehement conflicts about details. Now, as has often been pointed out, people can stay in their own "bubbles" without ever getting undistorted exposure to opposing beliefs and concepts.

I don't have the skill to write it, but I think it would be interesting to read a science-fiction novel about a world that contains two overlapping dimensions without the inhabitants of those dimensions realizing they're not even in the same universe.

Anyway, on a brighter note, as a former co-worker of mine used to say, "Vote early, vote often."

Margaret L. Carter

Carter's Crypt

Thursday, February 10, 2022

Most Writers Are Writers

This is the name of a page on the TV Tropes site, referring to the countless works of fiction with authors, playwrights, screenwriters, journalists, or poets as protagonists, a not unreasonable consequence of the hoary precept to "write what you know.":

Most Writers Are Writers

Taking this principle to its logical extreme leads to the situation satirized in a quote from SF author Joe Haldeman at the top of that trope page: "Bad books on writing and thoughtless English professors solemnly tell beginners to Write What You Know, which explains why so many mediocre novels are about English professors contemplating adultery."

Strict obedience to that "rule" would, of course, mean no fiction could be created about places or ethnicities other than the author's own, much less science fiction or fantasy. TV Tropes has another page discussing, with examples, the difficulties of writing about nonhuman protagonists such as extraterrestrials or animals. Yet even these characters have to exhibit as least some human-like traits, or readers couldn't identify with them:

Most Writers Are Human

Henry James critiques the advice that an author should write only from his or her own experience in this famous passage from his 1884 essay "The Art of Fiction" about the need for a writer to be someone "on whom nothing is lost":

"I remember an English novelist, a woman of genius, telling me that she was much commended for the impression she had managed to give in one of her tales of the nature and way of life of the French Protestant youth. She had been asked where she learned so much about this recondite being, she had been congratulated on her peculiar opportunities. These opportunities consisted in her having once, in Paris, as she ascended a staircase, passed an open door where, in the household of a pasteur, some of the young Protestants were seated at table round a finished meal. The glimpse made a picture; it lasted only a moment, but that moment was experience. She had got her impression, and she evolved her type. She knew what youth was, and what Protestantism; she also had the advantage of having seen what it was to be French; so that she converted these ideas into a concrete image and produced a reality. Above all, however, she was blessed with the faculty which when you give it an inch takes an ell, and which for the artist is a much greater source of strength than any accident of residence or of place in the social scale. The power to guess the unseen from the seen, to trace the implication of things, to judge the whole piece by the pattern, the condition of feeling life, in general, so completely that you are well on your way to knowing any particular corner of it--this cluster of gifts may almost be said to constitute experience, and they occur in country and in town, and in the most differing stages of education. If experience consists of impressions, it may be said that impressions are experience, just as (have we not seen it?) they are the very air we breathe. Therefore, if I should certainly say to a novice, 'Write from experience, and experience only,' I should feel that this was a rather tantalising monition if I were not careful immediately to add, 'Try to be one of the people on whom nothing is lost!'"

To put it more briefly, it has been said that instead of "Write what you know," the rule should be, "Know what you write." In other words, thoroughly research whatever you aren't already familiar with from personal experience or study.

I admit I've usually adhered to "write what you know" in terms of my characters' occupations. Most of my heroines work as librarians, proofreaders, bookstore clerks, college instructors, or, yes, authors. Since their work usually isn't the central focus of the story, I figure it's just as well to give them jobs I know enough about not to make blatant errors. Where the protagonist's vocation does play a major role in the plot, I default to "writer."

The internet makes research easier than ever before, provided one takes care to distinguish accurate sources from their opposite. And for in-depth exploration, reliable websites can direct the searcher to books, which can often be obtained through interlibrary loan—which can also be arranged online. A public library might even have access to that one necessary book in electronic format, eliminating the need to go out to pick it up. For example, once when I wanted to insert a few sentences about a heroine's psychic vision of a mountain trail in Afghanistan into a story, typing and clicking on a single search phrase gave me all the images I could wish for. We truly live in wondrous times for "knowing what we write."

Margaret L. Carter

Carter's Crypt

Thursday, January 27, 2022

Creative Fakelore for Fun and Enlightenment

The January-February 2022 issue of SKEPTICAL INQUIRER includes an article by statistical ecologist Charles G. M. Paxton, narrating his experiment of creating an imaginary water monster to masquerade as an authentic legend. He was inspired by an account of an eighteenth-century ghost in London that turned out to be a hoax promulgated in the 1970s. Paxton wondered whether his lake monster could gain similar credence. One intriguing thing about this experiement, to me, is that not only did his invented sightings get retold as genuine by multiple sources, new reports of alleged historical sightings sprang up, independent of any effort on his part.

He decided to create, not a generic sea serpent like Nessie in Loch Ness, but a "monstrous aquatic humanoid." He located it in two freshwater lakes in England's Lake District that, as far as he knew, had no existing tradition of monster lore. Paxton named this creature Eachy and devised a false etymology for the word. He also invented a nonexistent book to cite as a source. After he had an article about Eachy uploaded to Wikipedia, references to the monster began to spread. Although the Wikipedia article on Eachy no longer exists, the Cryptid Wiki has a straightforward page on him or it as a real piece of folklore:

Eachy

The Cryptid Wiki piece mentions the earliest reported appearance of Eachy having occurred in 1873, an imaginary "fact" taken directly from Paxton's material. Moreover, in 2007 the monster sneaked into an actual nonfiction book, a cryptozoology guide by Ronan Coghlan. By January of 2008, Eachy T-shirts were being sold on the internet by someone unconnected to Paxton. At the time the Wikipedia Eachy page was deleted in 2019, it held the status of second-longest surviving hoax on that site.

What do we learn from this story? Paxton proposes that "the tale of the Eachy tells us the dangers of how Wikipedia can be subject to manipulation." As he mentions, however, in more recent years Wikipedia has tightened its standards and introduced more safeguards. On a broader scale, the Eachy hoax demonstrates the danger of how easily recorded history can be distorted or even fabricated from nothing, then accepted as fact. An important caution I'd note, as Paxton also alludes to, is the hazard of uncritically believing what appear to be multiple sources when in truth they're bouncing the same "facts" around in a self-referential echo chamber, repeating what they've picked up from previous sources in endless circularity. That phenomenon can be seen in a field I'm somewhat familiar with, scholarship on Bram Stoker's DRACULA. For instance, after an early biography suggested that Stoker might have died from complications of syphilis, numerous authors since then (in both nonfiction and novels) have accepted without question the truth of the assumption, "Bram Stoker had syphilis, which influenced the writing of DRACULA." The tale of Eachy also reinforces the obvious warning not to believe everything you read on the internet or even in books.

It's fascinating to me that a legend can be invented, disseminated, and perceived as authentic so quickly. Some authorities believe the story of Sawney Bean, the alleged patriarch of a sixteenth-century Scottish cannibal family, first reported in the NEWGATE CALENDAR centuries after the supposed events and repeated as fact in numerous publications since, was just such a fictional legend. And Sawney Bean's tale became deeply rooted in the public imagination long before the internet. In our contemporary electronic age, the chilling scenario in Orwell's NINETEEN EIGHTY-FOUR comes to mind. If history is whatever is written, what happens when history becomes so easy to rewrite? That's one good reason why, even if it ever became possible to digitize and make available on the web every book in existence, we should still hang onto the physical books. Ink on paper can't be altered at whim like bytes in an electronic file.

Margaret L. Carter

Carter's Crypt

Thursday, September 09, 2021

More Futuristic Forecasts

"Prediction is hard, especially about the future." Over the past week, I've been rereading LIFE AND TIME, a 1978 collection of essays by Isaac Asimov (some of them written as early as the 1960s). In contrast to the imaginative speculations in his fiction, these articles include serious forecasts about potential developments in technology and society.

Most strikingly, he anticipated the internet, a global repository of information anybody could draw upon. He envisioned everybody on Earth having a personal "channel" just as most people now have individual telephone numbers. We sort of have that system now, considering the unique IP address of each computer as a personal channel. Also, an individual tablet or smart phone serves the same function. Incidentally, J. D. Robb's "In Death" SF mystery series anticipated today's smart phone as the pocket "link" most people in her fictional future carry with them, long before such devices became common in real life. Asimov hailed the future possibilities of lifelong, customized learning through the worldwide computer bank. Granted, many people benefit from the internet in that way, yet the satirical lament too often holds some truth: We have a network that gives us access to the entire accumulated knowledge of humanity, and we use it mostly for political rants and pictures of cats. Asimov suggested computer learning could overcome one of the main disadvantages of our educational system, the necessity for one teacher to instruct a large group of students, making it impossible to adjust lessons to the comprehension level, interests, and learning style of each individual. Computer education could effectively give each pupil a private tutor. Although we've recently had over a year of experience with online education, it's still been mainly a group-oriented activity. Advanced AI might fulfill Asimov's vision. He also foresaw cashless monetary transactions, electronic transmission of documents, and virtual rather than in-person business meetings, all of which exist now. Unfortunately, his expectation that these developments would greatly reduce travel and its attendant pollution hasn't come to pass yet, probably because many employers are reluctant to embrace the full potential of remote work.

On some topics, he was too pessimistic. For example, he foresaw the world population reaching seven billion by the early 21st century, a point we've already passed. However, we're not forced to survive on synthetic nourishment derived from genetically engineered microbes, as he speculated might become necessary. We still eat a lavish variety of fresh foods. He seemed to believe a population of the current level or higher would reduce humankind to universal misery; while many of the planet's inhabitants do live in abject circumstances, Earth hasn't yet become a dreary anthill.

Not surprisingly, Asimov favored genetically modified agricultural products, which already exist, although not in some of the radically altered or enhanced forms he imagined. He also focused on the hope of cleaner energy, perhaps from controlled fusion or large-scale solar power. He proposed solar collectors in orbit, beaming energy down to Earth, far from a practical solution at present. And, as everyone knows, fusion-generated power is only twenty years away—and has been for a generation or more. :) Asimov predicted autonomous cars, almost commercially viable in the present. He also discussed the potential advantages of flying cars, however, without apparently considering the horror of city skies thronged with thousands of individual VTOL vehicles piloted by hordes of amateurs. Maybe self-driving vehicles would solve that problem, being programmed to avoid collisions.

To save energy on cooling and heating as well as to shelter inhabitants from severe weather, he proposed moving cities underground, as in his novel THE CAVES OF STEEL. This plan might be the optimal strategy for colonizing the Moon or Mars. I doubt most Earth citizens would accept it unless it beomes the only alternative to a worldwide doom scenario. Asimov, a devoted claustrophile, seemed to underestimate the value the average person puts on sunshine, fresh air, nature, and open space.

In general, he tended to be over-pessimistic about the fate looming over us unless we solve the problem of overpopulation right now (meaning, from his viewpoint, in the 1980s). As dire as that problem is in the long run, the decades since the publication of the essays in LIFE AND TIME demonstrate that Earth is more resilient than Asimov (and many prognosticators at that time) feared. Moreover, the worldwide birthrate is declining, although the shift isn't spread evenly over the world and for the present global population continues to rise through sheer momentum. Asimov analyzed the issue of whether a demographic pattern of old people far outnumbering younger ones would lead to a rigid, reactionary culture. He maintained that the mental stagnation traditionally associated with aging could be prevented by an emphasis on lifelong learning and creativity. He devoted no attention to the more immediate problem of declining birthrates some nations already begin to face now—a young workforce that isn't large enough to support its millions of retired and often infirm elders. Encouraging immigration would help. (But that's "modpol"—shorthand for modern politics on one list I subscribe to—so I'll say no more about it.) In the long run, however, if and when prosperity rises and births decline worldwide, there won't be anyplace for a supply of young workers to immigrate from.

Asimov seemed over-optimistic about the technological marvels and wondrous lifestyle we'll all enjoy IF over-population and its attendant problems are conquered. He envisioned the 21st century as a potential earthly paradise. Judging from the predictions of such optimists over many decades, just as controlled fusion is always twenty years away, utopia is always fifty years away.

Margaret L. Carter

Carter's Crypt

Thursday, April 08, 2021

Starting Afresh

Kameron Hurley's newest LOCUS column discusses making a fresh start with the turn from winter to spring:

Plotting the Way Forward

Noting that the ancient Romans marked the New Year in March rather than January, Hurley muses about the signs of spring that show up in March. This year, she finds particular hope in the change of seasons because a potential end to the COVID crisis may be in sight. She ponders what is meant by "returning to normal": What will go back to the way it was? What will have changed permanently? As she puts it, “'normal' is a shifting target. After the last year, our world will not be quite the same."

One change she welcomes is the decline of shopping malls. Here I disagree. I'm a big fan of malls, even though with online ordering I haven't frequented our local mall in recent years nearly so much as I used to (especially after its chain bookstore closed). Sure, a green-space town center with a cluster of shops, within easy walking distance of home, would be lovely. But that's not likely to sprout up out of nowhere near us (all the ground within walking distance being occupied by houses or, if one has the stamina to hike one-point-three miles to the main road, existing stores). Nor does it describe the neighborhoods where I spent the years between age eight and moving out of my parents' home to get married. We lived in the suburbs. There was nowhere to walk except other houses and, a longish trek from our home, a major highway at the entrance to the development. A very long bike ride could take us to a shopping strip with one large store and several smaller ones. When the first actual mall opened near us (in the greater Norfolk, Virginia, area), in my teens, I was thrilled about the concept of shopping at a bunch of stores in the same location, with plenty of parking, under a ROOF! That last was a big deal in one of the more rainy regions of the country. And I still think malls are a great idea in places where most people depend on cars to get anywhere, which describes every city we've lived in throughout our married life.

But I digress. Some of the changes Hurley welcomes, I can agree with. As for the ambition to "re-think our crowded buildings in crowded cities that have few to no greenspaces," that sounds desirable, but such a revolution can't occur with the simple wave of a wand. Shifting many jobs to remote work is a change I'd like to see made permanent, if only for the sake of our grown children who've benefited from it. What about universal mask-wearing? I look forward to not having to do that all the time, yet I agree with Hurley on the advantage of getting sick less often. I could embrace a custom of wearing masks out and about when suffering from a mild illness, as many people do in Japan. As a probable side effect of the COVID precautions, I haven't had a cold in over a year. Hurley also looks forward to future advances in medical science as a result of discoveries made in the course of vaccine research. Like wars, pandemics can produce occasional positive technological side effects.

I've missed attending church in person, but I hope after we resume live gatherings our church will continue to record Sunday services for availabilty to people who can't be present for one reason or another. The pandemic has compelled us to try many such innovations that would be helpful to hang onto. The ubiquity of restaurant meal ordering, for example—it's become easier than ever before to get home-delivered meals from a wide variety of our favorite places, on websites instead of over the phone, prepaid with a credit card. With the success of virtual conventions in the past year, maybe some of them will continue to provide an online track for fans who can't make it to the physical location. However, there's at least one minor negative about the increasing shift to electronic media, from my personal viewpoint: More and more periodicals are switching to digital-only. I like magazines I can hold in my hands and, if worth rereading, store on a shelf.

A related trend that predated COVID but may have accelerated recently is the convenience of being able perform many activities such as financial and government transactions over the Web. No need to drive to the bank to transfer funds, the post office to buy stamps, or the motor vehicle office to renew a car registration. This trend is likely to continue and expand. Of course, the downside involves less convenience for people who don't have a computer (my 90-year-old aunt, for one, but many citizens lack computers and their associated functions from poverty, not choice) or adequate internet access. As has often been pointed out recently, computers with internet connections are no longer luxuries but household necessities on a level with water, electric, and phone services.

Hurley concludes by invoking March, which heralds spring in much of the northern hemisphere, as the time "when we celebrate surviving the very worst the world could throw at us, and plot a new way forward." Or, as Brad Paisley says in his optimistic song "Welcome to the Future," highlighting modern marvels formerly enjoyed only in the realm of science fiction, "Wherever we were going, hey, we're here."

Margaret L. Carter

Carter's Crypt

Thursday, November 26, 2020

Thanksgiving

Happy Thanksgiving to all who celebrate it today!

We're preparing our usual turkey dinner—actually, my husband does all the hard parts, one thing I'm thankful for—although on a smaller scale than in some years. The only participant outside our household of three will be our oldest son, who lives alone.

I often remind myself to be grateful for how much better off we are than so many people in these times. Because my husband and I are retired, our lives didn't change much with the shift toward staying home more. As a writer, I can keep doing pretty much what I would be doing anyway, thanks to the internet. All four of our offspring are securely employed, three of them in positions that allow working from home. Thanks to Facebook, we can see what's new with the grandchildren. We're lucky to have many local restaurants that deliver and offer the convenience of online ordering. Anything we need that our neighborhood stores don't have, we can order from Amazon or the equivalent. Deliveries, mail, and other essential services continue to operate efficiently. Our supermarket has mostly recovered from the supply-chain problems of earlier in the year and usually stocks the things we need. And, again, if they run out, online sources can often fill the gaps.

The conventions we normally attend—ChessieCon this weekend and my International Conference on the Fantastic in the Arts in March—are able to offer virtual experiences rather than canceling altogether. Our church holds virtual services, too—experiences that would have been unimaginable a couple of decades ago.

Imagine how much more difficult this year would have been without contemporary technology and communications.

In the news, we have the hopeful prospect of three promising vaccines so far. Focusing on the positive helps me avoid sinking into depression when the news occasionally doesn't look so good. The world has survived worse; there's a light at the end, and this time it isn't an oncoming train. Best wishes to all for the upcoming holiday season, even though different from what we expected.

Margaret L. Carter

Carter's Crypt

Thursday, January 16, 2020

Freedom of Speech Online

Cory Doctorow's latest LOCUS column explores the distinction between freedom of speech in the legal sense and the pragmatic limitations encountered on the Internet:

Inaction Is a Form of Action

He focuses on the effects of the dominance exerted by tech giants such as Facebook and Google. The Constitution forbids government interference with freedom of speech, but it doesn't prevent private businesses from setting their own rules. Constructing a parable of two restaurants, one that forbids political conversation on its premises and another with no such prohibition, he acknowledges that customers who don't like the restrictions of No Politics Diner can eat at Anything Goes Bistro. But suppose No Politics Diner not only buys up all its competitors but branches out to own a variety of other kinds of businesses as well? It's theoretically possible that soon there won't be any privately owned public spaces in town where customers can discuss politics. Without any interference by government, freedom of speech has effectively been limited.

With the pithy comment that Facebook "has hostages, not users," he applies this analogy to online services. When the giants have swallowed up so many of their competitors that (in an exaggerated but still chilling quote) the Internet has become “five websites, each consisting of screenshots of text from the other four,” policies set by these companies can restrict online speech even though no state censorship is involved. Services such as Facebook make rules, followed by exceptions to the rules, then additional layers of regulations to close the loopholes created by the exceptions. The resulting incomprehensibly complex tangle of exceptions and loopholes, according to Doctorow, "will always yield up exploitable vulnerabilities to people who systematically probe it." While the trolls run rampant, the rest of us may have no means of defending ourselves against them.

He has a list of suggestions for "fixing" the Internet to transform it into an environment "that values pluralism (power diffused into many hands) and self-determination (you get choose which tech you use and how you use it)." One thing he urges is breaking up the Big Tech monopolies. I have reservations about whether this course of action is practical (or, under current law, legal, but that's an area I don't know much of anything about). It's hard to argue with his summary of the problem, however: "When the state allows the online world to become the near-exclusive domain of a small coterie of tech execs, with the power to decide on matters of speech – to say nothing of all the other ways in which our rights are impacted by the policies on their platforms, everything from employment to education to romance to (obviously) privacy – for all the rest of us, they are making policy."

Margaret L. Carter

Carter's Crypt

Thursday, November 22, 2018

Is the Internet Revolutionary?

Happy American Thanksgiving!

Cory Doctorow's latest LOCUS column discusses whether the Internet qualifies as "revolutionary":

What the Internet Is For

His answer: The Internet runs on a revolutionary principle but is not, in itself, revolutionary. The principle, as he describes it, is "the 'end-to-end' principle, which states that any person using the internet can communicate with any other person on the internet without getting any third party’s permission." We've become so used to the capacity to do this that we forget how mind-boggling it is. He goes on to examine computers and encryption from the same perspective. Finally, he asserts that the Internet is "a necessary but insufficient factor for effecting revolution" and offers support for that view. An exciting and optimistic article, recommended reading for the detailed explanations I haven't summarized.

This weekend, as usual, ChessieCon will be held just north of Baltimore, and my husband and I will appear on the program. I'll report on the panels and other events next week. Jo Walton will be this year's Guest of Honor!

ChessieCon

Meanwhile, happy turkey day (or whatever your feast of choice may be).

Margaret L. Carter

Carter's Crypt

Thursday, July 19, 2018

Monsters in the Modern World

A recent question on Quora asked how well vampires would be able to survive in the modern world. My reaction was along the line of "better than ever." In DRACULA, Bram Stoker envisions how the Count uses "nineteenth-century up-to-date" conveniences to move to the modern, technologically advanced environment of England from his "ruined castle in a forgotten land" (as Van Helsing describes it). Many urban fantasy novels imagine how vampires and other traditional "monsters" might fit into the twentieth and twenty-first centuries.

To begin with, a contemporary vampire who wants to relocate can buy a plane ticket instead of having to endure a lengthy ocean voyage, with the risks of exposure in being confined to a limited space for days or weeks with a small group of oblivious human companions. An even more fundamental consideration is that, in contrast to past times and places when tight-knit communities were suspicious of strangers and people with eccentric habits, nowadays in any first-world country a vampire could blend in as just one of many representatives of diverse ethnic groups and lifestyles. For much of European and American history, failure to attend church would be seen as peculiar or downright suspicious; nowadays that behavior wouldn't raise an eyebrow. If he or she has a severe reaction to sunlight (like the undead in movies and many modern novels, although not in nineteenth-century fiction or most folklore) or simply prefers a nocturnal existence, stores and businesses with extended hours are plentiful in any decent-sized city. The Internet, of course, makes it easy to obtain most products and services without leaving home. If the vampire needs to earn money, numerous night-shift jobs are available. Never being seen eating could be attributed to allergies or some other dietary restriction. ("I'm on a liquid protein regimen.") What about nourishment? Blood banks (with, presumably, bribeable employees who could supply newly expired blood) offer an obvious source. Also, it wouldn't be hard to find potential donors with romantic notions about vampires, who would happily give blood under the impression that the alleged vampire is simply playing a role.

Computers and the Internet, in my opinion, would make transition from one lifetime to the next easier rather than harder. A competent hacker can create a new identity with supporting data planted on all the relevant websites. In the TV series FOREVER KNIGHT, one vampire makes a career of performing that very service. The issue of possible exposure by old photographs was raised on Quora, a problem that I believe is much exaggerated. What would you think if you saw a century-old photo that closely resembled a contemporary acquaintance? Would you instantly jump to the conclusion that the person must be immortal? No, most likely you would think, "What an amazing family resemblance." My husband's brother looks remarkably like a picture we have of their father in late middle age, and nobody wonders whether my brother-in-law is really his father under a new identity. :)

Werewolves could also benefit from modern conveniences. With rapid transit, on full-moon nights a werewolf could quickly travel to an isolated region where he or she could roam and hunt animal prey. If he or she suffers from the affliction of being unable to control the change or behave rationally when transformed, an electronic lock on a timer could keep the werewolf safely confined in a reinforced room during the critical period—no need to involve a fallible human helper. In case of a craving for raw meat, any big city has butcher shops where fresh meat of all kinds can be bought, then consumed in the privacy of the home. Or maybe a discerning werewolf would order exotic cuts online (venison? buffalo?). Interesting side note: Poul Anderson wrote a couple of stories about werewolves who stay rational in wolf form but need moonlight to transform. They carry flashlights that simulate moonlight, so that they can change shape by shining the artificial moonlight on themselves.

Would the Internet and social media make contact with friendly extraterrestrials easier or harder to adjust to? The news of their arrival, with visual recordings, would be transmitted around the world instantly. On the other hand, given the ease of faking photos and videos, would much of the public think it's a hoax at first? How long would it take for governments and mainstream news media to convince most of their constituents that the landing really happened? Considering we still have believers in a flat Earth and disbelievers in the moon landings, some people might never accept the existence of aliens.

Vampires on FOREVER KNIGHT worried about photographs, because they could alter human memories by hypnosis, but memory erasure didn't stick if the victims had physical evidence to reinforce their awareness of the truth. One might think social media would pose a serious danger to the anonymity of vampires, werewolves, and other monsters. Again, I think the risk isn't that high, because audiences have solid reasons to be cynical about visual "proof." Anybody who isn't already predisposed to believe in the supernatural would probably dismiss pictures or videos as staged, photoshopped, or both. If vampires WANTED to come out in public (as in the Sookie Stackhouse series and its TV adaptation, TRUE BLOOD), they might have as much trouble getting the world to believe in them, at first, as visiting aliens would.

Margaret L. Carter

Carter's Crypt

Thursday, November 09, 2017

Spoilers

Once upon a time, the only way to watch old movies was to wait for them to show up on late-night television or possibly on weekday afternoons in lieu of soap operas. And those were OLD films. TV channels didn't start airing more recent movies in prime time slots until sometime in the 1960s, if I recall correctly. (I remember what an exciting novelty the feature "Monday Night at the Movies" was.) We had three television networks (aside from the few people who went to the trouble of installing UHF reception equipment). If you didn't catch an episode of a show, you'd simply missed it and had to hope a rerun would eventually appear. I remember wanting to see the episode of the one-hour TWILIGHT ZONE featuring Hitler's ghost and being bitterly disappointed that I managed to miss it each time it was on. (About fifty years later, I finally viewed it by buying the DVD of the season.) All we knew in advance about TV shows was what we read in the newspaper TV schedule blurbs. The only prior knowledge of movies came from theater previews, studio ads, or maybe information that "leaked" in magazines for fans. So getting "spoiled" with plot details was practically impossible.

Nowadays, of course, we exist in a media environment that's the extreme opposite. Thanks to the Internet and cable, it's almost impossible to avoid spoilers. The era when an entire audience waited week by week to watch each new episode of a program at the same time has vanished. Fans view shows on demand, in some cases even before broadcast. This past Sunday, for instance, a fellow OUTLANDER fan mentioned to me that she planned to watch the latest episode during the day, several hours before its official network debut in the evening. People "binge-watch" entire seasons within a span of hours. We can buy recordings of programs and movies to watch over and over, memorizing every detail of our favorites. If we want to avoid surprises and see an episode or movie "unspoiled," simply not reading reviews isn't enough. We have to purposefully stay away from social media, online fan discussions, entertainment news sites, anything that might reveal what we don't want to know.

Some classics carry their own inherent "spoilage," because their basic premise pervades our culture, even among people who've never read the books or seen adaptations of them. Everybody knows Frankenstein created a monster and Count Dracula is a vampire. The first readers of those books upon original release didn't, unless they'd picked up reviews first. Adaptations of DR. JEKYLL AND MR. HYDE always show the doctor's fateful transformation early in the story; in Stevenson's novella, the truth about Hyde is a mystery not solved until near the end. TV Tropes has a page about this phenomenon titled, "It Was His Sled," referring to the revelation in the final scene of CITIZEN KANE that's no longer a secret to anybody with even a casual knowledge of classic films.

Personally, I don't mind being spoiled—except maybe in the case of mysteries. The first time around, I don't want to know in advance who the murderer is. Even in that genre, though, I do reread and re-view favorite mysteries. There's so much more to enjoyment of a story than being surprised. The second and subsequent times, one can have the pleasure of noticing the clues and how they fit together to lead to the forthcoming revelation, which we couldn't have fully realized on the first reading or viewing. We're not looking so much for surprises (as C. S. Lewis says somewhere), but for "a certain surprisingness." The anticipation of knowing what's coming can actually enhance the pleasure of the suspense. Sometimes I want to know just enough about the ending to be sure my favorite characters survive. When the catastrophic series finale of FOREVER KNIGHT aired, I was glad I'd read a summary of the plot in advance, because the knowledge enabled me to brace myself for the worst. Upon actually watching the episode, I was able to think, "That wasn't quite so bad as I expected." On subsequent readings or viewings of a work we've enjoyed the first time around, we're no longer consumed with the drive to find out what's going to happen, so we can savor other aspects of the story, themes, and characters.

In AN EXPERIMENT IN CRITICISM, C. S. Lewis says that an invariable trait of what he calls "unliterary" readers (casual readers, who would find our devoted absorption in books bewildering) is that they never voluntarily read anything more than once. True book-lovers, on the other hand, often read their favorites multiple times over the years. How do you feel about being "spoiled"? Do you want to know nothing at all in advance? A tagline of TV GUIDE length? A back-cover blurb? Or do you not mind knowing some details of the plot or even a hint about the ending?

Margaret L. Carter

Carter's Crypt

Thursday, July 20, 2017

The Metamorphosis of Journalism

Earlier this year, the Toronto STAR ran an article by Catherine Wallace, winner of the 2016-2017 Atkinson Fellowship (a journalism award), about the whirlwind changes currently happening in the field of journalism:

Journalists Are Vanishing

The traditional media outlets, especially newspapers, are no longer the only source of news. For many people, they aren't the primary source, and some don't read old-fashioned newspapers at all (a practice that seems incredible to me—give up my morning papers? never!). The traditional media used to be the "gatekeepers" of information, as Wallace puts it. Now we get news and opinions from many different sources in addition to printed papers, not only broadcast programs (TV and radio) but a variety of Internet formats such as blogs, Facebook pages, Twitter feeds, and videos filmed by ordinary citizens. In Wallace's words, "My smartphone is a 24-hour news feed — a newspaper, magazine, computer, radio, TV and town square in a single mobile device." The Internet has blurred if not abolished the distinction between content providers and audience. Journalism is "no longer an industry, now an ecosystem." What have we lost or gained with the passing of the former status quo?

The Internet makes it possible for anyone to publish anything. Wallace applauds the "democratization of news and information." We can all express our opinions publicly. The news "ecosystem" has become diverse rather than monolithic. We have "countless witnesses to big events" instead of just the official line.

On the negative side, she mentions the loss of jobs in the field of journalism, a decline that endangers the objectivity we used to expect from the traditional news media. The Internet is swamped by information, but much of it is "raw." Traditionally, reporters and editors made sense of this flood of information (and misinformation). And then there's the "bubble" effect (though Wallace doesn't use that term), in which it has become too easy to surround ourselves with information and opinion sources that reinforce what we already believe. We're in danger of consuming "fake news" and "alternative facts" without checks and balances. Wallace draws particular attention to the role of traditional news sources in reporting on local community events and issues. That's one reason why I'll never drop our subscription to the local paper, even though, since it was bought by the company that owns the Baltimore SUN, the two publications print a lot of the same articles.

Wallace's long essay contains lots of thought-provoking observations and is well worth reading in its entirety.

Margaret L. Carter

Carter's Crypt

Thursday, September 15, 2016

Privacy Under Siege?

Speaking of privacy, as Rowena's recent post does: Cory Doctorow's column in the latest LOCUS delivers warnings about privacy threats from the Internet and the cutting-edge "Internet of Things."

Privacy Wars

Doctorow discusses the "absurd legal fiction" of the ubiquitous "notice and consent" requirement. You know, those policy statements and conditions of use for which we have to check "accept" before we can run software or access certain web content. As Doctorow points out, nobody can really read all that stuff. To do so in detail with every device or program would eat up most of our waking hours. Yet by checking "accept," we often give permission for all sorts of tracking software to interact with our computers and phones, without even realizing we've done so. Pokemon Go players probably realize the game "knows" where they are at all times, but they accept that knowledge as part of the cost of playing the game.

I don't own a smart phone and never plan to get one (unlike my husband, who upgraded to such a device a while back). So at present my activities and movements in the physical world can't be tracked by any incarnation of Big Brother (public or private—and isn't it interesting that Orwell envisioned an all-seeing government, yet nowadays it's mainly commercial entities that observe us?). I'd direly miss the convenience of ordering from my regularly-visited websites without having the enter information every time, though. And it's a great boon, when I'm not sure whether I own copy of a certain book, to learn from a glance at the Amazon book page whether I've bought it already. To get that convenience, we have to accept cookies and all that comes with them.

Doctorow's vision of the totally connected future takes on an apocalyptic tone, as in this paragraph:

"You will ‘interact’ with hundreds, then thou­sands, then tens of thousands of computers every day. The vast majority of these interactions will be glancing, momentary, and with computers that have no way of displaying terms of service, much less presenting you with a button to click to give your ‘consent’ to them. Every TV in the sportsbar where you go for a drink will have cameras and mics and will capture your image and process it through facial-recognition software and capture your speech and pass it back to a server for continu­ous speech recognition (to check whether you’re giving it a voice command). Every car that drives past you will have cameras that record your like­ness and gait, that harvest the unique identifiers of your Bluetooth and other short-range radio devices, and send them to the cloud, where they’ll be merged and aggregated with other data from other sources."

Do you think our digital footprints will, on a practical level, become that detailed and all-pervasive anytime in the near future? What company or agency would have the time, resources, or motivation to aggregate and make active use of so much miscellaneous data? On the other hand, I agree with Doctorow that the mere fact of having all this information unguardedly accessible SOMEWHERE is frightening.

Coincidentally, in an interview in the same issue of LOCUS, Charles Stross speculates on the benefits and potential hazards of living surrounded by interactive objects. He narrates an anecdote from the pioneering days of microprocessors, back in the 1970s. Someone joked that eventually the chips would become so cheap we'd put them in doorknobs. Everybody laughed. If you've stayed at a hotel lately, you've routinely encountered computerized door locks. Stross proposes the example of replacing city sidewalk pavement with stones containing chips that have "the equivalent of an iPhone 4 in computing power." Then suppose most pedestrians are wearing clothes with radio ID tags designed to interact with the washing machine for optimal cleaning—which incidentally also contain unique identifying data. If a person collapses from a heart attack, the sidewalk could summon an ambulance instantly. But a fully networked city could also track us everywhere we go.

Forsooth, smart technology can indeed be a mixed blessing.

Margaret L. Carter

Carter's Crypt

Tuesday, August 23, 2016

Incorrect Information About You As a Writer May Turn Up In A Search Engine

Incorrect Information About You As a Writer
May Turn Up In A Search Engine 

WRITERS BEWARE:

Readers used to "vet" you by looking you up in a "Who's Who" or similar paper book brought out front by a reference librarian (I'm in a lot of them).  Some of that info, printed on paper, would be out of date, and some of it never was true.

The same thing happens on the internet, only sometimes the info page is not dated.  What you read must be kept in your tentative-file, and not believed until confirmed by direct contact with the individual involved.

We all know better than to believe anything found on the internet.  Except, sometimes, you are in a hurry and find something that looks legitimate, and just use it as if it is true, or the whole truth.  Much of what is written about any public figure will have been written by enemies of that public figure.
Published writers are not immune to this phenomenon.

So READERS BEWARE.  

Here is an example that might be illustrative of using a well known name as click-bait for a scam selling information about Internet Figures, people known on social media to be influential.  It is possible this site might be collecting info with bots, then reselling it to advertisers.  They did not consult me before excerpting these items.

A friend of mine found my name used thusly:

http://www.zoominfo.com/p/Jacqueline-Lichtenberg/1625153

It appears some of this is lifted from things like classmates.com or ringsurf.com and other subscription services -- where I often fill forms with untrue info because it's none of their business. A lot of it is true, and lifted from my sites -- Facebook -- simegen.com -- by some kind of "bot" that really does not know how to read!  Some of it is true, or was at some time.  All of it is online somewhere, or was at one time or another.

What I'm posting here today is true as of August, 2016.

Note the copyright listings at the bottom of this page. Obviously, they are trying to conform to any legal details.

There is true "information" on that page mixed with information that is not true or way out of date.

Yes, I write books, and those are titles of mine -- but there exists newer information.

Here is where to get updated information and contact information:

You can find me on Facebook

https://www.facebook.com/jacqueline.lichtenberg

And the Sime~Gen Group where it's easy to talk with me to directly verify information. It's not hard to get in touch.

https://www.facebook.com/groups/SimeGen/

You can talk to me on Twitter:
https://twitter.com/JLichtenberg

You can find me on LinkedIn
https://www.linkedin.com/in/jacquelinelichtenberg

I'm on blogger:
http://aliendjinnromances.blogspot.com/

http://out-territory.blogspot.com/

http://dushau.blogspot.com/

http://makingangelsthemovie.blogspot.com/

http://editingcircle.blogspot.com/

I'm also on a numbeer of chat services such as whatsapp.

The master biography/bibliography that I update is:

http://simegen.com/bios/jlbio.html

There is a Sime~Gen wiki which is currently firmly edited by those who know what they're talking about -- but will be open to additions and embroidery by many casual readers trying to be helpful.  By then, we expect to have a paper printing of this Wiki's information in a Concordance of Sime~Gen which will be authoritative.

http://simegen.wikia.com/wiki/Main_Page

You can find my page on Amazon where you can use the "follow" button to get notified of new titles:http://www.amazon.com/Jacqueline-Lichtenberg/e/B000APV900/

Or focus on the Sime~Gen Series:
http://www.amazon.com/Sime-Gen-13-Book-Series/dp/B016QAFPMK/




Here is a documentary on French TV that has a few clips of me, and discusses my Star Trek series, Kraith:
http://www.france4.fr/emission/fanfiction-ce-que-lauteur-oublie-decrire

And here is Kraith for free reading:
http://www.simegen.com/fandom/startrek/kraith/

And of course, I own and update my own domain:

http://jacquelinelichtenberg.com

So if you find info on me you want to quote, check with me first.

Jacqueline Lichtenberg