Showing posts with label alexa. Show all posts
Showing posts with label alexa. Show all posts

Thursday, June 08, 2023

Existential Threat?

As you may have seen in the news lately, dozens of experts in artificial intelligence have supported a manifesto claiming AI could threaten the extinction of humanity:

AI Could Lead to Extinction

Some authorities, however, maintain that this fear is overblown and "a distraction from issues such as bias in systems that are already a problem" and other "near-term harms."

Considering the "prophecies of doom" in detail, we find that the less radically alarmist doom-sayers aren't talking about Skynet, HAL 9000, or even self-aware Asimovian robots circumventing the Three Laws to dominate their human creators. More immediately realistic warnings call attention to risks posed by such things as the "deep fake" programs Rowena discusses in her recent post. In the near future, we could see powerful AI "drive an exponential increase in the volume and spread of misinformation, thereby fracturing reality and eroding the public trust, and drive further inequality, particularly for those who remain on the wrong side of the digital divide."

On the other hand, a member of an e-mail list I subscribe to has written an essay maintaining that the real existential threat of advanced AI doesn't consist of openly scary threats, but irresistibly appealing cuteness:

Your Lovable AI Buddy

Suppose, in the near future, everyone has a personal AI assistant, more advanced and individually programmed than present day Alexa-type devices? Not only would this handheld, computerized friend keep track of your schedule and appointments, preorder meals from restaurants, play music and stream videos suited to your tastes, maybe even communicate with other people's AI buddies, etc., "It knows all about you, and it just wants to make you happy and help you enjoy your life. . . . It would be like a best friend who’s always there for you, and always there. And endlessly helpful." As he mentions, present-day technology could probably create a device like that now. And soon it would be able to look much more lifelike than current robots. Users would get emotionally attached to it, more so than with presently available lifelike toys. What could possibly be the downside of such an ever-present, "endlessly helpful" friend or pet?

Not so fast. If we're worried about hacking and misinformation now, think of how easily our hypothetical AI best friend could subtly shape our view of reality. At the will of its designers, it could nudge us toward certain political or social viewpoints. It could provide slanted, "carefully filtered" answers to sensitive questions. This development wouldn't require "a self-aware program, just one that seems to be friendly and is capable of conversation, or close enough." Building on its vast database of information collected from the internet and from interacting with its user, "It wouldn’t just be trained to emotionally connect with humans, it would be trained to emotionally manipulate humans."

In a society with a nearly ubiquitous as well as almost omniscient product like that, the disadvantaged folks "on the wrong side of the digital divide" who couldn't afford one might even be better off, at least in the sense of privacy and personal freedom.

Margaret L. Carter

Carter's Crypt

Thursday, October 07, 2021

Astro the Robot

Amazon has invented a household robot called Astro, described as about the size of a small dog. It's "Alexa on wheels" but a bit more:

Amazon Robot

Astro can roll around the house with its camera, on a 42-inch arm, enabling you to keep an eye on children from another room. Or you can view your home remotely when you're away. You might use this feature to check on a vulnerable family member who lives alone. Like a tablet, it can play videos and access the internet. Like Alexa, it can answer questions. Its screen can be used for video chatting.

It can't navigate stairs, although (like the Roomba) it knows not to fall down them. Unfortunately, it can't pick up things. I suspect that ability will come along sooner or later. It can carry small objects from room to room, though, if a human user loads the objects, and facial recognition allows Astro to deliver its cargo to another person on command. It could be remotely commanded to take medication or a blood pressure cuff to that elderly relative who lives by herself.

Amazon's goal is for Astro to become a common household convenience within ten years. Even if you have $999 to spare, you can't order one right now. The device is being sold only to selected customers by invitation. Amazon's vice president of product says the robot wasn't named after the Jetsons' dog. The first possible origin for the name that occurred to me, however, was the robot Astro Boy, from a classic early anime series.

Considering the way people talk to their pets as if the animals can understand, I can easily imagine an owner carrying on conversations with Astro almost like an intelligently responsive housemate.

Margaret L. Carter

Carter's Crypt

Sunday, February 10, 2019

Don't Read Aloud

Why would you spend your own money to purchase and install a listening device in your home that the government can use against you, and that you cannot turn off?

If you are an author, and you have a listening device in your home, be careful what you read aloud. A government listener might not understand that you are reading a work of fiction.

For that matter, an artificial intelligence device that is purposed to analyse your voice and extrapolate your mood and frame of mind and whether you are vulnerable to a sales pitch for catheters, chocolates or condoms --or live ammunition-- might bombard you with targeted advertisements.

It's not illegal to identify someone who might be suckered into binge shopping.
Your mood is not protected by privacy laws.

What if Alexa gets it wrong (for instance, if you are reading aloud the darkest moments of a fictional heroine) and Alexa tells the government that you are suicidal and a danger to yourself and to society?  You might not get that gun permit.  You might suddenly find that the local pharmacy will not allow you to fill a strong pain prescription for your sick parent.

Belle Lin writing for The Intercept shares a lot of scary info.

This targeted advertising might be inherently problematic. Perhaps landlords could use it to make sure that their high end condo properties are only advertised to highly educated, natural blondes with Elizabeth Hurley accents.

If the bot that filters and triages your phone call to your bank or brokerage house tries to bully you into submitting to signing up for the ease and security of  "voice recognition", think twice.

Once your voice is in a database, law enforcement can get it, too. Your voiceprint can be matched with any other conversation "you" might have anywhere at all.

This week, Apple realized that some app developers were able to capture a lot more information than they ought to have had through a screen-reading app.

As Olivia Tambini explains, allegedly, there was a North American airline whose customers' passport numbers and credit card information was exposed.

Legal blogger Haim Ravia summarizes the month's top privacy news for law firm Pearl Cohen Zedek Latzer Baratz, touching on espionage by smartphone, whether or not law enforcement may compel suspects to use biometrics to unlock private smartphones, and that amusement parks can be successfully sued for collecting fingerprints.


The crux of the problem with collecting biometric data without permission, and perhaps of secretly recording people in their own homes is the Fifth Amendment (the American citizens' right not to incriminate oneself.)

On the other hand, anything you say when speaking to Alexa seems to count as if you are talking to Amazon, and is not protected if Amazon (as one party in the conversation) elects to reveal what you said to it.

All the best,
Rowena Cherry