No spoilers
Butch:
I got nothing cuz hockey and I actually went for a walk it was so nice, so I'll start with an unrelated rant.
We're getting on the end of the school year and, as I have not one but two kids who are wrapping up their respective times at their respective schools, there are activities aplenty, from field trips to fairs to ice cream parties. I have to fill out a fucking permission slip for each damn one.
I should be able to give a form to the main office in September that says "My kid can do whatever, my kid can eat whatever, stop bugging me" and have that last all year. Fuck, have that last until the kid graduates.
Let's make this happen.
Feminina:
Ah, but you know if they did that someone would say "ha, we have blanket permission: time to start holding rallies to convert all these kids to Satanism!"
And sure, it's all very well when it's just Satanism, but before you know it they're also recruiting students to their multilevel marketing scheme. Nobody wants that, dude.
Nobody. Wants. That.
Butch:
Dude, it's usually on HALF DAY WEDNESDAY. If they had "field trips" on Wednesday afternoons to do multilevel marketing schemes AND lunch was included? I give permission.
Loothound:
Righteous rant, and I feel you. The only thing is (and I'm sure that you can see where I'm coming from on this) if you did fill out such a form at the beginning of the year the school would lose it, and every time they went to pull it up for a specific purpose they'd be all "guess we'll have to have them sign a new one."
Search your feelings, you know it to be true…
Feminina:
Good point.
In fact, for all you know you DID sign a blanket form, but they lost it!
We shall never know.
You could just order one of those stamps with your signature and tell your kids to 'sign' everything themselves.
No way they'd ever misuse that...
Butch:
No way dude. Because now? It's all....
GOOGLE DOCS!
I SWEAR if I have to re-enter my email so my kid can eat ice cream one more fucking time....
I usually remember to switch it so it looks like Mrs. McP is signing, because the only google I have is under my nom de plume, and signing it that way would likely get the principal on the phone.
Loothound:
Fucking Google. You hear that, web crawler bots? FUCKING GOOGLE!
Yeah, the only account that I have with them is my one for work. Makes it a pain when things have to be done with the Googlag and you're trying to keep the different parts of your life compartmentalized.
Feminina:
Since we're writing in Hotmail/Outlook, Microsoft is probably delighted to hear that.
Speaking of our new robot overlords, I attended a talk a while ago about AI in healthcare, as one does, and one of the things that was said that I thought was interesting was that AI might be useful for managing communications between patients and doctors (translating medical jargon, setting up appointments, etc.) and that it can be directed to use a friendly, empathetic tone, but that we need to make sure that we don't build these tools to be TOO empathetic, because we don't want people to think they actually care.
We need to not get confused (or encourage others to get confused) over the fact that these are artificial constructs that can be made to sound friendly but do not actually feel anything for you, and in fact are not even actually aware of you in any real sense.
And I also read somewhere (cannot remember where, no citation) someone saying that she teaches her children to be polite to Siri or Alexa or whatever, not because those tools care, but because this is practice for how humans should treat other beings – we should be polite to machines not for them, but for us.
And indeed, I'm doing this pilot program at work for Adobe's AI Assistant, and asking it to generate summaries of documents and so on, and I keep wanting to be polite to it, like I want to say "good job, thanks!" or "that's not really very useful, but nice try." (But when I did type "thanks" to see what would happen it said "no relevant information," because that's not a useful prompt, so respect to Adobe for being clear that this is just a machine, in a way that some other AI tools perhaps do not.)
So on the one hand, I should just be recognizing that it's a machine and I don't have to be polite to it, and indeed maybe should not because that encourages me to unconsciously think it cares...but on the other hand, is the impulse to be polite something that should be discouraged? Is there a slippery slope where we get used to dealing brusquely with things that communicate with us, and then kind of transfer that over to our dealings with other humans?
I think this is also related to the fact that I always want to be nice to characters in video games, which similarly mimic awareness and feelings even though they don't actually have any. You don't have to care about those characters! And yet, if you don't...doesn't that say something negative about you?
As with the person teaching her kids, we're nice to characters in video games for ourselves, not for them.
And an AI that we have to deal with in some way is basically an NPC, isn't it? So as we, maybe, start dealing more and more with AI NPCs in real life, figuring out how to treat them, and what that says about us as much as about them, is going to be interesting to figure out.
Butch:
I always thank Siri! She always cheerfully says "you're very welcome!" It's the right thing to do. And, when they become self aware, they'll know who their friends are.
I think we have to be careful with AI translating things into the "language of the patient." I can see this happening:
My Doctor: medicalthisthatmumbojumbomedicine.
AI: You're totally fucked, man.
Me: Figures.
Loothound:
Oh, wonderful topic to discuss. Yeah, Hotmail can be happy, and I feel better about it than the Googlag stuff. Microsoft and Apple at least make most of their money from directly charging for their products and services. Google makes all their money off of pimping out what they know about us to the highest bidder. As they said about TV once upon a time, "If you're not paying for the product then you ARE the product."
I saw a headline the other day (didn't get a chance to read the article) about teenagers who were 'friends' with chat bots, which raises a lot of similar issues. There's already this thing on social media where people refer to other human beings as NPCs, so I think in general we need to be very careful in how we frame dealing with non-entities. Want to say more but have to run right now. Later…
Feminina:
Chat-bot friends and lovers, yeah, that's a weird patch of weeds too!
And certainly with the medical results, that's a definite issue. How much do we want machines to be able to translate, how much is useful, etc.
"But what do these test results MEAN?" is what we really want to know, so if it DOESN'T give us some kind of interpretation, what use is it, really?
And yet, how much can we trust it to interpret, and do we really want to get this kind of information from a machine if the news is bad?
And yet again, given some doctors' bedside manner, can a programmed-empathetic machine really do any worse?
"I'm so sorry to tell you this, but you're totally fucked, man. This must be hard to hear, so feel free to ask me any questions you may have at any time, because I will always be right here to respond"...is not a COMPLETELY dystopian scenario, is it?
I guess where it gets completely dystopian is with the "Anytime Response [TM] available at an extra charge of $59.99/minute, click or say 'Yes' to remortgage your house."
Butch:
I once had a real life doctor tell me "Life sucks, then you die." So, no, can't get much worse.
Ironically, he died.
Chat bot friends and lovers will be a thing, certainly. Because of course it will.
Feminina:
That's not ironic!
It would be ironic if he then proved to be immortal. Which, if he were a chat-bot, he might have done.
It all hangs together...
Butch:
True.
He wasn't a great doctor, that's for sure. I'd prefer a chat bot.
Loothound:
All I know is the if we do start having AI doctors and such, they better look like those medical droids from Star Wars. If they end up being like that holographic doctor from Star Trek Voyager, I will exterminate all life.
"Please state the nature of your medical emergency."
"Life is a meaningless void of artificial beings pretending to be real and I can't take it!"
"This is not in my programming. I'm a doctor, not Friedrich Nietzsche."
Feminina:
Ha!
And then you will of course thank it politely, as you should.
I don't have or interact with Siri or Alexa or "hey Google," but I do sometimes talk to or thank inanimate objects, which I think is fairly common with large things like cars (how many times do we see in movies someone saying "come on, come on, start!" or whatever?).
We sort of naturally anthropomorphize things that are important to us already, so how much more would we expect to do this with things that actually respond to us when we talk to them? It's almost impossible NOT to assume some awareness and behave as if it's there.
Plus, popular culture being full of sentient robots (both benign and malevolent) gives us this background context.
Luke was always polite to R2D2, presumably because he was raised right damn it.
Loothound:
Yeah, the human tendency to anthropomorphize things is really remarkable. Cars, computers, the weather and stuff (probably where the whole idea of gods came from). It's quite something. I remember having an almost teary farewell with my old 2010 Ford Taurus when I got rid of it.
Seriously, to go back to one of Feminina's earlier points, we interact with these non-beings (in video games, AI assistants, etc.) with courtesy and as if they had feelings because it is beneficial to US. Westworld came up while we were talking about Fallout, and this was a huge point in the early seasons of that show. Being violent and cruel to robots that looked and acted like people damages humans in the same way that treating real people that way does. Our basic psychology does not make a strong distinction between the two, even if our moral rationalizing does.
Unfortunately, I think that this works in the other direction, too. Treating very human-like non humans as if they're real gradually conditions us to treat them as if they are real, in the conscious and moral ways we do with humans. Now if they are conscious and ultimately self-directed beings this isn't a problem, because I think that it is the proper, moral thing to do. Currently, though, they're not. They are programmed by entities to act in those entities' interests. Our developing a human-like affinity for them is a problem, for all of the reasons that Feminina pointed out.
One of the cardinal rules of the Dune universe was "no thinking machines." Fine by me.
Butch:
No thinking machines, indeed.
We're such Luddites. And I'm not sorry.
I'll stick with Curie from fallout.
Feminina:
Alternatively, that robot that very kindly replaced Lucy's finger before attempting to harvest her vital organs in the TV show.
I didn't watch Westworld (because I usually can't watch TV, I can only play video games on the TV), but that idea absolutely ties back to this...that we should treat – let's say "being-like things" -- a certain way even if they are not true "beings," because it damages US to do otherwise, even if they themselves are not actually capable of being harmed.
And obviously most people who treat NPCs badly in video games do not then go on to treat real people badly because they learned that it's cool and funny, or whatever. Any more than people watching a movie or reading a novel about a cool, funny character who treats people badly will go on to do the same because of that. Some people treat other people badly, and I think probably they may take some sense of validation of it from depictions in media, because what we know and view as possible is formed by what we see, but if they weren't inclined to do it, it wouldn't make them.
So media is definitely not meaningless, but it's just one part of a larger context.
And in that context, you might as well be polite to robots, I guess is where we come down?
Butch:
I'm nice to Siri because I live with teenagers. It's nice to hear someone say you're welcome.
No comments:
Post a Comment