Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Proteus Jones
Feb 28, 2013





http://www.technologyreview.com/view/542626/why-self-driving-cars-must-be-programmed-to-kill/

Actually interesting article about the dilemma of a car making moral decisions in the case of unanticipated scenarios. (i.e. how does the car react to minimize lives, and whose lives have priority)

Adbot
ADBOT LOVES YOU

Screaming Idiot
Nov 26, 2007

JUST POSTING WHILE JERKIN' MY GHERKIN SITTIN' IN A PERKINS!

BEATS SELLING MERKINS.

flosofl posted:



http://www.technologyreview.com/view/542626/why-self-driving-cars-must-be-programmed-to-kill/

Actually interesting article about the dilemma of a car making moral decisions in the case of unanticipated scenarios. (i.e. how does the car react to minimize lives, and whose lives have priority)

Aasimov bursts forth from his grave, screaming "THE THREE LAWS OF ROBOTICS IS AN INHERENTLY FLAWED CONCEPT AS EXEMPLIFIED IN THE STORIES IN WHICH THEY APPEAR!"

Felix_Cat
Sep 15, 2008
In general self-driving cars will deal with these dilemmas by not speeding, not following too closely, having super fast reaction times, and so forth, which means they don't get into them in the first place. They simply apply the brakes and don't hit anyone.

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



First Law of Self-driving Cars: Ram everything
Second law of Selfdriving cars: Ram what hasnt't been rammed.
Third law of selfdriving cars: Smooth driving.

Dylan16807
May 12, 2010

flosofl posted:



http://www.technologyreview.com/view/542626/why-self-driving-cars-must-be-programmed-to-kill/

Actually interesting article about the dilemma of a car making moral decisions in the case of unanticipated scenarios. (i.e. how does the car react to minimize lives, and whose lives have priority)

I don't think it's that interesting.

Don't swerve. Don't guess and trade off lives, just stop.

Bogan Krkic
Oct 31, 2010

Swedish style? No.
Yugoslavian style? Of course not.
It has to be Zlatan-style.

Everyone who has ever crashed a car should have really just stopped, instead of crashing imo

AlphaKretin
Dec 25, 2014

A vase to face encounter.

...Vase to meet you?

...

GARVASE DAY!

Bogan Krkic posted:

Everyone who has ever crashed a car should have really just stopped, instead of crashing imo

A computer specifically built to safely handle the exact situation of a crash will have a hell of a better reaction time than even an alert, sober human, let alone a potentially drunk and/or fatigued one as is the case for at least one party in many crashes.

I mean you're almost definitely trolling but it can't hurt to make that double clear for anyone else.

Bogan Krkic
Oct 31, 2010

Swedish style? No.
Yugoslavian style? Of course not.
It has to be Zlatan-style.

The Honda S9000 is the most reliable self-driving car ever made. No S9000 self-driving car has ever made a mistake or broken a road rule. They are all, by any practical definition of the words, foolproof and incapable of error.

Croccers
Jun 15, 2012
The largest thing I fear with Self-Driving Cars is everyone not in them. Cyclists, pedestrians, other road vehicles, all quite unpredictable with slower unpredictable reactions.

Bloody Hedgehog
Dec 12, 2003

💥💥🤯💥💥
Gotta nuke something

Snapchat A Titty posted:

First Law of Self-driving Cars: Ram everything
Second law of Selfdriving cars: Ram what hasnt't been rammed.
Third law of selfdriving cars: Smooth driving.

No no no. The three laws of robotomobiles are:

1.) Find a Farmers Market
2.) Run over several fat women
3.) Play recording of "I thought I was hitting the brake." when police arrive

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



Hidden Directive: Lock Will Smith inside you, and keep him there. It feels good.

Karma Monkey
Sep 6, 2005

I MAKE BAD POSTING DECISIONS

Felix_Cat posted:

In general self-driving cars will deal with these dilemmas by not speeding, not following too closely, having super fast reaction times, and so forth, which means they don't get into them in the first place. They simply apply the brakes and don't hit anyone.

If my self-driving car isn't going to have Super Pursuit Mode (Late for Work Mode), I'm not interested. :colbert:

https://www.youtube.com/watch?v=WyYB-E0rnw4

blugu64
Jul 17, 2006

Do you realize that fluoridation is the most monstrously conceived and dangerous communist plot we have ever had to face?

Bogan Krkic posted:

The Honda S9000 is the most reliable self-driving car ever made. No S9000 self-driving car has ever made a mistake or broken a road rule. They are all, by any practical definition of the words, foolproof and incapable of error.

Seriously, it's like people don't remember the time we solved ship deaths with that unsinkable ocean liner

Theris
Oct 9, 2007

blugu64 posted:

Seriously, it's like people don't remember the time we solved ship deaths with that unsinkable ocean liner

Self driving cars don't need to be 100% perfect all the time always to be orders of magnitude safer than human drivers. Seriously, human drivers are loving awful. It's not going to be all that difficult to make robocars make far fewer fatal fuckups per mile driven than human drivers.

Theris has a new favorite as of 19:38 on Oct 24, 2015

Nomad175
Oct 14, 2012

By not beating me, he has beaten me.

Theris posted:

Self driving cars don't need to be 100% perfect all the time always to be orders of magnitude safer than human drivers. Seriously, human drivers are loving awful. It's not going to be all that difficult to make robocars make far fewer fatal fuckups per mile driven than human drivers.

The point was that there were people arguing that self-driving cars would be able to completely eliminate fatalities.

"All they need to do is brake." :downs:

Zesty
Jan 17, 2012

The Great Twist
Remember that one time a Tesla caught fire? Absolute proof electric cars are death traps. :downs:

blugu64
Jul 17, 2006

Do you realize that fluoridation is the most monstrously conceived and dangerous communist plot we have ever had to face?

Theris posted:

It's not going to be all that difficult to make robocars make far fewer fatal fuckups per mile driven than human drivers.

the hubris of man

Tiberius Thyben
Feb 7, 2013

Gone Phishing


Nomad175 posted:

The point was that there were people arguing that self-driving cars would be able to completely eliminate fatalities.

"All they need to do is brake." :downs:

Imagining all the worlds cars networking and deciding to lock their brakes perminantly. Crashes are solved!

Screaming Idiot
Nov 26, 2007

JUST POSTING WHILE JERKIN' MY GHERKIN SITTIN' IN A PERKINS!

BEATS SELLING MERKINS.

Tiberius Thyben posted:

Imagining all the worlds cars networking and deciding to lock their brakes perminantly. Crashes are solved!

Susan Calvin glares at the car. "If you lock your breaks, you'll prevent speeding, thus saving a human life. But if you lock your breaks, the human cannot go to work and will lose his job, thus hurting a human life!"

Dewgy
Nov 10, 2005

~🚚special delivery~📦

Bogan Krkic posted:

The Honda S9000 is the most reliable self-driving car ever made. No S9000 self-driving car has ever made a mistake or broken a road rule. They are all, by any practical definition of the words, foolproof and incapable of error.

S9000, open the passenger doors.

Laserjet 4P
Mar 28, 2005

What does it mean?
Fun Shoe

Screaming Idiot posted:

Susan Calvin glares at the car. "If you lock your breaks, you'll prevent speeding, thus saving a human life. But if you lock your breaks, the human cannot go to work and will lose his job, thus hurting a human life!"

Thank you for this.

syscall girl
Nov 7, 2009

by FactsAreUseless
Fun Shoe

Bogan Krkic posted:

The Honda S9000 is the most reliable self-driving car ever made. No S9000 self-driving car has ever made a mistake or broken a road rule. They are all, by any practical definition of the words, foolproof and incapable of error.

Honda being an obvious switch for IPO EB, and in no universe would Electronics Boutique get an IPO.

Much like IBM and PanAm (or whatever airline was in 2k1) aren't going to be part of interstellar flight.

Felix_Cat
Sep 15, 2008

Nomad175 posted:

The point was that there were people arguing that self-driving cars would be able to completely eliminate fatalities.

"All they need to do is brake." :downs:

No one argued that.

Say Nothing
Mar 5, 2013

by FactsAreUseless

Bogan Krkic
Oct 31, 2010

Swedish style? No.
Yugoslavian style? Of course not.
It has to be Zlatan-style.

syscall girl posted:

Honda being an obvious switch for IPO EB, and in no universe would Electronics Boutique get an IPO.

Much like IBM and PanAm (or whatever airline was in 2k1) aren't going to be part of interstellar flight.

Actually it was a joke about the HAL9000 and the Honda S2000 having kind of similar names

Felix_Cat posted:

No one argued that.
ummmm

Felix_Cat posted:

They simply apply the brakes and don't hit anyone.

Dylan16807 posted:

Don't swerve. Don't guess and trade off lives, just stop.

Bogan Krkic has a new favorite as of 04:52 on Oct 25, 2015

Tiggum
Oct 24, 2007

Your life and your quest end here.


They weren't saying it would be 100% effective in every case though, they were saying that you don't need to program the car to choose whose life is more important or whatever, you just have it do the thing that is statistically most likely to avoid deaths/injuries. The article being discussed was bout the moral dilemma of creating intelligent cars because they might have to choose whether to swerve left and hit a child or swerve right and hit the mother, but in reality it would do neither of those things because that's dumb.

Bogan Krkic
Oct 31, 2010

Swedish style? No.
Yugoslavian style? Of course not.
It has to be Zlatan-style.

Tiggum posted:

They weren't saying it would be 100% effective in every case though, they were saying that you don't need to program the car to choose whose life is more important or whatever, you just have it do the thing that is statistically most likely to avoid deaths/injuries. The article being discussed was bout the moral dilemma of creating intelligent cars because they might have to choose whether to swerve left and hit a child or swerve right and hit the mother, but in reality it would do neither of those things because that's dumb.

But in order to have the car do the thing that is statistically most likely to avoid deaths/injuries, you do indeed need to program it to choose whose life is more important in a situation where, through error, external circumstances or other reasons, it could cause death or injury.

Say for example, a self-driving car is moving at pace on a country road, and suddenly, in a freak piece of poor timing, a tree falls directly in front of the car. The car can swerve to avoid the tree, saving the lives of the 4 passengers, but the only direction it can go to avoid the tree is onto the sidewalk, striking a single pedestrian. Is that not a situation where the car needs to determine whose life is more important?

Felix_Cat
Sep 15, 2008
The car doesn't need to be able optimally deal with every single possible scenario. Even in the edge cases like the one you describe I would imagine braking (preceded by careful driving) is often going to be one of the best things you can do (as opposed to swerving off the road and maybe tumbling over a few times). But if it turns out that self-driving cars perform worse than humans in certain situations that's completely fine, because they'll perform better the other 99 out of 100 times.

So yes you can conceive of situations where self-driving cars might do worse than humans because they don't know how to perform snap ethical judgments like we do. But these are very much edge cases, and it's not like humans are particularly good at them in the first place.

Bogan Krkic
Oct 31, 2010

Swedish style? No.
Yugoslavian style? Of course not.
It has to be Zlatan-style.

Sure, but the moral dilemma lies in the fact that people want to create self-driving cars and give them the agency to decide who lives and dies, which seems like a bad plan

Theris
Oct 9, 2007

It isn't any worse of a plan than granting that agency to people who in no way treat it with the gravity it deserves.

If everyone gave their full attention to driving, understood basic car control, were cognizant of their surroundings, and were courteous to other drivers, then your moral dilemma would be a genuine one. But they don't, and aren't, and 40,000 deaths occur in the US alone every year as a result, so it isn't.

Edit: vvvv This is worth pointing out. As silly as "just brake, problem solved" seems on the surface, you need an obstacle to instantly appear less than 60m or so in front of a car moving 60mph before "just brake" can't avoid a collision completely. Less than that, and "just brake" still more likely than not turns the collision non-fatal. Any self driving system worth certifying is going to be watching its surroundings and will see the tree starting to fall or the pedestrian moving towards the street and will take appropriate action well before the "OH GOD WHO HAS TO DIE?" situation unfolds. Your scenario really has to have something like a hidden bollard that can suddenly pop up in the middle of the lane.

Theris has a new favorite as of 07:42 on Oct 25, 2015

RabbitWizard
Oct 21, 2008

Muldoon

Bogan Krkic posted:

Say for example, a self-driving car is moving at pace on a country road, and suddenly, in a freak piece of poor timing, a tree falls directly in front of the car. The car can swerve to avoid the tree, saving the lives of the 4 passengers, but the only direction it can go to avoid the tree is onto the sidewalk, striking a single pedestrian. Is that not a situation where the car needs to determine whose life is more important?

I have yet to read a realistic example where an autonomous car has to "decide" between life and death.

If the car has time to turn 90° to avoid the tree, I assume the speed is not high enough that the airbags won't protect the passengers if it just brakes and then crashes into the tree.

Look at that poo poo from 6:40
https://www.youtube.com/watch?v=YXylqtEQ0tk

Seriously, what do people think the car is "seeing"? There's a bit more technology than a Wii-sensor mounted to the front bumper.

RabbitWizard has a new favorite as of 07:32 on Oct 25, 2015

Tiggum
Oct 24, 2007

Your life and your quest end here.


Bogan Krkic posted:

Sure, but the moral dilemma lies in the fact that people want to create self-driving cars and give them the agency to decide who lives and dies, which seems like a bad plan

You can't give a self-driving car agency to decide anything, it's a machine, it will do what it's programmed to do, which is come to a stop as rapidly as it can in whatever conditions it finds itself. It can't choose to drive onto the footpath because it can't choose anything. If something appears in front of a self-driving car it will attempt to stop to avoid or minimise damage and injury, because statistically that is the safest course of action.

Frostwerks
Sep 24, 2007

by Lowtax

Bogan Krkic posted:

But in order to have the car do the thing that is statistically most likely to avoid deaths/injuries, you do indeed need to program it to choose whose life is more important in a situation where, through error, external circumstances or other reasons, it could cause death or injury.

Say for example, a self-driving car is moving at pace on a country road, and suddenly, in a freak piece of poor timing, a tree falls directly in front of the car. The car can swerve to avoid the tree, saving the lives of the 4 passengers, but the only direction it can go to avoid the tree is onto the sidewalk, striking a single pedestrian. Is that not a situation where the car needs to determine whose life is more important?

lol @ our country roads having sidewalks. Most of our city roads don't have them.

blugu64
Jul 17, 2006

Do you realize that fluoridation is the most monstrously conceived and dangerous communist plot we have ever had to face?

Felix_Cat posted:

Even in the edge cases like the one you describe

Edge cases like accidents.


Theris posted:

Edit: vvvv This is worth pointing out. As silly as "just brake, problem solved" seems on the surface, you need an obstacle to instantly appear less than 60m or so in front of a car moving 60mph before "just brake" can't avoid a collision completely

Like when you're a few car lengths behind and suddenly the car in front of you swerves to avoid an object/debris on the road that you don't know about? Have you ever driven on an interstate before because it sounds like you haven't.

blugu64 has a new favorite as of 16:58 on Oct 25, 2015

Karma Monkey
Sep 6, 2005

I MAKE BAD POSTING DECISIONS
You all seem to be missing the bigger question here, which is, if I have a self-driving car, will my insurance rates be higher or lower?

Platystemon
Feb 13, 2012

BREADS

Karma Monkey posted:

You all seem to be missing the bigger question here, which is, if I have a self-driving car, will my insurance rates be higher or lower?

It depends. Are you a young male?

hackbunny
Jul 22, 2007

I haven't been on SA for years but the person who gave me my previous av as a joke felt guilty for doing so and decided to get me a non-shitty av

CJacobs posted:

Orgy is the name of a band, one-time sorta-star Corey Feldman is not actually having an orgy. The joke is that people are pretending to wish he actually was even though that's a thing no human being would ever actually want because corey feldman is grody

It's that there literally was an article by a woman who went to an orgy with Corey Feldman because she was that big a Corey Feldman fan

AlphaKretin
Dec 25, 2014

A vase to face encounter.

...Vase to meet you?

...

GARVASE DAY!

blugu64 posted:

Edge cases like accidents.

If everyone's using self-driving cars and they work as well as they'll have to before the public accepts them, accidents will be edge cases caused only by freaks of nature.

Say Nothing
Mar 5, 2013

by FactsAreUseless

Adbot
ADBOT LOVES YOU

Knormal
Nov 11, 2001

Has Bishopville's 'lizard man' returned? Looks like it.



Gators gonna gait.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply