• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Oh dear, my AI isn't working!

Mock Turtle

Oh my, did I say that!
Premium Member
Remember the Uber self-driving car that killed a woman crossing the street? The AI had no clue about jaywalkers

The self-driving Uber car that hit and killed a woman walking her bike across a street wasn’t designed to detect “jaywalking pedestrians.” That's according to an official dossier published by the US National Safety Transportation Board (NTSB) on Tuesday. The March 2018 accident was the first recorded death by a fully autonomous vehicle. On-board video footage showed the victim, 49-year-old Elaine Herzberg, pushing her bike at night across a road in Tempe, Arizona, moments before she was struck by the AI-powered SUV at 39 MPH. Now, an investigation by the NTSB into the crash has pinpointed a likely major contributing factor: the code couldn't recognize her as a pedestrian, because she was not at an obvious designated crossing. Rather than correctly anticipating her movements as a person moving across the road, it ended up running right into her.

I can think of another scenario that might confuse any AI - a car-door opening (whether stationary or otherwise) - how about you? There must be quite a few out there that will likely confuse any AI and cause an accident or worse.
 

tayla

My dog's name is Tayla
I can think of another scenario that might confuse any AI - a car-door opening (whether stationary or otherwise) - how about you? There must be quite a few out there that will likely confuse any AI and cause an accident or worse.
The solution is easy. Program the cars to not hit anybody. (Why has no one thought of this?)
 

Stevicus

Veteran Member
Staff member
Premium Member
Remember the Uber self-driving car that killed a woman crossing the street? The AI had no clue about jaywalkers

The self-driving Uber car that hit and killed a woman walking her bike across a street wasn’t designed to detect “jaywalking pedestrians.” That's according to an official dossier published by the US National Safety Transportation Board (NTSB) on Tuesday. The March 2018 accident was the first recorded death by a fully autonomous vehicle. On-board video footage showed the victim, 49-year-old Elaine Herzberg, pushing her bike at night across a road in Tempe, Arizona, moments before she was struck by the AI-powered SUV at 39 MPH. Now, an investigation by the NTSB into the crash has pinpointed a likely major contributing factor: the code couldn't recognize her as a pedestrian, because she was not at an obvious designated crossing. Rather than correctly anticipating her movements as a person moving across the road, it ended up running right into her.

I can think of another scenario that might confuse any AI - a car-door opening (whether stationary or otherwise) - how about you? There must be quite a few out there that will likely confuse any AI and cause an accident or worse.

I remember that. I suppose there are plenty of situations which a computerized AI-controlled car would be unable to anticipate. Even humans have difficulty anticipating the actions of other humans, which is part of the reason why there are so many traffic accidents.

Like, if you see a ball bouncing into the street, a driver might be prompted to slow down and anticipate that a child might come darting out into the street to get it. Could a computerized AI driver be programmed to do the same?

I've encountered a few situations where someone was coming down the wrong way of the street, and I had to react quickly. A lot of people do some really dumb and reckless things, and a large part of defensive driving involves mentally preparing oneself and anticipating other people screwing up on the road. Not just other drivers, but pedestrians and bicyclists, too.

A computerized driver might drive very logically, and when encountering human drivers who don't drive that way, the computer might respond "Illogical! Illogical! Please explain. You are human. Only humans can explain their behavior." Their system would overload and shut down. AI's answer to road rage.
 

Mock Turtle

Oh my, did I say that!
Premium Member
The solution is easy. Program the cars to not hit anybody. (Why has no one thought of this?)

Like incorporating heat-sensors to detect these idiots crossing the road (normal for this species apparently and even legal) - woe betide any who wear very insulating clothing - all their fault! :oops:
 

Revoltingest

Pragmatic Libertarian
Premium Member
The solution is easy. Program the cars to not hit anybody. (Why has no one thought of this?)
The difficulty is in recognizing people.
They're surprisingly difficult to see, what with their different appearances &
postures...things they carry, push or pull...blending in with their background...
...lighting oddities. Even when seen, recognition can be momentarily be lost
due to changes in the above. I speculate that improvements in these areas
must be augmented with predictions of human behavior & location, thus
bridging gaps in detection.
 

icehorse

......unaffiliated...... anti-dogmatist
Premium Member
Then there is the trolley car dilemma. What if you have to hit someone, who do you choose? (I think in this sort of situation, hit the Patriots fan.)
 

bobhikes

Nondetermined
Premium Member
Remember the Uber self-driving car that killed a woman crossing the street? The AI had no clue about jaywalkers

The self-driving Uber car that hit and killed a woman walking her bike across a street wasn’t designed to detect “jaywalking pedestrians.” That's according to an official dossier published by the US National Safety Transportation Board (NTSB) on Tuesday. The March 2018 accident was the first recorded death by a fully autonomous vehicle. On-board video footage showed the victim, 49-year-old Elaine Herzberg, pushing her bike at night across a road in Tempe, Arizona, moments before she was struck by the AI-powered SUV at 39 MPH. Now, an investigation by the NTSB into the crash has pinpointed a likely major contributing factor: the code couldn't recognize her as a pedestrian, because she was not at an obvious designated crossing. Rather than correctly anticipating her movements as a person moving across the road, it ended up running right into her.

I can think of another scenario that might confuse any AI - a car-door opening (whether stationary or otherwise) - how about you? There must be quite a few out there that will likely confuse any AI and cause an accident or worse.

For me that's a crock of @$%#. So there saying if a dog, deer or bear wandered in front of you car while driving. The car would just hit the animal. Where I live in New Jersey even in the cities usually by parks deer jump into the road. They never took that into consideration when designing this system. If they did design the car for that how is a human J walking different. Also Kids playing and running into the street the car will just hit them. If the system is like they say it is then they should be closed down for creating a harmful inferior product. They are trying to shirk responsibility by blaming her for breaking the law. Hopefully they get slapped with a major fine and lose a major suit to the family of this lady.
 

Mock Turtle

Oh my, did I say that!
Premium Member
For me that's a crock of @$%#. So there saying if a dog, deer or bear wandered in front of you car while driving. The car would just hit the animal. Where I live in New Jersey even in the cities usually by parks deer jump into the road. They never took that into consideration when designing this system. If they did design the car for that how is a human J walking different. Also Kids playing and running into the street the car will just hit them. If the system is like they say it is then they should be closed down for creating a harmful inferior product. They are trying to shirk responsibility by blaming her for breaking the law. Hopefully they get slapped with a major fine and lose a major suit to the family of this lady.

Well the human is supposed to be in control so perhaps they see that as their get-out clause, but as you say, the AI should be a lot better at diagnosing all the various scenarios that might occur, such that who or what authority passes such AI systems as being suitable for use in automobile systems when there must still be plenty of situations where problems will occur. And have, besides this one.
 

A Vestigial Mote

Well-Known Member
Since hearing about it, I've never felt self-driving cars to be a good idea. Too much at stake to leave to a computerized brain. And by that I mean too much at stake to leave in the hands of a programmer who is simply not going to anticipate all contingencies.

The human brain is built to "work on the fly." Contemporary computing simply isn't. We're still mostly on the "If A then B" side of things. AI is barely more than a way to branch into the correct "If A then B" scenario for a particular situation more quickly using complex decision trees. But rigorous "rules" the machine must follow will always (and I mean ALWAYS) come up against some scenario within which the rules simply don't work. Human minds can work with this ambiguity and often times flourish. Computerized "minds" - not so much.
 

Ouroboros

Coincidentia oppositorum
The solution is easy. Program the cars to not hit anybody. (Why has no one thought of this?)
First step, program a way for the computer to recognize an "anybody".

The problem isn't so much the rules as it is to correctly categorize the things the computer "sees". Is it a box? Is it a bag? Is it a bike? Is it a truck? Is it a traffic light? Is it a poster with a person on it, or is it an actual person standing on the sideline? There are a lot to consider for these AIs.

Personally, I do think the technology is being pushed too early. The first step should be to create more tools in the car to help the driver. Start with making the car give you information about surrounding things. For instance, our car tells us what the speed limit is, by actually reading the traffic sign. Why not let it read all other signs, traffic lights, stop and x-ing markings on the road, and so on, and give that info realtime on the console. Then add warnings about someone standing behind a parked car, or traffic 10 cars ahead and cars that might run a red light and ... put the tools in the hands of the "car operator". And when time is right in the future, the car could take over some decisions.

Also, another problem with the self-driving cars is the exposure to hacking. We'll most likely see "car viruses" in the near future, causing accidents and pileups.
 

beenherebeforeagain

Rogue Animist
Premium Member
Remember the Uber self-driving car that killed a woman crossing the street? The AI had no clue about jaywalkers

The self-driving Uber car that hit and killed a woman walking her bike across a street wasn’t designed to detect “jaywalking pedestrians.” That's according to an official dossier published by the US National Safety Transportation Board (NTSB) on Tuesday. The March 2018 accident was the first recorded death by a fully autonomous vehicle. On-board video footage showed the victim, 49-year-old Elaine Herzberg, pushing her bike at night across a road in Tempe, Arizona, moments before she was struck by the AI-powered SUV at 39 MPH. Now, an investigation by the NTSB into the crash has pinpointed a likely major contributing factor: the code couldn't recognize her as a pedestrian, because she was not at an obvious designated crossing. Rather than correctly anticipating her movements as a person moving across the road, it ended up running right into her.

I can think of another scenario that might confuse any AI - a car-door opening (whether stationary or otherwise) - how about you? There must be quite a few out there that will likely confuse any AI and cause an accident or worse.
It amazes me that there was no "unknown object on collision course--slow down/change lanes to avoid." Why in the hell would it have to recognize that something crossing the road was a human being before it would react to it? Poor programming of the AI...and frankly little different than an inebriated, distracted, or otherwise inattentive human at the wheel...

So, AI is an improvement?
 

Mock Turtle

Oh my, did I say that!
Premium Member
It amazes me that there was no "unknown object on collision course--slow down/change lanes to avoid." Why in the hell would it have to recognize that something crossing the road was a human being before it would react to it? Poor programming of the AI...and frankly little different than an inebriated, distracted, or otherwise inattentive human at the wheel...

So, AI is an improvement?

Five seconds after detecting something before hitting her, so enough time to have issued a warning at least. Perhaps we need harsher laws to make people more attentive using such cars.
 

Brickjectivity

wind and rain touch not this brain
Staff member
Premium Member
It sounds like AI is not ready, yet, for safe human-AI robot interaction. If you have to write a program for every conceivable situation and still can't get the AI to see people... Its too soon. :( Blind people will have to keep waiting for a better class of AI driver.
 

suncowiam

Well-Known Member
The real question is not about perfection but if a robot can do it better than a human being.

For that scenario concerning the jaywalker, if you view the front camera footage, I'm pretty sure most will agree that humans in the same situation would have hit that jaywalker.

View and determine yourself:

Although, to my understanding, that car was equipped with some form of radar that should have detected the jaywalker even if visibility was poor.

Again, AI cars are not supposed to be perfect. They are supposed to reduce the amount of accidents which, IMO, is definitely achievable.
 

Mock Turtle

Oh my, did I say that!
Premium Member
The real question is not about perfection but if a robot can do it better than a human being.

For that scenario concerning the jaywalker, if you view the front camera footage, I'm pretty sure most will agree that humans in the same situation would have hit that jaywalker.

View and determine yourself:

Although, to my understanding, that car was equipped with some form of radar that should have detected the jaywalker even if visibility was poor.

Again, AI cars are not supposed to be perfect. They are supposed to reduce the amount of accidents which, IMO, is definitely achievable.

I'm not sure one can judge by that video. At 44 mph I think someone might have been able to brake and even avoid a collision. A bit difficult to determine since light levels might not be as it appears.
 

Revoltingest

Pragmatic Libertarian
Premium Member
Since hearing about it, I've never felt self-driving cars to be a good idea.
I've long thought how dangerous it is to depend upon humans' intermittent
attention, variable judgement, & even wicked intent. (I was once run down
on purpose.) AI offers potential to be safer than humans.
Are we there yet? I don't think so.
Consider a breaking news item....
Uber self-driving car involved in fatal crash couldn't detect jaywalkers
The software wasn't designed to accommodate jaywakers.
Really? They're all over the place.
There's a lot of work ahead for autonomous vehicle designers, eh.

I once heard that driving is the most complex activity
that most people ever do. Makes sense to me.
 

Stevicus

Veteran Member
Staff member
Premium Member
The real question is not about perfection but if a robot can do it better than a human being.

For that scenario concerning the jaywalker, if you view the front camera footage, I'm pretty sure most will agree that humans in the same situation would have hit that jaywalker.

View and determine yourself:

Although, to my understanding, that car was equipped with some form of radar that should have detected the jaywalker even if visibility was poor.

Again, AI cars are not supposed to be perfect. They are supposed to reduce the amount of accidents which, IMO, is definitely achievable.

Difficult to say, since the lighting makes it seem as if the pedestrian walking a bicycle just "appeared" out of nowhere. Human eyes might have been able to detect movement up ahead, even if it can't be picked up on camera. That's another thing to consider, when it's dark, human drivers might tend to be (or at least they're supposed to be) more careful and prudent about possible hazards due to poor visibility at night.

In the video, the human backup driver was clearly not paying attention.
 

beenherebeforeagain

Rogue Animist
Premium Member
Difficult to say, since the lighting makes it seem as if the pedestrian walking a bicycle just "appeared" out of nowhere. Human eyes might have been able to detect movement up ahead, even if it can't be picked up on camera. That's another thing to consider, when it's dark, human drivers might tend to be (or at least they're supposed to be) more careful and prudent about possible hazards due to poor visibility at night.

In the video, the human backup driver was clearly not paying attention.
I guess I was mistaken...I thought the AI vehicles were supposed to come with additional sensors to allow it to see potential hazards sooner than humans might under normal conditions...along with software that would be able to use such information...but if it can't even see something that's there...
 
Top