Iben blogs about computer graphics customization, recording music with a computer and more.
You've been warned
Published on April 9, 2019 By Iben In Everything Else

Will a self driving car be able to see in dense fog in the day time?
I use fog lights and slow down.
What happens when the detector can't see but the brain box thinks it still can?
Human drivers know from instinct what other human drivers might do in many
driving situations. Human drivers can't predict what self driving cars will do,
adding another layer of uncertainty and possible chaos to a human's drive.
How many miles have the people driven who think self driving cars is a good idea?
When given a split second choice will the self driving car hit a cow or the bus?
I'll take the cow choice and aim for anything but dead center.
Will a self driving car even know one is a bus and one is a cow and take into
account the speed, direction, weight and possible passengers of each choice?
Cruise control for the gas is all I would use and have used, I would never feel safe in a self driving car.
They are crazy.


Comments (Page 2)
2 Pages1 2 
on Jul 16, 2019

Sharon36743

Even if I had a car with self-driving capabilities, I'd turn it off.

Exactly!  Just because we can [have self-driving cars] doesn't mean we should.

There may be places where self-driving cars could possibly be viable [freeways, highways, etc] but as Dave Bax pointed out, these are not always accessible to the average/regular motorist who is just leaving home to visit friends or relatives, to go to work or go shopping.  In these instances the self-driving car is/will be a liability.

In this current age there are no real situations a self-driving car can or will be safe.... period.

In 200 -300 years, maybe, but right now it's ludicrous to even think self-driving cars are a viable proposition.

on Jul 16, 2019

Show me where there is a substitute for intuition and I'll show you a self driving person. 

on Jul 17, 2019

I don't know you guys are very pessimistic. I guess for the first few years their will be bugs that need to be worked out. Lets compare a human reflex next to a machine. A computer can compute at nano seconds. Where a person would need at least 0.7 seconds to identify the problem then at least 0.7 seconds to decide. Then at least 0.7 seconds to react. Something a computer can do in 3 nanoseconds. 

Seeing through fog is a problem for a person. Where a computer doesn't need eyes. Maybe it uses eyes, maybe it doesn't. Automobiles could give off signals. Technically  everyone could give off signals. Even animals could get marked. 

Now if a car can't tell a bus is more important than a bus then it's more of a program fault. The computer should always chose people.

What about the person who can't see to drive.

on Jul 17, 2019

admiralWillyWilber

What about the person who can't see to drive.

Then that person should never be a motorist.  Period!

I no longer drive because health issues dictate that I am not safe behind the wheel, and I choose not to drive for the greater good, the other motorists and pedestrians using the roads, etc.  I therefore suggest most vehemently that incapacitated persons do not need self-driving cars but a taxi.... somebody capable to drive for them.

on Jul 17, 2019

Providing of course the driver has a fair knowledge of HOW TO drive. Then I'm all for it but......reading about some instances where the driver and passengers were at odds with one another......I'll take my bike. 

on Jul 17, 2019

I've met some of the world's most capable drivers.  The holders of a 'Super Licence' - that means Formula One [in spite of what the locals might say...it IS the pinnacle of Motorsport] and yet I can say quite unequivocally that the current World Champion [and thus in theory the best driver on the planet] is an utter dickhead.  Doing burnouts in a rental car outside the circuit in a public city street?....there are eff-wit HOONS that do that.

Provided Hamilton doesn't program the self-drive cars there's a fair to middle chance they will end up superior/safer than him....

Sadly, as an FIA Observer I'm required to be impartial....but I do tend to smile a wee bit when he sticks it into a barrier...

on Aug 03, 2019


At least self-driving cars don't read mobile/cell phone text messages while driving....

I am in total agreement there. I commute on the interstate back and forth to work everyday. At least a 3rd are distracted with something. I have had a car totaled because of a distracted driver. She totaled 2 cars but got to drive hers home. That's not right.

on Aug 03, 2019

At the end of the day, we have to look at the many ways the car has to get information. Visually there are simple cameras, multi-spectrum cameras, OCR technologies, etc. to help "see" what is around them. With tech like that, something like fog is irrelevant. When you add an advanced AI into the mix, you have the ability to make a car far safer than if a human was behind the wheel. The problem comes when the AI is required to make a decision that has potential impacts on the vehicles around it. It's the classic trolley problem. How do you teach AI to rationalize the potential loss of life.

Autonomous vehicles are something we won't escape, and in fact, the tech is growing rapidly. 

on Aug 04, 2019

That there are a few cases of bad accidents with self driving cars, that's no proof of them being unsafe. In fact, when you count the number of cases, it's more of a proof of them being much safer than human drivers! And that's before figuring out whether the self driving car is actually to blame in the reported case - most I've read were just about SDCs being involved, not them causing it.

Mind you, I agree that there are still many situations human drivers can handle better than SDCs. Or I should say, could handle better, if they pay attention, and are experienced enough to react on time and in an appropriate manner. SDCs come to the rescue when the human driver doesn't pay attention or is too unexperienced to react properly and in time.

Unfortunately, when you ask a human driver, (s)he'll always only consider how good they can be at the wheel under optimal conditions, and therefore not choose to let the car drive in their stead. They won't consider the off-chance of them being unattentive at the wrong time, or not being able to handle a case they've never experienced before. It's a psychological problem, not a technical one.

on Aug 04, 2019

MindlessMe

At the end of the day, we have to look at the many ways the car has to get information. Visually there are simple cameras, multi-spectrum cameras, OCR technologies, etc. to help "see" what is around them. With tech like that, something like fog is irrelevant. When you add an advanced AI into the mix, you have the ability to make a car far safer than if a human was behind the wheel. The problem comes when the AI is required to make a decision that has potential impacts on the vehicles around it. It's the classic trolley problem. How do you teach AI to rationalize the potential loss of life.

Autonomous vehicles are something we won't escape, and in fact, the tech is growing rapidly. 
This is not a problem at all, because if you ask three humans, you'll get four opinions on how to handle such a situation! No matter how an autonomous system will answer this question, it will always be dissatisfactory, because we, humans, cannot answer the question ourselves!

Anyway, it's a constructed situation that will likely not occur to anyone in our lifetime; so even if an autonomous system won't react in the way we'd like it to - whatever that is - the sheer amount of avoided accidents, injuries, and deaths in other situations will easily outweigh it. Besides, only an autonomous system would even be fast enough to estimate the probable outcome: a human will almost always have to make an uninformed decision within a fraction of a second. That's why nobody thinks of blaming a human if he made a wrong decision. Why blame an autonomous system that makes a different decision based on facts and calculations? Why should we even want it not to do that?

Or, to put it another way: the chance of dying in such a situation because an autonomous system decided to sacrifice you is much lower than to die by pretty much anything else we decide on every day, e. g. taking a plane that might crash, or simply crossing a street at the wrong moment. Nobody will then say, afterwards, that is was your fault because you made the 'wrong' decision.

You can't blame the autonomous system for making a decision, as long as it was a reasonable one at the time it was made. If everyone would use one, I'd feel much safer on the streets. The off-chance of my autonomous system killing me to safe other people's lifes is acceptable under the premise that the autonomous systems in other cars will do the same to protect me.

on Aug 04, 2019

starkers

Besides, how lazy is the human race getting? I mean, why can't people learn to drive, be tested and licensed to drive properly, or is that too much effort and concentration?

Nope, self-driving cars are like sleeping with a cow pat under your pillow . You're better off not doing it.
If I had to decide between a human who doesn't understand the difference between lazyness and safety, or makes inadequate analogies, and an autonomous system that can make educated decisions before a human driver's receptory even recognize there is a dangerous situation, I'll take the latter.

2 Pages1 2