Driverless Cars

   / Driverless Cars #581  
How many people with autopilot have used it in a snow storm? Heavy downpour? Freezing ice? Seems to me that you can't compare autopilot vs manunaly driving statistics without knowing the details. I'm sure autopilot would do well against distracted driving but there's lots of cases where the autopilot makes stupid mistakes too. The problem is that we are trying to make new technology work with something that was never designed for this technology.
 
   / Driverless Cars #582  
   / Driverless Cars #583  
How many people with autopilot have used it in a snow storm? Heavy downpour? Freezing ice? Seems to me that you can't compare autopilot vs manunaly driving statistics without knowing the details. I'm sure autopilot would do well against distracted driving but there's lots of cases where the autopilot makes stupid mistakes too. The problem is that we are trying to make new technology work with something that was never designed for this technology.
AI thinks faster than you and I. Have you watch Tesla's drifting on paved tracks or on snow covered roads?
 
   / Driverless Cars #584  
That statement also applies to humans putting things into coding..... and it's usually compounded by the fact that the coder (particularly at larger corporations) generally has little-to-no knowledge pertinent to what it is they're coding (assuming they even know/understand what it is they are coding). Though even if/when they do, they tend to think about it only from their own life experiences which can be very limited (e.g. a native Floridian or southern Californian who's never left isn't going to know much about driving in snow or on ice).

Add in how so many humans believe their own life experiences are somehow universal or absolute and it makes getting the poorly designed code/systems corrected an absolute pain ....well, until either a sufficient amount of money has been lost due to the ignorance/negligence - or the loss of life has been enough the liable entity/company doesn't want the negative attention anymore.

Automated driving may eventually become more widespread, but any automated system will be limited to executing the instructions/code it's designers implemented. ...and those designers are far from perfect (hence the concept/existence of "recalls").

Really until a lot of humans choose to improve their own behavior creating more tools/automation is just pushing the same old "imperfect humans" problem around.....

- speaking as an engineer who's career is focused on finding (& preferably preventing) the screw-ups of other engineers/scientists/coders (can be rather eye-opening how often designers will ignore the lessons of the past ....with some choosing to do so even after their noses have been rubbed in them)
Yes, no and maybe (or undetermined).
Once coded "it is what it is" and can be fixed/patched, etc. if/when bugs arise - once fixed ...well, I know too much about regression failures in code, so lets leave that aside for now.

The problem with humans is that every one of them learns by their own subset of all possible mistakes, to coin a phrase it is an error based learning system.

While not a "GOOD" driver I rate myself as "fair" with several decades of experience in USA, Western Europe, Ireland, England ...and a couple of vacation spots where I drove less than 100 miles.
Anyway, the risk I pose isn't how WELL I can drive, it is how BADLY !
Machines are more consistent - which makes faults identifiable and fixable.

Easy peasey - and as I have said before 3 or 4 years ago Teslas drove better than I did then, they are improving while I am declining.
 
   / Driverless Cars
  • Thread Starter
#585  
Automation..... given how well "regulated" Boeing has been in recent years, I don't expect govts to be very effective regulating the performance of driver-less cars.

I do view this tech as inevitable on the roads, as the reality is the tech does not have to be perfect, Just Better Than.... What's out there now.....

But..... the acid test for me personally is If/When the vehicle manufacturers start accepting legal liability for autonomous accidents.

I own My Mistakes..... not big on owning somebody else's though......

Rgds, D.
 
   / Driverless Cars #586  
Automation..... given how well "regulated" Boeing has been in recent years, I don't expect govts to be very effective regulating the performance of driver-less cars.

I do view this tech as inevitable on the roads, as the reality is the tech does not have to be perfect, Just Better Than.... What's out there now.....

But..... the acid test for me personally is If/When the vehicle manufacturers start accepting legal liability for autonomous accidents.

I own My Mistakes..... not big on owning somebody else's though......

Rgds, D.
Elon Musk has stated AI makers will be leagally responsible for their mistakes in creating/implementation of AI.

My interest in Full Self Driving is to reduce my risk of death by vehicle accident and to not injury or kill others.

By 2025 I expect the new German/Texas Model Y to have been debugged and FSD functional. I never invision owning a car without a steering wheel but having one that can safely drive me to an ER if I became unconscious would be a plus.
 
   / Driverless Cars #587  
AI thinks faster than you and I. Have you watch Tesla's drifting on paved tracks or on snow covered roads?
AI can only think about what it's programed to think about and what it's sensors tell it. Snow is just one, but a very difficult one, condition that AI will struggle with. I'm all for self driving. When the roads are designed to work with it (sensors built into the pavement for example, cars talking to each other, etc.) it'll be nice to take a trip where you don't have to worry about the task of driving. We're not there yet and I don't think we are as close as some want us to be.
 
   / Driverless Cars
  • Thread Starter
#588  
Elon Musk has stated AI makers will be leagally responsible for their mistakes in creating/implementation of AI.
At least privately, even Elon may admit that getting the legal system to move at something more that its typical glacial speed makes the Mission to Mars look easy.....

Logic (digital or otherwise) means one thing to the Engineering community. On the legal side...... I found this not-old article amusing...

"Product Liability: This pertains to the liability of smart car manufacturers in terms of strict liability (despite the >>> driver <<< of the >>> driverless <<< car taking all care possible to prevent such an accident) and negligence such as design defects and product defects."


What Is a Self-Driving Car Liability?

Establishing those precedents will likely happen.... not sure it will be during my lifetime.....

There are many reasons Level5+ is attractive..... medical events where the lone-occupant is incapacitated being one.

Programming a rules-based complex system is non-trivial.

If a level5 is trying to get an incapacitated person to the ER doorway, and encounters a blocked road just before the entrance, will it choose to drive over a sidewalk to get the person there ?

Rules are (usually) there for reasons..... but knowing when to bend or break them is a higher-level reasoning challenge....

Rgds, D.
 
   / Driverless Cars #589  
AI can only think about what it's programed to think about and what it's sensors tell it. Snow is just one, but a very difficult one, condition that AI will struggle with. I'm all for self driving. When the roads are designed to work with it (sensors built into the pavement for example, cars talking to each other, etc.) it'll be nice to take a trip where you don't have to worry about the task of driving. We're not there yet and I don't think we are as close as some want us to be.

There's similar systems already in place for aviation - but there are still considerable constraints on automated/autonomous systems for several reasons. One being the non-transmitting (aka "uncooperative") objects flying through the air which include everything from general aviation aircraft, to hang gliders, para gliders, and hot air balloons where all the aircraft involved aren't yet fully equipped to transmit their own location or flight vectors -- and unmanned aircraft aren't yet capable of detecting those objects. Then of course there's also birds which can really mess up an aircraft when they are struck.....

All of which is simpler to deal with than the ground traffic due to the greatly reduced numbers and challenges involved. Also considering the wild life that could be hit in the air generally does far less damage to aircraft than the animals on the ground can do to cars & trucks the consequences can also be less severe. That of course is also assuming the life form isn't intentionally trying to create problems - which isn't universally true as even in certain areas individuals will intentionally step in front of oncoming vehicles (for various reasons that may not include attempted suicide). Unfortunately, it's not hard to imagine how the rules of a self-driving car would/could be exploited by those engaged in criminal activities since it'd be a lot safer to try corralling a self-driving vehicle (particular one that refuses to drive over/through people) than one driven by a human.

Establishing those precedents will likely happen.... not sure it will be during my lifetime.....

There are many reasons Level5+ is attractive..... medical events where the lone-occupant is incapacitated being one.

Programming a rules-based complex system is non-trivial.

If a level5 is trying to get an incapacitated person to the ER doorway, and encounters a blocked road just before the entrance, will it choose to drive over a sidewalk to get the person there ?

Rules are (usually) there for reasons..... but knowing when to bend or break them is a higher-level reasoning challenge....

Rgds, D.

I agree there definitely are a great many attractions, but as noted there are also a lot of challenges --- many of which may seem like "improbably/impossible" corner cases to the casual observer. However, those seemingly "impossible" cases when applied to large fleets of vehicles can start becoming pretty routine/regular; the "one in a millions seems" phrase generally seems pretty impossible until it's applied to a sample size of tens/hundreds of millions or billions. For consideration the 737 Max fleet had a fatal accident rate of 4 per million flights with ~500,000 flights when the fleet was grounded.

If anything I expect we'll first see that the assistive-driving technologies will end up causing some of the same issues seen in commercial airliners in that the automated systems are so good the human-crew gets complacent and stops paying as much attention .... which (as has been seen) can cause additional issues when the automation gives up (or goes wonky in some way) and hands full control back to the human who is still legally responsible for controlling the aircraft.

So just my own opinion, but I suspect it'll still be another 5-15 years before automated aircraft are routinely permitted to fly without human supervision (even with all the considerable governmental speed/effort being applied to that problem). I suspect that will also occur at least 10-20 years before fully self-driving vehicles become commercially available.

I suspect that will be the case due in large part because needing to know when/how to bend/break the rules and the liability/accountability issues involved with that sort of decision making will be more of a necessity on the ground since there's a greater probability of having to deal with intentional & adverse behavior by humans attempting to engage in either criminal or other anti-social activity. For example (a serious question); how do current technologies handle road-raging drivers attempting to run someone else in a tech-assisted vehicle off the road? ..not something that can just be hand-waved away when designing self-driving vehicle (question could also be applied to occupants of another vehicle throwing things at such vehicles too).

It'd be nice to be wrong, but from what I've seen I'm probably still being overly optimistic in my time estimates - but whether I am or not I'll be pleasantly surprised if I see fully self-driving cars in my lifetime (which could extend out another 60-ish years). ...granted it seems the more work that people are relieved of doing the more anti-social some seem to become (I've yet to see more technology really solve anti-social human problems of the world).
 
   / Driverless Cars #590  
Self driving cars need infrastructure to work , some demo cities are already up and running.
GPS does not work well in canyons, natural or formed by buildings.
This infrastructure is slowly spreading, but cost $. Car firms are footing most of the bills now, some cities/states are chipping in. When the mesh is working, cars will advance a lot, as most of them must "try" to do this function using radar and cameras today.
 
 
Top