"Although the chief acknowledges EV fires require different tactics by firefighters, Tesla and government safety data asserts that traditional internal-combustion vehicles experience one fire for every 19 million miles traveled; for Teslas EVs, it's one fire for 205 million miles traveled."
According to "government safety data", we are 10 times less likely to experience a fire in an EV than in an internal-combustion vehicle.
__________________
"No friend ever served me, and no enemy ever wronged me, whom I have not repaid in full."
If, over time, it were to be demonstrated that driverless vehicles were involved in less crashes than driver-controlled vehicles, what then?
My thoughts as to the future of personal transport is that there may be small aircraft units that will get us all from A to B. A bit like we saw on the Jetsons.
If they do perfect driverless cars that are successful, it wont solve congestion and by the time the technology does all work who knows what the population of the world might be.
Unless many of us perish during the testing of the driverless vehicles.
I'm just playing the devil's advocate, but aren't we already delegating control of our cars to computers to some extent?
For example, are there any cars today which don't have antilock braking systems? These gadgets modulate the braking pressure to avoid locking the wheels, in much the same way that a professional rally driver or circuit racer would do. If we didn't have these devices, then those of us who aren't rally drivers would in most cases instinctively stand on the brakes and skid off the road.
The computer in a driverless vehicle would have a 360 degree view of the environment, and would be aware of all the car's systems, including engine, electrical, suspension, braking, etc. Ideally it should be able to make the best possible decision within a far shorter reaction time than a human. I think it's just a matter of time before we get to this point. Chess computers are already outperforming grand masters, so why should we expect that even the dumbest human will outperform a sophisticated car computer?
As for the Tesla accident, I'm seeing conflicting reports. Musk is saying that the autopilot was not engaged and that the "full self driving" option was not purchased. Other reports are saying that the car can't be driven without a human in the driver's seat, although hackers have demonstrated how to circumvent this. So if there was no autopilot and no self driving software, how was the car driving itself? Is there a "partial self driving" option, and if so, what is the difference?
__________________
"No friend ever served me, and no enemy ever wronged me, whom I have not repaid in full."
I'm sure we will get to the point where driverless cars are safer than human controlled ones but I think that is a very long way off.
The analogy with chess (or Go) is a poor one, these are simply linear computing problems whereas driving a car is a real time asynchronous one and therein lies the issue. "It is impossible to fully test such software"; that statement comes from the European standard for medical software and it's correct.
So many things (known or unknown) may occur at any time and in any sequence that it is impossible to predict and programme for them all. Consider how long it takes for a human to learn high level driving skills and we still get it wrong from time to time.
And then there is the philosophical aspect which this article touches upon:
Tesla's blog reports that Teslas running on autopilot travel 4 million miles between crashes whereas regular vehicles achieve only 500,000 miles. That seems to be saying that you're already 8 times safer in a driverless car. In the case of that fatal Uber accident, there was a human at the wheel, so who is to blame? We may well argue that if a human was in full control, then the accident may not have happened. But we could equally argue that in 8 other cases an accident could have been avoided if a computer was driving.
As for high level driving skills, I think they're only ever achieved by rally drivers and circuit racers. How many times have we missed seeing a cyclist or a motorbike? I was almost killed by a truck driver who didn't see me in his mirrors. How often do we hear of toddlers being killed in their driveway? You can mitigate these problems with appropriate sensors that alert a human driver, but you're still limited by reaction time.
On the subject of morality, if statistics prove that you're less likely to be involved in an accident if you allow your car's computer to handle the driving, should you still turn off the autopilot? In fact let's take that a little bit further. If you were involved in an accident that killed a pedestrian who jumped out in front of your car, and if it was determined that autopilot could have reacted quickly enough to have saved the pedestrian, would you be liable, at least morally, if you were in control of your vehicle instead of your autopilot?
-- Edited by dorian on Saturday 24th of April 2021 04:28:12 AM
__________________
"No friend ever served me, and no enemy ever wronged me, whom I have not repaid in full."
The big problem with computers is that they will only operate within their programmed parameters.
The human brain has little in the way of parameters really and is always learning.
You are correct Dorian, that motorspirt does train your brain to operate on a different level and to react at a different speed so a driver who has that training can and often will get themselves out of a situation by those quick reactions.
However, it also often comes down to being able to make the car go through a fairly violent manoeuvre to avoid an accident and those sort of manoeuvres would not be programmed into a computer which will always stay in the safe zone.
This was highlighted in a couple of Airbus crashes, in particular one at an airship, where the investigation showed that the plane was heading for the trees and the pilot tried to pull a hard manoeuvre that the investigation thought would have saved the plane, but the computer wouldn't allow it, and so the plane went into the trees.
This is why I would never want a computer to hold my life in it's grip and I want to be steering the car myself.
The Airbus crash at the airshow was attributed to pilot error. In fact the pilot was convicted of a crime, IIRC. The black box data appeared to prove that the plane was behaving as instructed by the pilot, even though the pilot claimed otherwise. He was accused of flying too close to the ground, ostensibly to impress onlookers, despite having commercial passengers onboard. He was also unfamiliar with that particular airport and didn't see the trees until he hit them. Of course there are conspiracy theories which suggest that the blackbox data were tampered with, and the fact that this was Airbus's first public flight adds fuel to these suspicions. The TV program that I watched also showed footage of the blackbox which suggest that it looked different in one shot than another.
-- Edited by dorian on Sunday 25th of April 2021 12:49:39 PM
__________________
"No friend ever served me, and no enemy ever wronged me, whom I have not repaid in full."