check out the new remote control Jockey Wheel SmartBar Canegrowers rearview170 Cobb Grill Skid Row Recovery Gear Caravan Industry Association of Australia
Members Login
Username 
 
Password 
    Remember Me  
Post Info TOPIC: Tesla crashes, no driver, 4 hours to put out flames


Guru

Status: Offline
Posts: 3979
Date:
Tesla crashes, no driver, 4 hours to put out flames


"Although the chief acknowledges EV fires require different tactics by firefighters, Tesla and government safety data asserts that traditional internal-combustion vehicles experience one fire for every 19 million miles traveled; for Teslas EVs, it's one fire for 205 million miles traveled."

According to "government safety data", we are 10 times less likely to experience a fire in an EV than in an internal-combustion vehicle.

__________________

"No friend ever served me, and no enemy ever wronged me, whom I have not repaid in full."

Lucius Cornelius Sulla - died 78 BC 

 



Guru

Status: Offline
Posts: 7640
Date:

Alloy and electricity is terrible. Have had burns from alu , elect flash as lineworker !! They used this method as light flashes for pics years ago .


__________________
Whats out there


Guru

Status: Offline
Posts: 3979
Date:

If, over time, it were to be demonstrated that driverless vehicles were involved in less crashes than driver-controlled vehicles, what then?


__________________

"No friend ever served me, and no enemy ever wronged me, whom I have not repaid in full."

Lucius Cornelius Sulla - died 78 BC 

 



Senior Member

Status: Offline
Posts: 386
Date:

dorian wrote:

If, over time, it were to be demonstrated that driverless vehicles were involved in less crashes than driver-controlled vehicles, what then?


 My thoughts as to the future of personal transport is that there may be small aircraft units that will get us all from A to B. A bit like we saw on the Jetsons.

If they do perfect driverless cars that are successful, it wont solve congestion and by the time the technology does all work who knows what the population of the world might be.

Unless many of us perish during the testing of the driverless vehicles. biggrin biggrin

 



__________________

Stu



Guru

Status: Offline
Posts: 3979
Date:

I'm just playing the devil's advocate, but aren't we already delegating control of our cars to computers to some extent?

For example, are there any cars today which don't have antilock braking systems? These gadgets modulate the braking pressure to avoid locking the wheels, in much the same way that a professional rally driver or circuit racer would do. If we didn't have these devices, then those of us who aren't rally drivers would in most cases instinctively stand on the brakes and skid off the road.

The computer in a driverless vehicle would have a 360 degree view of the environment, and would be aware of all the car's systems, including engine, electrical, suspension, braking, etc. Ideally it should be able to make the best possible decision within a far shorter reaction time than a human. I think it's just a matter of time before we get to this point. Chess computers are already outperforming grand masters, so why should we expect that even the dumbest human will outperform a sophisticated car computer?

As for the Tesla accident, I'm seeing conflicting reports. Musk is saying that the autopilot was not engaged and that the "full self driving" option was not purchased. Other reports are saying that the car can't be driven without a human in the driver's seat, although hackers have demonstrated how to circumvent this. So if there was no autopilot and no self driving software, how was the car driving itself? Is there a "partial self driving" option, and if so, what is the difference?

__________________

"No friend ever served me, and no enemy ever wronged me, whom I have not repaid in full."

Lucius Cornelius Sulla - died 78 BC 

 



Guru

Status: Offline
Posts: 4706
Date:

I'm sure we will get to the point where driverless cars are safer than human controlled ones but I think that is a very long way off.

The analogy with chess (or Go) is a poor one, these are simply linear computing problems whereas driving a car is a real time asynchronous one and therein lies the issue. "It is impossible to fully test such software"; that statement comes from the European standard for medical software and it's correct.

So many things (known or unknown) may occur at any time and in any sequence that it is impossible to predict and programme for them all. Consider how long it takes for a human to learn high level driving skills and we still get it wrong from time to time.

And then there is the philosophical aspect which this article touches upon:

Self driving cars - morality

One big problem with such systems is that computers are not afraid of death.



__________________

 

"I beseech you in the bowels of Christ think it possible you may be mistaken"

Oliver Cromwell, 3rd August 1650 - in a letter to the General Assembly of the Kirk of Scotland



Guru

Status: Offline
Posts: 3979
Date:

Tesla's blog reports that Teslas running on autopilot travel 4 million miles between crashes whereas regular vehicles achieve only 500,000 miles. That seems to be saying that you're already 8 times safer in a driverless car. In the case of that fatal Uber accident, there was a human at the wheel, so who is to blame? We may well argue that if a human was in full control, then the accident may not have happened. But we could equally argue that in 8 other cases an accident could have been avoided if a computer was driving. 

As for high level driving skills, I think they're only ever achieved by rally drivers and circuit racers. How many times have we missed seeing a cyclist or a motorbike? I was almost killed by a truck driver who didn't see me in his mirrors. How often do we hear of toddlers being killed in their driveway? You can mitigate these problems with appropriate sensors that alert a human driver, but you're still limited by reaction time.

On the subject of morality, if statistics prove that you're less likely to be involved in an accident if you allow your car's computer to handle the driving, should you still turn off the autopilot? In fact let's take that a little bit further. If you were involved in an accident that killed a pedestrian who jumped out in front of your car, and if it was determined that autopilot could have reacted quickly enough to have saved the pedestrian, would you be liable, at least morally, if you were in control of your vehicle instead of your autopilot?



-- Edited by dorian on Saturday 24th of April 2021 04:28:12 AM

__________________

"No friend ever served me, and no enemy ever wronged me, whom I have not repaid in full."

Lucius Cornelius Sulla - died 78 BC 

 



Guru

Status: Offline
Posts: 7640
Date:

I think statistics get in the way here !! Things can be manipulated with statistics!

__________________
Whats out there


Guru

Status: Offline
Posts: 1382
Date:

well, if your heads in the oven and your bums in the freezer, statistically your comfortable.

__________________

I reserve the right to arm bears :)



Guru

Status: Offline
Posts: 4375
Date:

Aus-Kiwi wrote:

I think statistics get in the way here !! Things can be manipulated with statistics!


 Especially if you don't like what they say.biggrin

Cheers,

Peter



__________________

OKA196, 4x4 'C' Class, DIY, self contained motorhome. 960W of solar, 400Ah of AGMs, 310L water, 280L fuel. https://www.oka4wd.com/forum/members-vehicles-public/569-oka196-xt-motorhome
 

 



Member

Status: Offline
Posts: 11
Date:

And you guys want me to put these batteries in my caravan

__________________


Guru

Status: Offline
Posts: 1160
Date:

The big problem with computers is that they will only operate within their programmed parameters. The human brain has little in the way of parameters really and is always learning. You are correct Dorian, that motorspirt does train your brain to operate on a different level and to react at a different speed so a driver who has that training can and often will get themselves out of a situation by those quick reactions. However, it also often comes down to being able to make the car go through a fairly violent manoeuvre to avoid an accident and those sort of manoeuvres would not be programmed into a computer which will always stay in the safe zone. This was highlighted in a couple of Airbus crashes, in particular one at an airship, where the investigation showed that the plane was heading for the trees and the pilot tried to pull a hard manoeuvre that the investigation thought would have saved the plane, but the computer wouldn't allow it, and so the plane went into the trees. This is why I would never want a computer to hold my life in it's grip and I want to be steering the car myself.

__________________

Greg O'Brien



Guru

Status: Offline
Posts: 3979
Date:

The Airbus crash at the airshow was attributed to pilot error. In fact the pilot was convicted of a crime, IIRC. The black box data appeared to prove that the plane was behaving as instructed by the pilot, even though the pilot claimed otherwise. He was accused of flying too close to the ground, ostensibly to impress onlookers, despite having commercial passengers onboard. He was also unfamiliar with that particular airport and didn't see the trees until he hit them. Of course there are conspiracy theories which suggest that the blackbox data were tampered with, and the fact that this was Airbus's first public flight adds fuel to these suspicions. The TV program that I watched also showed footage of the blackbox which suggest that it looked different in one shot than another.



-- Edited by dorian on Sunday 25th of April 2021 12:49:39 PM

__________________

"No friend ever served me, and no enemy ever wronged me, whom I have not repaid in full."

Lucius Cornelius Sulla - died 78 BC 

 

«First  <  1 2 | Page of 2  sorted by
 
Quick Reply

Please log in to post quick replies.

Tweet this page Post to Digg Post to Del.icio.us
Purchase Grey Nomad bumper stickers Read our daily column, the Nomad News The Grey Nomad's Guidebook