I suspect liability

Story: Smartcars – dangerous or simply can’t make money out of the apps? Total Replies: 11
Author Content

May 26, 2013
9:14 AM EDT
The author brings up a lot of what-ifs and perhapses about why the auto industry is so reluctant to jump on board, but I think he overlooks plain ol' liability. Oh sure, when was the last time yer tablet failed. Well, seeing as they haven't been heavily in play for 10 yrs, who knows. I've certainly had computer components fail, yet my desktop hasn't moved 5 inches in 5 yrs. Throw in a 1000 miles of bad road, exposure to the elements, the bottom line boys, and yer gonna get failure, sure as water is wet and the Sun sets in the West.

So, who is gonna be responsible for that failure? Google? GM? AT&T? Caltrans? Who is gonna underwrite the insurance? Private? Govt? Corporate legal teams don't wanna even consider such a tangled legal nightmare. And you can bet that failure is gonna be BIG!! Maybe in the 1000+ cars per incident range. Plus human injuries, plus.... Even I don't wanna think about it. In fact, I don't even want an autonomous car for myself. If a great automobile/roadway control system works anything at all like my ISP, we'll all be dead in 90 days. )

May 26, 2013
11:45 AM EDT
Computers and computer software have been used for engine control, braking and other critical applications in cars for at least 20 years. The hardware is specifically designed to handle the environmental conditions in automobiles which are similar to military specifications that have been in place for much longer than that. In the military they are used for aircraft flight control surfaces that allow unstable aircraft designs to be flown by humans and in ways that could never be done without them.

The concerns about where the liability is and who takes responsibility in case of failure is probably valid.

May 26, 2013
3:09 PM EDT
> The concerns about where the liability is and who takes responsibility in case of failure is probably valid.

No doubt. It's one thing for all concerned to write off an engine stoppage half way home during a 70 mile commute due to computer controlled ignition failure, where the driver can manually maneuver his vehicle to the roadside. It's quite another when the uniform spacing at 70 mph fails and hundreds of cars end up rear-ending each other with massive injuries the result. I suspect NO ONE wants that responsibility. The single car on a test roadbed is jes another "lookee me" ego trip.


May 26, 2013
4:10 PM EDT
I worked with an oil company at one of its refineries. We had a computer controlled processing software (Advanced Process Control - APC) running on a DIgital VAX & Alpha computer systems with VMS OS. That software was increasing the refinery revenue by about $10 Million a year over manual operation.

It was set up in a way where if the software or hardware was to fail, plant operators would be notified by an alarm to take over and keep production going until the issue is fixed. The operators hated that, not the software, but the failure because when it happens, they have to leave what they are doing at the time [mostly playing games, -:) ] and attend to the manual control console. The hardware and OS hardly ever went down, Digital hardware & VMS were very famous for their reliability and robustness, it was the APC software and its interfaces to the gateway controllers that were the culprits most of the time.

Now with all the new technology and better hardware, I am sure a car can be fitted with similar automation and when something goes wrong, the driver would be alarmed to take over. I am sure manufacturers would have the owner of the vehicle sign an agreement elinquishing any responsibility. Whether that is legal or not, IANAL so I don't know for sure.

Wait a minute, doesn't NASA automate most, if not all of their spacecraft and satellite flight navigations and has a pretty good record in doing that?


May 26, 2013
4:32 PM EDT
I used to work on engine controllers and many friends still do. Perhaps some ECU in the last five years have started using ARM but I know of no car before about 2007 that had an ARM in any ECU. Yes they might, probably do, have an ARM in the dashboard "infotainment" system but that won't be controlling the safety critical elements of the car at all. Such ECU (possibly many looking after the engine, brakes, gearbox and more will be buried inside the car). So the article as far as I read was just plain wrong.

There's no conspiracy to why cars don't have these features. It's simply cost, they in the high volume models are always a generation behind. They are building stuff in to a vehicle that is intended to last for 10 or more years. The life of most phones seems to be 18 months at a push. A car before it hits production has been 3-5 years in the making you just can't put state of the art into them easily. Although modular thinking and changes of approach is making it more achievable. No conspiracy there. Cars just aren't yet a throw away item.

May 26, 2013
7:42 PM EDT
@ notbob

The single car on a test roadbed is jes another "lookee me" ego trip.

So far it has a much better driving record over several hundred thousand miles than a lot of humans. It doesn't text, make calls, shave, put on makeup, read, look at maps, eat, drink (or get drunk or stoned), lose concentration, go to sleep or get distracted by other goings on in the automobile while it drives. It clearly has the potential to eliminate all the accidents caused by the just mentioned human "failures" (which is pretty much all accidents) as well as ease traffic congestion and decrease commute times during heavy rush hour traffic. The test roadbed has proved that it is feasible with today's technology. That's not to say that it is perfect now and that there won't be some problems but as I said before military computers are currently used to control far more complex and critical functions with high reliability than driving now. The only real dark cloud on this horizon is liability and it would be a shame for that to be the only thing that holds it back.

May 27, 2013
7:45 AM EDT
I put my answer in my article "Smartcars: Dangerous if software companies would make them", which should be up in a few hours from now.

GDStewart: Please understand that if humans make mistakes it doesn't have high financial consequences for companies, but if companies make technologies which lead to accidents it will.

Example: I'm shaving in the car and hit a pedestrian. Too bad for my insurance company, they have to pay €300 000 to the pedestrian, maybe I'm sentenced two months of jail if I did it by accident, a 20 lines article appears on page 9 of the local newspaper - and life goes on.

I'm shaving in my Google autonomous-driving car and hit a pedestrian. Too bad for Google, they are liable and have to pay $25000000000 - tons of negative news and bad publicity, pedestrians start fearing those autonomous cars, and Google shareholders start complaining.

Besides me living in Europe with less exorbitant amounts of damages, see the difference?

Apart from those few autonomous cars not being ready for mass production, because then everything is different. 10 autonomous cars really is a "hobby project" compared to the number of cars Toyota produces every year.

May 28, 2013
9:50 AM EDT
Impressive article, Hans. You might also consider the Therac-25 fiasco, in which four people died and two others were left maimed. The IEEE-published report starts here. If you want to read it, set aside an hour or two; it's a lot to digest.

There are some computer-oriented jobs I won't take: those which put at risk people's lives, limbs, or personal fortunes. And that was before I knew about the Therac-25 case.

May 28, 2013
10:42 AM EDT
As a follow-on, there's another aspect to consider: vulnerability to malefactors. With more digital communication comes more exposed security holes; SCADA and smartphones are just two current examples. I saw this coming 11 years ago:

Motorola, Enemy of Privacy

Sorry about the unrefined writing. But it's even more relevant now.

May 28, 2013
10:51 AM EDT
One of the reasons I support strict liability is not that I oppose fly-by-wire or automated pilots, but that whomever throws that switch has got to know that they are responsible for what happens after.

I've seen discussions of home-made EMP devices. What really, really scares me is some punk who thinks it's funny to set it off next to the highway to watch people crashing.

May 28, 2013
10:57 AM EDT
OTOH, if I could get my neighbor's stereo to crash...

May 28, 2013
12:40 PM EDT
gus3: Good point about vulnerability to malefactors, hadn't thought of it yet.

Posting in this forum is limited to members of the group: [ForumMods, SITEADMINS, MEMBERS.]

Becoming a member of LXer is easy and free. Join Us!