Tesla demands removal of video of cars hitting child-size mannequins

Eric

Mama's lil stinker
Posts
11,293
Reaction score
21,743
Location
California
Instagram
Main Camera
Sony
From the WP:
SAN FRANCISCO — Tesla is demanding an advocacy group take down videos of its vehicles striking child-size mannequins, alleging the footage is defamatory and misrepresents its most advanced driver-assistance software.

In a cease-and-desist letter obtained by The Post, Tesla objects to a video commercial by anti-“Full Self-Driving” group the Dawn Project that appears to show the electric vehicles running over mannequins at speeds over 20 mph while allegedly using the technology. The commercial urges banning the Tesla Full Self-Driving Beta software, which enables cars on city and residential streets to automatically lane-keep, change lanes and steer.

The video in question
 

mac_in_tosh

Site Champ
Posts
678
Reaction score
1,306
It is arrogant to think that a piece of software can allow a car to drive by itself. Software has bugs, and the more complicated the software is the more bugs there are, hence the operating system updates we have to continually do. A programmer cannot possibly account for all possibilities and complications that a car will encounter in a wide variety of roads from single lane country roads to multi-lane highways and in all road and weather conditions. I can just imagine - "This update addresses an issue where under certain rare circumstances the car's windshield wipers are activated instead of the brakes."

From the little I've heard Musk speak, he seems to be prone to unrealistic predictions. And besides, why do we even need this?
 

quagmire

Site Champ
Posts
331
Reaction score
402
It is arrogant to think that a piece of software can allow a car to drive by itself. Software has bugs, and the more complicated the software is the more bugs there are, hence the operating system updates we have to continually do. A programmer cannot possibly account for all possibilities and complications that a car will encounter in a wide variety of roads from single lane country roads to multi-lane highways and in all road and weather conditions. I can just imagine - "This update addresses an issue where under certain rare circumstances the car's windshield wipers are activated instead of the brakes."

From the little I've heard Musk speak, he seems to be prone to unrealistic predictions. And besides, why do we even need this?

The issue here though is that test was likely manipulated to get the results they desired which was Tesla's killing fake kids. The guy likely had his foot on the accelerator so even if the car saw the fake kid and wanted to stop, the human still pressing the gas pedal overrides the computer.

The guy behind the group is not unbiased. He is running for government in CA on taking down Musk/Tesla, etc.
 

Eric

Mama's lil stinker
Posts
11,293
Reaction score
21,743
Location
California
Instagram
Main Camera
Sony
The issue here though is that test was likely manipulated to get the results they desired which was Tesla's killing fake kids. The guy likely had his foot on the accelerator so even if the car saw the fake kid and wanted to stop, the human still pressing the gas pedal overrides the computer.

The guy behind the group is not unbiased. He is running for government in CA on taking down Musk/Tesla, etc.
Essentially "my car hit something as I was accelerating on the pedal and driving right into it". It's akin to driving head on into a wall, no reasonable person would ever expect AI in a car to avoid an obstacle under these conditions.
 

rdrr

Elite Member
Posts
1,179
Reaction score
1,957
This video was bombastic to begin with. All the test would need to show is if the car can determine if the object ahead is a crumpled paper bag that it can drive over, or a rock that it should stop or avoid. Most of the computer "AI" cannot make the same correct choice that the human brain can discern in a millisecond.
 

Eric

Mama's lil stinker
Posts
11,293
Reaction score
21,743
Location
California
Instagram
Main Camera
Sony
This guy does a pretty good job of explaining it. TLDR don't press on the accelerator pedal and ignore the flashing warnings as you speed into your intended target.

 

Yoused

up
Posts
5,508
Reaction score
8,681
Location
knee deep in the road apples of the 4 horsemen
It is arrogant to think that a piece of software can allow a car to drive by itself. Software has bugs, and … A programmer cannot possibly account for all possibilities and complications that a car will encounter

Thing is, programmers are mostly not directly involved in developing self-driving systems. It is typical for a system like this to be established in pseudo-neural-network hardware, in such a way that the logic is dynamically capable of learning through observation. Essentially, it is AI that programs itself (and is extremely difficult to debug when it goes wrong).

Obviously FSD should have redundancies (probably at least quadruple) with some form of elaborate inter-agent dispute resolution. More importantly, AI should not be trained to drive like me (or most other Americans) but adopt a more sensible, less meth-head-like approach that would provide a great deal of margin for not encountering Trolley Problem situations. In addition, any vehicle capable of FSD should have a broad variety of sensors, so that it can readily distinguish a white semi trailer from a cloudy sky, and should be designed to bring the car to a safe stop if a situation is too difficult or complex for it to handle.

In other words, FSD is realistically possible and, properly designed, might well be preferable to many of the human so-called-drivers I have had to contend with.
 

mac_in_tosh

Site Champ
Posts
678
Reaction score
1,306
As the AI can't be trained on all essentially infinite contingencies (given the number of possibilities of road conditions, weather, traffic, pedestrians, flooding, etc. and combinations thereof) the AI would have to be pretty much on the level of the human brain to be able to cope with all situations. Is it anywhere near that level? Musk has been known to overstate the autonomous capabilities of his cars. He's now predicting FSD safer than a human by the end of 2022. I believe he had previously predicted it earlier.

For reference, last year from the Wall Street Journal: Self-Driving Cars Could Be Decades Away, No Matter What Elon Musk Said
 

Eric

Mama's lil stinker
Posts
11,293
Reaction score
21,743
Location
California
Instagram
Main Camera
Sony
I would argue that if you choose to press on the accelerator and the car automatically shut down (whether it feels it has seen an obstacle or not) that it could be just as dangerous. There's enough phantom things going on as it is and the driver has to have some personal responsibility, anyone who accelerates towards an object or person is an idiot, regardless of onboard systems.
 

Yoused

up
Posts
5,508
Reaction score
8,681
Location
knee deep in the road apples of the 4 horsemen
I would argue that if you choose to press on the accelerator and the car automatically shut down (whether it feels it has seen an obstacle or not) that it could be just as dangerous.
My position would be that stepping on the accelerator would be comparable to what cruise does when you touch the brake: if you try to accelerate or turn, it should give a loud, bright warning and cede control, though first it should do some consciousness checks, to make sure your foot did not extend to the pedal or hand grab the wheel in your dreams. It really cannot be that hard to design a carbot that behaves sensibly.
 

Eric

Mama's lil stinker
Posts
11,293
Reaction score
21,743
Location
California
Instagram
Main Camera
Sony
My position would be that stepping on the accelerator would be comparable to what cruise does when you touch the brake: if you try to accelerate or turn, it should give a loud, bright warning and cede control, though first it should do some consciousness checks, to make sure your foot did not extend to the pedal or hand grab the wheel in your dreams. It really cannot be that hard to design a carbot that behaves sensibly.
This is exactly what the Tesla does BTW, in fact to the point it's so annoying that instead of trying to push the accelerator when it's in autopilot that I just terminate the CC entirely by either hitting the brakes or flipping the switch up. In fact, if you push the accelerator while on AP it first warns you, then cuts you off for the rest of the drive and will not let you re-enable it.

This is what's telling about this video, the driver would've had to ignore all kinds of alarms (and they're loud enough to scare the shit out of you) while at the end of that run and the car would've halted all automation at that point. They don't show any of that in the video once the car starts on the cones and the last shot of the interior before they hit the cones had a warning sign on the screen.
 
Top Bottom
1 2