Tesla Running So-Called 'Full Self-Driving' Software Splatters 'Child' In School Bus Test
-
This post did not contain any content.
Tesla: “Well, we can fix that in the next update, right?”
-
Tesla: “Well, we can fix that in the next update, right?”
Musk: “I’m 80% sure we’ve fixed the ‘knowingly hits children’ bug.”
FSD: continues hitting children
-
This post did not contain any content.
I want to know how this could be easily avoided with lidar.
-
This post did not contain any content.
Look it’s called full self driving not full self avoidance, okay?
-
This post did not contain any content.
That seems consistent with human drivers. What’s the issue?
-
I want to know how this could be easily avoided with lidar.
It could have been avoided by the car stopping behind the school bus while the lights were flashing.
-
Just going to point out that while I have not ever been in favour of damaging random peoples’ personal property, and I firmly believe that destroying someone’s primary mode of transport usually causes a lot more problems than it solves:
If Tesla operates robotaxis that maim and kill people during routine operation like this, then destroying them isn’t vandalism. It’s self-defense.
If you can afford a Tesla (or just decide it’s worth the cost) you can afford to lose a Tesla.
-
This post did not contain any content.
Obviously it could tell the child wasn’t real being it didn’t have pale pink skin.
-
This post did not contain any content.
Understandable. Both the school bus and the child mannequin were not moving. Its vision is based on movement
-
Understandable. Both the school bus and the child mannequin were not moving. Its vision is based on movement
What bullshit so it can hit a stopped car a wall a tree something that fell off in the middle of the road.