Hail Satan.

Mbin
Sharkey

  • 0 Posts
  • 34 Comments
Joined 1 year ago
cake
Cake day: June 10th, 2023

help-circle












  • People have been hit and killed by human drivers at much, much higher rates than SDCs. Those aren’t hiccups, and those are deaths that shouldn’t have happened, as well. The miles driven per collision ratio between humans and SDCs aren’t even comparable. Human drivers are an order of magnitude more dangerous, and there’s an order of magnitude more human drivers than SDCs in the cities where these fleets are deployed.

    By your logic, you should agree that we should be revoking licenses and removing human drivers from the equation, because people are far more dangerous than SDCs are. If we can’t drive safely without killing people, then we shouldn’t be licensing people to drive, right?




  • They’ve already been testing on private tracks for years. There comes a point where, eventually, something new is used for the first time on a public road. Regardless, even despite even idiotic crashes like this one, they’re still safer than human drivers.

    I say my tax dollar funded DMV should put forth a significantly more stringent driving test and auto-revoke the licenses of anybody who doesn’t pass, before I’d want SDCs off the roads. Inattentive drivers are one of the most lethal things in the world, and we all just kinda shrug our shoulders and ignore that problem, but then we somehow take issue when a literal supercomputer on wheels with an audited safety history far exceeding any human driver has two hiccups over the course of hundreds of millions of driven miles. It’s just a weird outlook, imo.


  • After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to “persistent orientation mismatch” between the towed vehicle and the one towing it.

    Having worked at Waymo for a year troubleshooting daily builds of the software, this sounds to me like they may be trying to test riskier, “human” behaviors. Normally, the cars won’t accelerate at all if the lidar detects an object in front of it, no matter what it thinks the object is or what direction it’s moving in. So the fact that this failsafe was overridden somehow makes me think they’re trying to add more “What would a human driver do in this situation?” options to the car’s decision-making process. I’m guessing somebody added something along the lines of “assume the object will have started moving by the time you’re closer to that position” and forgot to set a backup safety mechanism for the event that the object doesn’t start moving.

    I’m pretty sure the dev team also has safety checklists that they go through before pushing out any build, to make sure that every failsafe is accounted for, so that’s a pretty major fuckup to have slipped through the cracks (if my theory is even close to accurate). But luckily, a very easily-fixed fuckup. They’re lucky this situation was just “comically stupid” instead of “harrowing tragedy”.