

and do the released facts say here someone was pointing a camera at the jury and the scolding happened as a result of that or are you just inventing a hypothetical with nothing to do with what is being discussed?
Mastodon: @73ms@infosec.exchange


and do the released facts say here someone was pointing a camera at the jury and the scolding happened as a result of that or are you just inventing a hypothetical with nothing to do with what is being discussed?


I don’t know if it was intentional marketing but it does have that effect and was kinda pointless. I assume people have camera phones in the courtroom with them too but possessing a device that can record doesn’t mean you intend to do it and I doubt Meta has tampered with their glasses so if they were to do that it would be noticeable thanks to the recording LED…


Talk about the pot calling the kettle black. You’re literally not addressing any of my points and just accusing me of things I didn’t even do. I could be arguing with an LLM and get responses that make more sense so I’m done, thanks.


I provided a counterpoint and now you’ve moved your goalposts to just ramps. The fact is that there is no reason to believe the roads Waymos utilize are generally safer than roads on average. But that doesn’t really even matter because the studies that have been done about this do account for different types of environments anyway and point to Waymos having fewer accidents.


It is obviously false that fatal accidents would be “zero” on the roads Waymos are limited to, it’s ridiculous to even suggest such a thing. What is true that such accidents are even more rare there though. It’s another good reason for why it makes no sense to solely focus on fatal accidents as they are unlikely to be involved in them anyway due to these limits. That’s in addition to the fact that the statistical analysis is simply impossible with current vehicle miles.
Now, I’m not saying we know for certain Waymo is much safer than a human as the current statistics imply, that is going to require more rigorous studies. I would say what we’ve got is good enough to say that nothing points to them being particularly unsafe.


Well Waymo isn’t assigning blame, it’s a third party assessment based on the information released about those accidents. The strongest point remains that fatal accidents are rare enough that there simply isn’t enough data to claim any statistical significance for these events. The overall accident rate for which data is sufficient remains significantly lower than the US average.


All these services have the ability for a human to solve issues if the FSD disengages. Doesn’t mean they’re not driving on their own most of the time including full journeys. The remote assistant team is just ready to jump in if there’s something unusual that causes the Waymo driver to disengage and even then they don’t usually directly control the car, they just give the driver instructions on how to resolve the situation.


We are talking about Tesla robotaxis. They certainly do drive in very limited geofenced areas also. While Waymo now goes on freeways only in the Bay Area with the option being offered to only some passengers Tesla Robotaxis do not go on any freeways ever currently. In fact they only have a handful of cars doing any unsupervised driving at all and those are geofenced in Austin to a small area around a single stretch of road.
Tesla Robotaxis currently also cease operations in Austin when it rains so Waymo definitely is the more flexible one when it comes to less than perfect conditions.


When there’s two deaths total it’s pretty obvious that there just isn’t enough data yet to consider the fatal accident rate. Also FWIW like was said neither of those was in any way the Waymo’s fault.


Yeah I seen that video and another where they went back and forth for an hour in a single unsupervised Tesla. One thing to note is that they are all geofenced to a single extremely limited route that spans about a 20 minute drive along Riverside Dr and S Lamar Blvd with the ability to drive on short sections of some of the crossing streets there, that’s it.


not sure what you’re getting at but I think this makes it seem like California regulators are doing their job since they’ve not let Tesla test without someone being behind the wheel yet while Texas is completely derelict of their duty, even allowing them unsupervised despite these terrible stats.


Well I mean if you believe that it is possible in a safe way it’s the one thing that Tesla’s got going for it compared to Waymo which is way ahead of them. Personally I don’t but I can see the sunk cost.


Tesla robotaxis don’t go anywhere near highways currently.


What does this have to do with Newsom? Tesla isn’t allowed to operate this way in California, the accidents are from the Texas data.


The unsupervised cars are very unlikely to be involved in these crashes yet because according to Robotaxi tracker there was only a single one of those operational and only for the final week of January.
As you suggest there’s a difference in how much the monitor can really do about FSD misbehaving compared to a driver in the driver’s seat though. On the other hand they’re still forced to have the monitor behind the wheel in California so you wouldn’t expect a difference in accident rate based on that there, would be interesting to compare.


Entirely possible, but all incidents are counted as it would probably be difficult to produce reliable stats where you’re leaving out some based on some kind of an assessment of blame.
Because Tesla hides most of the details unlike the competition we can’t really look at a specific one and know.
You can say that but it’s entirely different from bringing up deliberately pointing the camera of a device at the jury. And again, there was nothing about them looking at something in particular or anything suggesting the intent to film. As I said it is also very easy to know if the camera is activated.