54 points by otobrglez 2 months ago
So the quote at the end is Tesla's "you're holding it wrong" moment, eh? It reminds me of stubborn developers that blow off bad UI design with "they're not supposed to do that". Well, fact of the matter is, users do do that, so account for it. Fact of the matter is, Tesla, your advertising gives users the impression that they can practically take a nap with Autopilot on (because it has the hardware for fully-automated driving, amirite?), so don't be pulling out the fine print about proper use of Autopilot.
To a more general point, when driving a car I should not have to ask, "is it going to work this time?" I hit the brake pedal, I have a high degree of confidence that the car will scrub off speed. I hit the gas, I don't worry that it will go (except in our old VW). I set Autopilot, I expect that it won't plow into the back of car in front of it. Whoops, scratch that last one. Instead, I'll constantly be wondering if it's going to do its job.
In risk management for the medical devices I’ve worked on, we account for users and patients not following instructions during the design process. “You’re holding it wrong” is literally something we account for. From my perspective Tesla has been baffling, especially so considering the highly involved CEO is aware that people use his company’s product contra to the instructions for use.
> Well, fact of the matter is, users do do that, so account for it.
Of course they do that. The functionality is completely useless if you still have to fully engage with driving the car. Cruise control is useful for resting the driver’s leg. This doesn’t require a reduction in attention. Partial automation’s benefit comes specifically from relieving the demand for attention. To turn around and say that you still have to be fully attentive and in full control is to say that the automation is entirely useless.
Here's a video from a few years ago of Elon Musk doing exactly what they're saying not to do: https://youtu.be/MO0vdNNzwxk?t=2m
Tesla keep repeating the 'if you use it as designed you won't have an issue'.
But they need to be honest with themselves - in the public's mindshare these are 'self-driving' vehicles, and there's no doubt that Tesla has benefited from the idea that Tesla cars have this technology and the corresponding image of their cars being a new generation of vehicle.
It isn't helped (or it is, depending on your perspective) by naming the system 'auto-pilot', which implies to the lay person an automatic system.
Really this isn't much more than lane-assist, adaptive cruise control and emergency braking, terms which do not imply you can delegate control to a computer.
The problem isn't that Tesla isn't being reasonable, the problem is that the public doesn't have the attention span to try to understand a more nuanced picture. People just don't want to hear that they're still driving the car, they want to deal with cars that drive themselves, whether to say it's a great thing or to say it's a scourge.
There's really only so far Tesla can go in its attempts to educate the public. In fact, they've been strenuously stating, over and over again, that these aren't self-driving cars. Autopilot as a name for the technology is in line with how it's used and expected to perform in commercial airplanes. You still have to know how to fly the plane and you still need to stay vigilant.
There's really only so far Tesla can go in its attempts to educate the public.
Thankfully, they have a long way to go before they reach the limits of how far "only so far" is.
In fact, they've been strenuously stating, over and over again, that these aren't self-driving cars.
The first link I came to when searching "tesla fully automated hardware" was this: https://www.tesla.com/blog/all-tesla-cars-being-produced-now... I read the page twice, and didn't see a single outright statement that the cars aren't self-driving. Oh, I could infer it, but Tesla never came right out and said it, let alone "strenuously".
EDIT: because my search string was a bit loaded, in that I knew it would find the link I'm looking for, let's be more fair with:
"tesla fully automated driving" - the link above is the first listing
"tesla autonomous driving" - the link above is the 2nd link.
In summary, should I care to know more about Teslas and their autonomous driving abilities, I will likely be led to a page that never mentions that these cars can't drive themselves, but strongly implies that they can.
> In fact, they've been strenuously stating, over and over again, that these aren't self-driving cars.
They only state that in response to "Tesla car involved in crash"-style reports. Otherwise, the messaging they send out is "our cars are basically self-driving." At best, Tesla is guilty of sending out severely mixed messaging, and at worst, they're lying to people and claiming that it's not lying because there is a footnote that's telling the truth (that is designed so that no one will see it).
>In fact, they've been strenuously stating, over and over again, that these aren't self-driving cars.
The Tesla fans keep repeating this, but it flies in the face of the description of Autopilot on Tesla's website. I can't tell if they are being deliberately ignorant, or what. But the fact that Tesla claims you can press a button and summon your car, at least in my mind, says "cars that drive themselves".
In any other assisted driving, the system deactivates when it suspects the driver is not paying attention. Not issue impotent warnings.
They just really shouldn't have called this autopilot. It creates all the wrong expectations.
- Advanced cruise control
- Lane Following
Doesn't sound as sexy, though. :-)
>They just really shouldn't have called this autopilot. It creates all the wrong expectations.
Have you read the marketing on the website?
All you will need to do is get in and tell your car where to go. If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed. When you arrive at your destination, simply step out at the entrance and your car will enter park seek mode, automatically search for a spot and park itself. A tap on your phone summons it back to you.
So it's not like this is a case of "it's called something that it isn't really", or mismanaged expectations. Tesla is explicitly stating the car drives itself. They still claim it's "Autopilot". Is what they are selling anything close to what is described there?
That’s their pitch for future Autopilot functionality, I believe. They (probably intentionally) make future claims in a way that they can be interpreted as current functionality.
The name is technically correct. Autopilot on planes is an assistive technology. The problem is there is a common misconception among the general public that it's beyond assistance and is partial automation.
Furthermore, even if you're totally aware it's only assistance, there is a tendency to be more easily distracted. That's what a google study found like 5 years ago.
I am a huge Tesla fanboy, and stock holder. But be careful.. if you are struggling to remain vigilant... turn off autopilot.
>The problem is there is a common misconception among the general public that it's beyond assistance and is partial automation.
Again, where do you get that this is a "misconception among the general public"? Read the marketing. Watch the demos. Listen to the claims.
"Tap your phone and the car will return to you". Where does being vigilant behind the wheel fit with that?
I think the person you're responding to was saying there's a misconception of what autopilot means in general. People think autopilot is some form of automation for planes but its not. Though I could be reading the person's comment wrong.
Yes, exactly. If you ask an average person what auto pilot does on a plane (with no mention of Tesla). They say it's when a plane flies itself. That is wrong. It's an assistive technology.
Autopilot on planes does not deal with obstacles _at all_. It functions in perfectly controlled environment while being guided by multiple monitoring systems controlled by humans (ie. ATC towers).
The metaphor of autopilot for cars just does not work and was used in error.
… or copilot.
Liquid Lane Lock ?
Cogging Cruise Control ?
That surprised even me. I would have expected Tesla's vision system to recognize a car rear end and stop. That's the one thing it's supposed to do well.
Based on the most recent NTSB report, I wouldn't have been surprised (in fact, I almost expected it since the video is setting us up for something) if the car sped up. "Woo hoo! Slow poke got out of my way, increase to Ludicrous speed!"
This is the stationary vehicle in traffic lanes problem. See any of the discussions from earlier this week or last on why detecting stationary objects in traffic lanes is hard.
It does, the car is changing lanes at the last minute so it doesn't have enough room to physically stop.
Correct. The right move would be to change lanes—like the lead car—but apparently Tesla Autopilot isn't programmed to take evasive action?
It doesn't even try to stop.
It does slow down and stop, but not in time—it comes to a stop before hitting a second vehicle directly in front of the first.
The video says it doesn't have room to stop before hitting the first vehicle.
I'd assume when you are going 40 mph and hit the brakes as hard as you can that you'd hear squealing and leave skid marks? Anyone have a guess for how hard the brakes were applied?
I'd assume when you are going 40 mph and hit the brakes as hard as you can that you'd hear squealing and leave skid marks?
Not with ABS, you won't. Yes, I've tested this exact scenario. YMMV, based on the manufacturer.
Looking at the Tesla's suspension, it looks like it gradually applied the brakes.. The accel forces would be way stronger if the car punched the brakes.
Btw, it's probably designed to respond to a confidence level curve and apply the brakes accordingly.
Not to mention that, if the dummy car is NOT made out of metal (or at least contain some), the ~70GHz radar would have trouble detecting it.
This issue has been discussed adneuseum on this forum. There is a good article that was posted here a a few days back that explains the issue: https://arstechnica.com/cars/2018/06/why-emergency-braking-s...
I think this article is useful because, in addition to the BBC being a pretty big media outlet, this shows that basically anyone can create the hazardous conditions.
The Ars article was not as clear-cut.
What happens next? Nothing, until somebody important dies, at which point lawyers for the family will swoop in to feed on the company involved. Meanwhile, lawmakers will preen and posture and create eventual regulations and restrictions on the concept. The media will portray this as an exaggerated risk to everyone on the planet.
I am not passing judgement, merely making an observation/prediction.
The vast majority of vehicles will have reflective surfaces of particular colors and sizes in a particular range of locations in accordance with applicable law and a reflective plate with letters and numbers. Of the ones that don't most of them will have underside guards, hitch receivers, or one of several common commercial vehicle body types.
Detecting these patterns is easier than pedestrian and bike detection because the problem space is so much more bounded. It's hard to give them a pass for not doing it.
What difference does that make? There are many road obstacles besides cars. Last Saturday, in broad daylight (3PM), I had a deer run out in front of me on a freeway onramp, then stop in the middle of the road when she realized her fawn wasn't following her.
I, and the other drivers behind me, avoided hitting the deer because we're used to dealing with them. Should we give Tesla a pass if it hits one because detecting a stationary deer is harder than detecting a stationary car?
It seems like it would actually be "better" to indicate (audio and/or visual alert) to the driver actions that should be taken rather than taking them automatically.
1. Indicate to the driver that they drifting out of their lane
2. Warn about possible slow or stationary object ahead
If the driver knows that car won't take action on it's own, but will help them but telling them about things that can't see (or when they are distracted), then we wouldn't be blaming Tesla but the driver.
It always infuriated me that Tesla called the same tech (Mobileye) "Autopilot", while Audi (and other OEMs) have used the exact same tech but didn't go as far to call even semi-autonomous driving. Audi & Mercedes systems do exactly what you mentioned, the tech is used to warn the driver (audio/visually but even with vibrations in seat and steering wheel) of impeding danger. It's a good case of Silicon Valley's "break stuff fast, iterate quickly" mindset that may be acceptable if it's the latest photo share app, but absolutely despicable business practice when human lives are at stake.
Disclosure: I currently own an Audi vehicle with this technology that has been available since 2009.
I don't actually understand how Tesla autopilot detects cars ahead of it--but does this experiment properly simulate the stationary car, e.g. is it made of metal (and is that the material that Tesla autopilot is designed to detect)?
At the end of the day it doesn't matter if this experiment properly simulated a stationary car - unless you could argue that the Tesla was prescient enough to understand it was going to run into a car dummy. Any kind of road obstacle, be it construction, debris, or a wall, isn't necessarily going to look like a car.
No, the TL;DR is that all semi autonomous systems ignore stationary object as it is difficult to precisely measure if the object is in the lane. Highways have a ton of stationary object such as overhead signs, debris, construction stuff. Trying to accurately process all of these items will lead to frequent wrongful hard braking leading to rear ended accidents. That's why they are ignored.
And yet, Subaru EyeSight can detect and react to stationary objects without any of the problems you describe. Do they have some magic technology unavailable to Tesla?
Then there is no reason to market such systems and their imaginary future capabilities to be completely driverless.
I thought Tesla could detect one car ahead by bouncing off a signal under the car right in front of you.
Only a slowing car, and that too would need to be metal.
This is a stationary non metal car. This shows that Tesla's vision system cannot even recognize a car, much less any random obstruction on the road.
So why didn't Tesla stop?
Because it can't tell the difference between an object in the lane and an object overhanging the lane (for example, exit signs on highways).
Because it didn't have the space, and it apparently is not programmed to steer round obstacles (a much more tricky job to tell where you can steer to as opposed to staying in between what look like lane markings).
But did it even engage the brakes? I wouldn't be shocked if it could not stop fully, but for it to not even attempt to stop is very troubling.
Yes, in the video it clearly and immediately brakes, and comes to a stop before hitting a second dummy car directly in front of the first.
So is the presumption that the human driver would do better?
the car in front did better...
The driver in front also knew about the stationary ‘car’ well ahead of time and was instructed to avoid it by merging into another lane. An actual human driver in a real scenario may perform better than the Tesla, as well as the Tesla or worse than the Tesla — it just depends.
Tesla's algorithm is "try to slow down, but nevertheless plow into stuff if that doesn't work". It is demonstrably worse than the human algorithm of "slow down, but also steer to avoid obstacles". I can't see how you can objectively say they're equally good.
Do we want a car deciding whether to leave the lane? Leave the road? It could do more harm than good. We'll have to decide this, soon. By law hopefully - some standards for auto-drive and what it's allowed to do.
Because if we don't, software will have to code up the 'trolly car problem' solution, starting now. Hit the school bus ahead? or the guy walking beside the road? or the brick wall (and kill its passenger)? There's no good (universally defendable) solution.
I'd argue that the car in front is at fault changing lanes at last possible second. This may or may not leave the drivers behind enough time to follow suit. Clearly this is a manufactured situation but in order to draw conclusions I'd like to see tests when unsuspecting human driers follows the leading car.
I re-watched the video and it really doesn't look like the Tesla is breaking. The video says "unable to break in time". This to me sounds misleading. It is not clear whether Tesla tried to break but the distance was too short or it didn't attempt to break. Which leads me to conclude there's an agenda here.
It can't be deduced from the footage alone whether the Model S is braking or not, but it's clear that it decelerates and stops after it impacts the dummy car.
In the U.S. at least, it's highly unlikely that the driver in front would be found legally at fault in a similar accident. In most cases of rear-end collisions, the driver who impacts the rear of the car will be found at fault.
The car in front also had unobstructed vision. If you tailgate a car that blocks your view ahead, a human driver is just as susceptible to being surprised like this.
That's up to the pilot isn't it? is Auto his real name?
Why wouldn't the car be programmed to follow the car in front and switch lanes? Any disadvantage in this?
Surely if a car in front has switched lanes then it's a good idea for it to switch lanes or if not, to register that a car in front has changed lanes and to then look for the reason why, perhaps in this case, then becoming more aware of stationary objects ahead as threats and be more aggressive in avoiding them. As the problem before was that the stationary object was likely not an issue and stopping would be a bigger problem, but the chances of the stationary object being an issue if the car in front changes lanes, is now rather high.
If car in front changes lane and unknown stationary object appears, slow down, follow car in front and also change line or slow down and give some kinda alert about possible obstruction so the driver can take the required action or at least something.
I understand the stationary object issue, but if the car is not at least applying some logic when it sees another car change lane and then also see's a stationary object that it ignores there is a big problem in this logic. There are two red flags and the car is choosing to ignore both of them.