This is the key. I’ve said it before, but the key to understand Elon Musk from a business perspective is iterative development and minimum viable product. (Also vertical integration, but set that aside.) The process is basically:
Identify something that would be really great (self-driving electric cars, colonizing Mars, brain-computer interfaces, insane networks of cheap underground tunnels).
Figure out a commercially viable product that moves a little bit towards the goal.
Attract engineering talent by talking hyperbolically about the goal.
Engineer like crazy to implement the MVP.
Repeat steps 2 through 4, using knowledge gained from the engineering to think of a product that’s even closer to the goal and hyping the results to get more engineering talent.
FSD is the very best example of this, because machine learning is built on data collection, so the MVP is actually a direct input to future iterations.
And yah I hate that it looks like I’m stanning for Elon since he’s a freak and a very weird dude, but full disclosure I like launching rockets and space exploration and I think Teslas are pretty neat, but don’t own one. Also hate that he’s a covid denying scumbert. Life is hard.
If they’d just called it “driver’s assist” like any other car company, it would have been great. Instead Elon wildly overhyped what “Fully Self Driving” could do and his stans would up driving head-on into trucks.
Pilots are highly trained professionals working in pairs and getting paid to pay attention for hours. That’s not what consumers want, are capable of and are going to do.
Also Pilots don’t have pedestrians jump out in front of them and a fraction of a second to react. They get a loud beep and at least have a few seconds to think about what to do.
*I’m assuming. I have no idea what actually flying a commercial plane is like. But I can’t imagine there are many decisions like “OH SHIT HIT THE BRAKES NOW” when your Tesla starts speeding up towards a white box truck that blends in with the background or w/e.
It just seems really hard to maintain that level of alertness for something that may never happen in 1000s of hours of driving.
I guess the level of alertness will be up to the driver, outside of the alertness checks the the vehicle requires every 15 seconds or so (need to apply force feedback to the steering wheel), combined with the in-car camera check that is/going to be required to verify there is a driver, eyes open, etc).
To me, the question comes down to:
If you are statistically far less likely to be injured or killed in an accident when engaging Full Self Driving, even considering some may choose to operate the vehicle inattentively in this mode and expose themselves to bad outcomes, does this make it a good feature to exist for the driving public?
I’m just specifically taking issue with any system that demands someone be alert for something that might only happen once a year or whatever. That almost sounds like a kind of torture to me.
Maybe before we have true self-driving cars we need a world where every car, pedestrian and potential obstacle announces itself over some short-range line-of-sight frequency. So the car always sees the same thing the human does. That part doesn’t seem too technically hard. Obviously the Big Brother ramifications are bad.
It’s gonna be weird if we make it to a generation that doesn’t know how to drive at all.
i don’t believe there have been zero accidents, but a surprisingly low number for the amount of miles it has driven. still i think tesla successfully showed that driver was using fsd incorrectly. like that one guy was sleeping for some reason, and another was intentionally ignoring when he was supposed to take over or something.
overall, FSD probably isn’t meant to take over 100% of driving, and people are mad because that’s what was promised. i would like to hear transportation experts to weigh in: 1) requiring FSD vehicle to have some lights indicating they are in FSD, much like cars have break lights, to reduce anxiety of drivers and pedestrians around them. and 2) for now, restricting FSD usage to certain well-marked roads and lanes, like already existing carpool and bus only lanes. it would limit the usage to where FSD does best right now, and ultimately help adoption.
Yeah, I think those are all interesting points. I think we can all pretty much accept that if it isn’t already, there will be autonomous driving AI software that’ll be overall safer than letting us random yahoos drive around at 85mph.
The regulatory stuff will be very interesting - its an unprecedented change to an activity that is so ubiquitous. And, Elon is uniquely poorly positioned to be the person to champion regulatory adoption, as he’ll just alienate everyone like he’s doing with the FAA.
People have such a reflexive distaste for him that it’ll be hard to have that not taint the discourse surrounding regulation of autonomous vehicles. So we’ll get zomg tesla crashed into a houseboat news articles cuz of shitty Elon and it’ll poison the well for what might be incredible technology that could save truckloads of lives.
Some Wall Street analysts were frustrated by Musk’s comments on the call.
“There was no reason that he needed to double down and shout ‘supply chain’ into a crowded theater,” said Dan Ives, tech analyst at Wedbush Securities. “He gave the bears meat on the bones. That’s why the stock sold off. I’m convinced if Musk was not on the call, the stock probably would have been up on Thursday.”