Douchebag 2.0—an Elon Musk company

When I was in DC, the Metro stops near the Pentagon were plastered with ads for a particular engine choice for the F-35.

1 Like

maybe starlink subscriptions

My pony brought a link.

https://twitter.com/JoshMankiewicz/status/1591812931830026242?s=20&t=12oxeJXgD0TxSJ4oE3flhQ

11 Likes

This reminds me. I hope the Elon Musk movie is devastating.

https://twitter.com/JUNlPER/status/1591966011053465601

https://twitter.com/poafies/status/1591979669594963968

10 Likes

https://twitter.com/AltoBased/status/1591970169781452800

tech bros who make too much money and want to flex status while appearing environmentally conscious

people really like the self driving feature but i’d never ever ever trust my life with that thing and the fact so many people do is kind of insane

they’re hopelessly overengineered in my opinion, I’ll take dodge’s current and future electric lineup any day

like, does anyone remember the porsche cayenne circa 2010-2012? same fuckin people buy the tesla mini suv, whatever it’s called. One moment every other car in an affluent city is one of those, then they get traded in and no one buys them because they’re too expensive to maintain and overpriced to begin with. that is the future of tesla vehicles. Plus i imagine replacing battery on that thing costs an arm and a leg.

Always guaranteed to the the douchiest assholes on the road.

1 Like

I’m closer to thinking that Teslas are basically military-grade weapons deployed in a civilian environment. They’re semi-autonomous, potentially lethal technology. I’d trust a used 1999 sedan with a gasoline engine over anything built in the last 10 years, many of which also have “driver override” AI-based safety features that, well, also kill people, especially when they fail to trigger as expected.

Perhaps the crash rates are not that different than human drivers, but why should an AI with a non-zero crash rate be allowed on the road at all? I mean, am I allowed to deliberately make a robot, knowing that it will “accidentally” kill 6 people, and unleash it on US roadways? Who is liable in the event that an AI-driven consumer machine kills someone?

(Suggest changing thread title to Elon Musk: How Many People Will He Kill with Algorithms?)

1 Like

Get this instead:

We won’t really know that until everything is self-driving…at which point the answer will be: orders of magnitude less. Until every single car on the road has the capability, no one will ever trust it…which means it will take a lot longer to happen than it probably should.

What we have now is AI trying to predict human behavior. Shit, humans can’t predict human behavior.

Also, there is definitely a philosophical aspect to this that AI developers (not just self driving) have been grappling with for years, and it’s basically the trolley problem IRL. In terms of self-driving cars, whose life should it prioritize? The driver’s? Every occupant of the car? Other drivers or pedestrians? At some point, the AI will need to make a decision on whether to crash into a wall, killing the car’s occupants, or take a turn too fast and hit the pedestrians on the corner. I suppose giving the car owner the choice to set the AI mode is one way to handle this.

What’s funny is a human driver cold be faced with the exact same decision, and we make no complaints about trusting them. Right now the legal system will place the fault on the driver in the situation i described above. Why is it so different all of a sudden if the car owner tells the AI to make the decision a certain way?

The FSD on my car definitely still has limitations. There are some weird intersections it has issues with, and it still doesn’t know how to handle well on twisty 2-lane roads (it turns like, a quarter of a second too late going into curves…just enough to cause me to cringe and want to retake the wheel every time). But for traffic jams and long trips, it is a lifesaver.

One thing the media around this is good at is breathy hyperbolic overreactions because it was a Tesla (oooooh hand wavy ooooh) so you end up hearing about one situation, rather than the 20 other car fires that happened the same day to ICE cares.

2 Likes

What we have now is AI that reliably cannot read “road closed” and “do not enter” signs

2 Likes

Skynet coming any day now though. Just need to sort out those tricky road signs. Then it’s all over.

3 Likes

This is fallacious reasoning (not surprising from you). I’m obviously arguing against that.

You literally just said we don’t know this information.

No. I’m saying that, given that the AI is making decisions about who lives and who dies, it should be illegal for civilian applications.

You can just go watch The Aviator rather than waiting.

3 Likes

The movie script almost writes itself

1 Like

The wife won’t ride in her sisters Tesla (which she puts in auto frequently) unless it clearly states it will prioritize the car’s occupants over a nun holding baby triplets.

1 Like

Dodge doesn’t sell any EVs.