This is still the case. It does an incredible imitation of knowing things and is able to break down and reformulate lexical constructs such that it looks like it understands them, but it’s a case of garbage in, garbage out. For example, here it is explaining bubble sort in the speaking style of Snoop Dogg, only this time it unfortunately gets the time complexity wrong.
It will probably get to the point where asking it to explain things has a better success rate than Googling and reading stuff, which is also prone to error. But it’s not a replacement for real expertise.
As someone who is not particularly tech savvy, I thought this was a pretty interesting demo of both some of the abilities and the limitations of AI… It clearly understood things at a level beyond pure Googling , but it’s ability to create a workable end product was hurt by the lack of human experience and senses.
what if slim jesus never put out the music video with the dont take this seriously warning and instead just did something with girls in it on a golf course with old white people or even something like dat stick (which was super lame)? it could have been a whole different story.
TIL the inspiration for Barbie was a sexy novelty doll aimed at men. The idea of a doll with a female body for little girls was completely revolutionary and scandalous in the 50s.
This is from The Toys that Built America. Really interesting show on the History Channel. The overacting in the in the dramatizations is really cheesy though.
I read the article quickly and don’t know the details but this is potentially a huge treatment modality for an increasingly problematic infection. People are already doing fecal transplants but they rely on needing donors and are basically done on a case by case basis afaik. Of course an approved product is probably going to cost a shit-ton of money when compared to a voluntary transplant (pun intended).
I don’t know what it is about Barbie but my daughter (almost 5 years old) absolutely loves Barbie stuff, especially if it’s mermaid adjacent. They’ve got a bunch of harmless Netflix shows, too, plus the movie coming out next year for the adult Barbie lover in your life.
what is it with sovereign citizens that makes them want to represent themselves? Like, I get that they’re saying the court is illegitimate or whatever but why does that mean you shouldn’t get an expert to advise you??? Seems like there’s a huge market opportunity here for a lawyer to become the Johnny Cochran of yellow-fringed flags or whatever
this is the least mind-blowing thing I’ve seen, regular expressions are extremely predictable, they are bound by hard rules. This is like riding a bike with training wheels for an AI, I would expect even relatively simple models to nail this.
There’s no nuance to pick up on, no ambiguity, etc. Some humans may misinterpret a pattern or construct a bad one, but an AI should absolutely be able to take a given pattern and explain what it does.
I’m not entirely convinced. The problem is that the model always tries to BS it’s way through everything, because that’s what it was trained to do. So when it gets confused, it just goes with its best guess and sounds stupid. But it definitely seems like these models are reasoning about concepts to produce their responses. Check out this one:
Not high level stuff, but it’s also not reciting canned discussions of paperclip-maximizing AIs that it saw on the internet. And there’s very clear reasoning going on.
EDIT: In some ways, factorial bubble sort is an impressive mistake to make, since it’s not a mistake that humans make. It either decided or remembered that the worst case time involved n steps, then n - 1 steps, then n - 2 steps, etc., then it said, “Oh, I know that pattern, it’s a factorial!” Sure it’s just iterated, error-ridden pattern matching, but that’s actually exactly the same as human reasoning.
I don’t think they let you be or remain a lawyer if you won’t acknowledge the legitimacy of law. You’re supposed to at least pretend you believe in law.