Honestly that’s how I feel. Ai is very flawed, no doubt, but it’s less flawed than most humans. I got people at work who hallucinate more than the first chatgpt model lol
I really hate the term hallucinate because it’s a complete misrepresentation of what is actually happening. A hallucination is a delusion that reality is different than what is objectively true i.e. the person you are seeing to and speaking to is not actually there
When AI “hallucinate” it’s not because of some broken circuitry, it is simply because its programming has locked onto an untrue piece of information that’s in its database. If the data set had been limited to objective facts rather than simply spilling the internet all over it, hallucinations wouldn’t be a problem.
They use the term hallucinate because it distances themselves from the responsibility of actually curating the data set, which of course they won’t do because that would take a lot of time and then they wouldn’t be competitive with all of the other tech bros releasing a new “groundbreaking” AI every 3 months. It is an entirely self-generated problem that they’re going to hand wave away and never fix.
Honestly that’s how I feel. Ai is very flawed, no doubt, but it’s less flawed than most humans. I got people at work who hallucinate more than the first chatgpt model lol
I really hate the term hallucinate because it’s a complete misrepresentation of what is actually happening. A hallucination is a delusion that reality is different than what is objectively true i.e. the person you are seeing to and speaking to is not actually there
When AI “hallucinate” it’s not because of some broken circuitry, it is simply because its programming has locked onto an untrue piece of information that’s in its database. If the data set had been limited to objective facts rather than simply spilling the internet all over it, hallucinations wouldn’t be a problem.
They use the term hallucinate because it distances themselves from the responsibility of actually curating the data set, which of course they won’t do because that would take a lot of time and then they wouldn’t be competitive with all of the other tech bros releasing a new “groundbreaking” AI every 3 months. It is an entirely self-generated problem that they’re going to hand wave away and never fix.