…but perhaps, OLEICAT
Removed by mod
Do you want that?
Removed by mod
I like the cut of your jib troll, keep it up. Edit~ nevermind you’re a cunt.
Removed by mod
Where
Removed by mod
Get that shit out of here nasty fuck
I mean… it absolutely will get it wrong.
Ai “always incorrect”
Having a whole in your roof is a classic construction gotcha.
What are you talking about? They will measure it wrong 3 times, cut it wrong while saying it’s correct.
They will measure it wrong 3 times, cut it wrong while saying it’s correct.
Are we talking about the AI or the contractor?
Yes
Both. Technically all three.
- The general purpose AI unsuited to the task.
- The untrained contractor directing the AI.
- The soulless profit-bot AI that manages the contractor.
Technically, it’ll say it measured 3 times, when really it just found a number that someone on Reddit said they measured 3 times 10 years ago, and used that.
That would be ideal.
That’s literally what they do lol
Damn it, I didn’t know AI was that good already
Then try to fill the space with glue, to then conclude it needs to wipe everything and start over, removing the earth in the progress
just ignore the massive pile of fixodent in my garage that used to be a car
I’d like to see AI miss work because it’s Grandma died… Again
Hell, ai does that already
You’re right to push back! I removed your lung, not your appendix.
It doesn’t even have to measure it first!
You’re absolutely right…
It does my head in how we ignore the shit answers language models give (I refuse to call them intelligent).
I swear unless it’s baby shit, like where is the syntax error in this script, it almost always gets it wrong.
Making fresh scripts, even with padentic level prompting and detail just ends up with a script with multiple errors.
I’ve realised I would have written it just as quickly (after all the iteration work) if I had just done it myself.
The only I find LLMS are good for are glorified search engines. And even then it’s horrifying how inefficient chatgpt is for search compared to say Google.
Not to mention it’s run by a sociopath.
The most tragic and obvious version of AI already doing that is the school that was bombed in Iran due to AI messing up at best.
At least Ai will say sorry you are right, I pulled the measurements out of my ass
I assure you it can give you three different measurements from the same picture then go on to explain why it’s okay to eat the plywood.
Phenomenal
Have you seen AI with numbers? It doesn’t calculate, it just trying to give a statistically-likely answer (just like it does for every other next-word in its answer).
Some of them have to drop back to deterministic software tools (and even then sometimes they’re called with incorrect parameters because “intent” lol)
AI already emulates dumbasses it just got no legs. Don’t worry though, soon they’ll give it legs and guns. Nothing can go wrong guys. trust me bro the future is now
They already gave it missiles and we see where that got us…
Good luck trying to emulate a dumbass, dumbass.

I feel like that’s what AI specializes in

I can see AI measuring something 3 times and cutting it wrong, more than I can see AI measuring things once and cutting it right.
AI does it five times, in my experience.
correction: it claimed to do it 5 times. (it did 0 times)
3 times, sir.
I’d like to see AI change blinker fluid or bring the foreman a bucket of steam
Wouldn’t even know where to find a left handed screw driver
My favorite is having the new guy hold a bucket behind someone else using a grinder, to collect sparks for the spark plugs.
So nonsensical and yet so many people fall for it.
Or the board stretcher.
The Leftorium at the Springfield mall, obviously.
this is like my new favorite wikipedia article, the examples are so funny https://en.wikipedia.org/wiki/Fool's_errand
Doing this sort of work has convinced me that the parallel postulate is a lie









