Yup. Look up the calculus and linear algebra that neural networks use to train. It’s an insane amount of calculations. So many calculations that it requires hundreds of processing units to crunch at a reasonable speeds. All that to get simple math questions wrong.
the other wonderful irony?
(basically) the only thing a computer can do is math.
so it’s doing a SHITLOAD of math, to do a terrible job, at doing some very basic math.
bravo!
deleted by creator
Yup. Look up the calculus and linear algebra that neural networks use to train. It’s an insane amount of calculations. So many calculations that it requires hundreds of processing units to crunch at a reasonable speeds. All that to get simple math questions wrong.
All that to hallucinate every response in ways, that make people feel like they know what they are talking about.
Which they don’t, and LLMs never will - unless they program in some responses, which then goes against the entire thing.
They should just use lookup tables
I feel called out by this.