I can tell a piece of software to do the maths for ms. Sometimes the results appear to work with reality.
People complain about LLMs hallucinating, but they have no idea of how many assumptions and just plain “everybody does it this way, I guess it works” are there in scientific research.








Scientific calculations - and other approaches as well - put out garbage all the time, that is the main point of what I said above.
Some limitations are known, just like it is known that LLMs have the limitation of hallucinating.