

Yeah plenty of opportunities to just work it into the story.
I dunno what kind of local models you can use, though. If it is a 3D game then its fine to require a GPU, but you wouldn’t want to raise minimum requirements too high. And you wouldn’t want to use 12 gigs of vram for a gimmick, either.
I swear I’m gonna plug an LLM into a rather traditional solver I’m writing. I may tuck deep into the paper a point how it’s quite slow to use an LLM to mutate solutions in a genetic algorithm or a swarm solver. And in any case non LLM would be default.
Normally I wouldn’t sink that low but I got mouths to feed, and frankly, fuck it, they can persist in this madness for much longer than I can stay solvent.
This is as if there was a mass delusion that a pseudorandom number generator can serve as an oracle, predicting the future. Doing any kind of Monte Carlo simulation of something like weather in that world would of course confirm all the dumb shit.