- cross-posted to:
- technews@radiation.party
- cross-posted to:
- technews@radiation.party
cross-posted from: https://derp.foo/post/317313
There is a discussion on Hacker News, but feel free to comment here as well.
cross-posted from: https://derp.foo/post/317313
There is a discussion on Hacker News, but feel free to comment here as well.
This specific thing is not AI, but that’s not actually relevant because this is still an example of the issue at hand. Namely, it’s now cheaper to just throw some shitty CG in the background than it is to pay people to be there and executives don’t see a problem with this. While this particular example of four or five models may not seem like much (especially using stock-ass animations like that), it’s not long before you’ll be seeing scenes where fifteen or twenty background extras are replaced by AI driven CG that behaves like someone that played a similar role five years ago whose motions were cataloged and reused.
THAT is the crux of the issue. The studios basically want to scan and own everyone that ever appears onscreen. It’s fucking gross, and it needs to die on the vine.
CGI crowds have been a thing for literally decades. I think the last time you needed 100% extras to fill a scene was the 90s.
But this is so… janky. Usually, they put actors in the front, a couple of “layers” of extras, and then CGI where it’s harder to notice.
This is so obvious it almost looks like those intentionally janky CGI shorts or music videos and intended to be humorous.
I knew some dink was going to bring up Massive style crowd simulations which is why I VERY FUCKING SPECIFICALLY quoted a small crowd size where individual actions actually matter, which is a far different thing.
Older style crowd simulations don’t really use AI as we define it now. They use preset animations that can be cycled through for various circumstances. A few dozen walk cycles, maybe thirty or forty “CHOP HOBBIT IN HALF” animations, throw in some jumping or arm waving and you’ve got yourself a crowd simulation.
That is not what we are talking about.