Skanky@lemmy.world to No Stupid Questions@lemmy.world · 1 year agoFrom a legal/evidence perspective, what is going to happen when it will become impossible to tell the difference between a video generated by AI versus the real thing?message-squaremessage-square34fedilinkarrow-up164arrow-down13
arrow-up161arrow-down1message-squareFrom a legal/evidence perspective, what is going to happen when it will become impossible to tell the difference between a video generated by AI versus the real thing?Skanky@lemmy.world to No Stupid Questions@lemmy.world · 1 year agomessage-square34fedilink
minus-squareBlameThePeacock@lemmy.calinkfedilinkarrow-up1·1 year agoNo need to stream the whole video externally, you could just send the checksums every X minutes and then provide the video with that checksum later. It doesn’t entirely stop the problem though, as you could still insert faked videos into the stream. You just couldn’t do it retroactively.
minus-squarefubo@lemmy.worldlinkfedilinkarrow-up2·1 year agoSure, but robbing a store and simultaneously hacking their video feed is harder than robbing a store and retroactively creating fake footage.
minus-squareBlameThePeacock@lemmy.calinkfedilinkarrow-up1·1 year agoI’m not particularly worried about robbery. There are far more sophisticated ways to attack an organization or person.
No need to stream the whole video externally, you could just send the checksums every X minutes and then provide the video with that checksum later.
It doesn’t entirely stop the problem though, as you could still insert faked videos into the stream. You just couldn’t do it retroactively.
Sure, but robbing a store and simultaneously hacking their video feed is harder than robbing a store and retroactively creating fake footage.
I’m not particularly worried about robbery. There are far more sophisticated ways to attack an organization or person.