stopthatgirl7@kbin.social to Technology@lemmy.world · 8 months agoOpenAI Quietly Deletes Ban on Using ChatGPT for “Military and Warfare”theintercept.comexternal-linkmessage-square103fedilinkarrow-up1683arrow-down19file-textcross-posted to: worldnews@lemmy.ml
arrow-up1674arrow-down1external-linkOpenAI Quietly Deletes Ban on Using ChatGPT for “Military and Warfare”theintercept.comstopthatgirl7@kbin.social to Technology@lemmy.world · 8 months agomessage-square103fedilinkfile-textcross-posted to: worldnews@lemmy.ml
The Pentagon has its eye on the leading AI company, which this week softened its ban on military use.
minus-squareFedizen@lemmy.worldlinkfedilinkEnglisharrow-up64·8 months agoI can’t wait until we find out AI trained on military secrets is leaking military secrets.
minus-squareAeonFelis@lemmy.worldlinkfedilinkEnglisharrow-up18·8 months agoIn order for this to happen, someone will have to utilize that AI to make a cheatbot for War Thunder.
minus-squareJknaraa@lemmy.mllinkfedilinkEnglisharrow-up14arrow-down3·8 months agoI can’t wait until people find out that you don’t even need to train it on secrets, for it to “leak” secrets.
minus-squareBezerker03@lemmy.bezzie.worldlinkfedilinkEnglisharrow-up4arrow-down3·8 months agoI mean even with chatgpt enterprise you prevent that. It’s only the consumer versions that train on your data and submissions. Otherwise no legal team in the world would consider chatgpt or copilot.
minus-squareScribbd@feddit.nllinkfedilinkEnglisharrow-up4·8 months agoI will say that they still store and use your data some way. They just haven’t been caught yet. Anything you have to send over the internet to a server you do not control, will probably not work for a infosec minded legal team.
I can’t wait until we find out AI trained on military secrets is leaking military secrets.
In order for this to happen, someone will have to utilize that AI to make a cheatbot for War Thunder.
I can’t wait until people find out that you don’t even need to train it on secrets, for it to “leak” secrets.
How so?
I mean even with chatgpt enterprise you prevent that.
It’s only the consumer versions that train on your data and submissions.
Otherwise no legal team in the world would consider chatgpt or copilot.
I will say that they still store and use your data some way. They just haven’t been caught yet.
Anything you have to send over the internet to a server you do not control, will probably not work for a infosec minded legal team.