• darkphotonstudio@beehaw.org
    link
    fedilink
    arrow-up
    44
    ·
    6 months ago

    I think people would have less issues with AI training if it was non-profit and for the common good. And there are open source AI projects, many in fact. But yeah, these deals by companies like this are sleazy.

        • Skull giver@popplesburger.hilciferous.nl
          link
          fedilink
          arrow-up
          14
          arrow-down
          2
          ·
          6 months ago

          Up until GPT3 they were quite open. When GPTs became good, they started claiming sharing the models would be risky and that there were ethical problems and that they would safekeep the technology. I believe they were even sued by one of their investors for sticking to their open mission at some point.

          The source code they would provide would be pretty useless to most people anyway, unless you have a couple million laying around to spend on GPUs.

          Plenty of AI companies do what OpenAI did, without ever sharing any models or writing any papers. We only hear about the open stuff. We see tons of open source AI stuff on Github that’s all mostly based on research by either Google or OpenAI. All the Llama stuff exists only because Facebook shared their model (accidentally). All of this stuff is mostly open, even if it’s not FOSS.

          Compare that to what companies are doing internally. You bet data brokers and other shady shits are sucking up as much data as they can get their hand on to train their own, specialised AI, free from the burdens of “as an LLM I can’t do that”.