Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.

Large language models such as ChatGPT are some of the most energy-guzzling technologies of all. Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.

Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues in drier parts of the world.

Furthermore, while minerals such as lithium and cobalt are most commonly associated with batteries in the motor sector, they are also crucial for the batteries used in datacentres. The extraction process often involves significant water usage and can lead to pollution, undermining water security. The extraction of these minerals are also often linked to human rights violations and poor labour standards. Trying to achieve one climate goal of limiting our dependence on fossil fuels can compromise another goal, of ensuring everyone has a safe and accessible water supply.

Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.

In other words, policy needs to be designed not to pick sectors or technologies as “winners”, but to pick the willing by providing support that is conditional on companies moving in the right direction. Making disclosure of environmental practices and impacts a condition for government support could ensure greater transparency and accountability.

  • Kilgore Trout@feddit.it
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    edit-2
    7 months ago

    It is a little scary. Machine learning / LLMs consumes insane amounts of power, and it’s under everyone’s eyes.

    I was shocked a few months ago to learn that the Internet, including infrastructure and end-user devices, already consumed 30% of world energy production in 2018. We are not only digging our grave, but doing it ever faster.

    • frezik@midwest.social
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      7 months ago

      The Sam Altman fans also say that AI would solve climate change in a jiffy. Problem is, we already have all the tech we need to solve it. We lack the political will to do it. AI might be able to improve our tech further, but if we lack the political will now, then AI’s suggestions aren’t going to fix it. Not unless we’re willing to subsume our governmental structures to AI. Frankly, I do not trust Sam Altman or any other techbro to create an AI that I would want to be governed by.

      What we end up with is that while AI might improve things, it almost certainly isn’t worth the energy being dumped into it.

      Edit: Yes, Sam Altman does actually believe this. That’s clear from his public statements about climate change and AI. Please don’t get into endless “he didn’t say exactly those words” debates, because that’s bullshit. He justifies massive AI energy usage by saying it will totally solve climate change. Totally.

      • SlopppyEngineer@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 months ago

        Frankly, I do not trust Sam Altman or any other techbro to create an AI that I would want to be governed by.

        “Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them.”

        ~ Frank Herbert, Dune

        • frezik@midwest.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          7 months ago

          Thing is, I could maybe be convinced that a sufficiently advanced AI would run society in a more egalitarian and equitable way than any existing government. It’s not going to come from techbros, though. They will 100% make an AI that favors techbros.

          Edit: almost forgot this part. Frank Herbert built a world ruled by a highly stratified feudal empire. The end result of that no thinking machine rule isn’t that good, either. He also based it on a lot of 1960s/70s ideas about drugs expanding the human mind that are just bullshit. Great novel, but its ideas shouldn’t be taken at face value.

      • afraid_of_zombies@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        5
        ·
        7 months ago

        You know I have never once heard anyone saying what you are saying that they are. I personally think it would be better for us to address bad arguments that are being made instead of ones we wish existed solely so we can argue with them.

      • AdrianTheFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        5
        ·
        7 months ago

        I agree that these arguments are stupid, but is anyone actually saying we should do those things?