• dreamless_day@feddit.org
          link
          fedilink
          arrow-up
          0
          ·
          3 months ago

          It doesn’t. “Wasting water” is bullshit most of the time. What you waste is the energy powering pumps and sewage plants.

          • UnderpantsWeevil@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            “Wasting water” is bullshit most of the time.

            Pumping water out of reserves, using it as coolant, and then disposing the hot water into local waterways where the heat kills off the local ecology is “waste” on several levels.

            This is a common practice for industrial cooling, as pumping water and releasing it is cheaper than cycling the water through a large ventilator and recovering it.

  • verdigris@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    This does misunderstand what actually costs the energy – it’s training the models that’s costly, not using the already trained ones. Although to be fair using them increases incentive for new ones to be trained… But yeah asking ChatGPT for a recipe idea isn’t burning an ounce of gasoline.

      • infinitesunrise@slrpnk.net
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 months ago

        Playing a AAA game amount of energy vs running an entire data center on full blast amount of energy, is the comparison I like to make.

        • FooBarrington@lemmy.world
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          3 months ago

          While the order of magnitude is correct, running the bigger models is closer to playing a AAA game on 8 computers at the same time.

  • chicken@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    3 months ago

    I found a blogpost that cites a Business Insider article that implies this claim as formulated is way off:

    Reported energy use implies that ChatGPT consumes about as much energy as 20,000 American homes. An average US coal plant generates enough energy for 80,000 American homes every day. This means that even if OpenAI decided to power every one of its billion ChatGPT queries per day entirely on coal, all those queries together would only need one quarter of a single coal plant. ChatGPT is not the reason new coal plants are being opened to power AI data centers.

    It goes on to argue that while it’s true that AI related electricity use is booming, it’s not because of LLM chatbots:

    AI energy use is going to be a massive problem over the next 5 years. Projections say that by 2030 US data centers could use 9% of the country’s energy (they currently use 4%, mostly due to the internet rather than AI). Globally, data centers might rise from using 1% of the global energy grid to 21% of the grid by 2030. …

    97% of the total energy used by AI as of late 2024 is not being used by ChatGPT or similar apps, it’s being used for other services. What are those services? The actual data on which services are using how much energy is fuzzy, but the activities using the most energy are roughly in this order:

    * Recommender Systems - Content recommendation engines and personalization models used by streaming platforms, e-commerce sites, social media feeds, and online advertising networks.
    
    * Enterprise Analytics & Predictive AI - AI used in business and enterprise settings for data analytics, forecasting, and decision support.
    
    * Search & Ad Targeting - The machine learning algorithms behind web search engines and online advertising networks.
    
    * Computer vision - AI tasks involving image and video analysis – often referred to as computer vision. It includes models for image classification, object detection, facial recognition, video content analysis, medical image diagnostics, and content moderation (automatically flagging inappropriate images/videos). Examples are the face recognition algorithms used in photo tagging and surveillance, the object detection in self-driving car systems (though inference for autonomous vehicles largely runs on-board, not in data centers, the training of those models is data-center based), and the vision models that power services like Google Lens or Amazon’s image-based product search.
    
    * Voice and Audio AI - AI systems that process spoken language or audio signals. The most prominent examples are voice assistants and speech recognition systems – such as Amazon’s Alexa, Google Assistant, Apple’s Siri, and voice-to-text dictation services.
    
      • chicken@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 months ago

        How would I have used AI here, it’s mostly quotes from the article? You’re way off anyway, it actually took me a little while to try out different ways of formatting that list in Lemmy and making the hyperlinks in the quotes display correctly, let alone finding this in the first place as it was the source of the graph I had seen somewhere that I was thinking initially of posting, or reading the thing in order to pick out appropriate passages. To be clear, I have put way too much effort into writing internet comments over the years to be using LLMs for that now and I promise you I do not do that.

        • TheSambassador@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          It’s super interesting that a nicely formatted, bulleted and quoted comment like yours was immediately accused of being Ai. I know that AI generally use a similar format when summarizing, but that’s just because it’s been trained on lots of people writing real summaries.

          I’m worried that the new keyboard warriors of the internet are just going to be harassing people with AI accusations. I’ve seen gamedevs accused of having their store page text written by AI, artists who’ve had incredible and personal works doubted, and authors having organized harassment campaigns over false Ai suspicions. People are getting really overzealous about being anti AI and it’s getting a bit irrational.

          • chicken@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            3 months ago

            I think maybe the confusion has to do with how that list at the bottom is meant to be another quote rather than a summary, but since it is a code block that looks different from the other quotes that might imply that it isn’t a quote. Now that I’m looking at things more, in hindsight I should have done it like this:

            • list1
            • list2

            I just didn’t realize it mattered much and figured it cluttered the page less the first way