• verdigris@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    This does misunderstand what actually costs the energy – it’s training the models that’s costly, not using the already trained ones. Although to be fair using them increases incentive for new ones to be trained… But yeah asking ChatGPT for a recipe idea isn’t burning an ounce of gasoline.

      • infinitesunrise@slrpnk.net
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        4 months ago

        Playing a AAA game amount of energy vs running an entire data center on full blast amount of energy, is the comparison I like to make.

        • FooBarrington@lemmy.world
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          4 months ago

          While the order of magnitude is correct, running the bigger models is closer to playing a AAA game on 8 computers at the same time.