• Lem Jukes@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      3
      ·
      21 hours ago

      If you learned math with a calculator you didn’t learn math.

      • finitebanjo@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        3
        ·
        edit-2
        21 hours ago

        Firstly, a calculator doesn’t have a double digit percent chance of bullshitting you with made up information.

        If you’ve ever taken a calculus course you likely were not allowed to use a calculator that has the ability to solve your problems for you and you likely had to show all of your math on paper, so yes. That statement is correct.

    • Lumiluz@slrpnk.net
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      5
      ·
      23 hours ago

      Same vibes as “if you learned to draw with an iPad then you didn’t actually learn to draw”.

      Or in my case, I’m old enough to remember “computer art isn’t real animation/art” and also the criticism assist Photoshop.

      And there’s plenty of people who criticized Andy Warhol too before then.

      Go back in history and you can read about criticisms of using typewriters over hand writing as well.

      • endeavor@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 hours ago

        As an artist who is learning to code its different. It is night and day wether you have access to undo and HSV adjust but still must nail color, composition, values, proportion, perspective etc. Especially when a ton of shortcuts are also available to trad artists who can just paint over a projection. Only thing besides saving tons of money and making it easier to do your daily practise, digital art will also give you is more noob traps like brushes and then the lack of confidence from the reliance on undo and other tools like that. I transferred to traditional oil paints just fine cause the fundamentals are the one that separates the trash from the okay and above.

        It is night and day when you ask ai how to make a multiplication table vs apply what you have learned previously to learn the logic behind making it yourself. Using AI wrong in programming means you don’t learn the fundamentals aka you don’t learn to program. Comparing using AI to learn to program with learning to paint on ipad is wrong. Comparing using AI to learn to program with using AI to make art for you is more apt.

        • Lumiluz@slrpnk.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 hours ago

          You’re right, my bad. I should have worded that reply better.

          I meant it as a tool to help you code etc it’s useful, especially if you know some coding. It can help you to say finish a game by coding mechanics you don’t quite know how to make work which you can then fix up yourself with the desired parameters etc.

          If it helps with finishing your idea of a game (especially if it’s something like the first game you’ve ever made), it’s useful in order to learn some of the workflow involved in making a game.

      • finitebanjo@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        4
        ·
        21 hours ago

        None of your examples are even close to a comparison with AI which steals from people to generate approximate nonsense while costing massive amounts of electricity.

        • Lumiluz@slrpnk.net
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          13 hours ago

          Have you ever looked at the file size of something like Stable Diffusion?

          Considering the data it’s trained on, do you think it’s;

          A) 3 Petabytes B) 500 Terabytes C) 900 Gigabytes D) 100 Gigabytes

          Second, what’s the electrical cost of generating a single image using Flux vs 3 minutes of Balder’s Gate, or similar on max settings?

          Surely you must have some idea on these numbers and aren’t just parroting things you don’t understand.

          • finitebanjo@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            edit-2
            10 hours ago

            What a fucking curveball joke of a question, you take a nearly impossible to quantify comparison and ask if its equivalent?

            Gaming:

            A high scenario electricity consumption figure of around 27 TWh, and a low scenario figure of 14.7 TWh

            North American gaming market is about 7% of the global total

            then that gives us a very very rough figure of about 210-285 TWh per annum of global electricity used by gamers.

            AI:

            The rapid growth of AI and the investments into the underlying AI infrastructure have significantly intensified the power demands of data centers. Globally, data centers consumed an estimated 240–340 TWh of electricity in 2022—approximately 1% to 1.3% of global electricity use, according to the International Energy Agency (IEA). In the early 2010s, data center energy footprints grew at a relatively moderate pace, thanks to efficiency gains and the shift toward hyperscale facilities, which are more efficient than smaller server rooms.

            That stable growth pattern has given way to explosive demand. The IEA projects that global data center electricity consumption could double between 2022 and 2026. Similarly, IDC forecasts that surging AI workloads will drive a massive increase in data center capacity and power usage, with global electricity consumption from data centers projected to double to 857 TWh between 2023 and 2028. Purpose-built AI nfrastructure is at the core of this growth, with IDC estimating that AI data center capacity will expand at a 40.5% CAGR through 2027.

            Lets just say we’re at the halfway point and its 600 TWh per anum compared to 285 for gamers.

            So more than fucking double, yeah.

            And to reiterate, people generate thousands of frames in a session of gaming, vs a handful of images or maybe some emails in a session of AI.

            • Lumiluz@slrpnk.net
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              2
              ·
              10 hours ago

              But we’re not comparing the global energy use of LLMs, diffusion engines, other specialized AI (like protein foldings) etc to ONLY the American gaming market.

              The conversation was specifically about image generative AI. You can stop moving the goalposts and building a strawman now, and while at it answer the first question too.

              • finitebanjo@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                3
                ·
                edit-2
                9 hours ago

                Apparently you can only read 2 of 3 lines, that estimate was a global projection of gaming cost IF the globe followed similar trends to the USA (because thats the only available data) so the real global cost estimate for gaming might be far far lower.

                The USA alone spent 27 on gaming, not 285.

                • Lumiluz@slrpnk.net
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  9 hours ago

                  That still doesn’t address that the energy use of AI in your statistics includes all AI rather than just image generation.

                  If we’re including all AI use cases, we’d have to consider all non-AI use cases on the other end too, not just gaming, such as anime production, 3D rendering, etc that also using graphic card cycles.

                  And still ignoring the very first question.

                  So, try again.

                  • finitebanjo@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    3
                    ·
                    edit-2
                    9 hours ago

                    LMAO wtf? I included all of gaming opposed to all generative AI. My estimate also included the cost of production if you check the source.

                    You’re the one who wanted to compare AI power costs to gaming costs and now you’ve shifted the goalpost to all power costs for everything total?

                    It’s a waste. AI is a massive fucking waste. It’s going to actually literally kill us all with climate change alone, it’s going to multiply our power consumption many times over in only a couple of decades at the current rate even after you account for efficiency gains. It’s beyond worthless, it’s an almost pure negative.