• Jrockwar
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    That middle paragraph is very misleading. It’s Generative AI as a service that is actively harmful to the environment. Having a 15 W chip to do tasks like erasing objects from a photo is not any more harmful to the environment than a GPU that uses 15W. In fact, NPUs can be more efficient at some tasks than GPUs.

    The problem is opening your phone/browser, and being able to call on demand GPT-4 to wake up a cluster of 128 Nvidia A100s operating at around 300-400W each. That’s 51.2 kW.

    Now you can draw some positives and negatives from that figure, such as

    • Given that an iPhone 15 Pro’s A17 has a thermal design power of 8 W, GPT-4 on the server is about 6400 more energy intensive than anything you can do on an iPhone. 10 seconds of GPT need a similar amount of energy to an iPhone 15 Pro operating flat out at maximum power for 18 hours. Now in those 10 seconds, OpenAI says they “handle multiple user queries simultaneously”, but still - we’re feeding the machine.
    • 51.2 kW is also roughly how much power a large SUV needs to roll at constant speed on a motorway. Each of those large clusters uses a similar amount of energy to a single 7-seater SUV, but serving many users at the same time. Plus unlike cars, a large portion of their energy usage comes from renewables. So yes, I agree that it’s a significant impact but largely overrepresented and we have bigger fish to fry; personal transport is a way bigger issue.