Tesla's Musk predicts AI will be smarter than the smartest human next year

Discussion in 'Science' started by Eclectic, Apr 9, 2024.

  1. Eclectic

    Eclectic Newly Registered

    Joined:
    Feb 20, 2024
    Messages:
    383
    Likes Received:
    240
    Trophy Points:
    43
    Gender:
    Male
    Musk goes on to say that both unavailability of chips and electrical power are constraints on AI model training. Each Nvidia H100 chip consumes about 500 watts. So a training setup using 20,000 H100s would consume about 10,000 kilowatts, and one using 100,000 H100s would consume about 50 Megawatts. Adding in memory, CPUs, and other circuitry, etc., the data center draw is probably about 100 MW. That's only about a tenth of a typical nuclear power plant, so not too bad. But it's not the thing you want every little organization trying to do.
     

Share This Page