Only this gift is solely for the use of Donald Trump, during and even after his presidency.Trump didn't get given any jet. The gift was to the United States
Only this gift is solely for the use of Donald Trump, during and even after his presidency.Trump didn't get given any jet. The gift was to the United States
I use AI almost every day, so I do expect that energy consumption of AI in datacenters will grow and grow each year. However, I reckon that AI offsets that energy consumption against lesser use of more power hungry laptops and desktop PCs because you can use AI on your phone instead of going to your laptop or desktop PC. I have a dedicated PC that I have been using to run AI locally. It's hardly in use now because I find it more convenient to go online. Compare the amount of electrity my phone uses against that of my AI PC, 64 cores and tons of memory. That is one or two orders of magnitude in energy savings.We did the math on AI’s energy footprint. Here’s the story you haven’t heard.
The emissions from individual AI text, image, and video queries seem small—until you add up what the industry isn’t tracking and consider where it’s heading next.
I've been running 12Bn parameter AI models on a mini PC with a Rysen 7, 16 core processor and when idle that machine uses 8 watts and when working on a query it never goes above 28 watts. The duration of the 28 watt request and answer where the machine is working at 28 watts input power might last half a minute, but lets call it a minute, which means solving the query uses about half a watt hour at most.I use AI almost every day, so I do expect that energy consumption of AI in datacenters will grow and grow each year. However, I reckon that AI offsets that energy consumption against lesser use of more power hungry laptops and desktop PCs because you can use AI on your phone instead of going to your laptop or desktop PC. I have a dedicated PC that I have been using to run AI locally. It's hardly in use now because I find it more convenient to go online. Compare the amount of electrity my phone uses against that of my AI PC, 64 cores and tons of memory. That is one or two orders of magnitude in energy savings.
Google announced that the number of tokens they process went up 50x in one year. However, in the same year, their Ironwood (TPUv7) offers massive leaps in compute (5x), HBM capacity (6x), and power efficiency (2x) over their 2024 Trillium (v6). Performance is estimated to be within 5% of an Nvidia B200, despite TPUv7’s single primary compute die design. The v7 delivers 4,614 TFLOP/s peak performance. In a few years, virtually everyone on earth is going to use AI daily. It will become as an essential resource as the internet has been.
2.9 watt-hours
Each ChatGPT question is estimated to use around 10 times more electricity than a traditional Google search. According to the nonprofit research firm Electric Power Research Institute, a ChatGPT request uses 2.9 watt-hours while traditional Google queries use about 0.3 watt-hours each.
Do you use Ollama Tony? What models did you run locally?I use AI almost every day, so I do expect that energy consumption of AI in datacenters will grow and grow each year. However, I reckon that AI offsets that energy consumption against lesser use of more power hungry laptops and desktop PCs because you can use AI on your phone instead of going to your laptop or desktop PC. I have a dedicated PC that I have been using to run AI locally. It's hardly in use now because I find it more convenient to go online. Compare the amount of electrity my phone uses against that of my AI PC, 64 cores and tons of memory. That is one or two orders of magnitude in energy savings.
Google announced that the number of tokens they process went up 50x in one year. However, in the same year, their Ironwood (TPUv7) offers massive leaps in compute (5x), HBM capacity (6x), and power efficiency (2x) over their 2024 Trillium (v6). Performance is estimated to be within 5% of an Nvidia B200, despite TPUv7’s single primary compute die design. The v7 delivers 4,614 TFLOP/s peak performance. In a few years, virtually everyone on earth is going to use AI daily. It will become as an essential resource as the internet has been.
Gemma 3 is a google product. I find it more accurate, less prone to hallucinations and it writes far better than many others. Llama is from META, and that isn't bad either, but I think the Google one is better. My son uses Perplexity Deep Research to analyse problems and write complex professional reports. It is a VERY impressive thing, but not as far as I know possible to run on your own local machine. As in all such work, writing the prompt carefully is essential if you want useful output.I use Ollama for more than a year but now mostly chatgpt and gemini 2.5 Pro. I use more and more Google products.
Yes - of course - nobody ever had a fire before solar flaps were invented. Fire brigades were really staffed by people who were actually unemployed, because fires never happened.One is not even safe in hospital from d Zolar heat, hehe.
Flaps on fire again, flippin hell?