May 23, 2025May 23 I use AI almost every day, so I do expect that energy consumption of AI in datacenters will grow and grow each year. However, I reckon that AI offsets that energy consumption against lesser use of more power hungry laptops and desktop PCs because you can use AI on your phone instead of going to your laptop or desktop PC. I have a dedicated PC that I have been using to run AI locally. It's hardly in use now because I find it more convenient to go online. Compare the amount of electrity my phone uses against that of my AI PC, 64 cores and tons of memory. That is one or two orders of magnitude in energy savings. Google announced that the number of tokens they process went up 50x in one year. However, in the same year, their Ironwood (TPUv7) offers massive leaps in compute (5x), HBM capacity (6x), and power efficiency (2x) over their 2024 Trillium (v6). Performance is estimated to be within 5% of an Nvidia B200, despite TPUv7’s single primary compute die design. The v7 delivers 4,614 TFLOP/s peak performance. In a few years, virtually everyone on earth is going to use AI daily. It will become as an essential resource as the internet has been. I've been running 12Bn parameter AI models on a mini PC with a Rysen 7, 16 core processor and when idle that machine uses 8 watts and when working on a query it never goes above 28 watts. The duration of the 28 watt request and answer where the machine is working at 28 watts input power might last half a minute, but lets call it a minute, which means solving the query uses about half a watt hour at most. I just asked Google AI, how much energy the average query to Chat Gpt uses and it said 2.9 watt hours so my own local system would be certainly less powerful, but about six times more energy efficient - except that I would have to leave the machine running idle at 8 watts an hour. I run it headless, interacting with it on my phone or on my chromebook, so there is no monitor in use on the mini pc, unless I am specifically configuring something on it. I interact with it through OpenWeb UI or a browser extension for chrome and firefox called Page Assist UI. By the way the words, 'traditional Google search', make the Google quote below a bit disingenuous, because there is no longer a traditional - old time google search. All the googling I do comes up with an AI generated response at the top of the answer. I find this massively useful since I don't generally need to go through screeds of marketing oriented paid for responses in the google listings. 2.9 watt-hours Each ChatGPT question is estimated to use around 10 times more electricity than a traditional Google search. According to the nonprofit research firm Electric Power Research Institute, a ChatGPT request uses 2.9 watt-hours while traditional Google queries use about 0.3 watt-hours each.
May 23, 2025May 23 I use AI almost every day, so I do expect that energy consumption of AI in datacenters will grow and grow each year. However, I reckon that AI offsets that energy consumption against lesser use of more power hungry laptops and desktop PCs because you can use AI on your phone instead of going to your laptop or desktop PC. I have a dedicated PC that I have been using to run AI locally. It's hardly in use now because I find it more convenient to go online. Compare the amount of electrity my phone uses against that of my AI PC, 64 cores and tons of memory. That is one or two orders of magnitude in energy savings. Google announced that the number of tokens they process went up 50x in one year. However, in the same year, their Ironwood (TPUv7) offers massive leaps in compute (5x), HBM capacity (6x), and power efficiency (2x) over their 2024 Trillium (v6). Performance is estimated to be within 5% of an Nvidia B200, despite TPUv7’s single primary compute die design. The v7 delivers 4,614 TFLOP/s peak performance. In a few years, virtually everyone on earth is going to use AI daily. It will become as an essential resource as the internet has been. Do you use Ollama Tony? What models did you run locally? I like Gemma 3 12 Bn parameters. I would run the 27Bn model but my mini PC isn't powerful enough. One of my sons runs 27Bn parameter models but the much more powerful gaming PC he runs them on uses a lot more electricity. I was doing it to learn about AI which was useful, rather than for any other reason. It took quite a bit of effort to set it all up, but I suppose one advantage of having the models on your own machine is that in a catastrophe, you would still have a vast fund of knowledge at your disposal and you'd only need solar panel to get it up and running if the sh t hit the fan and everything in civilisation collapsed. It may seem a but nuts to speculate on Armageddon scenarios, but between a resurgent Russia meddling with power and data cables under the sea - not to say threatening nuclear war, and armies of lunatic teens knocking over major retailers, who knows when power, or data centres could just disappear? Since Putin invaded Ukraine I have had a six month supply of food at my house in the country. A lot of it in the form of 30 kilo sacks of unground wheat. I have a mill of course. Dull, but I would stay alive for longer.
May 23, 2025May 23 I use Ollama for more than a year but now mostly chatgpt and gemini 2.5 Pro. I use more and more Google products.
May 23, 2025May 23 So people are more bothered about fake air vents on an Electric car, than how and why people were burnt to unalive in a malfunctioning Electric car? Mm :-/ That's correct vid
May 23, 2025May 23 One is not even safe in hospital from d Zolar heat, hehe. Flaps on fire again, flippin hell?
May 23, 2025May 23 https://www.infowars.com/posts/global-bombshell-using-advanced-ai-algorithms-renowned-tech-inventor-covid-expert-steve-kirsch-reveals-incontrovertible-evidence-that-the-covid-19-mrna-vaccines-have-caused-mass-death-il
May 23, 2025May 23 I use Ollama for more than a year but now mostly chatgpt and gemini 2.5 Pro. I use more and more Google products. Gemma 3 is a google product. I find it more accurate, less prone to hallucinations and it writes far better than many others. Llama is from META, and that isn't bad either, but I think the Google one is better. My son uses Perplexity Deep Research to analyse problems and write complex professional reports. It is a VERY impressive thing, but not as far as I know possible to run on your own local machine. As in all such work, writing the prompt carefully is essential if you want useful output. https://www.perplexity.ai/hub/blog/introducing-perplexity-deep-research https://deepmind.google/models/gemma/gemma-3/#:~:text=Welcome%20to%20Gemma%203,%2Dthe%2Dart%20open%20models.
May 23, 2025May 23 One is not even safe in hospital from d Zolar heat, hehe. Flaps on fire again, flippin hell? Yes - of course - nobody ever had a fire before solar flaps were invented. Fire brigades were really staffed by people who were actually unemployed, because fires never happened. Fire of London 1666. The whole city burned down. Los Angeles 2024
May 23, 2025May 23 council does not clean or maintain there inverters 20 are ready to blow and costing them 200 a month in electric pre pay meters
May 23, 2025May 23 If I wanted a lot of video, audio, ebook, website misinformation content automatically created by AI about how the earth is square, I'd have to go AMD, or shoehorn some <11GB AI model demon into my nVidia 1080 Ti... but of course it won't be as efficient or run as fast as the newer cards with AI processing enhancements. The time may have come to build an AMD system, I've been exclusively Intel for decades. AMD CPUs used to run too hot and burn out often back then. How much square earth content would it take to snuff out the flat earthers? Edited May 23, 2025May 23 by guerney
May 23, 2025May 23 If I wanted a lot of video, audio, ebook, website misinformation content automatically created by AI about how the earth is square, I'd have to go AMD, or shoehorn some <11GB AI model demon into my nVidia 1080 Ti... but of course it won't be as efficient or run as fast as the newer cards with AI processing enhancements. The time may have come to build an AMD system, I've been exclusively Intel for decades. AMD CPUs used to run too hot and burn out often back then. How much square earth content would it take to snuff out the flat earthers? My Rysen 7 processor never gets above about 60c. It has good, silent fans though. I don't have a gpu, and even so, an average query might mean i need to wait ten seconds for it to understand and begin answering. It produces output faster than I can read it. I think it is true that Rysen orocessors are less power hungry than equivalent Intel ones.
May 23, 2025May 23 https://www.scan.co.uk/products/amd-epyc-9655p-s-sp5-3nm-zen-5-96-core-192-thread-26ghz-45ghz-turbo-384mb-400w-cpu-oem there never in stock:rolleyes:
May 23, 2025May 23 If I wanted a lot of video, audio, ebook, website misinformation content automatically created by AI about how the earth is square, I'd have to go AMD, or shoehorn some <11GB AI model demon into my nVidia 1080 Ti... but of course it won't be as efficient or run as fast as the newer cards with AI processing enhancements. The time may have come to build an AMD system, I've been exclusively Intel for decades. AMD CPUs used to run too hot and burn out often back then. How much square earth content would it take to snuff out the flat earthers? it's a waste of time running AI on a home computer unless you have the stuff that SW buys. It's time consuming to setup and maintain a homelab. Just go to https://aistudio.google.com/ - you get several months of free usage. By the time you use up one freebie, they offer you two more to try...
May 23, 2025May 23 youtube DS-ytunI30Y The guy isn't right. Unlike the USA or Canada, our 280 or so oil and gas fields are so close to a dozen of countries in Europe that the producers don't need to supply us first. They just sell to the highest bidders. Sure, if the oil and gas industry is nationalised then I will agree that burning gas makes sense.
May 23, 2025May 23 The guy isn't right. Unlike the USA or Canada, our 280 or so oil and gas fields are so close to a dozen of countries in Europe that the producers don't need to supply us first. They just sell to the highest bidders. Sure, if the oil and gas industry is nationalised then I will agree that burning gas makes sense. We should have done that, like the Norwegians. They have a national wealth fund of massive proportions and are because of it, the richest population in the world. Stuff like the natural resources of a nation should not be pillaged by mega corporations who then sell back to the people their own nations resources at a world market price. The Norwegian government has the largest shareholding in Equinor - 67% and the rest is publicly traded on international stock exchanges. The capital raised by this means benefits the company and the Norwegian state gets the lion's share of the profits. On top of all that, they recover tax and fees from the corporations they use to extract the oil and gas and 5.5% of Norway's population are directly employed in the business of extracting, and exporting oil and gas. We should have done it like that.
May 23, 2025May 23 it's a waste of time running AI on a home computer unless you have the stuff that SW buys. It's time consuming to setup and maintain a homelab. Just go to https://aistudio.google.com/ - you get several months of free usage. By the time you use up one freebie, they offer you two more to try... Privacy is the big problem there, "Free" means you're the commodity. One of my projects is predicting stock prices and years ago I made an algorithm that works well for small time increments, rather like the robotraders which might work on the same principles. Trouble is, you need a trading desk and API access to make ultra fast trades, so I'd like to develop something for long term investments. Warren Buffet's brain ought to be digitised while there's still time. https://aistudio.google.com/ Impressive but still a bit meh: Edited May 23, 2025May 23 by guerney
May 23, 2025May 23 https://www.bullionbypost.co.uk/silver-bars/5kg-silver-bar/umicore-5-kilogram-fine-silver-bar/
May 23, 2025May 23 Privacy is the big problem there, "Free" means you're the commodity. One of my projects is predicting stock prices and years ago I made an algorithm that works well for small time increments, rather like the robotraders which might work on the same principles. Trouble is, you need a trading desk and API access to make ultra fast trades, so I'd like to develop something for long term investments. Warren Buffet's brain ought to be digitised while there's still time. Google and also amazon are after professional users who spend hundreds if not thousands every month The stuff they give free is limited but work well. Their customers aren't products.
May 23, 2025May 23 Google and also amazon are after professional users who spend hundreds if not thousands every month The stuff they give free is limited but work well. Their customers aren't products. Their customers are fodder for future products.
May 23, 2025May 23 You should check out their Veo 3. It's insanely good. Well worth the money but it's not open to residents outside the usa.
May 23, 2025May 23 You should check out their Veo 3. It's insanely good. Well worth the money but it's not open to residents outside the usa. Who owns copyright of it's creations? I can see Veo 3 and the like being useful for a business like yours, if blended with real video elements and segments for the purposes of instruction, promotion. Facial resolution is too soft sometimes and expressions icky. I say meh again. Needs more baking.
May 23, 2025May 23 I don't know who owns the copyright. I assume it's the customer who pay about £250 a month for the use of the resources. Another Sci fi like product is google Beam. That reminds me of an old work by Isaac asimov. You can be naked talking to your parents and they wouldn't know. All they see is a 3d you with your normal clothes on.
Join the conversation
You are posting as a guest. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.