Know the rules The Paceline Forum Builder's Spotlight


Go Back   The Paceline Forum > General Discussion

Reply
 
Thread Tools Display Modes
  #31  
Old Yesterday, 08:29 AM
Carbonita Carbonita is offline
Senior Member
 
Join Date: Aug 2023
Location: San Francisco bay area
Posts: 291
Yes. Some years ago, I bodged together a face recognition camera with a raspberry pi, a model, and some code. I figured that building a system helps cut through the hype (kind of like building and fixing bikes?) False/true positive probability for each video made it clear that it's just stats and patterns recognition.

Quote:
Originally Posted by jimoots View Post
Generative AI is not all knowing and cannot conduct research or fact check, it's a probability engine.
Reply With Quote
  #32  
Old Yesterday, 08:41 AM
MikeD MikeD is offline
Senior Member
 
Join Date: Jan 2015
Posts: 3,103
From the Jurassic Park movie:

Malcolm: Yeah, but your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.
Reply With Quote
  #33  
Old Yesterday, 08:42 AM
AngryScientist's Avatar
AngryScientist AngryScientist is offline
Administrator
 
Join Date: Mar 2010
Location: northeast NJ
Posts: 33,908
Quote:
Originally Posted by xnetter View Post
It's also a mega hog of grid power for processing and cooling.
Slight thread drift, but this is something interesting to me.

Recently read this:

https://www.npr.org/2024/09/20/nx-s1...t-microsoft-ai

Re-opeing a nuc plant to power datacenters. Wow. I had no idea these things were SO power hungry.

I dont really know much about the tech, what is it that eats so much MWs ?
Reply With Quote
  #34  
Old Yesterday, 09:19 AM
MikeD MikeD is offline
Senior Member
 
Join Date: Jan 2015
Posts: 3,103
Quote:
Originally Posted by AngryScientist View Post
Slight thread drift, but this is something interesting to me.

Recently read this:

https://www.npr.org/2024/09/20/nx-s1...t-microsoft-ai

Re-opeing a nuc plant to power datacenters. Wow. I had no idea these things were SO power hungry.

I dont really know much about the tech, what is it that eats so much MWs ?
And your electric rates go up to pay for it all (add Crypto and EV'S to that as well).
Reply With Quote
  #35  
Old Yesterday, 09:33 AM
verticaldoug verticaldoug is online now
Senior Member
 
Join Date: Nov 2009
Posts: 3,447
Quote:
Originally Posted by AngryScientist View Post
Slight thread drift, but this is something interesting to me.

Recently read this:

https://www.npr.org/2024/09/20/nx-s1...t-microsoft-ai

Re-opeing a nuc plant to power datacenters. Wow. I had no idea these things were SO power hungry.

I dont really know much about the tech, what is it that eats so much MWs ?
Scale and concentration. 62% if Cloud is provided by the big 3 (AWS, AZURE, GCP)

I think AWS has 33mm sq ft of Data center which is equivalent to 500 football pitches packed with Servers, cabling, redundancy considerations, cooling considerations. The SOTA of GPUs are so densely packed, it generates so much excess heat.

AI is part of it, and may have turbocharged the growth, but this has been a thing with Cloud providers for a while.

The Nvidia H100 GPU consumes 700 W of power which is probably equivalent to the average american home. NVDA plan to ship 2mm year. So think of it as building 2mm homes.

Zuckerberg just bought 350,000 H100s for Meta. That's 350,000 houses. A pretty good sized american city there (you still haven't bought the airconditioners yet)

Last edited by verticaldoug; Yesterday at 09:36 AM.
Reply With Quote
  #36  
Old Yesterday, 10:41 AM
P K's Avatar
P K P K is offline
Senior Member
 
Join Date: Dec 2013
Location: Small Lake City
Posts: 104
the future

Seems to me the real risk isn't if AI will become self-aware and take over, but "bad actors" or nation states using AI
(combined with quantum computing) in the future to carry out attacks on their adversaries. Say goodbye to internet security,
banking etc. Those office buildings in Russia and China full of people spreading disinformation are nothing compared to the potential
of what chaos "bad" AI could unleash.
__________________
Grumpy Old Shoe cycles
Reply With Quote
  #37  
Old Yesterday, 04:12 PM
reuben's Avatar
reuben reuben is offline
Senior Member
 
Join Date: Jun 2020
Location: The Land of Pleasant Living
Posts: 5,335
Quote:
Originally Posted by AngryScientist View Post
Slight thread drift, but this is something interesting to me.

Recently read this:

https://www.npr.org/2024/09/20/nx-s1...t-microsoft-ai

Re-opeing a nuc plant to power datacenters. Wow. I had no idea these things were SO power hungry.

I dont really know much about the tech, what is it that eats so much MWs ?
CPUs and cooling.
__________________
It's not an adventure until something goes wrong. - Yvon C.
Reply With Quote
  #38  
Old Yesterday, 04:53 PM
bigbill bigbill is offline
Senior Member
 
Join Date: Feb 2006
Location: Hackberry, AZ
Posts: 3,985
Quote:
Originally Posted by verticaldoug View Post
Scale and concentration. 62% if Cloud is provided by the big 3 (AWS, AZURE, GCP)

I think AWS has 33mm sq ft of Data center which is equivalent to 500 football pitches packed with Servers, cabling, redundancy considerations, cooling considerations. The SOTA of GPUs are so densely packed, it generates so much excess heat.

AI is part of it, and may have turbocharged the growth, but this has been a thing with Cloud providers for a while.

The Nvidia H100 GPU consumes 700 W of power which is probably equivalent to the average american home. NVDA plan to ship 2mm year. So think of it as building 2mm homes.

Zuckerberg just bought 350,000 H100s for Meta. That's 350,000 houses. A pretty good sized american city there (you still haven't bought the airconditioners yet)
700 Watts is half a blow dryer. A house with 100 amp service can consume 11,000 Watts. Voltage times current equals power. You can power 16 chips for what a house consumes.
Reply With Quote
  #39  
Old Yesterday, 06:02 PM
marciero marciero is offline
Senior Member
 
Join Date: Jun 2014
Location: Portland Maine
Posts: 3,319
Quote:
Originally Posted by bigbill View Post
700 Watts is half a blow dryer. A house with 100 amp service can consume 11,000 Watts. Voltage times current equals power. You can power 16 chips for what a house consumes.
This power thing is a sideline-not directly an AI thing.
As verticaldoug suggested, its all about scale, and for the most part, its the explosion of the more generic data center processing, which AFAIK is mostly on CPUs. The GPU architectures are pretty specific to the array and tensor operations needed for gaming and deep learning architectures. As the name suggests, they were originally designed for graphics.
Reply With Quote
  #40  
Old Yesterday, 06:09 PM
redir's Avatar
redir redir is offline
Senior Member
 
Join Date: Jan 2007
Location: Mountains of Virginia
Posts: 7,093
AI is very successful with propaganda right now. I'm astounded at the stupidity of a post I see on Facebook which is obvious AI and 90% of the people there respond to it as though it's real. Once it's pointed out that it is in fact not real it doesn't matter. The AI was successful at programming the human thoughts. That's the real problem with AI right now imho.
Reply With Quote
  #41  
Old Yesterday, 06:28 PM
Permanent socks Permanent socks is offline
Senior Member
 
Join Date: Dec 2022
Location: On my bike
Posts: 189
Quote:
Originally Posted by bigbill View Post
700 Watts is half a blow dryer. A house with 100 amp service can consume 11,000 Watts. Voltage times current equals power. You can power 16 chips for what a house consumes.
My annual consumption electricity consumption is 10,000kw. That's 10,000,000 watts per year or around 27,000watts per day.
Reply With Quote
  #42  
Old Yesterday, 09:08 PM
bigbill bigbill is offline
Senior Member
 
Join Date: Feb 2006
Location: Hackberry, AZ
Posts: 3,985
Quote:
Originally Posted by Permanent socks View Post
My annual consumption electricity consumption is 10,000kw. That's 10,000,000 watts per year or around 27,000watts per day.
Right, at any one time, a typical house with a 100 amp service can handle 11,000 Watts. So in theory, a house could consume 264 Kw each day. The whole issue is the capacity of the grid in the U.S. We want to shift to all electric homes, EVs, and AI, while at the same time trying to bring renewable energy on line. We don't have the capability yet.
Reply With Quote
  #43  
Old Yesterday, 09:50 PM
rkhatibi rkhatibi is offline
Senior Member
 
Join Date: Aug 2014
Location: SF, CA
Posts: 287
People seem to be confusing capacity with consumption. Here's the power explained out to kWh annually which should be easier to compare https://www.tomshardware.com/tech-in...he-coming-year
"This is Nvidia's H100 GPU; it has a peak power consumption of 700W," Churnock wrote in a LinkedIn post. "At a 61% annual utilization, it is equivalent to the power consumption of the average American household occupant (based on 2.51 people/household).
at 61% annual utilization, an H100 GPU would consume approximately 3,740 kilowatt-hours (kWh) of electricity annually
Reply With Quote
  #44  
Old Yesterday, 10:58 PM
HTupolev HTupolev is offline
Senior Member
 
Join Date: Apr 2018
Posts: 330
Quote:
Originally Posted by AngryScientist View Post
I dont really know much about the tech, what is it that eats so much MWs ?
So suppose you have the text "Once upon a ". What's the next word?

As an experience user of English, you probably realize that there are a variety of possibilities, of which the most likely is "time".

But how do you know that the most likely is "time"?
Your mind is considering the entire preceding sentence, and perhaps also the lack of stuff preceding that. If there was additional material prior to "Once upon a ", then it's possible that some other next word would make more sense as a result of the context. But this would require recognizing patterns of meaning within that preceding text.

You know how "large language models" are often referred to as having "billions" or even "trillions" of parameters? Those are weights with complicated correlations, in an appropriate structure with sufficient complexity to perform the sorts of guessing operations described above. The weights are "trained" ("machine learning") on stuff like actual large collections of text.

When you give these models some text and ask them to predict the next token*, they infer it by running complicated matrix operations on some or all of the parameters. The amount of individual mathematical operations to generate just a single fraction of a word is enormous.

Consider what it means when somebody says that a GPU is able to produce 60 tokens per second with some LLM. In many cases, that same modern GPU is able to render a modern video game at 60 frames per second at 4K resolution. So imagine the amount of processing power required to produce more than eight million pixels in 1/60th of a second, where the color of each pixel might involve sampling data from a large number of textures, and running a bunch of very complex lighting math... and now imagine the same processor taking just as much time to guess at the next fraction of a word.

Performing interesting inference sometimes just takes a really large amount of processing power, because it takes a ton of data and operations to capture and reproduce that inferencing.
Smaller models are often much cheaper and faster to run, but they tend to be substantially less capable.

*LLMs don't operate on letters and words as we do. Generally they're converted into a different semantic structure, where some words might be made up on multiple "tokens."

Last edited by HTupolev; Yesterday at 11:06 PM.
Reply With Quote
  #45  
Old Yesterday, 11:50 PM
verticaldoug verticaldoug is online now
Senior Member
 
Join Date: Nov 2009
Posts: 3,447
Quote:
Originally Posted by rkhatibi View Post
People seem to be confusing capacity with consumption. Here's the power explained out to kWh annually which should be easier to compare https://www.tomshardware.com/tech-in...he-coming-year
"This is Nvidia's H100 GPU; it has a peak power consumption of 700W," Churnock wrote in a LinkedIn post. "At a 61% annual utilization, it is equivalent to the power consumption of the average American household occupant (based on 2.51 people/household).
at 61% annual utilization, an H100 GPU would consume approximately 3,740 kilowatt-hours (kWh) of electricity annually

Using Meta's claim for their latest model LLAMA 3.1 450B , they say they trained it on a cluster of 16,000 H100 which took 30.94mm GPU hours.

I think this equals 20Gw over 80 days

I think this is the consumption of 5300 homes annually.

Last edited by verticaldoug; Yesterday at 11:53 PM.
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 11:28 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.