Hacker News new | past | comments | ask | show | jobs | submit login
Up to 600 watts of power for graphics cards with the new PCIe 5.0 powerconnector (igorslab.de)
86 points by jsiepkes on Oct 12, 2021 | hide | past | favorite | 176 comments



That looks all the world like a Micro-Fit 3.0 drawing, whose contacts are normally rated at 8.5A, but here they're claiming 9.2A each. Also, the largest Micro-Fit terminal is made for 18AWG, but the article mentions 16.

https://www.molex.com/molex/products/family/microfit_30?pare...

I suspect what's happening is Molex is making a new version of the terminal with higher clamping force or better plating, to keep the temperature rise down at higher current, and larger wire grips, to accommodate the thicker conductor. (This will also allow more heat flow away from the contact.)

However, the contact pitch is unchanged from 3.0mm, meaning the cavities in the housing can't grow any more, so the wire insulation thickness will be limited. That's not a big deal electrically since it's only 12 volts, but it's a consideration mechanically since the wires will be less protected against abrasion, pinching, and other damage.


The current deratings are down to 5.5A for 12-circuit wire-to-board applications:

https://www.molex.com/pdm_docs/ps/PS-43045-001.pdf

But that is allowing a maximum temperature rise of 30C, it uses 18AWG wire as you say, and it's supposed to be capable of 600V AC/DC; it's not supposed to decompose/leak more than 5mA when exposed to 2200V.

All of that is overkill for inside of a PC. If you're willing to limit the operating voltage to 12V and can guarantee that your board side conductors have big, multi-plane copper traces pulling heat out of the contacts to an actively cooled heat sink, you can get away with a lot that you couldn't if you were using the connector as, say, an in-line disconnect in some conduit for a 480V servo drive.


At what point would it make sense to move to a 24v or 48v power standard in PCs? The dc-dc conversion world has improved a lot since ATX was invented.


48v made sense years ago, the only reason we haven't done it is inertia. You're spot-on; the PSU is not making voltages that're directly used by components anymore, with the possible exception of 5v for USB and stuff. Everything else has a local DC-DC converter and the "12-ish" is used as an intermediate distribution standard.

Pursuant to that, the ATX12VO standard is a huge step in the right direction. It ditches the 5v and 3v3 rails from the PSU and uses 12 only. This requires only modest changes to peripherals, most of which were getting most of their power from the 12v rail already.

I think it's an excellent stepping stone on the way to an ATX48VO or something. That will require some more changes because a lot of popular power semiconductors are only rated to 30v or 35v or something, so there will be some push for a 24v compromise, but I think the efficiency gains with a higher voltage will ultimately win out.


I've always wondered why CPU AIO coolers are so prevalent, but GPUs almost exclusively use air coolers. We're nearing a point where GPUs will have a TDP >2X the hungriest CPU (Threadripper), but the cooling used by most is far less effective.

It's especially noticeable in smaller builds like those using mini-itx. Most of them GPU coolers draw air in from the bottom and exhaust out the sides, which doesn't really fit well the usual PC case airflow setup of draw cool air in the front and exhaust out the back and top. It seems like blower style coolers are also getting harder to find.

I'm hoping my next PC can have an AIO at the intake connected to the GPU, and a large air cooler for the CPU. That makes way more sense to me than having the AIO on the CPU and an air cooler on the GPU...


The difference is that with GPUs nearly every model has a different layout of components around the actual chip, and those components need to be cooled too, so making a universal cold plate for them can be really challenging. The AIO I bought for my last computer could be used with any CPU, AMD or Intel, released in the past decade. The water block I bought for the GPU I put in my desktop two computers ago could only be used with reference designs of the AMD R9 290 or 290x.

There are models of GPUs with AIOs, EVGA makes their Hydro series, Gigabyte has their waterforce series. They just usually aren't worth the cost since you have to design a new full cover block not just got every generation but also usually for different cards in the same gen too.

Lastly, TDP is not a defined standard and each manufacturer defines it differently. Intel for example defines TDP as the maximum allowable power draw under full load. How that translates to actual heat output depends on architecture and process node. Intel's most recent chips for example all max out at just over 100 watts TDP, but they're known to need 200+ watts of heat dissipation for sustained workloads.


A big part is that CPU sockets are very standardized and you are only cooling the CPU and not the memory and power regulators. For graphics cards the coolers need to cool the gpu along with the memory and power regulators which can be in different places depending on the manufacturer. This usually means you need a block made specifically for your model of card. You can buy those blocks and do custom water cooling but the cost of that is incredibly high. Also it finally comes down to the AIO coolers really aren't better than good air coolers. Sure AIO tend to be quieter and move the heat to a different location but on pure performance they don't crush air coolers.


I’ve always thought of gaming as a virtuous low-impact form of entertainment, insofar as it displaces real-world pursuits like foreign ski holidays, gasoline-powered road trips, or land-hogging golf courses.

But burning nearly a kilowatt on a gaming rig feels like a step too far. Sure, the power could be sourced from wind or solar plants, but there’s still a level of excessive conspicuous consumption going on here that doesn’t sit right with me.


Definitely agree that a kilowatt seems very high for gaming and starts to have an environmental impact. Still, driving a car at 100kph takes around 20 kW of power, so driving somewhere is still a bigger burden on the environment than gaming.


That's peak power. You don't run at 100kph straight for hours but you will run your gaming rig at say 500W for three hours in a day (particularly for people willing to shell out money for such a gaming rig). The total energy per day might not be that different, you'd have to account for how much is involved in commuting in traffic.


My daily commute is 48.6 miles, at about 30 miles/gallon. 1.62 gallons per day, which when combining the energy content of the gas (38.685 kwhr/gal) and the refining energy cost (4 kwhr/gal) is a daily vehicle energy usage of 69.15 kilowatt-hours, plus whatever I use getting lunch. The greenhouse effect of my gaming rig is next to irrelevant.


Tesla's claimed energy usage is around 180Wh/km.

Running your gaming rig for three hours a day at 500W (1500 Wh) would therefore be the equivalent of driving 8km (5 miles) in a modern electric car. Still ok comparatively.


You really won't run your rig at 500W all the time either. That's also peak power.

Very few people have computers capable of cooling a constant 500W.


>You really won't run your rig at 500W all the time either.

While this is true, I don't agree with your statement about cooling.

It's very possible to have sustained 500W+ draw while playing a modern game on a 3080/3090. Sometimes you might play games for a few hours with a fairly consistent 500/600W power draw - honestly I think it's cheap and easy to cool this kind of load. A few 140mm fans and a mid-level CPU cooler should be enough.


> A few 140mm fans

Yeah... Nearly nobody has a few 140mm fans on their computers. Most people won't have any fan lager than 120mm. Powerful computers usually have a pair of them. Maybe 3 if you count the PSU.


People who are using 3080/3090's are much more likely to have systems with larger numbers of fans. And they usually have far more than two.


That's not peak power. 20kW is around 30hp. Peak power for an average north American car is more like 150kW(!).


> You don't run at 100kph straight for hours

Plenty of people going on ski trips in Europe will run their car at 130kph for hours.


There are literally millions of kilometers of highway all over the world with a speed limit of 100km/h. I think the total number of man-hours spent at that speed or higher globally would be pretty shocking.

My commute to and from the office alone would be about 90 minutes at that speed.


Do you live 150km away from your office? A super commuter commutes >50 miles one way, that's a little over half of 90 miles (which is 150km). Again, the point is what matters isn't peak power or rate of any quantity but the integral, the total. Most of the time commuting is spent moving much slower than 60mph.


In Denmark, the limit is 50kmh inside the city, 80kmh on normal roads outside the city, and 110-130 kmh (68-80 mph) on the motorway, which is where a lot of people drive most of the commuting distances. So the average speed is likely not going to be at least 60 or higher.


No, I'm about 60km one direction - I live rural and work in a nearby city. Each direction is still only 40 minutes or so, so it's not as onerous as it might sound.


Stop and go traffic uses more than 20kW of power.


The grandparent gave specific examples of "foreign ski holidays, gasoline-powered road trips". Extra context for those was provided with an example benchmark by the child comment. Original hypothesis was not about everyday car use.


> That's peak power. You don't run at 100kph straight for hours

Have you been to the US? I regularly drive 4+ hours at 125kmh+ anytime I'm trying to get to the Sierra, or Utah or Arizona, which is often.

In fact, almost every weekend I drive at least 3 hours round trip, half freeway half mountain roads, if I go to the closest good climbing. 5+ if I go to my preferred locale. The distances people regularly cover are enormous.


excessive conspicuous consumption

Energy is a proxy for accomplishment and development. Humanity's #1 goal should be to maximize energy use per capita while minimizing energy use per activity, such that every human is doing as many desirable activities as possible.

Gaming at 1kW for an hour costs 30 cents or less in electricity. As others have mentioned indirectly, going for a Sunday drive to see the changing leaves uses more energy. A space heater uses more power.


>minimizing energy use per activity

Maybe this is what feels off?

A kilowatt feels like too much for a videogame. This isn't a bitcoin mining thing, is it?


If we take a game console or PC running at 250 watts (made up number), 4 hours is already a kilowatt-hour. Kilowatt-hours sounds like a lot, but in the grand scheme of things, it’s not.


my rtx 3070 uses like 70-80% on average when I play new world. my cpu (5600x) is very low (~25%), but I probably blow like ~250 watt or more per hour and I do not have a expensive setup (no rgb, etc.)


It's a trivial amount of power and even better, we have already electrified all the components! Computers don't require us to go out and replace all the combustion engine vehicles still around. Clean up the grid and that's it.


Why? Your home heating consumes 5-50kW. In colder climates, the power consumption will be compensated by less home heating.


if you compare that with an inefficient electric heater. Otherwise, your gaming gear will consume way more electricity for the same heating power than, say, a heat pump.


True but doesn't change the conclusions dramatically, does it? If a heat pump is twice as efficient as electrical heating then you're still displacing half the wattage from your rig with reduced heating.

On the other hand, taking heat into account just makes matters worse in the summer, and I haven't known a gamer who takes half the year off.


An electric heater is ~100% efficient. The power generation is much less efficient.

The heat pump can be effectively more than 100% efficient, but that's because it's moving heat.


What's an "inefficient electric heater"? Is that a device that generates more light than heat?

Surely all the energy you put into any machine is eventually going to turn into heat? Hmm - perhaps there are heaters that generate heat mechanically, and the resulting losses are sonic.


There are electric heat pumps which move heat from the ground into your space which use drastically less energy than electric heaters that convert electricity into heat directly.


Sure; but that's because a heat pump has the potential to be > 100% "efficient" (which isn't real efficiency, in the thermodynamic sense).


Even if, in warmer climates the offset will be doubled by the AC.


5kW minimum, seriously? Not 5kWh/day?


Probably not worth judging the entire sector by the peak theoretical output of one connector that might be relegated to industrial use.


Lots of things are a huge waste of electricity. Ever use your oven? At 12% efficiency you're wasting over 1kw of electricity.

Unless energy prices rise, it's literally pennies.


This line of argumentation drives me insane, and it's so pervasive right now.

Why should I walk to our friends' house down the road? After all, I drive to work, so why not drive there too?

Why should I use less plastic? After all, I already use some plastic, so why not use more?

Why should I eat less meat? After all, I already eat some meat, why eat less?


I think the environmental problem should be approached like a budget - figure out how much we can spend, and than allocate it accordingly. We have to avoid being 'penny wise and pound foolish' with the environment. If I look at my own environmental footprint, the biggest changes I can make are in my heating, my flying, my driving and I am trying to change those things (working from home, and changing my holiday destinations) Will be looking into isolating my house as well. Gaming does not even register compared to those, as it does for most people.


It will start to matter if it costs ~3KWh's of gaming per day. Probably even more if this is meant for serious gamers, they will game more than 3 hours per day.

It doesn't matter it's genshin impact or animal crossing but it definitely starts to matter when your gaming costs almost two orders of magnitude up in power from the typical case.


3KWh a day is $0.30 or less than $10/month. (in Seattle).


How can an electric oven be just 12% efficient?

I suppose:

1. You often have to pre-heat the oven, so during that time, you are not cooking food

2. The oven is rarely full, so instead of heating food, you are heating the oven

3. The oven remains hot after the food is removed, but you could argue that's low-grade heat, unsuitable for cooking

If my electric oven is just 12% efficient, I'd like to know why (and how I can get one that is more efficient). I understand that microwave ovens are much more efficient (I guess because the heat is generated inside the food). But they're no good for browning food, or baking bread.

I'm still lost on this notion that an electric oven is just 12% efficient. If that's true, then I could patent a device that harvests energy from electric ovens, and make a mint.


It's not hard to understand, the energy required to heat up food vs energy consumed by the appliance.

https://www.energy.gov/sites/prod/files/2015/06/f23/conventi...

TLDR: they measure with a metal block.


> It's not hard to understand

Please don't patronise!

My oven, on full power, draws 2KW (at least, that's what the labels on my fuze-box suggest). If only 12% of that energy is used to heat the food, then the oven amounts to a 200W food cooker, combined with an 1800W electric heater (I rounded down to 10% to make the reckoning easier).

I live in a studio flat; my oven is in my living room. How come I can survive without A/C? But I seem to be able to roast and bake without my home warming up.

My microwave is an old cheap one, with a max. power of 700W. Assuming that's 100% efficient, then it should cook food about 3.5 times faster than the inefficient oven, I guess; and it does. So it does seem that the oven is inefficient. But why? Where is the lost energy going?


Most of the energy created by the oven doesn't actually go into the food, it heats up the air which heats up your food (and also the walls of the oven). While a microwave pretty much only heats the water molecules in your food.

To roast a chicken to 165f you'll probably set your oven to 400f. And then when it's done, the air and interior of your oven are still very hot. When you microwave food, you can touch the side-walls of your microwave right away and they won't be hot. The only excess heat in a microwave comes from the food steaming and contact with the platter.


> But burning nearly a kilowatt on a gaming rig feels like a step too far.

How much energy does a golf cart use? Cause those golfers are using a golf-cart for miles when their legs are perfectly acceptable (and golf-carts are illegal in professional play)


It's really low impact though. Even if your rig consumes a kW, you're still not driving anywhere (in a car that consumes at least an order of magnitude more), or going out to a restaurant (that's air conditioned), etc.


> It's really low impact though. Even if your rig consumes a kW, you're still not driving anywhere

If google didn't failed me, a good car mileage is equivalent to around 20kWh per 100 miles. Therefore, a 100 mile drive is loosely equivalent to a day or two gaming with one of these rigs.


Doing the math based on the energy content of gasoline gives 96 kilowatt-hours for a car getting 40 miles per gallon, going up to 106 if you include the energy cost of refining a gallon of gasoline from oil.


Exactly why I decided to comment, only to find you already touched the topic :)

This is getting out of control!

Nvidia/AMD: build more efficient GPUs!


if you crunch the numbers on how much solar (desertec has a fun graphic on this for the interested) and wind power we could produce if politicians actually cared about fighting climate change, there is nearly an infinite supply of power for humanity.

This isnt even including fossil fuels, nuclear/fusion, biomass/trash, and hydro.

https://en.wikipedia.org/wiki/Desertec#/media/File:Fullneed....


Lots of sunlight, but limited amounts of manufacturing and rare earth metals for solar and limited amounts of energy storage for times when solar and wind aren't available. Solar and Wind are terrific, but they're not a free lunch. What drives down the price of energy, drives up the cost of wind/solar/batteries.


When you double battery production over 5 years prices go down not up. It’s a function of economies of scale.

It’s one of the major reasons EV ranges keep increasing the batteries are less expensive.


Why so negative. I think gaming is the future. We can heat houses and have fun. If only will be able to store the heat. Why make better chips when the user do not care ?


Rest assured that almost no gamer will have a card that uses that much power.


Any human endeavor consumes energy.


Sex is good for 100 calories per minute. Google tells me thats a bit over 5 watts.


100 kcal ~= 6973 watts Multiple sources say it's about 100 kcal per hour, not minute. At any rate, that's about 7kW.


I wonder how much energy was spent getting to that 5 watts.


Time, my friend, time. Women need time invested in them. Not money, not energy. Time.


You lost "kilo" when querying. Or, more likely, your source lost it.


Assuming you can install two GPUs using 600W, that's basically running a hair dryer in your desktop since most of the energy is converted to heat.


Why is this being downvoted? If you have a kilowatt (I wanted to type that out instead of write KW to emphasize it) coming out of the wall that energy has to go somewhere. Literally warm your room during winter and turn down your heater.


And make your AC work even harder during summer or if you live in a tropical/desert/…etc. climate.


I've idly been considering ducting my desktop's exhaust outside like a clothes dryer vent.

My previous computer didn't run very hot at all so the exhaust fan on the top blew onto the underside of my desk and then at me, it was actually uncomfortably cold in the winter. This new one is something else, though.


This got me thinking. ( I must be missing something or having wrong assumption in my high overview of the issue. )

Do Crypto People use their GPU generated heat as heater assuming they lives in a place that requires heating anyway?

Doesn't that equate to free money assuming they could actually mine crypto?


Heating with a Heat pump or Gas is significantly cheaper, so if you look at the big picture, it’s not “free money”.


Open air heat pumps have a lower limit for their operating temperature, so can't be used efficiently in cold climates (around < 25F), since they start to use more energy than they provide as heat. Geothermal heat pumps solve this, but cost more.

Gas is somewhere around half the cost of electricity. Back in the day, you would still profit with a GPU.


Yes. That's one of the benefits.

Some people use electric heating even without crypto so using GPUs to mine crypto to get the same effect is a no brainer.

Problem is that during the summer you have to get rid of the excess heat, so you need air conditioning or perhaps open air cooling.

Not gonna lie, I've used my GPU to heat up my small room before.


It's not a "no brainer" if you include the cost of the GPUs. You need to do at least a little math to figure out whether it's actually cheaper that a simple resistive heater (or a heatpump).


Pretty sure that he was implying that if you already have a GPU then you should mine during the winter instead of using conventional heating.

That's how I read it at least.


> so you need air conditioning or perhaps open air cooling.

Or icebergs - Reyjavik is a popular place to build data-centres, so I have heard. They also have good geothermal in those parts.


Im no expert but coupled with solar panels you get more power during summer to run the AC


Check out this video on using the heat from computing beneficially.

https://www.youtube.com/watch?v=hNytmvltsWk


All of the energy is converted to heat. Some energy from your rig is emitted as light; but that's your monitor, and your monitor has its own power supply (which means that 600W is an underestimate).


Well, all the energy is eventually converted to heat.


The RTX 3080 already uses 320W (peak?). By next GPU generation who knows.

Good lord, just 2 generation back GPUs like the 1070Ti consumed 180W. Even if you add a CPU (of that era) like the i7-7700 with it 65W draw, that just 245W. Now a video card alone can draw more power.


I have a 2xRTX 3090 rig under desk. At close to full GPU load the temperature on the desk is about 3F higher than in the room.


It helps save on the winter heating bill if you live in Wisconsin.


Most? Where does the rest go?


Why does the author/publisher feel the need to overlay their own logo over drawings they copied from someone else's datasheet?


To hide the fact he's violating Amphenol ICC's copyright by using their images and stripping their copyright notice and publishing info off of them.


If someone wants to copy and paste his article, they’ll at least have to go to the effort of finding the images themselves


But it clearly suggests it's his work. It isn't. Let the original author worry about how he protects his own work.


I understand where you’re coming from, but just finding things can be a lot of effort. If someone wants the images with no watermark, they can look them up


Oh, so I can go to your blog or science article with those nice, detailed posts, take all the photos you made yourself and then make a shitty article on the Internet including your hard work. I will just in case slam my own watermark, so it will look like I did it myself and will not include you as an author.

Apparently this sounds fair to you!


You're posting this on a site where the admin encourages users to post archive links to bypass paywalls. Wrong place for arguments that content creators get their fair share.


Everyone tries to be fancy with wiring exotic networking in their homes, but the real performance kings are installing NEMA 14-50R in their offices right now.


Anecdote: A relative of mine who does home remodeling told me that over the past year most of his clients plan for dedicated office rooms now, rooms that would've been planned as guest rooms and whatnot before. They're also running 12-2 wire with 20 amp breakers to these rooms now. So yeah, not quite the laundry plugs you're describing but getting closer.


It's funny because here in europe, power plugs with 16A (some are 20A) are actually the norm: at 230V they can deliver 3600W/4600W !


> power plugs with 16A [...] at 230V they can deliver 3600W/4600W

That makes me a bit sad. We used to have a mess of plugs and sockets, with older plugs being either two round non-insulated pins or two equal-sized flat pins (sockets that could fit both kinds were not uncommon, but sockets which could fit only one kind were also not uncommon), and a Y with three flat pins for air conditioners, with NEMA 5-15 also being popular for computers (and we had sockets which could fit both the NEMA 5-15 "computer plug" and the other two types). We switched the whole country to a really neat new plug type (NBR 14136, see https://en.wikipedia.org/wiki/NBR_14136 for a few pictures), but unfortunately it's only 10A (or 20A, but that's mostly for higher-power appliances like air conditioners; nearly all sockets are going to be the 10A variant, even though the 20A socket is designed to work fine with 10A plugs).

Our voltage can be either 127V or 220V depending on which city you live in (yeah, both voltages use the same plug, but that was already the case even with the older plugs), and on how your building is wired (even in places with 127V you can get 220V by wiring phase-phase instead of phase-neutral), so it's not unusual to be limited to 10A at 127V which is around 1200W only.


I wonder how hard it is to do a bottom-up conversion of the USA to 230.

If the "all-electric" branch of the environmental movement are right, we'll eventually need to adopt the superior European kettle technology. Getting everyone on the same plug would be great.


Every single home in the US has 220 already. They're taken from 220 to 110 at the breaker for common outlets, but many large appliances (and EV chargers) already are 220.

Worse kettle performance is unfortunate. But better safety is a nice side effect (in particular as the NEMA connector the US uses is dangerous/poorly designed/bad, even with the ground pin).


If houses in the US are 240V, than houses in Europe are 480 (or rather, 400V because it's usually triphasic).

The voltage arriving at the building is kind of irrelevant to the discussion. The question is whether a connection could even sustain the house network at double the voltage with larger appliances than it was designed for.


> The voltage arriving at the building is kind of irrelevant to the discussion.

My post talked about voltages arriving at appliances in US homes. Many appliances are designed and in fact receive 220 V. There's even a common (i.e. near every home) plug/wiring/breaker standard for 220 V.

You're the one trying to steer the discussion towards "arriving Vs. using" whereas the first line of my post was expressly about the active use of 220 V in near every US home today (tumble dryers, ovens, central heating, EV chargers, shop equipment, etc).

> The question is whether a connection could even sustain the house network at double the voltage with larger appliances than it was designed for.

I don't understand what you're trying to say. A full end-to-end circuit has to be built for a specific voltage and amperage. If you want to turn a 110 V into a 220 V, you have to re-wire from the breaker otherwise the wiring will overheat and catch fire.



I haven’t watched the video yet, so I don’t know if he addresses this, but: considering many devices with a switching PSU support “universal” voltage, plugging a US style plug into a 220/240V source would probably be fine.


Imagine how many transformers you are talking about replacing. You'd also have to redo everyone's electrical panel.

It is substantially easier to run additional customer connections for those edge cases where an existing 200A residential service will not suffice.


I was gonna say... I didn't even know overloading a power socket was a problem people had.

Sometimes we'll run 5 gaming PCs + 5 monitors from one socket and it's always worked fine.


I've melted extension cords at lan parties before... There is always going to be a point where the # of electrical contrivances trips a certain threshold and starts causing problems.

If you've ever been to dreamhack or one of those gigantic lan gaming conventions, you would probably have noticed the massive power distribution transformers staged every 300 feet or so on the floors.

I honestly don't know how they would manage power delivery if every single person was running a ~1kW gaming load.


This looks like some abomination of the ancient molex drive connection with part of an ATX main connection.

At which point does the GPU get its own dedicated power supply?


Why would it ever? It's much more efficient to scale up the one AC mains voltage -> 12V DC supply you already have.

You think in datacenters they are just stacking up PSUs? No, all of this stuff gets vastly more efficient and reliable if you can build just one big PSU to supply say 48V DC and distribute that everywhere.


My understanding is that most normal servers have their own PSUs, and often a hot spare. But that's not really relevant.

The question was more rhetorical than serious.


We seem to be reaching the point where I expect it'll soon make more sense to consider the GPU as the heart of the computer and the CPU as a peripheral.


Historically typical GPU power draw has exceeded CPU draw for a really long time. In terms of peak compute throughput the GPU usually has an advantage in any gaming-focused machine, and even general productivity ones. A developer workstation with a Threadripper probably evens things out more.

A GPU these days can often do general purpose compute, but not in a way that meaningfully allows it to compete with the OS for heavily branched or random access workloads since its memory controller and execution model are a poor fit. I'm not sure we'd ever see GPUs fully displace CPUs for that reason.


The CPU is a peripheral ;-)

(See Slide 5): https://papers.put.as/papers/firmware/2014/2014-10_Breakpoin...


Isn't the Raspberry Pi already there? IIRC, its main core is the GPU, and the ARM cores are one of the peripherals.


I may be wrong but isn't on some mobile CPUs (perhaps Qualcomm?) the modem chip is the first chip on and the CPU is a peripheral.


The real question is at what point do gamers start choosing lower power parts because the GPU is too power hungry? Do we get into a world where professional gamers start having 240V lines installed for their PCs?

A US power outlet is typically rated at 120VAC@15A, that's a 1800W budget if the only thing on the circuit is a PC.


Well, the higher power draw cards tend to be more expensive, so in practice budget-focused gamers are already buying lower power parts.

AMD and NVidia have at times competed on power usage, but generally that doesn't sell units to the degree that game support or new features do.

For mobile you start getting traction there because '<vendor>-based laptops have an hour of extra battery life' is meaningful, but gamer laptops are typically designed to be plugged in. The higher power draw typically means more heat and louder fans as well but for some reason noise doesn't seem to be a consideration for most gamers - every gaming laptop I've ever used was unbearably loud.


Will Smith (of Tested/Maximum PC) mentioned in one of his recent podcast episodes [1] that GPU vendors tried to add dedicated power supplies to cards, with their own AC connection, a while back (mid 2000s or so if I remember right). Consumer/industry pushback killed it before they hit shelves and the vendors focused on efficiency instead. I don't see the same pushback happening these days.

1) https://techpod.content.town/


I think the mad 4 processor 3DFX Voodoo 5 6000 that was never released used an external power supply, at least on some PCB revisions.

I'm not sure if any released graphics card has used their own power supplies. I think they all used some sort of internal connection to the standard power supply if the slot can't power it alone.



> At which point does the GPU get its own dedicated power supply?

This is true in many machines already for several years, if you look at the machines the hard to find GPUs are actually in.

I wouldn’t be surprised if that is a market force behind this new PCIe standard.


Yeah, I was recently looking for an ATX motherboard extension cable to hack apart for an evening project, and all the results were for splitters to connect extra PSUs to the mobo's Power On signal for mining rigs.


Right, precisely. Its surprising how much people don't want to hear it.

Its like an acknowledgement of reality is seen as an endorsement of that reality.


I'll be honest, it's very hard for me to imagine what I would do that would demand anywhere near 600 watts for a graphics card alone. I mean my PC can draw at most 120 watts and that feels like a lot, although the PC is pretty old by now. All of this for crypto? How many games are there out there that draw anywhere near this level of wattage just for graphics?


> How many games are there out there that draw anywhere near this level of wattage just for graphics?

Pretty much any “AAA” game at 120+Hz, high resolution, with image quality cranked up.

I’ve had friends who managed to snag 3090s at prices they found acceptable (sic…), and had to change psus because theirs were a tad short (despite being theoretically ok going off of nameplate capacity, though it’s also possible the rails distribution was off). They were rebuilding gaming rigs, no crypto there.

Technically the 2x8 / 1x12 setup of the 3090 is off-spec, these are not officially supported PCI configurations.


I do photogrammetry. That's one use that will max out the entire system (not just the graphics card) for hours on end that has nothing to do with games or cc's.

Just as an example.


What in particular are you doing with it? I have the feeling my question may be a bit light on details for you to be able to formulate a great answer, but are you doing things like doing test renderings of an interpolated viewpoint given multiple integrated perspectives? Edge detections? Object "identity" detection (x, y, z) object in perspective W is (a, b, c) in perspective D? That sort of thing?

I could see how that could end up getting very computation intensive very quickly! Building up the 3-D space, tagging spots where the math doesn't quite work out... Compensating for different lighting conditions from different pictures at different angles to get just the info you're looking for!

Or am I completely off base?


There are programs that do all of this for you. I just take the photos and use the models in other things.


600W worth of photogrammetry?


If you're doing it for work, the faster it goes the more money you make. So its not a question of 600w of photogrammetry, but 'how fast can we make it' and the resulting electrical load.


> 600W worth of photogrammetry?

Yes, no doubt. Specially nowadays when point clouds are all the rage.


You ask a question, you get an answer and then you doubt the answer. That's not nice.


Its also an invitation to brag about their work! Its an incredible claim with a factual answer, its rare you get to show off about your niche like that right?


Whatever your problem is if it's remotely calculation heavy you can draw as much power as the hardware will let you (if the software handles it)


If I don't want to wait for 200 hours at a time, yes sure.


Just built a system that I managed to get a 3080Ti for at MSRP. I’ve spent a lot of effort to have it be silent (and extremely cool) with just air cooling, even when it does play games.

With a 5900X (slight overclock for max boost at 5.1GHz), some lightly overclocked 64GB of DDR4 memory, two 27” 1440p displays, a phone dock, KB, mouse, web cam and condenser mic it idles at about 150W.

General productivity, like development with a few VMs spun up, and maybe watching a 1080p video in the background? Maybe 180W.

Streaming a game. So I’m capping myself at 1080p60, which does not stress the 3080Ti with all settings maxed out. Two LED light rings also on, plus an LED pendant lamp. Now we’re squarely in 350-400W land for the entire setup.

Not streaming. 1440p, capping at 120fps (displays go to 170Hz). If settings are cranked up on a very recent title and I’m GPU bound, this is where the GPU alone can actually hit 400W by itself in some scenes (usually 280-350W) and the total power consumption for the room maxes at 600-650W.

—- ——-

There’s 6 140mm PWM case fans, the CPU has two 150mm PWM fans, and the GPU has three 92mm PWM fans. Case is all steel with sound deadening foam on both side panels. In every settings it’s silent unless I’m “going for broke” at max quality for averaging 120fps, where it’ll then be a “light whirr” once it’s heat soaked, which I’ve measured at 24.8dBA at 3 feet (~92 cm). GPU peaks at 76C, GPU memory peaks at 70C, CPU is around 65C, and system memory at 35C. All air cooled.

—- ————

Here’s the thing. Come next year, it’ll be a 32-core Threadripper with 256GB of memory and a NVMe RAID 10 array for real work with Spark without constantly paying by the hour. The CPU, storage, and system memory alone will probably pull 500W under max load factoring in the displays, KB and mouse. Still will idle close to 165W though.


Gotta think about the scale here. 4k gaming pushes 8 million pixels, 60 times per second. Those pixels are being calculated from light bouncing off of meshes, each formed of 10,000-100,000 triangles. More triangles makes your objects look smoother and rounder. Each triangle also has texture information that dictates its color, as well as some specific properties such as reflectiveness. To achieve highly-realistic video games (and CGI), you simply have to have that level of complexity. Even then, the way light is calculated in video games is not realistic at all, it is done that way for significant performance/efficiency gains. Ray tracing is now being added, which is how light behaves in the real world, but that cranks the computational complexity through the roof. That's why it takes thousands of CPU cores 25 hours to render one single frame for a Pixar movie, they go for realistic physics/light.


Yeah, just wow. Here's what I run fine even under load behind a 600W UPS:

  * 15-20 arm64 SBCs 
  * 5 Ryzen nodes
  * a handful of HDDs and a decent number of SSDs
  * an enterpriseish 48-port PoE switch using 10GBE on some ports
  * 5 x86 SBCs (routers)

That's a crazy amount of energy for one GPU used for gaming.


I doubt it's for crypto though. Running two cheaper 300W GPUs will be more cost effective. Also serious miners already moved on to ASIC anyways.


Seems to depend on the coin. There are lots of coins to mine/etc with a gpu.


Ethereum is mined with GPUs. The total mining rewards (block rewards + fees) are actually marginally higher for Ethereum than Bitcoin, with both around $51M per day


4K Gaming at 120FPS with Ray Tracing would easily push pass 300 to 400W. Even on a hypothetical 5nm GPU.


Ok, this is a tangent, but this kind of stuff irks me. Is there no love anymore for making things pretty enough but actually keeping things efficient? Do game devs (even AAA people) not like the family of things like 0x5F3759DF anymore?

Efficiency aside, simply pushing the power requirements doesn't even bring home the bacon. Nintendo's "lateral thinking with withered technology" works super well, just look at the switch compared to any of the consoles of its generation. I know PC gaming is different but even there people dig streams of undertale over cyberpunk (although cyberpunk had issues...), so not everything is pushing power requirements.

Literally, if all that matters is "realism" and "power" and pushing that boundary is an end in and of itself, well that boundary is already pushed by others. I mean may be not with a GPU but I work in HPC and I'm sure as hell my simulations draw more than 600W. So, this is a push for power in games specifically and that just starts to draw out my boomer side and wonder why people just don't try to go back and optimize their code first.


I think this is three different set of questions.

1. Should Games really be pushing GFx heavy / realism vs Nintendo Style Game?

Well first I absolutely love Nintendo and their Games. But the answer is first, Nintendo's target market is very different to PS5, Xbox / PC. I am sure Battlefield fans just dont think it will work on Nintendo style graphics. Second it really have nothing to do with Game Dev, more to do with Games priorities. i.e The question then becomes a Design / Market Fit / Choices questions rather than a technical question.

2. Keeping Games Efficient.

AAA Games, or as a matter of fact nearly all Games are efficient, comparatively speaking. Just look at how browsers, one of the most optimised piece of software is still learning their lessons from Game Dev. Games are using as much power as they could to push the limit on Graphics, and Physics within Games. That is not an efficiency question, but how much power they could push the boundary. Remember 4K Res is FOUR times the resolution of 2K. 120 Fps will be TWO times the normal standard of 60fps, that is 8 times the complexity alone excluding any additional high quality asset or modelling. I didn't even put Ray Tracing in that equation into it. Even if you assume 600W gives you double the performance of 300W ( which it doesn't ). You already see the disparity between complexity and performance improvement.

Finally, the third point I want to mention is hypothetically speaking Ray Tracing will lower the cost of Graphics Asset, which is by far the largest cost centre for Modern Games. Since designer no longer have to do all sort of special tricks for effects and save them some time. Although I imagine in reality these saved cost and time will only be used somewhere else for even better graphics.


No comment on 3.

On 1, the point is that graphics and what you call game dev isn't really that necessary to sell or to be fun. It was in response mostly to people who for some reason think it is, especially those who think that "battlefield" or "the last of us" games are a different category of game. Idk, that sort of thinking is a big problem and part of why games have moved away from being fun and being what lead to the rise of "press f to pay respects" being a meme. Too much emphasis on very American thoughts of what makes a game good: graphics, intricacy or complexity, and too much story telling vs what actually does without fail, just being fun. The thing is making something fun is actually pretty difficult because it require ingenuity and creativity, while the others just require grinding and there is already a path forward. Again, I said this was a tangent because it is, I'm just tired of modern games just plain not being fun and being too geared towards being interactive movies with eye candy. The enabling (requiring) of 600W cards for a video game is just yet another result of this slide of the larger game dev space which is why I bring it up.

On 2, I know you're saying comparatively but really just compare to games from 10 years ago, 20 years ago. I remember talking to game developers who need to cut corners say they don't worry about handling memory for strings "that are just a few megabytes anyway" and sure may be you don't need to worry about that today but no, by an absolute scale gamedevs are neglecting things like string processing[0] whilst caring so damn much about graphics, so you can't argue that things are not more inefficient now. I get your points about 120hz 4k though, though honestly if these are the power requirements, then I'm starting to develop an opinion that 120hz might make gaming start to be an activity that is bad for other reasons, particularly climate change as is argued in other threads in this comment section.

Anyway, the general reason for this rant is again, AAA game devs are approaching a similar state like the js community, whereas instead of being caught in a cycle of churning frameworks, they're caught in an arms race to try to make things bigger and more "realistic" and eventually more resource consuming whilst getting their lunch being eaten by (relatively, in AAA's eyes) garbage looking things like fortnite, animal crossing, undertale, with vastly lower budgets. I want them to realize these massively successful games which aren't in the vein of what you think of being AAA diffuses the fog and makes studios which have gobs of money realize what actually makes games good and thus which will actually make them sell.

[0] https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times... [1] https://www.reddit.com/r/Undertale/comments/4auo3n/putting_u...


Gamedev is still full of clever people pushing graphics cards to their limits. The kinds of tricks are different nowadays, as most of the work is done on the GPU, and the important thing is to keep it busy with highly parallel, branchless code with good memory locality.


My point is being embarrassingly parallel over being efficient and smart isn't actually being clever, hence why the phrase is "embarrasingly parallel." That said, I often need get work done (publish papers) so I opt to do that over being more clever in my code.

I sort of looked up to gamedevs because they seem to accomplish a good experience without having to model everything, and its an attitude I've fantasized about bringing to my work. It sounds however like they are moving in our direction which is disappointing I guess.


4k 120hz isn't efficient? You'd see higher but we don't even have higher display connection specs yet.

You're also conflating graphics, game design, and hardware design. These are not the same groups. Every pillar is pushing forward. There's no zero sum game there.


For a game dev they'd rather release a broken game and start selling sooner than spend time and money polishing it and not selling copies. Gamers don't buy games because the code is performant.


Ok you're explaining a problem though (cyberpunk was definitely a problem) albeit matter of fact, but clearly examples show that lack of performance hurts sales. You're right, people don't buy games because they are performant, they buy them because they are fun, but if your game is so unperformant then it won't be fun...ergo sales sink.

Fewer people by virtue of how expensive 120+Hz equipment (not just the monitors but everything else) can afford that stuff, and if you're limiting your $60 game to just those people that also reduces sales. Again, the logic isn't there, and you won't beat Animal Crossing New Horizons in earnings so even the economics isn't there. The logic is totally wrong.


Nobody is limiting their game to people with absolutely high-end setups.


> All of this for crypto?

ML training


> How many games are there out there that draw anywhere near this level of wattage just for graphics?

Well, since its the brand new, top-of-the-line connector, hopefully very few current applications will max it out.


90% of this for crypto, the rest for ML and games


This is long overdue, for something I have been questioning since 2016. ( Our GPUs are severely TDP limited )

It is also interesting on one hand you have Apple pushing CPU and GPU integration for maximum efficiency. Where Apple could make a potential 300W TDP SoC for their Mac Pro. On the other hand you have CPU pushing to 400W ( and higher in the future ) while GPU pushing to 600W.


I wonder what will be the MSRP for 3090 TI and what the actual price in stores would be. Even mid tier GPUs like 3070 cost a lot of money.


The 3080 Ti is north of $3k, so...even more north of that.


This is insane. Give me back the 75W GPUs instead. No extra power connector, just the slot.


APUs/integrated GPUs have taken up that role.


But I don't want a shit CPU. I can buy a 65 W 8C/16T CPU from AMD (which I need for more productive purposes). Why can't I pair it with a 75 W video card? Should be enough for 1080p gaming.


It's been a few years since the release of the GTX 1650 but it's still a pretty serviceable card. Cheap, too.


That's the point. Where is the 3050?


They'll probably only put out high-end cards until the chip shortage clears up a bit.


We're going to see laws dictating max wattages for consumer computers real soon.


'California's Energy Consumption Tier 2' is basically that but I believe it applies only to prebuilds currently.


It doesn't make any sense. Why do you need to have 600W to power a graphics card? Why do they need an independent power connector?


I'm not sure I understand your question, but I will try.

>>Why do you need to have 600W to power a graphics card?

You don't. There is no GPU currently that needs that much. But as cards are comfortably approaching 400W and more, a new connector is necessary so you don't end up with GPUs taking 3 or 4 existing PCIe 8-pin power connectors. This single 55-amp compatible connector allows for significantly easier routing of paths on circuit boards.

>>Why do they need an independent power connector?

Because the PCIe slot alone can only provide 75W of power. And if it could provide more you'd need to provide that power to the motherboard so you just moved your connector from one place to another.


This is still nowhere near as elegant as Apple’s MPX connector which eliminates the extra cabling altogether.


It's not, because if you plugged a device needing 600W using an MPX connector, it wouldn't work at all, since MPX can only do 500W.

Here's the thing. When you have a lot of amps, they create noise in wires next to them. They'll also need thick wires and will get hot. When you have a lot of volts, they'll go through shielding and start a fire.

Do you really want to place something with a lot of amps and volts next to something that's transmitting timed high-bandwidth data, like a graphics card? Or run the entire output of your big PSU box that needs a bunch of fans, through the motherboard? For a tower form factor?

The single slot for power and data is convenient for phones and laptops. Not something that's 600W and is going to power a space heater in its next kilowatt+ iteration.


Given the RF noise and the heavy-duty electrical engineering, wouldn't it be better to isolate the GPU in its own external enclosure?


I think the point is you want the power run separately from the data. And you want the data lines to be short to reduce noise and loss.


the RF noise form computer components is enough to be taken care of by regular metallic shielding, because the amps are low for actual compute operations. an inch of space on its own is an effective barrier. this is very different than having a wire running enough amps to power a space heater, located next to your data transmission lines. even having the GPU in its own external enclosure, you would not want the data cable to be bundled with the power cable for anything that pulls over half a kilowatt.


That connector is really long and wouldn't fit in a number of cases (ITX PCs, servers...). For carrying lots of current, thick cables are just better than PCB traces (especially on PCBs that are mostly designed for signal integrity, not carrying power).


I would rather pay for a $10 cable than 100x that for a motherboard


600 watts? That's one bar of a two-bar electric fire; in my youth, that two-bar fire was the only thing keeping some families warm.

I mean, that power all gets turned into heat anyway in the end; so sure, if you need more heating, forget the two-bar fire, and buy a graphics controller instead - playing games with the two-bar fire isn't a good plan.

I don't get why they can't make efficient GPUs. I mean, I do get that graphics depends on computation, and that all computation has an intrinsic minimum energy cost; but half a killowatt, to make a moving picture? That's more than the power supply for my gaming rig (which I retired, because the fan noise was excessive).


> I don't get why they can't make efficient GPUs

They do make efficient GPUs, see smartphones. Even current desktop cards are efficient considering their computational power, but they are increasingly tuned to run with higher power budgets. Turns out people care more about FPS than power consumption I guess? Also consider that winning the performance crown has a halo effect for the rest of the portfolio, so the performance / high-end models are tuned towards that.


I agree it is a little out of control.

Some may marvel at the incredible power available to consumers now, but I've been watching with disappointment at how each new generation is only very slightly better in the performance per watt metric. Most of the new compute power is coming from increased energy draw. In an energy-conscious environment it's quite hard to justify.


GPUs are efficient. That's why (some) people get benefits from throwing lots of power at them. If there were a smaller difference between a 200W part and a 300W part, fewer people would upgrade.


I do appreciate that!

It just seems seriously counter-intuitive to claim that spending 600W to put a moving colour picture on a screen the size of a bed-pillow counts as "efficient".

[Edit: obviously it doesn't take 600W to play back an HD movie - the energy is being spent on game computations, or hashing, or something like that]




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: