Why do PCs produce waste heat?

Discussion in 'Computer Gaming Forum' started by Conker2012, Jul 18, 2013.

  1. Conker2012

    Conker2012 Intrepid Member

    Joined:
    Oct 19, 2012
    Messages:
    687
    Likes Received:
    78
    Considering how we're all supposed to be so aware of global warming, it's always surprised me how no one ever seems to question the fact that PCs kick out so much excess heat, especially since at the moment (when the weather is *very* warm around here) the PCs are making things almost unbearable in the office...

    I mean, the excess heat from a PC (be it desktop or laptop) is waste heat. And waste heat is a sign of bad design (so I've heard), as either the components are drawing too much power, that they then don't need and so is left to radiate outwards, or the PC is producing waste heat as part of it's normal functioning, which is an inefficient side effect. So why aren't PCs produced that counter this, either by drawing only the power they need, or by avoiding the inefficient output of waste heat from their components?

    I've heard that it's down to the inefficient design of the original chips, and that every revision/upgrade of the chips are not only carrying all of the inefficiencies of the older chips, but also adding their own new inefficiencies, as a total redesign of the chips, along with as close to 100% efficiency as possible, would be a prohibitively large expense for the chip manufacturers, when instead they can just build on the old chips which costs much less in research and design. I've no idea if that's true or not (I know nothing of CPU design or building, it might as well be magic to me) but I'd be interested to know why such (seemingly) inefficient components are the norm nowadays.
     
  2. sonicdude10

    sonicdude10 So long AG and thanks for all the fish!

    Joined:
    Jan 17, 2012
    Messages:
    2,573
    Likes Received:
    29
    Think of it this way. Ever have a wire carrying a load and the size of the wire is a bit small for the amperage? What happens? It gets hot. Why? Friction. Friction causes heat even with electricity. Electricity is the moving of electrons from atom to atom. So when physical mass is moved across other physical mass (electrons are physical mass after all) it causes friction. So with chips getting smaller and smaller the traces shrink as well. Some traces in modern chips are only a few atoms wide now. So when trying to shove as many electrons as possible across these small traces and the fact there could be millions or even billions (!) in a small chip the size of my fingernail of course it's gonna get hot. Overclocking just adds even more current increasing the friction and therefore the heat output.

    That's why they get so damned hot.
     
    Last edited: Jul 18, 2013
    Jord9622 likes this.
  3. AlexRMC92

    AlexRMC92 Site Supporter 2013

    Joined:
    Feb 12, 2013
    Messages:
    337
    Likes Received:
    28

    It's not bad design, we use more computing power than can be efficiently delivered by a single cpu/gpu. With new chip designs such haswell (current gen) and broadwell (future gen), their ULV chips produce almost no heat when under small load. Chip designs are getting much better, but it's going to be a long time before you can do process intensive tasks without making a lot of of heat, not to mention that programs continually get more and more complex which requires more and more power.

    Chip manufactures aren't simply adding on to old designs, intel has redesigned its chips completely many times. x86 has little to do with the structure of the chip itself, but only the instruction sets it can handle.

    Humans also produce waste heat, we are constantly heating ourselves to maintain body temperature and are constantly radiating heat, breathing out heat, even our waste expels heat. Global warming has little to do with waste heat, but more to do with greenhouse gasses trapping in the suns heat (which is much more than all of the pc's in the world put together).
     
  4. bennydiamond

    bennydiamond Gutsy Member

    Joined:
    Aug 24, 2011
    Messages:
    476
    Likes Received:
    180
    Heat production of ICs are not much related to friction by electron friction as to parasitic capacitance that plague transistors. It's a phenomenon that plagues any electronic components and affects the theoritical model of the components.

    In the case of the transistor that switch at really high frequency (CPU clock), a "virtual" capacitor is present between each 3 terminals of the transistor. This parasitic capacitor accumulate and release power continually (like any capacitor in an oscillating circuit). The release of this unwanted energy is present in the form of heat generation. Remove this phenomenon and you get almost no heat generation; then parasitic resistance becomes your main concern for heat generation. However, if you remove the parasitic capacitance effect, you reduce global impedance of the circuit and thus reduce current flow drastically. Power generation can be calculated by multiplying Voltage by current. Less current directly means less power!

    Unfortunately that's something chip makers are not able to do. Of course, reducing trace and transistor size as well as reducing oscillation frequency helps to reduce heat generation.

    Unfortunately, sonicdude10, your theory is flawed with this simple example. On the Xbox360, the first generation of the console(Xenon), the CPU and GPU where 90nm technology, consumed more electricity and produced more heat than the last generation of the Fat consoles(Jasper) which contained 45nm CPU and GPU. The 45nm ICs contains smaller traces and transistor but consumes less energy and produce less heat. It's the same architecture, same amount of transistors but smaller.
     
  5. -=FamilyGuy=-

    -=FamilyGuy=- Site Supporter 2049

    Joined:
    Mar 3, 2007
    Messages:
    3,031
    Likes Received:
    889
    While there might be some design amelioration that would prevent too much excess heat, computing is impossible without 'wasting' unwanted energy not required for the calculation itself. This is a physics principle called Landauer's principle.

    It's more Joule's first law, rather than friction, that causes current to generate heat. This law goes like this: Q=R*I² ; where Q is the generated heat energy in joule units, R is the resistance of the wire and I the current you pass through it. Which means that any wire with a non-zero resistance WILL generate heat. The fact that a tiny wire seems more hot than a big one is because it has a bigger resistance R than a bigger wire (think of it as more electrons can travel side by side in a big wire than in a small one). Now for devices more complex than wires, you replace R by the impedance Z as bennydiamond said, but the idea is the same. The 'friction model' is used to vulgarise the phenomenon as it's intuitive, just friction your hand and you'll make heat, yet it's innacurate.

    Now, transistors are pretty small and the current that goes through them is also small, so their Z and I is small and yield a small produced heat Q. But there's an awful lot (and a half) of transistor on a chip, real close to each others, and in the end the tiny heat of each adds to a respectable total. People work hard to reduced the needed currents in transistors and to make smaller transistors with smaller impedances Z, but in the end, you can't do calculations without wasting power into heat.

    Even fundamentally, if the transistor would create no heat (superconducting or something), erasing the information stored anyhow after a computation would yields heat (more accurately it'd create entropy, but it's about equivalent), this is called Landauer's principle as stated above.
     
    Last edited: Jul 18, 2013
  6. Calpis

    Calpis Champion of the Forum

    Joined:
    Mar 13, 2004
    Messages:
    5,906
    Likes Received:
    21
    Transistor leakage current is significant in modern processors with billions of transistors, but that's only half of it--I'm surprised I didn't see switching current mentioned... When CMOS gates switch states there is a moment at the switching threshold where both transistors conduct and there's a low impedance path to ground. This is a big deal with high frequency clocks since at any instant millions of transistors are switching and dumping amps of current to ground, and because those millions of transistors need large drivers to handle the fanout/line capacitance and meet the timing constraints.
     
  7. Conker2012

    Conker2012 Intrepid Member

    Joined:
    Oct 19, 2012
    Messages:
    687
    Likes Received:
    78
    OK, thanks for the information, it is very interesting.
     
sonicdude10
Draft saved Draft deleted
Insert every image as a...
  1.  0%

Share This Page