I strongly suggest everyone ignore the pessimistic trolls that doom and gloom Ivy Bridge based on ONE TEST that might have either a bad temp sensor and might not have been the best FPO batch.
Every Intel shrink made thus far has had lower temps and lower volts for the same frequency as its predecessor, there is no reason to think otherwise with Ivy Bridge.
dukenuke88 TDP is not real heat disspation, it is just a category
95 TDP huh? So the truth comes out?
. Although, Intel has been known to stick CPUs that don't qualify in the category just to ensure and simplify cooling options for OEM's. A very good example is Xeon W3503
, both are 45nm dual-cores with no Turbo or HT that only work in X58 boards, yet they get the same 130W TDP rating that quads and 6-cores get, while they are easily less than 65W each full load.
Power consumption is directly related to die surface area, the Ivy APU's are 170mm2
while Sandy APU's are 208mm2
, this means at the same frequency and cores, Ivy consumes at least 18% less actual power. That French review when FX8150 first appeared showed the power consumption of 2600K as well, at 93W of electricity
at stock with integrated GPU running. Since energy efficiency for electronics haven't really changed in the past 30 years and remain in the range of 15%, that makes 2600K's actual TDP down at 80W-- thus qualifying for the 95W TDP rating, because it is higher than the next lower rating. Knowing the power drop due to shrink, the estimated actual TDP of 3770K is just above 65W
-- it still qualifies for 77W TDP rating, but of course Intel stuck it in a higher category.
<message edited by lehpron on Wednesday, April 18, 2012 4:03 PM>