So, I was looking through an IBM presentation on the future of high-performance supercomputing, and came across this slide that explains why they think a new approach is needed in dealing with computer-chip power consumption.
( A graph dramatizing the reason )
( A graph dramatizing the reason )