dimanche 3 novembre 2013

IBM’s bionic computers bleed electronic blood

A lire sur:  http://www.extremetech.com/computing/168864-ibms-bionic-computers-bleed-electronic-blood
IBMcooling
In the 1950s, the highest priority for national defense was the Air Force ballistic missile program. The ICBM, and therefore the entire space program, and the internet, were made possible by the IBM 360 mainframe computer and its immediate predecessors. Last week at their Zurich lab, IBM gave a media tour of some of the company’s latest concepts — concepts which it believes will be just as revolutionary as the 360-era computing devices. The key to making supermachines 10,000 times more efficient, IBM says, is to build bionic computers cooled and powered by electronic blood.
Bytes and flops can be handy measures for characterizing file sizes, or gauging how long a simulation might take to give you an answer. If however, you are budgeting a datacenter, or even just a processor or battery for a smartphone, a more useful measure to work with might be operations per joule. If doing something even more exotic, like designing machines to navigate the highways and byways of a circulatory system, operations per liter may be even more relevant. As was astutely noted in a recent article, an important element in the units we choose to describe our computing efforts is the need to capture a sense of direction, and more importantly, progress.
IBM’s stated goal is to have a one petaflop computer in a 10-liter volume. In other words, it wants to take a petaflop machine that would fill a room today, to be put on a desktop. Today’s top dog, China’s Tianhe-2, has a petabyte of memory, and achieves 33.86 petaflops (quadrillion floating-point calculations per second) by using 32,000 Xeon processors and 48,000 Xeon Phi accelerators. The machine runs Kylin Linux and requires some 17.8 megawatts of power. Power draws on that scale have previously led IBM to develop a prototype system called Aquasar, which has branching tubes to deliver liquid coolant right where it is needed. They specked this system with some unusual units — 7.9 trillion operations per second per kilogram of carbon dioxide released into the atmosphere.
Acuasar
In a previous article we suggested that ultimately, resources like power and cooling for high-density computing need to be volume sourced, rather than supplied through costly wires and power-ground planes on a board. The logical extension of the design rational (which would deliver these resources through an increasingly fractalized system of tubes) would be to immerse the entire computer. While there are technical hurdles to achieving something like that at scale, IBM has made significant advances in a technology they call a redox flow battery. IBM’s flow battery computer has a microfluidic chip which uses vanadium electrolytes with different oxidation states to generate voltages ranging from 0.5 to 3 volts. It is capable of generating up to a watt of power per square centimeter of board surface, and potentially provides for a significant amount of heat removal.

Grams of CO2 as a computer specification

IBM’s new yardstick of flops/kilogram CO2 would appear to make the assumption that the CO2 cost of building the computer is negligible compared to the cost of running it. Indeed for life based on cells, that assumption can be somewhat justified because cells only use a little more energy when they initially divide or grow as they do when they are in normal operation. However, until we can construct computers as easily as we would inoculate a vat of broth with a rapidly replicating bacterium, construction costs will probably continue to be a major factor.
To draw a little comparison here, a commodity like steel might have little or no CO2 cost associated with its operation, but significant costs for its construction. To melt and process the metal for cast girders might take 1300 megajoules/tonne steel and generate 235 kg of CO2 per tonne steel. For computers, efficiency has increased as they have proliferated over time, but the effects clearly have not offset one another. In the other words, the average total computing need and budget per capita has grown over time, as has the percentage use of our computers of our total energy budget. Perhaps we need a better measure for computing — one that would involve what we want to actually do, rather than how much of something we use to do it.

Aucun commentaire:

Enregistrer un commentaire