dimanche 4 mai 2014

Your computer is about to achieve transendence

A lire sur: http://www.gq-magazine.co.uk/entertainment/articles/2014-04/28/nasa-quantum-artificial-intelligence-laboratory

By Louise Donovan 28 April 14

In 1969, the United States put two men on the moon. The mission required more than 3,500 IBM employees and the most sophisticated programs ever written. Today, though, a single Apple iPhone holds more computing power than any of the technology used on Apollo 11. That rapid advance can be explained by a pattern called Moore's Law: every 18 months, the amount of transistors that it's possible to fit on to a one-
inch-wide microchip doubles. In other words, the pace of change is geometric, not linear. That's why a laptop bought today is not nine times better than one you could buy nine years ago, it's 64 times better. The problem for innovators is that Moore's Law will, in around 15 years' time, hit a wall. There is a physical limit to how many transistors can be squeezed on to a chip.
In the short term, the chips themselves will evolve. The so-called "wonder material", graphene could replace their silicon insides. Graphene conducts electricity at high speed and it reduces interference between tightly arranged transistors.
Quantum computing, however, is the holy grail. Unlike everyday computers which depend on "bits" of data (ones and zeroes), quantum computers use "qubits". Qubits can either be a one, zero or both at the same time - and that makes it possible to perform millions more calculations per second than a conventional machine. Last November physicists managed to hold a qubit's memory state simultaneously at one and zero for a record 39 minutes. So, in other words, they're unstable.
Still, the race is on. Google and Nasa have launched the Quantum Artificial Intelligence Laboratory to harness quantum technologies in an effort to make a computer (right) that can think. And, thanks to Edward Snowden, we know that the NSA is spending $80m on a quantum computer that can crack any encryption. Is that ethical? We'll wait for a computer to give us the definitive answer. 

Aucun commentaire:

Publier un commentaire