2012-03-29

Panel ponders many-core ICs tripping 'the singularity'

Panel ponders many-core ICs tripping 'the singularity'




SAN JOSE, Calif. – Will the rapidly increasing processing power being enabled by many-core processors cause the advent of machines with super-human intelligence, an event sometimes referred to as the singularity?

That was the question put to a panel of some of the best minds in multicore processor theory and design assembled on Tuesday (March 27) by analyst Jon Peddie at the Multicore DevCon, part of DESIGN West being held here this week. A small but enthusiastic audience was there to listen.

Peddie set up the debate by referencing Vernor Vinge, a science fiction writer, who had predicted that computing power would be equivalent to human processing power by about 2023. One particular aspect of the concept of the singularity is that once machines either in single units or collectively exceed human intelligence there may be an explosion of machine learning advancement that it would not be possible for humans to fathom, by definition, making the singularity a kind of event horizon.

Another extrapolation of computing progress had 2045 as the year in which it might be possible to buy a machine with the processing power of human brain for $2,000 in 2045.

Pradeep Dubey of Intel parallel computing labs illustrated the progress by saying that a petaflops supercomputer can already simulate a cat's brain. A human brain has 20 to 30 times more neurons and 1,000 times more synapses, he said, so the complete simulation of human brain is only a matter of a 5 or 6 years away. "Exaflops could simulate a human brain," he said.

Dubey said there are currently three approaches: simulate the process with a neuron- and synapse-level model; ignore brain architecture and treat the problem as data and statistical problem; or to build hardware that mimics neurons and synapses.

However, the simulation of the brain is not the same as thinking or generating the emotional intelligence we see in human beings, said Ian Oliver director of Codescape development tools at processor licensor Imagination Technologies Group plc. "We probably have the wrong memory model. The human brain is non-deterministic. It operates on the edge of chaos," he said.

Oliver pointed out that the use of genetic algorithms to derive FPGA designs through evolution produced much more brain-like architectures but were not readily usable in the real world and as such computer and human intelligence appeared to be distinct.

Mike Rayfield, vice president of the mobile business unit at Nvidia argued that the number of processor cores is a red herring. But Intel's Dubey countered that more cores is better saying that massive data engines can capture correlations if not causality. He pointed out that machines can already do some things far better than humans, which is the reason they exist. "We can build planes but we can't build birds," he said.
Next: computers already smarter than humans -- at specific things

TAG:multicore processors ARM Imagination Intel Nvidia

No comments:

Post a Comment