HomeinetMIT New type of memory opens the way for brain-based computers

MIT New type of memory opens the way for brain-based computers

MIT: The ability of machines to reach the performance of the human brain in areas such as image or video interpretation may be able to give a new kind of computer memory.


As reported in his publication MIT Technology Review, IBM researchers used the so-called phase-change memory to create a device that processes data in a way that points to the function of a biological brain. Through the use of a prototype phase-change memory chip, the researchers set up the system to operate as a 913 network of neurons with 165.000 links, or synapses, among them.

The power of these synapses changes as the chip processes incoming data, changing how virtual neurons affect each other. Taking advantage of this feature, scientists have made the system capable of recognizing handwritten numbers.

The phase-change memory is expected to emerge in the coming years. It can record information at high speed and pack it at a much higher density than today's types of memory. A chip of such memory consists of a network of "cells" that can receive two states to present a digital bit of information - an 1 or an 0. In IBM's experimental system, each junction is represented by a pair of cells that work together.

"Stroke" computers have been the subject of research by computer scientists for some time. Such designs are radically different from today's chips, promising to make computers more efficient at tasks that are currently considered difficult by conventional systems - such as experience learning or video comprehension.

Earlier in the year, IBM announced the most advanced chip of its kind so far, created through techniques and components used to build smartphone processors. The new system is not as powerful, however, as Jeff Beer, an IBM researcher, pointed out that it is important that phase-change memory is used to create 165.000 synapses. According to him, this type of memory is considered to be suitable for "neuromorphic" type systems because it stores data at very high density - it is also easier to reprogram it. In practice, this facilitates the construction of a system that is "capable of" learning, by properly adjusting its behavior while receiving new data.

Previous attempts in this field were of limited scale, with 100 synapses or fewer. The new system, developed in collaboration with researchers at the Pohan Science and Technology University in Korea, is 1.000 of a larger size.

The paper was presented in December at the International Electron Devices Meeting in San Francisco.

Source: naftemporiki.gr


In a world without fences and walls, who needs Gates and Windows