Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
SINGLE-CHIP MICROCONTROLLERS with RISC architec....doc
Скачиваний:
4
Добавлен:
04.12.2018
Размер:
3.56 Mб
Скачать

5.4. Basic concepts and tasks of neuron calculators

The examples of tasks solution were considered in previous parts, which are well formalized, id est mathematical models are created for them and there can be the applied algorithms which are based on rules of type "if A, then B". However there are tasks, which it is difficult to formalize, id est find the clear algorithm of solution. To such tasks belong :

  • pattern recognition, for example, recognition of hand-whritten and printed characters during optical introduction into computer, recognition of blood cell types, recognition of language. Thus an object which is recognized is the data array, which needs to be attributed to one of the known classes a priori;

  • clusterization of data (search of regularity). Input data it follows to attribute to any group (cluster) after proper to them "proximity", thus the number of clusters is unknown a priori. As criteria of "proximity" can be used distance between the data vectors, value of correlation factor and others like that;

  • approximation of functions. To find a function which approximates unknown, for example set of experimental data. This task is actual during the design of the difficult systems and creation of control system by difficult dynamic objects, for a robust control;

  • prognostication. After previous behavior of function, to forecast its behavior in the future. This task is actual for the systems control with forecasting and for the systems of making decision;

  • optimization. Aim of these tasks − to find the optimal value of target function which satisfies to the row of limitations.

It should be noted that a man solves well tasks which it is difficult to formalize − recognizes an image, classifies data, forecasts and others like that. Therefore the idea of creation of artificial intellect became actual enough. However for this purpose it was needed to conduct numerous researches of principles of functioning of man brain from the point of view of information processing.

A brain of man is most difficult from the known systems of information processing. There are about 100 milliards of nerve cells in it, or neurons, each of which on the average has 10 000 connections.

Neuron is the special type of cells, basic purpose of which consists in an operative control by organism.

The schematic image of neuron is given on fig. 5.6.

A neuron has a body (soma) 2, tree of inputs (dendrites) 1 and outputs (axons) 4. Dendrites are ramified strongly, penetrating into comparatively large space round a neuron. The initial segment of axon − is thickened axon hump 3, which Fig. 5.6. Schematic image of neuron adjacent to the cell body.

With moving away from cell it gradually narrows and a myelin sheath appears on it which has high electric resistance. On sonm and dendrites an endings of axons take place which go from other nervous cells. Each such ending 5 has the shape of bulge, which is called a synaptic terminal, or synapse. The input signals of dendritic tree (postsynaptic potentials) weighted and summarized on a path to a axon bulge, where an output pulse is generated. Its presence (or intensity) is the function of the weighted sum of input signals. An output signal passes on the branches of axon and arrives to synapses which connect axons with the dendrite trees of other neurons. Through synapses a signal is transformed into a new input signal for adjacent neurons. This input signal can be positive and negative (excitative or brake) depending on the type of synapses. The value of output signal which is generated by synapse can differ from the value of signal which enters synapse. These divergences determine efficiency, or weight of synapse. Synapse weight can change in the process of synapse functioning.

The scientists of different specialities did attempts to create the mathematical model of neuron. Yes, biologists tried to get analytical conception of neuron, that would take into account all its known functional bihavior. However the basic task − information transfer by nervous impulse − was lost among the great number of parameters which belong to physics of pulses conductivity. Therefore tried to replace physical description of neuron by logical. Thus a nervous cell was examined as element which passes information. In 1943 the scientists-mathematicians Мак-Каллох and Пітс represented a neuron as simple switching element which can be in one of two stable states "On" or "Off". A neuron triggers, if the algebraic sum of inputs is more than threshold in this time. A neuron in such presentation can be used as the computer element and enables to build a network of neurons with corresponding thresholds and connections, that would realize an arbitrary boole function or truth table. These researches resulted in the numerous inventions of charts of information proceding, recognizers and sensory analyzers.

Presently mostly use the model of neuron, represented on fig. 5.7.

A neuron has n single-direction inputs (synapses), united with outputs of other neurons and output y (axon), by which a signal (excitation or braking) acts on synapses of next neurons. Synapse is characterized by the value of synaptic connection, or scales wi, that by physical sense is equivalent to electric conductivity. Every neuron is characterized by the running state s by analogy from by the nervous cells of brain, which can be excited or Fig. 5.7. Model of neuron braked.

Running state of neuron depends on the value of its inputs, weight and, possibly, the previous state. Mostly the state of neuron is determined or as the weighted sum of its inputs

s = (5.1)

or as distance between the input vector and vector of input weights

x = (5.2)

A neuron output y is the function of its state;

y = f(x) (5.3)

The function f (s) is called the function of activating.

The most widespread functions of activating is a step threshold, linear threshold, sigmoid, linear and Gaussian, resulted in a table. 5.1.

Table 5. 1. Functions of activating of neuron

Name of function

Determination

Step threshold

f(s) =

Linear threshold

f(s) =

Sigmoid

f(s) = (1 + )-1

Linear

f(s) = ks + b

Gaussian

f(s) =

A neuron network is created as a result of association of output neurons with other inputs, thus neurons create layers, united between themselves.

Neuron network is a network with the eventual number of layers which consist of the same elements and different types of connections between the layers of neurons.

Thus the amount of neurons in layers is choosen on principle of ensuring of the set quality of task solution, and amount of neuron layers − as possible less for diminishing of solution duration.

The simplest single-layer neuron network which is yet called simple perseptron is represented on fig. 5.8, a. Signals come to n inputs, which pass along synapsis to three neurons which form an only layer with output signals

where j = 1…3.

Double-layer perseptron, got from the single-layer by addition of second layer which consists of two neurons, is represented on fig. 5.8, b. Thus non-linearity of activating function matters very much: if it was not, result of functioning of any p - layer of neuron network with the gravimetric matrices W(і) = 1,2,..p for every layer and would be taken to multiplying of input vector of signals X by the matrix of W(S) = W(1)*W(1)*W(2) *.*W(р), id est actually such p - layer neuron network would be equivalent single-layer with the gravimetric matrix of only layer W(S) : Y = Х W(S).

Fig.5.8. Single- (a) and double-layer (b) perseptrons

Except the number of layers and connections between them, neuron networks are classified as acyclic or cyclic. Shown on a fig. 5.8, a, b examples belong to the acyclic neuron networks. The example of cyclic neuron network is represented Fig. 5.9. Cyclic neuron network on fig. 5.9.

If supplement considered charts (see fig. 5.8 and 5.9) by condition about clocking of network (to set duration of neurons trigger), then we will get a hardware for assignment of different algorithms for the data processing by means of neuron networks, which can be used for solution of both the formalized tasks and tasks which it is difficult to formalize. In the last case application of neuron networks is based not on implementation of the offered algorithm, but on memorizing by network of given to it examples on the stage of networking and making of results, coordinated with these examples, on the stage of task solution.

Under the type of signals a neuron networks are divided into binary (digital) and analog. The digital networks operate binary signals, and the output of every neuron can take only two values − 0 or 1. After possibility of adaptation it is possible to mark out: neuron networks which construct and teach. In networks which construct, they set a number and type of neurons, graphs of transneuronal connections, weight of inputs, and in networks which teach, − graphs of transneuronal connections and weight of inputs which are changed during implementation of learning algorithm.

After the learning algorithm the networks are divided into a network, which are under supervision, are not under supervision and mixed (hybrid). The first in the process of studies compare a priori known result with got. The second study, not knowing the correct values of result. They group input data so that they formed the same output of network. Such approach is used, for example during the task solution of clusterization. At the mixed learning algorithm the part of weight is determined during a supervision, and part − without a supervision.

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]