Entropic Neuronal Summation

De Assothink Wiki
Aller à la navigation Aller à la recherche

Context

Natural intelligences, like the human brain, and artificial intelligences, like Assothink, involve numerous nodes exchanging signals.

Focusing on one node, it is assumed that this node receives a finite (discrete) number of input signals (inflows). The input signals may be considered and descibed as positive real values.

These signals determine the local excitationn level of the node.

The question is purely mathematical: how do the various inflow units combine into a global inflow value ?

[ Note: les notations mathématiques utilisées ici sont construites à partir de cette aide

Principle

The first answer to the question above is 'summation'. Neuroscience document do not discuss this point (as far as we know), but it is generaly assumed that summation effects occur. It would be hard to prove that the summation effect is the correct model for the combination of inflows, but hard also to prove that it is not the correct model. So we can just make assumptions.

In this document, another assumption is built: the ENS (Entropic Neuronal Summation).

Obviously the model is quite similar to a simple summation model, but there is one point attracting our attention. The inflow coming from two different sources operates more than the inflow coming from one single source, event when the simple summation of the values is identical. There is no demonstration for this. It is based on introspective considerations.

But anyway, why would it be worse than the simple summation model ?

The critical point mentioned above, the difference between the summation model and the ENS model is that we want for the ENS a function , such that

In other words, the function has (besides some properties common with the simple summation model) some specific properties.

Properties

The function shares these properties with the simple summation model:

ENS model

Let us define various writings. The arguments of the functions are written

The sum of the arguments (the simple summation value) is

The relative weight of the arguments are

And we also define

In this document all logarithms use base 2

Let be the entropic uncertainty measure of the values