Search for a command to run...
Apart from all other machine learning models, neural networks are much more complex models in the sense that they represent mathematical functions with millions of coefficients (parameters). In this article, it is about designing and implementing a network of artificial neurons by applying the Heaviside activation function on each neuron of the first layer of the network and finally on the single output neuron, we apply the logical "and". To solve a system of linear inequalities with three real unknowns, it is to represent graphically in frame system of the three-dimensional plane, the set of points M of \(\mathbb{R}\)3 whose coordinates (\(\mathit{x}\)1 ,\(\mathit{x}\)2 \(\mathit{and}\) \(\mathit{x}\)3) simultaneously verify all the inequalities of the system. Where \(\mathit{a}\)\(\mathit{i}\) ,\(\mathit{b}\)\(\mathit{i}\) \(\mathit{et}\) \(\mathit{c}\)\(\mathit{i}\) are coefficients of \(\mathit{x}\)\(\mathit{i}\) with 1\(\le\) i \(\le\) 3 and \(\mathit{a}\)0,\(\mathit{b}\)0 and \(\mathit{c}\)0 the independent terms. The set of solutions of this system is a part of \(\mathbb{R}\)3 whose points satisfy these three inequalities simultaneously. In a neural network, the \(\mathit{x}\)\(\mathit{i}\) are variables, the (\(\mathit{a}\)\(\mathit{i}\)), (\(\mathit{b}\)\(\mathit{i}\)) ,\(\mathit{et}\) (\(\mathit{c}\)\(\mathit{i}\)) are weights associated with these variables and the \(\mathit{a}\)0,\(\mathit{b}\)0 \(\mathit{and}\) \(\mathit{c}\)0 are biases. This model has been implemented in python with the keras easy library for solve systems of linear inequalities with three real unknowns by graphically representing elemental solutions in \(\mathbb{R}\)3.
Published in: Journal of Advances in Mathematics and Computer Science
Volume 38, Issue 9, pp. 51-64