A real math problem.

Joined
Mar 31, 2023
Messages
95
Reaction score
8
I have a problem that I've been trying to solve for a very long time. This problem can be solved in any programming language or even on a piece of paper or a whiteboard; it's a linear equation problem.

This calculation is used to find the two parameters of a neuron F = ae + bf without bias. It's not really practical because training it on a huge dataset would be inconceivable. The calculation demonstrated below is designed for a neuron with two inputs, one output, two parameters, and only two data points. Here's an example of usage: {'inputs': [1, 2], 'targets': 0}, {'inputs': [3, 4], 'targets': 1}, where a = 1, b = 2, c = 3, d = 4, g = 0, and h = 1.

Capture d’écran 2024-01-02 130908.png


I am looking to create an iterative version of this calculation that would allow us to train our neurons with as much data, inputs, outputs, and hidden layer neurons as we want.

Thanks to everyone reading this message and sharing some of their ideas!

Phro0244

PS: I did this calculation myself; you won't find anything like it online.
 
Joined
Sep 21, 2022
Messages
122
Reaction score
15
There are iterative methods to solve simultaneous equations.

Jacobi iteration, simple but doesn't work well.

Gauss-Seidel iteration, is not much better.

Gaussian elimination, is the way a human would do it, so writing a function would not be difficult.

Jacobi iteration performed so badly with your example, that I thought my program had a bug in it.
 
Joined
Mar 31, 2023
Messages
95
Reaction score
8
Hello WhiteCube,

I have tested your Gaussian elimination method, but I cannot understand how this technique can accommodate an exponential number of variables in a database without modifying the number of other variables.

For example, in our scenario [(a, b), g] = data 1 and [(c, d), h] = data 2, but there are always only e and f that will never change based on the number of data points.

Thank you for your attention. Best regards.

Phro0244
 
Joined
Sep 21, 2022
Messages
122
Reaction score
15
Gaussian elimination will only apply when the number of datapoints equals the number of parameters.

I don't know how linear equations relate to your neurons.

I'm out of my depth when it comes to neural networks, sorry I can't help you out there.
 
Joined
Mar 31, 2023
Messages
95
Reaction score
8
I can explain briefly if you really want to help:
The connection between linear equations and neurons is often seen in the context of artificial neural networks (ANNs) within the field of machine learning.

In neural networks, neurons are mathematical units that receive input, apply a transformation (usually involving linear equations and activation functions), and produce an output. Linear equations play a crucial role in the transformation step.

Here's a simplified explanation:
  1. Neurons in a Neural Network:
    • Each neuron receives input values (features) from the previous layer or directly from the input data.
    • These input values are often assigned weights, which can be seen as coefficients in a linear equation.
  2. Linear Transformation:
    • The neuron performs a linear combination of its inputs and weights, usually represented by a weighted sum.
    • This process is essentially a linear equation: output = w1 * input1 + w2 * input2 + ... + wn * inputn + bias, where wi are weights, inputi are input values, and bias is a constant term.
  3. Activation Function:
    • The result of the linear transformation is then passed through an activation function.
    • The activation function introduces non-linearity to the model, allowing it to learn complex patterns and relationships in the data.
  4. Learning and Adaptation:
    • During the training process, the weights in the linear equations are adjusted based on the error between the predicted output and the actual output.
    • This adjustment allows the neural network to learn and adapt to the underlying patterns in the training data.
The use of linear equations within neurons in a neural network enables the network to model and learn from complex relationships in data, making them a fundamental component in the broader field of machine learning.
 
Joined
Sep 21, 2022
Messages
122
Reaction score
15
The only things I know about neural networks comes from Wired magazine.

That for many years it was considered a dead-end approach, until one person, working with his students, came up with a way to make the training more efficient.

The method was sold to one of the big tech companies, for squillions, and NN became popular again.

Do you have the secret sauce?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,582
Members
45,060
Latest member
BuyKetozenseACV

Latest Threads

Top