Back Propagation

Joined
Jul 17, 2023
Messages
1
Reaction score
0
Hi, I'd like to know why exactly my back propagation function keeps giving the same pattern of weights for every neuron of the same layer (except the first layer, it's different because it has non-linear activation functions).
This is the sheet with the weights:

If I knew how to, I would post the code. If it's needed I'll try to figure out how to post it.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,537
Members
45,023
Latest member
websitedesig25

Latest Threads

Top