So, you are talking about Adam" if I'm not wrong? It is mostly called ADAM, but it's not an acronym actually. OK, so this is an optimisation with gradient descent.
There's 1) gradient decent with momentum 2) RMS-prop
Now, Adam = 1) + 2)
So, I'll ask you about your model? How have you created the model? Do you have access to our the backward derivatives? If you've coded them, they're denoted by dB, dV etc for the backpropagation.
I'll need access to those. In other words, I'll give the code for Adam and you will plug it in the middle like a box, with inputs and outputs.
Please make things more clear, so I can understand. Basically, explain me how have to done the forward passes and backward passes, and derivatives. That's all.
Thank you!
I'm a research assistant in machine learning lab. I've coded ML algorithms from scratch in MATLAB Octave Python. You must also tell me your platform. Else, I shall give you the psudocode and explain it to you in detail.
The ball is in your court now.