Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Constant loss value in Adam optimization method

I have NN code which is written with pytorch. In the loss value calculation when code is using Adam optimization method loss values does not change during the loop while the LBGFS optimization method gets on everything is ok and loss values decrease in the normal way. How can I get rid of this issue?

    for epo in range(epo_adam):
model.train()
optimizer_adam.zero_grad()
loss , momentum_loss , loss_data , loss_BC , continuity_loss = adop_loss_Weight(model, x, y, z, u_exact, v_exact , w_exact , p_exact ,
           x_b, y_b, z_b, u_b, v_b, w_b ,p_b)
loss.backward()
if epo %500 == 0:
  print(f'Epoch Adam {epo}, Total Loss: {loss.item():.10f}')
if loss.item() <=0.15 :
  print("Optimzation Method is swtiching to LBGF-S . . . ")
  break


for epochs in range(epochs):
model.train()
loss = optimizer.step(closure)

if epochs % 20 == 0:
    print(f'Epoch LBGF-s {epochs}, Total Loss: {loss.item():.5f}')
    #print(f'The highest Loss is:  {max(momentum_loss.item() , continuity_loss.item() , loss_data.item() , loss_bc.item()):.6f}')
    #print(time.time())

I searched a lot but couldn't find any solution. I think constant loss values in Adam optimization is not reasonable!

like image 411
arezayan Avatar asked Mar 12 '26 09:03

arezayan


1 Answers

The problem here is that there is a missing call to optimizer_adam.step(), so the weights aren't updated after calculating gradients.

like image 144
Robert Long Avatar answered Mar 13 '26 23:03

Robert Long



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!