Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
928 views
in Technique[技术] by (71.8m points)

neural network - Logistic regression implementation using NN

I have written this logistic regression code.

# training loop
w0 = -1
w1 = 1
b = 1
a = 0.5
epochs = 50000
for i in range(epochs):
  # loop over all the points to calculate the gradient
  dw0 = 0
  dw1 = 0
  db = 0

  for point in data:

    x0 = point[0]
    x1 = point[1]
    y = point[-1]

    z = (w0 * x0) + (w1 * x1) + b
    # print(z)
    a = 1/(1 + np.exp(-z))  # sigmoid
    # print(a)

    dz = a - y  # dz = dL/dz (derivative of loss function wrt z)

    dw0 += dz * x0
    dw1 += dz * x1
    db += dz

  dw0 /= len(data)  # av. val over all data points
  dw1 /= len(data)
  db /= len(data)

    
  # gradient descent step
  w0 = w0 - (a * dw0)
  w1 = w1 - (a * dw1)
  b  =  b - (a * db) 

print(w0, w1, b)
print()

But the thing is that the output is unaffected by a(learning rate). I changed the value of a to 6 and still got the same values for w0,w1 and b after 50,000 epochs. I am not sure but I think this is because of the gradients(dw0,dw1,db)? They are too small and not changing??? or Is there some other issue with the implementation?? How can I improve this implementation? Also, what's the best way to choose initial values for w1,w2,b?

question from:https://stackoverflow.com/questions/66050981/logistic-regression-implementation-using-nn

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)
Waitting for answers

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...