Neural network weights are initialized randomly, so there are no two identical models that will make the same exact prediction. That is, unless they are initialized with the same weights. If you set the random seed for weights initialization, the results will be the same:
self.dense1 = tf.keras.layers.Dense(4, activation=tf.nn.relu,
kernel_initializer=tf.initializers.GlorotUniform(seed=42))
Full code:
import tensorflow as tf
class MyModel(tf.keras.Model):
def __init__(self):
super(MyModel, self).__init__()
self.dense1 = tf.keras.layers.Dense(4, activation=tf.nn.relu,
kernel_initializer=tf.initializers.GlorotUniform(seed=42))
self.dense2 = tf.keras.layers.Dense(5, activation=tf.nn.softmax,
kernel_initializer=tf.initializers.GlorotUniform(seed=42))
def call(self, inputs):
x = self.dense1(inputs)
return self.dense2(x)
model = MyModel()
inputs = tf.keras.Input(shape=(3,))
x = tf.keras.layers.Dense(4, activation=tf.nn.relu,
kernel_initializer=tf.initializers.GlorotUniform(seed=42))(inputs)
outputs = tf.keras.layers.Dense(5, activation=tf.nn.softmax,
kernel_initializer=tf.initializers.GlorotUniform(seed=42))(x)
model_fun = tf.keras.Model(inputs=inputs, outputs=outputs)
model_fun.summary()
model.build((1,3,))
model.summary()
x= model_fun(tf.constant([[1,2,3]]))
y= model.call(tf.constant([[1,2,3]]))
assert((x.numpy()==y.numpy()).all())
[[0.74651927 0.00897978 0.04163173 0.00992385 0.1929454 ]]
[[0.74651927 0.00897978 0.04163173 0.00992385 0.1929454 ]]
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…