在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):Wardat-ISU/DeepLocalize开源软件地址(OpenSource Url):https://github.com/Wardat-ISU/DeepLocalize开源编程语言(OpenSource Language):Python 99.8%开源软件介绍(OpenSource Introduction):DeepLocalize: Fault Localization for Deep Neural NetworksOur ToolThe simplest way to build a training model is to start with the [ Here is the model = Sequential() Add all layers as easily as by using Dense layer takes three arguments: num_inputs(number of input unit), num_outputs(number of output unit), lr_rate(leaning rate), and name(name of layer). The activation function is added as a new layer. lr = 0.01
model.add(Dense(num_inputs=100, num_outputs=64, lr_rate=lr, name='FC1'))
model.add(ReLu())
model.add(Dense(num_inputs=64, num_outputs=10, lr_rate=lr, name='FC2'))
model.add(softmax()) Once you finished building your model, you can use model.compile(loss='categorical_crossentropy',
optimizer='sgd',
metrics=['accuracy']) The core principle of our tool is to make the training model simple, while inserting instrumentation
in the model.fit(x_train, y_train, epochs=5, batch_size=32) You can start training a new model using the following commands in Terminal (macOS/Linux) or cmd (Windows) as following:
Our Callback MethodTo use our callback, you need to add our callback as subclass in your keras.callbacks.py file. The core principle of our callback to get a view on internal states and statistics of the model during training. Then you can pass our callback callback = keras.callbacks.DeepLocalize(inputs, outputs, layer_number, batch_size, startTime)
model = keras.models.Sequential()
model.add(keras.layers.Dense(64))
model.add(keras.layers.Activation(activations.relu))
model.compile(keras.optimizers.SGD(), loss='mse')
model.fit(np.arange(100).reshape(5, 20), np.zeros(5), epochs=10, batch_size=1,
... callbacks=[callback], verbose=0) PrerequisitesVersion numbers below are of confirmed working releases for this project.
BibTeX ReferenceIf you find this paper useful in your research, please consider citing:
|
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论