• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java DenseLayer类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.deeplearning4j.nn.conf.layers.DenseLayer的典型用法代码示例。如果您正苦于以下问题:Java DenseLayer类的具体用法?Java DenseLayer怎么用?Java DenseLayer使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



DenseLayer类属于org.deeplearning4j.nn.conf.layers包,在下文中一共展示了DenseLayer类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: getDeepDenseLayerNetworkConfiguration

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
/** Returns the network configuration, 2 hidden DenseLayers of size 50.
 */
private static MultiLayerConfiguration getDeepDenseLayerNetworkConfiguration() {
    final int numHiddenNodes = 50;
    return new NeuralNetConfiguration.Builder()
            .seed(seed)
            .iterations(iterations)
            .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
            .learningRate(learningRate)
            .weightInit(WeightInit.XAVIER)
            .updater(Updater.NESTEROVS).momentum(0.9)
            .list()
            .layer(0, new DenseLayer.Builder().nIn(numInputs).nOut(numHiddenNodes)
                    .activation(Activation.TANH).build())
            .layer(1, new DenseLayer.Builder().nIn(numHiddenNodes).nOut(numHiddenNodes)
                    .activation(Activation.TANH).build())
            .layer(2, new OutputLayer.Builder(LossFunctions.LossFunction.MSE)
                    .activation(Activation.IDENTITY)
                    .nIn(numHiddenNodes).nOut(numOutputs).build())
            .pretrain(false).backprop(true).build();
}
 
开发者ID:IsaacChanghau,项目名称:NeuralNetworksLite,代码行数:22,代码来源:RegressionMathFunctions.java


示例2: getConfiguration

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
@Override
   protected MultiLayerConfiguration getConfiguration()
   {
return new NeuralNetConfiguration.Builder().seed(parameters.getSeed()).iterations(parameters.getIterations())
	.optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).learningRate(parameters.getLearningRate()).l2(0.001)
	.list(4)
	.layer(0,
		new DenseLayer.Builder().nIn(parameters.getInputSize()).nOut(250).weightInit(WeightInit.XAVIER)
			.updater(Updater.ADAGRAD).activation("relu").build())
	.layer(1,
		new DenseLayer.Builder().nIn(250).nOut(10).weightInit(WeightInit.XAVIER)
			.updater(Updater.ADAGRAD).activation("relu").build())
	.layer(2,
		new DenseLayer.Builder().nIn(10).nOut(250).weightInit(WeightInit.XAVIER)
			.updater(Updater.ADAGRAD).activation("relu").build())
	.layer(3,
		new OutputLayer.Builder().nIn(250).nOut(parameters.getInputSize()).weightInit(WeightInit.XAVIER)
			.updater(Updater.ADAGRAD).activation("relu")
			.lossFunction(LossFunctions.LossFunction.MSE).build())
	.pretrain(false).backprop(true).build();
   }
 
开发者ID:amrabed,项目名称:DL4J,代码行数:22,代码来源:AnomalyDetectionModel.java


示例3: net

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
private static MultiLayerConfiguration net(int nIn, int nOut) {
    return new NeuralNetConfiguration.Builder()
            .seed(42)
            .iterations(1)
            .activation(Activation.RELU)
            .weightInit(WeightInit.XAVIER)
            .learningRate(0.1)
            .regularization(true).l2(1e-4)
            .list(
                    new DenseLayer.Builder().nIn(nIn).nOut(3).build(),
                    new DenseLayer.Builder().nIn(3).nOut(3).build(),
                    new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD)
                            .activation(Activation.SOFTMAX)
                            .nIn(3)
                            .nOut(nOut)
                            .build()
            )
            .build();
}
 
开发者ID:wmeddie,项目名称:dl4j-trainer-archetype,代码行数:20,代码来源:Train.java


示例4: getOriginalNet

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
public static MultiLayerNetwork getOriginalNet(int seed){
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
            .seed(seed)
            .weightInit(WeightInit.XAVIER)
            .activation(Activation.TANH)
            .convolutionMode(ConvolutionMode.Same)
            .updater(new Sgd(0.3))
            .list()
            .layer(new ConvolutionLayer.Builder().nOut(3).kernelSize(2,2).stride(1,1).build())
            .layer(new SubsamplingLayer.Builder().kernelSize(2,2).stride(1,1).build())
            .layer(new ConvolutionLayer.Builder().nIn(3).nOut(3).kernelSize(2,2).stride(1,1).build())
            .layer(new DenseLayer.Builder().nOut(64).build())
            .layer(new DenseLayer.Builder().nIn(64).nOut(64).build())
            .layer(new OutputLayer.Builder().nIn(64).nOut(10).lossFunction(LossFunctions.LossFunction.MSE).build())
            .setInputType(InputType.convolutionalFlat(28,28,1))
            .build();


    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();
    return net;
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:23,代码来源:TestFrozenLayers.java


示例5: getOriginalGraph

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
public static ComputationGraph getOriginalGraph(int seed){
    ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder()
            .seed(seed)
            .weightInit(WeightInit.XAVIER)
            .activation(Activation.TANH)
            .convolutionMode(ConvolutionMode.Same)
            .updater(new Sgd(0.3))
            .graphBuilder()
            .addInputs("in")
            .layer("0", new ConvolutionLayer.Builder().nOut(3).kernelSize(2,2).stride(1,1).build(), "in")
            .layer("1", new SubsamplingLayer.Builder().kernelSize(2,2).stride(1,1).build(), "0")
            .layer("2", new ConvolutionLayer.Builder().nIn(3).nOut(3).kernelSize(2,2).stride(1,1).build(), "1")
            .layer("3", new DenseLayer.Builder().nOut(64).build(), "2")
            .layer("4", new DenseLayer.Builder().nIn(64).nOut(64).build(), "3")
            .layer("5", new OutputLayer.Builder().nIn(64).nOut(10).lossFunction(LossFunctions.LossFunction.MSE).build(), "4")
            .setOutputs("5")
            .setInputTypes(InputType.convolutionalFlat(28,28,1))
            .build();


    ComputationGraph net = new ComputationGraph(conf);
    net.init();
    return net;
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:25,代码来源:TestFrozenLayers.java


示例6: testJsonComputationGraph

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
@Test
public void testJsonComputationGraph() {
    //ComputationGraph with a custom layer; check JSON and YAML config actually works...

    ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder().graphBuilder()
                    .addInputs("in").addLayer("0", new DenseLayer.Builder().nIn(10).nOut(10).build(), "in")
                    .addLayer("1", new CustomLayer(3.14159), "0").addLayer("2",
                                    new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT).nIn(10).nOut(10)
                                                    .build(),
                                    "1")
                    .setOutputs("2").pretrain(false).backprop(true).build();

    String json = conf.toJson();
    String yaml = conf.toYaml();

    System.out.println(json);

    ComputationGraphConfiguration confFromJson = ComputationGraphConfiguration.fromJson(json);
    assertEquals(conf, confFromJson);

    ComputationGraphConfiguration confFromYaml = ComputationGraphConfiguration.fromYaml(yaml);
    assertEquals(conf, confFromYaml);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:24,代码来源:TestCustomLayers.java


示例7: checkInitializationFF

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
@Test
public void checkInitializationFF() {
    //Actually create a network with a custom layer; check initialization and forward pass

    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().list()
                    .layer(0, new DenseLayer.Builder().nIn(9).nOut(10).build()).layer(1, new CustomLayer(3.14159)) //hard-coded nIn/nOut of 10
                    .layer(2, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT).nIn(10).nOut(11).build())
                    .pretrain(false).backprop(true).build();

    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();

    assertEquals(9 * 10 + 10, net.getLayer(0).numParams());
    assertEquals(10 * 10 + 10, net.getLayer(1).numParams());
    assertEquals(10 * 11 + 11, net.getLayer(2).numParams());

    //Check for exceptions...
    net.output(Nd4j.rand(1, 9));
    net.fit(new DataSet(Nd4j.rand(1, 9), Nd4j.rand(1, 11)));
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:21,代码来源:TestCustomLayers.java


示例8: testMultiCNNLayer

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
@Test
public void testMultiCNNLayer() throws Exception {
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.LINE_GRADIENT_DESCENT).seed(123).list()
                    .layer(0, new ConvolutionLayer.Builder().nIn(1).nOut(6).weightInit(WeightInit.XAVIER)
                                    .activation(Activation.RELU).build())
                    .layer(1, new LocalResponseNormalization.Builder().build()).layer(2,
                                    new DenseLayer.Builder()
                                                    .nOut(2).build())
                    .layer(3, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT)
                                    .weightInit(WeightInit.XAVIER).activation(Activation.SOFTMAX).nIn(2).nOut(10)
                                    .build())
                    .backprop(true).pretrain(false).setInputType(InputType.convolutionalFlat(28, 28, 1)).build();

    MultiLayerNetwork network = new MultiLayerNetwork(conf);
    network.init();
    DataSetIterator iter = new MnistDataSetIterator(2, 2);
    DataSet next = iter.next();

    network.fit(next);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:22,代码来源:LocalResponseTest.java


示例9: getDenseMLNConfig

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
private static MultiLayerNetwork getDenseMLNConfig(boolean backprop, boolean pretrain) {
    int numInputs = 4;
    int outputNum = 3;
    long seed = 6;

    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().seed(seed)
                    .updater(new Sgd(1e-3)).l1(0.3).l2(1e-3).list()
                    .layer(0, new org.deeplearning4j.nn.conf.layers.DenseLayer.Builder().nIn(numInputs).nOut(3)
                                    .activation(Activation.TANH).weightInit(WeightInit.XAVIER).build())
                    .layer(1, new org.deeplearning4j.nn.conf.layers.DenseLayer.Builder().nIn(3).nOut(2)
                                    .activation(Activation.TANH).weightInit(WeightInit.XAVIER).build())
                    .layer(2, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT)
                                    .weightInit(WeightInit.XAVIER).nIn(2).nOut(outputNum).build())
                    .backprop(backprop).pretrain(pretrain).build();

    MultiLayerNetwork model = new MultiLayerNetwork(conf);
    model.init();
    return model;

}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:21,代码来源:DenseTest.java


示例10: getGraph

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
private ComputationGraph getGraph(int numLabels, double lambda) {
    Nd4j.getRandom().setSeed(12345);
    ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder().seed(12345)
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                    .weightInit(WeightInit.DISTRIBUTION).dist(new NormalDistribution(0, 1)).updater(new NoOp())
                    .graphBuilder().addInputs("input1")
                    .addLayer("l1", new DenseLayer.Builder().nIn(4).nOut(5).activation(Activation.RELU).build(),
                                    "input1")
                    .addLayer("lossLayer", new CenterLossOutputLayer.Builder()
                                    .lossFunction(LossFunctions.LossFunction.MCXENT).nIn(5).nOut(numLabels)
                                    .lambda(lambda).activation(Activation.SOFTMAX).build(), "l1")
                    .setOutputs("lossLayer").pretrain(false).backprop(true).build();

    ComputationGraph graph = new ComputationGraph(conf);
    graph.init();

    return graph;
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:19,代码来源:CenterLossOutputLayerTest.java


示例11: getIrisMLPSimpleConfig

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
/** Very simple back-prop config set up for Iris.
 * Learning Rate = 0.1
 * No regularization, no Adagrad, no momentum etc. One iteration.
 */
private static MultiLayerConfiguration getIrisMLPSimpleConfig(int[] hiddenLayerSizes,
                Activation activationFunction) {
    NeuralNetConfiguration.ListBuilder lb = new NeuralNetConfiguration.Builder().updater(new Sgd(0.1))
                .seed(12345L).list();

    for (int i = 0; i < hiddenLayerSizes.length; i++) {
        int nIn = (i == 0 ? 4 : hiddenLayerSizes[i - 1]);
        lb.layer(i, new DenseLayer.Builder().nIn(nIn).nOut(hiddenLayerSizes[i]).weightInit(WeightInit.XAVIER)
                        .activation(activationFunction).build());
    }

    lb.layer(hiddenLayerSizes.length,
                    new OutputLayer.Builder(LossFunction.MCXENT).nIn(hiddenLayerSizes[hiddenLayerSizes.length - 1])
                                    .nOut(3).weightInit(WeightInit.XAVIER)
                                    .activation(activationFunction.equals(Activation.IDENTITY) ? Activation.IDENTITY
                                                    : Activation.SOFTMAX)
                                    .build());
    lb.pretrain(false).backprop(true);

    return lb.build();
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:26,代码来源:BackPropMLPTest.java


示例12: testJSONBasic

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
@Test
public void testJSONBasic() {
    ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder().seed(12345)
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                    .weightInit(WeightInit.DISTRIBUTION).dist(new NormalDistribution(0, 1)).updater(new NoOp())
                    .graphBuilder().addInputs("input")
                    .addLayer("firstLayer",
                                    new DenseLayer.Builder().nIn(4).nOut(5).activation(Activation.TANH).build(),
                                    "input")
                    .addLayer("outputLayer",
                                    new OutputLayer.Builder().lossFunction(LossFunctions.LossFunction.MCXENT)
                                                    .activation(Activation.SOFTMAX).nIn(5).nOut(3).build(),
                                    "firstLayer")
                    .setOutputs("outputLayer").pretrain(false).backprop(true).build();

    String json = conf.toJson();
    ComputationGraphConfiguration conf2 = ComputationGraphConfiguration.fromJson(json);

    assertEquals(json, conf2.toJson());
    assertEquals(conf, conf2);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:22,代码来源:ComputationGraphConfigurationTest.java


示例13: getNetwork

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
private MultiLayerNetwork getNetwork() {
    int nIn = 5;
    int nOut = 6;

    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().seed(12345).l1(0.01).l2(0.01)
            .updater(new Sgd(0.1)).activation(Activation.TANH).weightInit(WeightInit.XAVIER).list()
            .layer(0, new DenseLayer.Builder().nIn(nIn).nOut(20).build())
            .layer(1, new DenseLayer.Builder().nIn(20).nOut(30).build()).layer(2, new OutputLayer.Builder()
                    .lossFunction(LossFunctions.LossFunction.MSE).nIn(30).nOut(nOut).build())
            .build();

    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();

    return net;
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:17,代码来源:ModelGuesserTest.java


示例14: testWriteCGModel

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
@Test
public void testWriteCGModel() throws Exception {
    ComputationGraphConfiguration config = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).updater(new Sgd(0.1))
                    .graphBuilder().addInputs("in")
                    .addLayer("dense", new DenseLayer.Builder().nIn(4).nOut(2).build(), "in").addLayer("out",
                                    new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT).nIn(2).nOut(3)
                                                    .build(),
                                    "dense")
                    .setOutputs("out").pretrain(false).backprop(true).build();

    ComputationGraph cg = new ComputationGraph(config);
    cg.init();

    File tempFile = File.createTempFile("tsfs", "fdfsdf");
    tempFile.deleteOnExit();

    ModelSerializer.writeModel(cg, tempFile, true);

    ComputationGraph network = ModelSerializer.restoreComputationGraph(tempFile);

    assertEquals(network.getConfiguration().toJson(), cg.getConfiguration().toJson());
    assertEquals(cg.params(), network.params());
    assertEquals(cg.getUpdater().getStateViewArray(), network.getUpdater().getStateViewArray());
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:26,代码来源:ModelSerializerTest.java


示例15: testWriteCGModelInputStream

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
@Test
public void testWriteCGModelInputStream() throws Exception {
    ComputationGraphConfiguration config = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).updater(new Sgd(0.1))
                    .graphBuilder().addInputs("in")
                    .addLayer("dense", new DenseLayer.Builder().nIn(4).nOut(2).build(), "in").addLayer("out",
                                    new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT).nIn(2).nOut(3)
                                                    .build(),
                                    "dense")
                    .setOutputs("out").pretrain(false).backprop(true).build();

    ComputationGraph cg = new ComputationGraph(config);
    cg.init();

    File tempFile = File.createTempFile("tsfs", "fdfsdf");
    tempFile.deleteOnExit();

    ModelSerializer.writeModel(cg, tempFile, true);
    FileInputStream fis = new FileInputStream(tempFile);

    ComputationGraph network = ModelSerializer.restoreComputationGraph(fis);

    assertEquals(network.getConfiguration().toJson(), cg.getConfiguration().toJson());
    assertEquals(cg.params(), network.params());
    assertEquals(cg.getUpdater().getStateViewArray(), network.getUpdater().getStateViewArray());
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:27,代码来源:ModelSerializerTest.java


示例16: testSmallAmountOfData

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
@Test
public void testSmallAmountOfData() {
    //Idea: Test spark training where some executors don't get any data
    //in this case: by having fewer examples (2 DataSets) than executors (local[*])

    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().updater(new RmsProp())
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).list()
                    .layer(0, new org.deeplearning4j.nn.conf.layers.DenseLayer.Builder().nIn(nIn).nOut(3)
                                    .activation(Activation.TANH).build())
                    .layer(1, new org.deeplearning4j.nn.conf.layers.OutputLayer.Builder(
                                    LossFunctions.LossFunction.MSE).nIn(3).nOut(nOut).activation(Activation.SOFTMAX)
                                                    .build())
                    .build();

    SparkDl4jMultiLayer sparkNet = new SparkDl4jMultiLayer(sc, conf,
                    new ParameterAveragingTrainingMaster(true, numExecutors(), 1, 10, 1, 0));

    Nd4j.getRandom().setSeed(12345);
    DataSet d1 = new DataSet(Nd4j.rand(1, nIn), Nd4j.rand(1, nOut));
    DataSet d2 = new DataSet(Nd4j.rand(1, nIn), Nd4j.rand(1, nOut));

    JavaRDD<DataSet> rddData = sc.parallelize(Arrays.asList(d1, d2));

    sparkNet.fit(rddData);

}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:27,代码来源:TestSparkMultiLayerParameterAveraging.java


示例17: testCifarDataSetIteratorReset

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
@Ignore // use when checking cifar dataset iterator
@Test
public void testCifarDataSetIteratorReset() {
    int epochs = 2;
    Nd4j.getRandom().setSeed(12345);
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                    .weightInit(WeightInit.XAVIER).seed(12345L).list()
                    .layer(0, new DenseLayer.Builder().nIn(400).nOut(50).activation(Activation.RELU).build())
                    .layer(1, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT)
                                    .activation(Activation.SOFTMAX).nIn(50).nOut(10).build())
                    .pretrain(false).backprop(true)
                    .inputPreProcessor(0, new CnnToFeedForwardPreProcessor(20, 20, 1)).build();

    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();
    net.setListeners(new ScoreIterationListener(1));

    MultipleEpochsIterator ds =
                    new MultipleEpochsIterator(epochs, new CifarDataSetIterator(10, 20, new int[] {20, 20, 1}));
    net.fit(ds);
    assertEquals(epochs, ds.epochs);
    assertEquals(2, ds.batch);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:24,代码来源:MultipleEpochsIteratorTest.java


示例18: testRemoteFull

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
@Test
@Ignore
public void testRemoteFull() throws Exception {
    //Use this in conjunction with startRemoteUI()

    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).list()
                    .layer(0, new DenseLayer.Builder().activation(Activation.TANH).nIn(4).nOut(4).build())
                    .layer(1, new OutputLayer.Builder().lossFunction(LossFunctions.LossFunction.MCXENT)
                                    .activation(Activation.SOFTMAX).nIn(4).nOut(3).build())
                    .pretrain(false).backprop(true).build();

    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();
    StatsStorageRouter ssr = new RemoteUIStatsStorageRouter("http://localhost:9000");
    net.setListeners(new StatsListener(ssr), new ScoreIterationListener(1));

    DataSetIterator iter = new IrisDataSetIterator(150, 150);

    for (int i = 0; i < 500; i++) {
        net.fit(iter);
        //            Thread.sleep(100);
        Thread.sleep(100);
    }

}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:27,代码来源:TestRemoteReceiver.java


示例19: test

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
@Test
public void test() throws IOException {

    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().list()
                    .layer(0, new DenseLayer.Builder().nIn(10).nOut(10).build())
                    .layer(1, new OutputLayer.Builder().nIn(10).nOut(10).build()).build();

    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();


    MultiLayerNetwork net2 =
                    new TransferLearning.Builder(net)
                                    .fineTuneConfiguration(
                                                    new FineTuneConfiguration.Builder().updater(new Sgd(0.01)).build())
                                    .setFeatureExtractor(0).build();

    File f = Files.createTempFile("dl4jTestTransferStatsCollection", "bin").toFile();
    f.delete();
    net2.setListeners(new StatsListener(new FileStatsStorage(f)));

    //Previosuly: failed on frozen layers
    net2.fit(new DataSet(Nd4j.rand(8, 10), Nd4j.rand(8, 10)));

    f.deleteOnExit();
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:27,代码来源:TestTransferStatsCollection.java


示例20: reconf

import org.deeplearning4j.nn.conf.layers.DenseLayer; //导入依赖的package包/类
public void reconf() {

        int seed = 123;
        double learningRate = 0.01;
        int numInputs = 2;
        int numOutputs = 2;
        int numHiddenNodes = 5;

        MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                .seed(seed)
                .iterations(1)
                .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                .learningRate(learningRate)
                .updater(Updater.NESTEROVS).momentum(0.9)
                .list()
                .layer(0, new DenseLayer.Builder().nIn(numInputs).nOut(numHiddenNodes)
                        .weightInit(WeightInit.XAVIER)
                        .activation("relu")
                        .build())
                .layer(1, new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD)
                        .weightInit(WeightInit.XAVIER)
                        .activation("softmax").weightInit(WeightInit.XAVIER)
                        .nIn(numHiddenNodes).nOut(numOutputs).build())
                .pretrain(false).backprop(true).build();

        model = new MultiLayerNetwork(conf);

        System.out.println("Ready :-)");

        if (dirty != null) {
            dirty.run();
        }

    }
 
开发者ID:datathings,项目名称:greycat,代码行数:35,代码来源:NeuralNetAttribute.java



注:本文中的org.deeplearning4j.nn.conf.layers.DenseLayer类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java RawCommandLineEditor类代码示例发布时间:2022-05-21
下一篇:
Java RawMessage类代码示例发布时间:2022-05-21
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap