• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java Activation类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.nd4j.linalg.activations.Activation的典型用法代码示例。如果您正苦于以下问题:Java Activation类的具体用法?Java Activation怎么用?Java Activation使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



Activation类属于org.nd4j.linalg.activations包,在下文中一共展示了Activation类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: getDeepDenseLayerNetworkConfiguration

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
/** Returns the network configuration, 2 hidden DenseLayers of size 50.
 */
private static MultiLayerConfiguration getDeepDenseLayerNetworkConfiguration() {
    final int numHiddenNodes = 50;
    return new NeuralNetConfiguration.Builder()
            .seed(seed)
            .iterations(iterations)
            .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
            .learningRate(learningRate)
            .weightInit(WeightInit.XAVIER)
            .updater(Updater.NESTEROVS).momentum(0.9)
            .list()
            .layer(0, new DenseLayer.Builder().nIn(numInputs).nOut(numHiddenNodes)
                    .activation(Activation.TANH).build())
            .layer(1, new DenseLayer.Builder().nIn(numHiddenNodes).nOut(numHiddenNodes)
                    .activation(Activation.TANH).build())
            .layer(2, new OutputLayer.Builder(LossFunctions.LossFunction.MSE)
                    .activation(Activation.IDENTITY)
                    .nIn(numHiddenNodes).nOut(numOutputs).build())
            .pretrain(false).backprop(true).build();
}
 
开发者ID:IsaacChanghau,项目名称:NeuralNetworksLite,代码行数:22,代码来源:RegressionMathFunctions.java


示例2: runClf

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
/**
 * Run dummy network with give activationfunction for the first layer
 *
 * @param act Activation function to test
 * @throws Exception Something went wrong.
 */
public static void runClf(IActivation act) throws Exception {
  Dl4jMlpClassifier clf = new Dl4jMlpClassifier();
  // Data
  DenseLayer denseLayer = new DenseLayer();
  denseLayer.setNOut(2);
  denseLayer.setLayerName("Dense-layer");
  denseLayer.setActivationFn(act);

  OutputLayer outputLayer = new OutputLayer();
  outputLayer.setActivationFn(Activation.SOFTMAX.getActivationFunction());
  outputLayer.setLayerName("Output-layer");

  clf.setNumEpochs(1);
  clf.setLayers(denseLayer, outputLayer);

  final Instances data = DatasetLoader.loadIris();
  clf.buildClassifier(data);
  clf.distributionsForInstances(data);
}
 
开发者ID:Waikato,项目名称:wekaDeeplearning4j,代码行数:26,代码来源:ActivationTest.java


示例3: runClf

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
private void runClf(Instances data) throws Exception {
  // Data
  DenseLayer denseLayer = new DenseLayer();
  denseLayer.setNOut(32);
  denseLayer.setLayerName("Dense-layer");
  denseLayer.setActivationFn(Activation.RELU.getActivationFunction());

  OutputLayer outputLayer = new OutputLayer();
  outputLayer.setActivationFn(Activation.SOFTMAX.getActivationFunction());
  outputLayer.setLayerName("Output-layer");

  NeuralNetConfiguration nnc = new NeuralNetConfiguration();

  clf.setNumEpochs(DEFAULT_NUM_EPOCHS);
  clf.setNeuralNetConfiguration(nnc);
  clf.setLayers(denseLayer, outputLayer);

  clf.buildClassifier(data);
  clf.distributionsForInstances(data);
}
 
开发者ID:Waikato,项目名称:wekaDeeplearning4j,代码行数:21,代码来源:DatasetTest.java


示例4: net

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
private static MultiLayerConfiguration net(int nIn, int nOut) {
    return new NeuralNetConfiguration.Builder()
            .seed(42)
            .iterations(1)
            .activation(Activation.RELU)
            .weightInit(WeightInit.XAVIER)
            .learningRate(0.1)
            .regularization(true).l2(1e-4)
            .list(
                    new DenseLayer.Builder().nIn(nIn).nOut(3).build(),
                    new DenseLayer.Builder().nIn(3).nOut(3).build(),
                    new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD)
                            .activation(Activation.SOFTMAX)
                            .nIn(3)
                            .nOut(nOut)
                            .build()
            )
            .build();
}
 
开发者ID:wmeddie,项目名称:dl4j-trainer-archetype,代码行数:20,代码来源:Train.java


示例5: getGraphConfCNN

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
private static ComputationGraphConfiguration getGraphConfCNN(int seed, IUpdater updater) {
    Nd4j.getRandom().setSeed(seed);
    ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                    .weightInit(WeightInit.XAVIER).updater(updater).seed(seed).graphBuilder()
                    .addInputs("in")
                    .addLayer("0", new ConvolutionLayer.Builder().nOut(3).kernelSize(2, 2).stride(1, 1)
                                    .padding(0, 0).activation(Activation.TANH).build(), "in")
                    .addLayer("1", new ConvolutionLayer.Builder().nOut(3).kernelSize(2, 2).stride(1, 1)
                                    .padding(0, 0).activation(Activation.TANH).build(), "0")
                    .addLayer("2", new OutputLayer.Builder().lossFunction(LossFunctions.LossFunction.MSE).nOut(10)
                                    .build(), "1")
                    .setOutputs("2").setInputTypes(InputType.convolutional(10, 10, 3)).pretrain(false)
                    .backprop(true).build();
    return conf;
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:17,代码来源:TestCompareParameterAveragingSparkVsSingleMachine.java


示例6: testDataSetScore

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
@Test
public void testDataSetScore() {

    Nd4j.getRandom().setSeed(12345);
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                    .weightInit(WeightInit.XAVIER).seed(12345L).list()
                    .layer(0, new DenseLayer.Builder().nIn(4).nOut(3).activation(Activation.SIGMOID).build())
                    .layer(1, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT)
                                    .activation(Activation.SOFTMAX).nIn(3).nOut(3).build())
                    .pretrain(false).backprop(true).build();

    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();

    INDArray in = Nd4j.create(new double[] {1.0, 2.0, 3.0, 4.0});
    INDArray out = Nd4j.create(new double[] {1, 0, 0});

    double score = net.score(new DataSet(in, out));
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:20,代码来源:MultiLayerTest.java


示例7: testMultiCNNLayer

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
@Test
public void testMultiCNNLayer() throws Exception {
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                    .optimizationAlgo(OptimizationAlgorithm.LINE_GRADIENT_DESCENT).seed(123).list()
                    .layer(0, new ConvolutionLayer.Builder().nIn(1).nOut(6).weightInit(WeightInit.XAVIER)
                                    .activation(Activation.RELU).build())
                    .layer(1, new LocalResponseNormalization.Builder().build()).layer(2,
                                    new DenseLayer.Builder()
                                                    .nOut(2).build())
                    .layer(3, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT)
                                    .weightInit(WeightInit.XAVIER).activation(Activation.SOFTMAX).nIn(2).nOut(10)
                                    .build())
                    .backprop(true).pretrain(false).setInputType(InputType.convolutionalFlat(28, 28, 1)).build();

    MultiLayerNetwork network = new MultiLayerNetwork(conf);
    network.init();
    DataSetIterator iter = new MnistDataSetIterator(2, 2);
    DataSet next = iter.next();

    network.fit(next);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:22,代码来源:LocalResponseTest.java


示例8: incompleteMnistLenet

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
public MultiLayerConfiguration.Builder incompleteMnistLenet() {
    MultiLayerConfiguration.Builder builder =
                    new NeuralNetConfiguration.Builder().seed(3)
                                    .optimizationAlgo(OptimizationAlgorithm.CONJUGATE_GRADIENT).list()
                                    .layer(0, new org.deeplearning4j.nn.conf.layers.ConvolutionLayer.Builder(
                                                    new int[] {5, 5}).nIn(1).nOut(20).build())
                                    .layer(1, new org.deeplearning4j.nn.conf.layers.SubsamplingLayer.Builder(
                                                    new int[] {2, 2}, new int[] {2, 2}).build())
                                    .layer(2, new org.deeplearning4j.nn.conf.layers.ConvolutionLayer.Builder(
                                                    new int[] {5, 5}).nIn(20).nOut(50).build())
                                    .layer(3, new org.deeplearning4j.nn.conf.layers.SubsamplingLayer.Builder(
                                                    new int[] {2, 2}, new int[] {2, 2}).build())
                                    .layer(4, new org.deeplearning4j.nn.conf.layers.DenseLayer.Builder().nOut(500)
                                                    .build())
                                    .layer(5, new org.deeplearning4j.nn.conf.layers.OutputLayer.Builder(
                                                    LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD)
                                                                    .activation(Activation.SOFTMAX).nOut(10)
                                                                    .build());
    return builder;
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:21,代码来源:ConvolutionLayerSetupTest.java


示例9: testListenersViaModel

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
@Test
public void testListenersViaModel() {
    TestListener.clearCounts();

    MultiLayerConfiguration.Builder builder = new NeuralNetConfiguration.Builder().list().layer(0,
                    new OutputLayer.Builder(LossFunctions.LossFunction.MSE).nIn(10).nOut(10)
                                    .activation(Activation.TANH).build());

    MultiLayerConfiguration conf = builder.build();
    MultiLayerNetwork model = new MultiLayerNetwork(conf);
    model.init();

    StatsStorage ss = new InMemoryStatsStorage();
    model.setListeners(new TestListener(), new StatsListener(ss));

    testListenersForModel(model, null);

    assertEquals(1, ss.listSessionIDs().size());
    assertEquals(2, ss.listWorkerIDsForSession(ss.listSessionIDs().get(0)).size());
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:21,代码来源:TestListeners.java


示例10: testComputeZ

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
@Test
public void testComputeZ() {

    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder().weightInit(WeightInit.XAVIER)
                    .activation(Activation.TANH).list().layer(0, new DenseLayer.Builder().nIn(10).nOut(10).build())
                    .layer(1, new DenseLayer.Builder().nIn(10).nOut(10).build()).build();

    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();

    INDArray in = Nd4j.rand(10, 10);
    List<INDArray> preOuts = net.computeZ(in, false);

    assertEquals(3, preOuts.size()); //Includes original input
    assertEquals(in, preOuts.get(0));

    INDArray preOut0 = net.getLayer(0).preOutput(in);
    INDArray out0 = net.getLayer(0).activate(in);
    assertEquals(preOut0, preOuts.get(1));

    INDArray preOut1 = net.getLayer(1).preOutput(out0);
    INDArray out1 = net.getLayer(1).activate(out0);
    assertEquals(preOut1, preOuts.get(2));
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:25,代码来源:MultiLayerTest.java


示例11: testRnnTimeStepWithPreprocessor

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
@Test
public void testRnnTimeStepWithPreprocessor() {

    MultiLayerConfiguration conf =
                    new NeuralNetConfiguration.Builder()
                                    .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                                    .list()
                                    .layer(0, new org.deeplearning4j.nn.conf.layers.GravesLSTM.Builder().nIn(10)
                                                    .nOut(10).activation(Activation.TANH).build())
                                    .layer(1, new org.deeplearning4j.nn.conf.layers.GravesLSTM.Builder().nIn(10)
                                                    .nOut(10).activation(Activation.TANH).build())
                                    .layer(2, new RnnOutputLayer.Builder(LossFunctions.LossFunction.MCXENT)
                                                    .activation(Activation.SOFTMAX).nIn(10).nOut(10).build())
                                    .inputPreProcessor(0, new FeedForwardToRnnPreProcessor()).pretrain(false)
                                    .backprop(true).build();

    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();

    INDArray in = Nd4j.rand(1, 10);
    net.rnnTimeStep(in);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:23,代码来源:MultiLayerTestRNN.java


示例12: getConf

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
private static MultiLayerConfiguration getConf() {
    MultiLayerConfiguration conf =
            new NeuralNetConfiguration.Builder().seed(12345L)
                    .list().layer(0,
                    new DenseLayer.Builder().nIn(4).nOut(3)
                            .weightInit(WeightInit.DISTRIBUTION)
                            .dist(new NormalDistribution(0,1))
                            .build())
                    .layer(1, new org.deeplearning4j.nn.conf.layers.OutputLayer.Builder(
                            LossFunctions.LossFunction.MCXENT)
                            .activation(Activation.SOFTMAX).nIn(3).nOut(3)
                            .weightInit(WeightInit.DISTRIBUTION)
                            .dist(new NormalDistribution(0, 1)).build())
                    .build();
    return conf;
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:17,代码来源:MultiLayerTest.java


示例13: testRNG

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
@Test
public void testRNG() {
    DenseLayer layer = new DenseLayer.Builder().nIn(trainingSet.numInputs()).nOut(trainingSet.numOutcomes())
                    .weightInit(WeightInit.UNIFORM).activation(Activation.TANH).build();

    NeuralNetConfiguration conf = new NeuralNetConfiguration.Builder().seed(123)
                    .optimizationAlgo(OptimizationAlgorithm.CONJUGATE_GRADIENT).layer(layer).build();

    int numParams = conf.getLayer().initializer().numParams(conf);
    INDArray params = Nd4j.create(1, numParams);
    Layer model = conf.getLayer().instantiate(conf, null, 0, params, true);
    INDArray modelWeights = model.getParam(DefaultParamInitializer.WEIGHT_KEY);


    DenseLayer layer2 = new DenseLayer.Builder().nIn(trainingSet.numInputs()).nOut(trainingSet.numOutcomes())
                    .weightInit(WeightInit.UNIFORM).activation(Activation.TANH).build();
    NeuralNetConfiguration conf2 = new NeuralNetConfiguration.Builder().seed(123)
                    .optimizationAlgo(OptimizationAlgorithm.CONJUGATE_GRADIENT).layer(layer2).build();

    int numParams2 = conf2.getLayer().initializer().numParams(conf);
    INDArray params2 = Nd4j.create(1, numParams);
    Layer model2 = conf2.getLayer().instantiate(conf2, null, 0, params2, true);
    INDArray modelWeights2 = model2.getParam(DefaultParamInitializer.WEIGHT_KEY);

    assertEquals(modelWeights, modelWeights2);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:27,代码来源:NeuralNetConfigurationTest.java


示例14: createNet

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
public static ComputationGraph createNet() throws Exception {

        ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder()
                .graphBuilder()
                .addInputs("in")
                .addLayer("0", new ConvolutionLayer.Builder().nOut(3)
                        .kernelSize(2,2).stride(2,2).build(), "in")
                .addLayer("1", new ConvolutionLayer.Builder().nOut(3)
                        .kernelSize(2,2).stride(2,2).build(), "0")
                .addLayer("out", new OutputLayer.Builder().nOut(10)
                        .activation(Activation.TANH).lossFunction(LossFunctions.LossFunction.MSE)
                        .build(), "1")
                .setOutputs("out")
                .setInputTypes(InputType.convolutional(28,28,1))
                .build();

        ComputationGraph model = new ComputationGraph(conf);
        model.init();

        return model;
    }
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:22,代码来源:WorkspaceTests.java


示例15: testPredict

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
@Test
public void testPredict() throws Exception {

    Nd4j.getRandom().setSeed(12345);
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                    .weightInit(WeightInit.XAVIER).seed(12345L).list()
                    .layer(0, new DenseLayer.Builder().nIn(784).nOut(50).activation(Activation.RELU).build())
                    .layer(1, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT)
                                    .activation(Activation.SOFTMAX).nIn(50).nOut(10).build())
                    .pretrain(false).backprop(true).setInputType(InputType.convolutional(28, 28, 1)).build();

    MultiLayerNetwork net = new MultiLayerNetwork(conf);
    net.init();

    DataSetIterator ds = new MnistDataSetIterator(10, 10);
    net.fit(ds);

    DataSetIterator testDs = new MnistDataSetIterator(1, 1);
    DataSet testData = testDs.next();
    testData.setLabelNames(Arrays.asList("0", "1", "2", "3", "4", "5", "6", "7", "8", "9"));
    String actualLables = testData.getLabelName(0);
    List<String> prediction = net.predict(testData);
    assertTrue(actualLables != null);
    assertTrue(prediction.get(0) != null);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:26,代码来源:MultiLayerTest.java


示例16: getTestLayers

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
public static List<Pair<? extends Layer, InputType>> getTestLayers() {
    List<Pair<? extends Layer, InputType>> l = new ArrayList<>();
    l.add(new Pair<>(new ActivationLayer.Builder().activation(Activation.TANH).build(), InputType.feedForward(20)));
    l.add(new Pair<>(new DenseLayer.Builder().nIn(20).nOut(20).build(), InputType.feedForward(20)));
    l.add(new Pair<>(new DropoutLayer.Builder().nIn(20).nOut(20).build(), InputType.feedForward(20)));
    l.add(new Pair<>(new EmbeddingLayer.Builder().nIn(1).nOut(20).build(), InputType.feedForward(20)));
    l.add(new Pair<>(new OutputLayer.Builder().nIn(20).nOut(20).build(), InputType.feedForward(20)));
    l.add(new Pair<>(new LossLayer.Builder().build(), InputType.feedForward(20)));

    //RNN layers:
    l.add(new Pair<>(new GravesLSTM.Builder().nIn(20).nOut(20).build(), InputType.recurrent(20, 30)));
    l.add(new Pair<>(new LSTM.Builder().nIn(20).nOut(20).build(), InputType.recurrent(20, 30)));
    l.add(new Pair<>(new GravesBidirectionalLSTM.Builder().nIn(20).nOut(20).build(), InputType.recurrent(20, 30)));
    l.add(new Pair<>(new RnnOutputLayer.Builder().nIn(20).nOut(20).build(), InputType.recurrent(20, 30)));

    return l;
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:18,代码来源:TestMemoryReports.java


示例17: buildModel

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
public void buildModel() {
    //Create the network
    int numInput = 2;
    int numOutputs = 1;
    int nHidden = 10;
    mNetwork = new MultiLayerNetwork(new NeuralNetConfiguration.Builder()
            .seed(mSeed)
            .iterations(ITERATIONS)
            .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
            .learningRate(LEARNING_RATE)
            .weightInit(WeightInit.XAVIER)
            .updater(Updater.NESTEROVS)
            .list()
            .layer(0, new DenseLayer.Builder().nIn(numInput).nOut(nHidden)
                    .activation(Activation.TANH)
                    .name("input")
                    .build())
            .layer(1, new OutputLayer.Builder(LossFunctions.LossFunction.MSE)
                    .activation(Activation.IDENTITY)
                    .name("output")
                    .nIn(nHidden).nOut(numOutputs).build())
            .pretrain(false)
            .backprop(true)
            .build()
    );
    mNetwork.init();
    mNetwork.setListeners(mIterationListener);
}
 
开发者ID:mccorby,项目名称:FederatedAndroidTrainer,代码行数:29,代码来源:LinearModel.java


示例18: configuration

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
public static MultiLayerConfiguration configuration() {
	/*
	 * Regarding the .setInputType(InputType.convolutionalFlat(28,28,1)) line: This does a few things. (a) It adds
	 * preprocessors, which handle things like the transition between the convolutional/subsampling layers and the dense
	 * layer (b) Does some additional configuration validation (c) Where necessary, sets the nIn (number of input
	 * neurons, or input depth in the case of CNNs) values for each layer based on the size of the previous layer (but
	 * it won't override values manually set by the user) InputTypes can be used with other layer types too (RNNs, MLPs
	 * etc) not just CNNs. For normal images (when using ImageRecordReader) use
	 * InputType.convolutional(height,width,depth). MNIST record reader is a special case, that outputs 28x28 pixel
	 * grayscale (nChannels=1) images, in a "flattened" row vector format (i.e., 1x784 vectors), hence the
	 * "convolutionalFlat" input type used here.
	 */
	return new NeuralNetConfiguration.Builder().seed(SEED).iterations(NUM_ITERATIONS).regularization(true).l2(0.0005).
	/*
	 * Uncomment the following for learning decay and bias
	 */
			learningRate(.01).// biasLearningRate(0.02).
			// learningRateDecayPolicy(LearningRatePolicy.Inverse).lrPolicyDecayRate(0.001).lrPolicyPower(0.75).
			weightInit(WeightInit.XAVIER).optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
			.updater(Updater.NESTEROVS).momentum(0.9).list()
			.layer(0,
					new ConvolutionLayer.Builder(5, 5).nIn(NUM_CHANNELS).stride(1, 1).nOut(20).activation(Activation.IDENTITY)
							.build())
			.layer(1, new SubsamplingLayer.Builder(SubsamplingLayer.PoolingType.MAX).kernelSize(2, 2).stride(2, 2).build())
			.layer(2, new ConvolutionLayer.Builder(5, 5).stride(1, 1).nOut(50).activation(Activation.IDENTITY).build())
			.layer(3, new SubsamplingLayer.Builder(SubsamplingLayer.PoolingType.MAX).kernelSize(2, 2).stride(2, 2).build())
			.layer(4, new DenseLayer.Builder().activation(Activation.RELU).nOut(500).build())
			.layer(5,
					new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD).nOut(NUM_OUTPUTS)
							.activation(Activation.SOFTMAX).build())
			.setInputType(InputType.convolutionalFlat(28, 28, 1)).backprop(true).pretrain(false).build();
}
 
开发者ID:braeunlich,项目名称:anagnostes,代码行数:33,代码来源:ConfigurationFactory.java


示例19: create

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
@Override
public MultiLayerConfiguration create() {
    int width = imageTransformConfigurationResource.getScaledWidth();
    int height = imageTransformConfigurationResource.getScaledHeight();
    int channels = imageTransformConfigurationResource.getChannels();
    int outputs = networkConfigurationResource.getOutputs();
    return new NeuralNetConfiguration.Builder()
            .seed(seed)
            .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
            .iterations(1)
            .learningRate(0.0001)
            .activation(Activation.RELU)
            .weightInit(WeightInit.XAVIER)
            .updater(Updater.NESTEROVS).momentum(0.9)
            .regularization(true).l2(1e-3)
            .list()
            .layer(0, new DenseLayer.Builder()
                    .nIn(width * height * channels)
                    .nOut(1200)
                    .build())
            .layer(1, new DenseLayer.Builder()
                    .nIn(1200)
                    .nOut(600)
                    .build())
            .layer(2, new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD)
                    .nIn(600)
                    .activation(Activation.SOFTMAX)
                    .nOut(outputs)
                    .build())
            .pretrain(false).backprop(true)
            .setInputType(InputType.convolutional(height, width, channels))
            .build();
}
 
开发者ID:scaliby,项目名称:ceidg-captcha,代码行数:34,代码来源:MultiLayerConfigurationFactoryImpl.java


示例20: main

import org.nd4j.linalg.activations.Activation; //导入依赖的package包/类
public static void main(String[] args) throws Exception {
    final int numRows = 28;
    final int numColumns = 28;
    int seed = 123;
    int numSamples = MnistDataFetcher.NUM_EXAMPLES;
    int batchSize = 1000;
    int iterations = 1;
    int listenerFreq = iterations/5;

    log.info("Load data....");
    DataSetIterator iter = new MnistDataSetIterator(batchSize,numSamples,true);

    log.info("Build model....");
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
            .seed(seed)
            .iterations(iterations)
            .optimizationAlgo(OptimizationAlgorithm.LINE_GRADIENT_DESCENT)
            .list(8)
            .layer(0, new RBM.Builder().nIn(numRows * numColumns).nOut(2000).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build())
            .layer(1, new RBM.Builder().nIn(2000).nOut(1000).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build())
            .layer(2, new RBM.Builder().nIn(1000).nOut(500).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build())
            .layer(3, new RBM.Builder().nIn(500).nOut(30).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build())
            .layer(4, new RBM.Builder().nIn(30).nOut(500).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build()) 
            .layer(5, new RBM.Builder().nIn(500).nOut(1000).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build())
            .layer(6, new RBM.Builder().nIn(1000).nOut(2000).lossFunction(LossFunctions.LossFunction.RMSE_XENT).build())
            .layer(7, new OutputLayer.Builder(LossFunctions.LossFunction.MSE).activation(Activation.SIGMOID).nIn(2000).nOut(numRows*numColumns).build())
            .pretrain(true).backprop(true)
            .build();

    MultiLayerNetwork model = new MultiLayerNetwork(conf);
    model.init();

    model.setListeners(new ScoreIterationListener(listenerFreq));

    log.info("Train model....");
    while(iter.hasNext()) {
        DataSet next = iter.next();
        model.fit(new DataSet(next.getFeatureMatrix(),next.getFeatureMatrix()));
    }
}
 
开发者ID:PacktPublishing,项目名称:Deep-Learning-with-Hadoop,代码行数:41,代码来源:DeepAutoEncoder.java



注:本文中的org.nd4j.linalg.activations.Activation类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java ExpiredCredentialsException类代码示例发布时间:2022-05-21
下一篇:
Java TimestampObjectInspector类代码示例发布时间:2022-05-21
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap