• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java AsyncDataSetIterator类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.deeplearning4j.datasets.iterator.AsyncDataSetIterator的典型用法代码示例。如果您正苦于以下问题:Java AsyncDataSetIterator类的具体用法?Java AsyncDataSetIterator怎么用?Java AsyncDataSetIterator使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



AsyncDataSetIterator类属于org.deeplearning4j.datasets.iterator包,在下文中一共展示了AsyncDataSetIterator类的10个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: main

import org.deeplearning4j.datasets.iterator.AsyncDataSetIterator; //导入依赖的package包/类
/**
 * args[0] input: word2vecファイル名
 * args[1] input: sentimentモデル名
 * args[2] input: test親フォルダ名
 *
 * @param args
 * @throws Exception
 */
public static void main (final String[] args) throws Exception {
  if (args[0]==null || args[1]==null || args[2]==null)
    System.exit(1);

  WordVectors wvec = WordVectorSerializer.loadTxtVectors(new File(args[0]));
  MultiLayerNetwork model = ModelSerializer.restoreMultiLayerNetwork(args[1],false);

  DataSetIterator test = new AsyncDataSetIterator(
      new SentimentRecurrentIterator(args[2],wvec,100,300,false),1);
  Evaluation evaluation = new Evaluation();
  while(test.hasNext()) {
    DataSet t = test.next();
    INDArray features = t.getFeatures();
    INDArray lables = t.getLabels();
    INDArray inMask = t.getFeaturesMaskArray();
    INDArray outMask = t.getLabelsMaskArray();
    INDArray predicted = model.output(features,false,inMask,outMask);
    evaluation.evalTimeSeries(lables,predicted,outMask);
  }
  System.out.println(evaluation.stats());
}
 
开发者ID:keigohtr,项目名称:sentiment-rnn,代码行数:30,代码来源:SentimentRecurrentTestCmd.java


示例2: testRRDSIwithAsync

import org.deeplearning4j.datasets.iterator.AsyncDataSetIterator; //导入依赖的package包/类
@Test
public void testRRDSIwithAsync() throws Exception {
    RecordReader csv = new CSVRecordReader();
    csv.initialize(new FileSplit(new ClassPathResource("iris.txt").getTempFileFromArchive()));

    int batchSize = 10;
    int labelIdx = 4;
    int numClasses = 3;

    RecordReaderDataSetIterator rrdsi = new RecordReaderDataSetIterator(csv, batchSize, labelIdx, numClasses);
    AsyncDataSetIterator adsi = new AsyncDataSetIterator(rrdsi, 8, true);
    while (adsi.hasNext()) {
        DataSet ds = adsi.next();

    }

}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:18,代码来源:RecordReaderDataSetiteratorTest.java


示例3: main

import org.deeplearning4j.datasets.iterator.AsyncDataSetIterator; //导入依赖的package包/类
/**
 * args[0] input: word2vecファイル名
 * args[1] input: 学習モデル名
 * args[2] input: train/test親フォルダ名
 * args[3] output: 学習モデル名
 *
 * @param args
 * @throws Exception
 */
public static void main (final String[] args) throws Exception {
  if (args[0]==null || args[1]==null || args[2]==null || args[3]==null)
    System.exit(1);

  WordVectors wvec = WordVectorSerializer.loadTxtVectors(new File(args[0]));
  MultiLayerNetwork model = ModelSerializer.restoreMultiLayerNetwork(args[1],true);
  int batchSize   = 16;//100;
  int testBatch   = 64;
  int nEpochs     = 1;

  System.out.println("Starting online training");
  DataSetIterator train = new AsyncDataSetIterator(
      new SentimentRecurrentIterator(args[2],wvec,batchSize,300,true),2);
  DataSetIterator test = new AsyncDataSetIterator(
      new SentimentRecurrentIterator(args[2],wvec,testBatch,300,false),2);
  for( int i=0; i<nEpochs; i++ ){
    model.fit(train);
    train.reset();

    System.out.println("Epoch " + i + " complete. Starting evaluation:");
    Evaluation evaluation = new Evaluation();
    while(test.hasNext()) {
      DataSet t = test.next();
      INDArray features = t.getFeatures();
      INDArray lables = t.getLabels();
      INDArray inMask = t.getFeaturesMaskArray();
      INDArray outMask = t.getLabelsMaskArray();
      INDArray predicted = model.output(features,false,inMask,outMask);
      evaluation.evalTimeSeries(lables,predicted,outMask);
    }
    test.reset();
    System.out.println(evaluation.stats());

    System.out.println("Save model");
    ModelSerializer.writeModel(model, new FileOutputStream(args[3]), true);
  }
}
 
开发者ID:keigohtr,项目名称:sentiment-rnn,代码行数:47,代码来源:SentimentRecurrentTrainOnlineCmd.java


示例4: FileSplitParallelDataSetIterator

import org.deeplearning4j.datasets.iterator.AsyncDataSetIterator; //导入依赖的package包/类
public FileSplitParallelDataSetIterator(@NonNull File rootFolder, @NonNull String pattern,
                @NonNull FileCallback callback, int numThreads, int bufferPerThread,
                @NonNull InequalityHandling inequalityHandling) {
    super(numThreads);

    if (!rootFolder.exists() || !rootFolder.isDirectory())
        throw new DL4JInvalidInputException("Root folder should point to existing folder");

    this.pattern = pattern;
    this.inequalityHandling = inequalityHandling;
    this.buffer = bufferPerThread;

    String modifiedPattern = pattern.replaceAll("\\%d", ".*.");

    IOFileFilter fileFilter = new RegexFileFilter(modifiedPattern);


    List<File> files = new ArrayList<>(FileUtils.listFiles(rootFolder, fileFilter, null));
    log.debug("Files found: {}; Producers: {}", files.size(), numProducers);

    if (files.isEmpty())
        throw new DL4JInvalidInputException("No suitable files were found");

    int numDevices = Nd4j.getAffinityManager().getNumberOfDevices();
    int cnt = 0;
    for (List<File> part : Lists.partition(files, files.size() / numThreads)) {
        // discard remainder
        if (cnt >= numThreads)
            break;

        int cDev = cnt % numDevices;
        asyncIterators.add(new AsyncDataSetIterator(new FileSplitDataSetIterator(part, callback), bufferPerThread,
                        true, cDev));
        cnt++;
    }

}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:38,代码来源:FileSplitParallelDataSetIterator.java


示例5: initializeIterators

import org.deeplearning4j.datasets.iterator.AsyncDataSetIterator; //导入依赖的package包/类
protected void initializeIterators(List<DataSetIterator> originals) {
    int numDevices = Nd4j.getAffinityManager().getNumberOfDevices();

    int currentDevice = Nd4j.getAffinityManager().getDeviceForCurrentThread();

    if (originals.size() % numDevices != 0)
        log.error("WARNING: number of splits doesn't match number of devices!");

    int cnt = 0;
    for (DataSetIterator iterator : originals) {
        int cDev = cnt % numDevices;
        asyncIterators.add(new AsyncDataSetIterator(iterator, bufferSizePerDevice, true, cDev));
        cnt++;
    }
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:16,代码来源:JointParallelDataSetIterator.java


示例6: testSequentialIterable

import org.deeplearning4j.datasets.iterator.AsyncDataSetIterator; //导入依赖的package包/类
@Test
public void testSequentialIterable() throws Exception {
    List<DataSet> list = new ArrayList<>();
    for (int i = 0; i < 1024; i++)
        list.add(new DataSet(Nd4j.create(new float[] {1f, 2f, 3f}), Nd4j.create(new float[] {1f, 2f, 3f})));

    int numDevices = Nd4j.getAffinityManager().getNumberOfDevices();

    ExistingDataSetIterator edsi = new ExistingDataSetIterator(list);

    MagicQueue queue = new MagicQueue.Builder().setMode(MagicQueue.Mode.SEQUENTIAL).setCapacityPerFlow(32).build();

    AsyncDataSetIterator adsi = new AsyncDataSetIterator(edsi, 10, queue);

    int cnt = 0;
    while (adsi.hasNext()) {
        DataSet ds = adsi.next();

        // making sure dataset isn't null
        assertNotEquals("Failed on round " + cnt, null, ds);

        // making sure device for this array is a "next one"
        assertEquals(cnt % numDevices, Nd4j.getAffinityManager().getDeviceForArray(ds.getFeatures()).intValue());
        assertEquals(cnt % numDevices, Nd4j.getAffinityManager().getDeviceForArray(ds.getLabels()).intValue());

        cnt++;
    }

    assertEquals(list.size(), cnt);
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:31,代码来源:MagicQueueTest.java


示例7: main

import org.deeplearning4j.datasets.iterator.AsyncDataSetIterator; //导入依赖的package包/类
public static void main(String[] args) throws Exception {

        getModelData();
        
        System.out.println("Total memory = " + Runtime.getRuntime().totalMemory());

        int batchSize = 50;
        int vectorSize = 300;
        int nEpochs = 5;
        int truncateReviewsToLength = 300;

        MultiLayerConfiguration sentimentNN = new NeuralNetConfiguration.Builder()
                .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).iterations(1)
                .updater(Updater.RMSPROP)
                .regularization(true).l2(1e-5)
                .weightInit(WeightInit.XAVIER)
                .gradientNormalization(GradientNormalization.ClipElementWiseAbsoluteValue).gradientNormalizationThreshold(1.0)
                .learningRate(0.0018)
                .list()
                .layer(0, new GravesLSTM.Builder().nIn(vectorSize).nOut(200)
                        .activation("softsign").build())
                .layer(1, new RnnOutputLayer.Builder().activation("softmax")
                        .lossFunction(LossFunctions.LossFunction.MCXENT).nIn(200).nOut(2).build())
                .pretrain(false).backprop(true).build();

        MultiLayerNetwork net = new MultiLayerNetwork(sentimentNN);
        net.init();
        net.setListeners(new ScoreIterationListener(1));

        WordVectors wordVectors = WordVectorSerializer.loadGoogleModel(new File(GNEWS_VECTORS_PATH), true, false);
        DataSetIterator trainData = new AsyncDataSetIterator(new SentimentExampleIterator(EXTRACT_DATA_PATH, wordVectors, batchSize, truncateReviewsToLength, true), 1);
        DataSetIterator testData = new AsyncDataSetIterator(new SentimentExampleIterator(EXTRACT_DATA_PATH, wordVectors, 100, truncateReviewsToLength, false), 1);

        for (int i = 0; i < nEpochs; i++) {
            net.fit(trainData);
            trainData.reset();

            Evaluation evaluation = new Evaluation();
            while (testData.hasNext()) {
                DataSet t = testData.next();
                INDArray dataFeatures = t.getFeatureMatrix();
                INDArray dataLabels = t.getLabels();
                INDArray inMask = t.getFeaturesMaskArray();
                INDArray outMask = t.getLabelsMaskArray();
                INDArray predicted = net.output(dataFeatures, false, inMask, outMask);

                evaluation.evalTimeSeries(dataLabels, predicted, outMask);
            }
            testData.reset();

            System.out.println(evaluation.stats());
        }
    }
 
开发者ID:PacktPublishing,项目名称:Machine-Learning-End-to-Endguide-for-Java-developers,代码行数:54,代码来源:DL4JSentimentAnalysisExample.java


示例8: main

import org.deeplearning4j.datasets.iterator.AsyncDataSetIterator; //导入依赖的package包/类
public static void main(String[] args) throws Exception {

      downloadData();

      int batchSize = 50;     
      int vectorSize = 300;    
      int nEpochs = 5;         
      int truncateReviewsToLength = 300;   

       
      MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
              .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).iterations(1)
              .updater(Updater.RMSPROP)
              .regularization(true).l2(1e-5)
              .weightInit(WeightInit.XAVIER)
              .gradientNormalization(GradientNormalization.ClipElementWiseAbsoluteValue).gradientNormalizationThreshold(1.0)
              .learningRate(0.0018)
              .list()
              .layer(0, new GravesLSTM.Builder().nIn(vectorSize).nOut(200)
                      .activation("softsign").build())
              .layer(1, new RnnOutputLayer.Builder().activation("softmax")
                      .lossFunction(LossFunctions.LossFunction.MCXENT).nIn(200).nOut(2).build())
              .pretrain(false)
		.backprop(true)
		.build();

      MultiLayerNetwork net = new MultiLayerNetwork(conf);
      net.init();
      net.setListeners(new ScoreIterationListener(1));
 
      WordVectors wordVectors = WordVectorSerializer.loadGoogleModel(new File(WORD_VECTORS_PATH), true, false);
      DataSetIterator train = new AsyncDataSetIterator(new SentimentExampleIterator(DATA_PATH,wordVectors,batchSize,truncateReviewsToLength,true),1);
      DataSetIterator test = new AsyncDataSetIterator(new SentimentExampleIterator(DATA_PATH,wordVectors,100,truncateReviewsToLength,false),1);

      System.out.println("Starting training");
      for( int i=0; i<nEpochs; i++ ){
          net.fit(train);
          train.reset();
          System.out.println("Epoch " + i + " complete. Starting evaluation:");

          
          Evaluation evaluation = new Evaluation();
          while(test.hasNext()){
              DataSet t = test.next();
              INDArray features = t.getFeatureMatrix();
              INDArray lables = t.getLabels();
              INDArray inMask = t.getFeaturesMaskArray();
              INDArray outMask = t.getLabelsMaskArray();
              INDArray predicted = net.output(features,false,inMask,outMask);

              evaluation.evalTimeSeries(lables,predicted,outMask);
          }
          test.reset();

          System.out.println(evaluation.stats());
      }


  
  }
 
开发者ID:PacktPublishing,项目名称:Deep-Learning-with-Hadoop,代码行数:61,代码来源:RNN.java


示例9: main

import org.deeplearning4j.datasets.iterator.AsyncDataSetIterator; //导入依赖的package包/类
/**
 * args[0] input: word2vecファイル名
 * args[1] input: train/test親フォルダ名
 * args[2] output: 出力ディレクトリ名
 *
 * @param args
 * @throws Exception
 */
public static void main (final String[] args) throws Exception {
  if (args[0]==null || args[1]==null || args[2]==null)
    System.exit(1);

  WordVectors wvec = WordVectorSerializer.loadTxtVectors(new File(args[0]));
  int numInputs   = wvec.lookupTable().layerSize();
  int numOutputs  = 2; // FIXME positive or negative
  int batchSize   = 16;//100;
  int testBatch   = 64;
  int nEpochs     = 5000;
  int thresEpochs = 10;
  double minImprovement = 1e-5;
  int listenfreq  = 10;

  MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
      .seed(7485)
      //.updater(Updater.RMSPROP)
      .updater(Updater.ADADELTA)
      //.learningRate(0.001) //RMSPROP
      //.rmsDecay(0.90) //RMSPROP
      .rho(0.95) //ADADELTA
      .epsilon(1e-5) //1e-8 //ALL
      .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
      .weightInit(WeightInit.XAVIER)
      .gradientNormalization(GradientNormalization.ClipElementWiseAbsoluteValue)
      .gradientNormalizationThreshold(1.0)
      //.regularization(true)
      //.l2(1e-5)
      .list()
      .layer(0, new GravesLSTM.Builder()
          .nIn(numInputs).nOut(numInputs)
          .activation("softsign")
          .build())
      .layer(1, new RnnOutputLayer.Builder()
          .lossFunction(LossFunctions.LossFunction.MCXENT)
          .activation("softmax")
          .nIn(numInputs).nOut(numOutputs)
          .build())
      .pretrain(false).backprop(true).build();

  MultiLayerNetwork model = new MultiLayerNetwork(conf);
  model.setListeners(new ScoreIterationListener(listenfreq));
  //model.setListeners(new HistogramIterationListener(listenfreq)); //FIXME error occur


  LOG.info("Starting training");
  DataSetIterator train = new AsyncDataSetIterator(
      new SentimentRecurrentIterator(args[1],wvec,batchSize,300,true),2);
  DataSetIterator test = new AsyncDataSetIterator(
      new SentimentRecurrentIterator(args[1],wvec,testBatch,300,false),2);

  EarlyStoppingModelSaver<MultiLayerNetwork> saver = new LocalFileModelSaver(args[2]);//new InMemoryModelSaver<>();
  EarlyStoppingConfiguration<MultiLayerNetwork> esConf = new EarlyStoppingConfiguration.Builder<MultiLayerNetwork>()
      .epochTerminationConditions(
          new MaxEpochsTerminationCondition(nEpochs),
          new ScoreImprovementEpochTerminationCondition(thresEpochs,minImprovement))
      .scoreCalculator(new DataSetLossCalculator(test, true))
      .modelSaver(saver)
      .build();

  IEarlyStoppingTrainer<MultiLayerNetwork> trainer = new EarlyStoppingTrainer(esConf,model,train);
  EarlyStoppingResult<MultiLayerNetwork> result = trainer.fit();
  LOG.info("Termination reason: " + result.getTerminationReason());
  LOG.info("Termination details: " + result.getTerminationDetails());
  LOG.info("Total epochs: " + result.getTotalEpochs());
  LOG.info("Best epoch number: " + result.getBestModelEpoch());
  LOG.info("Score at best epoch: " + result.getBestModelScore());

  //LOG.info("Save model");
  //MultiLayerNetwork best = result.getBestModel();
  //ModelSerializer.writeModel(best, new FileOutputStream(args[2]+"/sentiment.rnn.es.model"), true);

}
 
开发者ID:keigohtr,项目名称:sentiment-rnn,代码行数:82,代码来源:SentimentRecurrentTrainEarlyStopCmd.java


示例10: main

import org.deeplearning4j.datasets.iterator.AsyncDataSetIterator; //导入依赖的package包/类
/**
 * args[0] input: word2vecファイル名
 * args[1] input: train/test親フォルダ名
 * args[2] output: 学習モデル名
 *
 * @param args
 * @throws Exception
 */
public static void main (final String[] args) throws Exception {
  if (args[0]==null || args[1]==null || args[2]==null)
    System.exit(1);

  WordVectors wvec = WordVectorSerializer.loadTxtVectors(new File(args[0]));
  int numInputs   = wvec.lookupTable().layerSize();
  int numOutputs  = 2; // FIXME positive or negative
  int batchSize   = 16;//100;
  int testBatch   = 64;
  int nEpochs     = 5000;
  int listenfreq  = 10;

  MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
      .seed(7485)
      .updater(Updater.RMSPROP) //ADADELTA
      .learningRate(0.001) //RMSPROP
      .rmsDecay(0.90) //RMSPROP
      //.rho(0.95) //ADADELTA
      .epsilon(1e-8) //ALL
      .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
      .weightInit(WeightInit.XAVIER)
      .gradientNormalization(GradientNormalization.ClipElementWiseAbsoluteValue)
      .gradientNormalizationThreshold(1.0)
      //.regularization(true)
      //.l2(1e-5)
      .list()
      .layer(0, new GravesLSTM.Builder()
          .nIn(numInputs).nOut(numInputs)
          .activation("softsign")
          .build())
      .layer(1, new RnnOutputLayer.Builder()
          .lossFunction(LossFunctions.LossFunction.MCXENT)
          .activation("softmax")
          .nIn(numInputs).nOut(numOutputs)
          .build())
      .pretrain(false).backprop(true).build();

  MultiLayerNetwork model = new MultiLayerNetwork(conf);
  model.init();
  model.setListeners(new ScoreIterationListener(listenfreq));


  LOG.info("Starting training");
  DataSetIterator train = new AsyncDataSetIterator(
      new SentimentRecurrentIterator(args[1],wvec,batchSize,300,true),2);
  DataSetIterator test = new AsyncDataSetIterator(
      new SentimentRecurrentIterator(args[1],wvec,testBatch,300,false),2);
  for( int i=0; i<nEpochs; i++ ){
    model.fit(train);
    train.reset();

    LOG.info("Epoch " + i + " complete. Starting evaluation:");
    Evaluation evaluation = new Evaluation();
    while(test.hasNext()) {
      DataSet t = test.next();
      INDArray features = t.getFeatures();
      INDArray lables = t.getLabels();
      INDArray inMask = t.getFeaturesMaskArray();
      INDArray outMask = t.getLabelsMaskArray();
      INDArray predicted = model.output(features,false,inMask,outMask);
      evaluation.evalTimeSeries(lables,predicted,outMask);
    }
    test.reset();
    LOG.info(evaluation.stats());

    LOG.info("Save model");
    ModelSerializer.writeModel(model, new FileOutputStream(args[2]), true);
  }
}
 
开发者ID:keigohtr,项目名称:sentiment-rnn,代码行数:78,代码来源:SentimentRecurrentTrainCmd.java



注:本文中的org.deeplearning4j.datasets.iterator.AsyncDataSetIterator类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java LabeledStatement类代码示例发布时间:2022-05-23
下一篇:
Java AnnotationDiffer类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap