• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java Long2LongOpenHashMap类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap的典型用法代码示例。如果您正苦于以下问题:Java Long2LongOpenHashMap类的具体用法?Java Long2LongOpenHashMap怎么用?Java Long2LongOpenHashMap使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



Long2LongOpenHashMap类属于it.unimi.dsi.fastutil.longs包,在下文中一共展示了Long2LongOpenHashMap类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: buildHashMap

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
/**
 * build a hash map that maps row indices (a primitive long) to their position in the
 * {@link SortedPartition} (also a primitive long) if the row index is part of an equivalence
 * class with more than one element.
 *
 * @return
 */
public void buildHashMap() {
  if (this.rowIndexToPosition != null) {
    return;
  }
  this.rowIndexToPosition = new Long2LongOpenHashMap();
  this.rowIndexToPosition.defaultReturnValue(POSITION_NOT_PRESENT);
  for (long i = 0; i < this.orderedEquivalenceClasses.size64(); i++) {
    final LongOpenHashBigSet equivalenceClass = this.orderedEquivalenceClasses.get(i);
    if (equivalenceClass.size64() == 1) {
      continue;
    }
    for (final long rowIndex : equivalenceClass) {
      this.rowIndexToPosition.put(rowIndex, i);
    }
  }
}
 
开发者ID:HPI-Information-Systems,项目名称:metanome-algorithms,代码行数:24,代码来源:SortedPartition.java


示例2: ConcurrentSparkList

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
/**
 * Constructor
 *
 * @param _maxSize set the maximal size
 */
public ConcurrentSparkList(final int _maxSize) {

    // start spark node
    SharedService.getInstance();

    // check the parameter
    if (_maxSize <= 2) {
        throw new IllegalArgumentException("maxSize must not be <= 2");
    }

    atomicInteger = new AtomicInteger();
    atomicInteger.set(0);

    // set the parameters
    this.maxSize = _maxSize;

    data = new Long2LongOpenHashMap(maxSize);
    item2ReadCount = SharedService.parallelizePairs(data);
    item2timeStampData = SharedService.parallelizePairs(data);
    numPartitions = item2ReadCount.context().defaultParallelism();

}
 
开发者ID:jasjisdo,项目名称:spark-newsreel-recommender,代码行数:28,代码来源:ConcurrentSparkList.java


示例3: PMDSink

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
public PMDSink(File file) throws IOException {
    version       = new PMD.Version();
    seconds       = new PMD.Seconds();
    orderAdded    = new PMD.OrderAdded();
    orderExecuted = new PMD.OrderExecuted();
    orderCanceled = new PMD.OrderCanceled();
    orderDeleted  = new PMD.OrderDeleted();
    brokenTrade   = new PMD.BrokenTrade();

    currentSecond = 0;

    instrument = new Long2LongOpenHashMap();

    side = new Long2ByteOpenHashMap();

    buffer = ByteBuffer.allocate(BUFFER_CAPACITY);

    writer = BinaryFILEWriter.open(file);
}
 
开发者ID:jvirtanen,项目名称:nasdaq-tools,代码行数:20,代码来源:PMDSink.java


示例4: ChunkedHashStore

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
/** Creates a chunked hash store with given transformation strategy, hash width and progress logger.
 *
 * @param transform a transformation strategy for the elements.
 * @param tempDir a temporary directory for the store files, or {@code null} for the current directory.
 * @param hashWidthOrCountValues if positive, no associated data is saved in the store: {@link Chunk#data(long)} will return this many lower bits
 * of the first of the three hashes associated with the key; zero, values are stored; if negative, values are stored and a map from values
 * to their frequency is computed.
 * @param pl a progress logger, or {@code null}.
 */

public ChunkedHashStore(final TransformationStrategy<? super T> transform, final File tempDir, final int hashWidthOrCountValues, final ProgressLogger pl) throws IOException {
	this.transform = transform;
	this.pl = pl;
	this.tempDir = tempDir;

	this.hashMask = hashWidthOrCountValues <= 0 ? 0 : -1L >>> Long.SIZE - hashWidthOrCountValues;
	if (hashWidthOrCountValues < 0) value2FrequencyMap = new Long2LongOpenHashMap();

	file = new File[DISK_CHUNKS];
	writableByteChannel = new WritableByteChannel[DISK_CHUNKS];
	byteBuffer = new ByteBuffer[DISK_CHUNKS];
	// Create disk chunks
	for(int i = 0; i < DISK_CHUNKS; i++) {
		byteBuffer[i] = ByteBuffer.allocateDirect(BUFFER_SIZE).order(ByteOrder.nativeOrder());
		writableByteChannel[i] = new FileOutputStream(file[i] = File.createTempFile(ChunkedHashStore.class.getSimpleName(), String.valueOf(i), tempDir)).getChannel();
		file[i].deleteOnExit();
	}

	count = new int[DISK_CHUNKS];
}
 
开发者ID:vigna,项目名称:Sux4J,代码行数:31,代码来源:ChunkedHashStore.java


示例5: testLengthLimitedHuffManGeom

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
@Test
public void testLengthLimitedHuffManGeom() {
	final int size = 20;
	final long[] symbols = new long[size];
	final long[] frequency = new long[size];
	for (int i = 0; i < size; i++) {
		symbols[i] = i;
		frequency[i] = 1 << i;
	}

	final Huffman huffman = new Codec.Huffman(5);
	final Long2LongOpenHashMap frequencies = new Long2LongOpenHashMap(symbols, frequency);
	final Coder coder = huffman.getCoder(frequencies);
	final Decoder decoder = coder.getDecoder();
	for (final long l: frequencies.keySet()) {
		final long encoded = coder.encode(l);
		final long longEncoded = Long.reverse(encoded) >>> 64 - coder.maxCodewordLength();
		final long decoded = decoder.decode(longEncoded);
		assertEquals(l, decoded);
	}
}
 
开发者ID:vigna,项目名称:Sux4J,代码行数:22,代码来源:CodecTest.java


示例6: getMostFrequentLabel

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
protected long getMostFrequentLabel(Node node) {
	Long2LongMap commMap = new Long2LongOpenHashMap();
       Iterable<Relationship> relationships = relType == null ? node.getRelationships() : node.getRelationships(relType);

       for (Relationship r : relationships) {
           Node other = r.getOtherNode(node);
		long otherCommunity = (long) other.getProperty(attName);
		// commMap.put(other.getId(), otherCommunity);	WRONG
		long count = commMap.getOrDefault(otherCommunity, 0L);
		commMap.put(otherCommunity, count+1);
	}

	long mostFrequentLabel = -1;
	long mostFrequentLabelCount = -1;
	for( Entry<Long, Long> e : commMap.entrySet() ) {
		if( e.getValue() > mostFrequentLabelCount ) {
			mostFrequentLabelCount = e.getValue();
			mostFrequentLabel = e.getKey();
		}
	}
	return mostFrequentLabel;
}
 
开发者ID:besil,项目名称:Neo4jSNA,代码行数:23,代码来源:LabelPropagation.java


示例7: calculateConditions

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
public List<LongArrayList> calculateConditions(PositionListIndex partialUnique,
                                               PositionListIndex PLICondition,
                                               int frequency,
                                               List<LongArrayList> unsatisfiedClusters) {
  List<LongArrayList> result = new LinkedList<>();
  Long2LongOpenHashMap uniqueHashMap = partialUnique.asHashMap();
  LongArrayList touchedClusters = new LongArrayList();
  nextCluster:
  for (LongArrayList cluster : PLICondition.getClusters()) {
    int unsatisfactionCount = 0;
    touchedClusters.clear();
    for (long rowNumber : cluster) {
      if (uniqueHashMap.containsKey(rowNumber)) {
        if (touchedClusters.contains(uniqueHashMap.get(rowNumber))) {
          unsatisfactionCount++;
        } else {
          touchedClusters.add(uniqueHashMap.get(rowNumber));
        }
      }
    }
    if (unsatisfactionCount == 0) {
      result.add(cluster);
    } else {
      //if ((cluster.size() - unsatisfactionCount) >= frequency) {
      unsatisfiedClusters.add(cluster);
      //}
    }
  }
  return result;
}
 
开发者ID:HPI-Information-Systems,项目名称:metanome-algorithms,代码行数:31,代码来源:OrConditionTraverser.java


示例8: ConditionTask

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
public ConditionTask(int uniqueCluster, LongArrayList conditionClusters,
                     LongArrayList removedClusters, long size, long frequency,
                     Long2LongOpenHashMap andJointCluster) {
  this.uniqueClusterNumber = uniqueCluster;
  this.conditionClusters = conditionClusters.clone();
  this.removedConditionClusters = removedClusters.clone();
  this.size = size;
  this.frequency = frequency;
  this.andJointCluster = andJointCluster;
}
 
开发者ID:HPI-Information-Systems,项目名称:metanome-algorithms,代码行数:11,代码来源:OrConditionTraverser.java


示例9: remove

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
public boolean remove(long conditionClusterNumber, ConditionEntry entryToRemove) {
  if (entryToRemove.condition.getSetBits().size() < 2) {
    if (((this.size - entryToRemove.cluster.size()) + this.andJointCluster.size())
        >= this.frequency) {
      this.size = this.size - entryToRemove.cluster.size();
      this.conditionClusters.remove(conditionClusterNumber);
      this.removedConditionClusters.add(conditionClusterNumber);
      return true;
    } else {
      return false;
    }
  } else {
    Long2LongOpenHashMap newAndJointCluster = andJointCluster.clone();
    for (long row : entryToRemove.cluster) {
      if (newAndJointCluster.containsKey(row)) {
        long previousValue = newAndJointCluster.get(row);
        previousValue--;
        if (0 == previousValue) {
          newAndJointCluster.remove(row);
        } else {
          newAndJointCluster.put(row, previousValue);
        }
      } else {
        //dunno
      }
    }
    if (this.size + newAndJointCluster.size() >= this.frequency) {
      this.andJointCluster = newAndJointCluster;
      this.conditionClusters.remove(conditionClusterNumber);
      this.removedConditionClusters.add(conditionClusterNumber);
      return true;
    } else {
      return false;
    }
  }
}
 
开发者ID:HPI-Information-Systems,项目名称:metanome-algorithms,代码行数:37,代码来源:OrConditionTraverser.java


示例10: calculateConditions

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
public List<LongArrayList> calculateConditions(PositionListIndex partialUnique,
                                               PositionListIndex PLICondition,
                                               int frequency,
                                               List<LongArrayList> unsatisfiedClusters) {
  List<LongArrayList> result = new LinkedList<>();
  Long2LongOpenHashMap uniqueHashMap = partialUnique.asHashMap();
  LongArrayList touchedClusters = new LongArrayList();
  nextCluster:
  for (LongArrayList cluster : PLICondition.getClusters()) {
    if (cluster.size() < frequency) {
      continue;
    }
    int unsatisfactionCount = 0;
    touchedClusters.clear();
    for (long rowNumber : cluster) {
      if (uniqueHashMap.containsKey(rowNumber)) {
        if (touchedClusters.contains(uniqueHashMap.get(rowNumber))) {
          unsatisfactionCount++;
        } else {
          touchedClusters.add(uniqueHashMap.get(rowNumber));
        }
      }
    }
    if (unsatisfactionCount == 0) {
      result.add(cluster);
    } else {
      if ((cluster.size() - unsatisfactionCount) >= frequency) {
        unsatisfiedClusters.add(cluster);
      }
    }
  }
  return result;
}
 
开发者ID:HPI-Information-Systems,项目名称:metanome-algorithms,代码行数:34,代码来源:SimpleConditionTraverser.java


示例11: pliHashmapTest

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
@Test
public void pliHashmapTest()
        throws CouldNotReceiveResultException, UnsupportedEncodingException, FileNotFoundException,
        InputGenerationException, InputIterationException, AlgorithmConfigurationException {
  //Setup
  AbaloneFixture fixture = new AbaloneFixture();
  RelationalInput input = fixture.getInputGenerator().generateNewCopy();
  PLIBuilder builder = new PLIBuilder(input);
  List<PositionListIndex> pliList = builder.getPLIList();

  List<PositionListIndex> pliListCopy = new ArrayList<>();
  for (int i = 0; i < pliList.size(); i++) {
    pliListCopy.add(i, new PositionListIndex(pliList.get(i).getClusters()));
  }

  for (int i = 0; i < pliList.size(); i++) {
    PositionListIndex pli = pliList.get(i);
    PositionListIndex pliCopy = pliListCopy.get(i);

    Long2LongOpenHashMap pliHash = pli.asHashMap();
    Long2LongOpenHashMap pliCopyHash = pliCopy.asHashMap();

    assertEquals(pliHash, pliCopyHash);
    assertEquals(pliHash.keySet(), pliCopyHash.keySet());

    for (long row : pliHash.keySet()) {
      assertEquals(pliHash.get(row), pliCopyHash.get(row));
    }
  }
}
 
开发者ID:HPI-Information-Systems,项目名称:metanome-algorithms,代码行数:31,代码来源:DcuccSimpleTest.java


示例12: pliHashmapTest2

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
@Test
public void pliHashmapTest2()
        throws CouldNotReceiveResultException, UnsupportedEncodingException, FileNotFoundException,
        InputGenerationException, InputIterationException, AlgorithmConfigurationException {
  //Setup
  AbaloneFixture fixture = new AbaloneFixture();
  RelationalInput input = fixture.getInputGenerator().generateNewCopy();
  PLIBuilder builder = new PLIBuilder(input);
  List<PositionListIndex> pliList = builder.getPLIList();

  List<Long2LongOpenHashMap> pliListCopy = new ArrayList<>();
  for (int i = 0; i < pliList.size(); i++) {
    pliListCopy.add(i, pliList.get(i).asHashMap());
  }

  for (int i = 0; i < pliList.size(); i++) {
    PositionListIndex pli = pliList.get(i);

    Long2LongOpenHashMap pliHash = pli.asHashMap();
    Long2LongOpenHashMap pliCopyHash = pliListCopy.get(i);

    assertEquals(pliHash, pliCopyHash);
    assertEquals(pliHash.keySet(), pliCopyHash.keySet());

    for (long row : pliHash.keySet()) {
      assertEquals(pliHash.get(row), pliCopyHash.get(row));
    }
  }
}
 
开发者ID:HPI-Information-Systems,项目名称:metanome-algorithms,代码行数:30,代码来源:DcuccSimpleTest.java


示例13: testGamma

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
@Test
public void testGamma() {
	final Codec.Gamma gamma = new Codec.Gamma();
	final Long2LongOpenHashMap frequencies = new Long2LongOpenHashMap(new long[] { 6, 9, 1, 2, 4, 5, 3, 4, 7, 10000000 }, new long[] { 64, 32, 16, 1, 8, 4, 20, 2, 1, 10 });
	final Coder coder = gamma.getCoder(frequencies);
	final Decoder decoder = coder.getDecoder();
	for (int i = 0; i < 10000000; i++) {
		final long encoded = coder.encode(i);
		final long longEncoded = Long.reverse(encoded) >>> 64 - coder.maxCodewordLength();
		final long decoded = decoder.decode(longEncoded);
		assertEquals(i, decoded);
	}
}
 
开发者ID:vigna,项目名称:Sux4J,代码行数:14,代码来源:CodecTest.java


示例14: testHuffman

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
@Test
public void testHuffman() {
	final Long2LongOpenHashMap frequencies = new Long2LongOpenHashMap(new long[] { 6, 9, 1, 2, 4, 5, 3, 4, 7, 1000 }, new long[] { 64, 32, 16, 1, 8, 4, 20, 2, 1, 10 });
	final Huffman huffman = new Codec.Huffman();
	final Coder coder = huffman.getCoder(frequencies);
	final Decoder decoder = coder.getDecoder();
	for (final long l: frequencies.keySet()) {
		final long encoded = coder.encode(l);
		final long longEncoded = Long.reverse(encoded) >>> 64 - coder.maxCodewordLength();
		final long decoded = decoder.decode(longEncoded);
		assertEquals(l, decoded);
	}
}
 
开发者ID:vigna,项目名称:Sux4J,代码行数:14,代码来源:CodecTest.java


示例15: testLengthLimitedLengthHuffman

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
@Test
public void testLengthLimitedLengthHuffman() {
	final Long2LongOpenHashMap frequencies = new Long2LongOpenHashMap(new long[] { 6, 9, 1, 2, 4, 5, 3, 4, 7, 1000 }, new long[] { 64, 32, 16, 1, 8, 4, 20, 2, 1, 10 });
	final Huffman huffman = new Codec.Huffman(2);
	final Coder coder = huffman.getCoder(frequencies);
	final Decoder decoder = coder.getDecoder();
	for (final long l: frequencies.keySet()) {
		final long encoded = coder.encode(l);
		final long longEncoded = Long.reverse(encoded) >>> 64 - coder.maxCodewordLength();
		final long decoded = decoder.decode(longEncoded);
		assertEquals(l, decoded);
	}
}
 
开发者ID:vigna,项目名称:Sux4J,代码行数:14,代码来源:CodecTest.java


示例16: testUnary

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
@Test
public void testUnary() {
	final Codec.Unary unary = new Codec.Unary();
	final Long2LongOpenHashMap frequencies = new Long2LongOpenHashMap(new long[] { 6, 9, 1, 2, 4, 5, 3, 4, 7, 61 }, new long[] { 64, 32, 16, 1, 8, 4, 10, 2, 1, 1 });
	final Coder coder = unary.getCoder(frequencies);
	final Decoder decoder = coder.getDecoder();
	for (int i = 0; i < 62; i++) {
		final long encoded = coder.encode(i);
		final long longEncoded = Long.reverse(encoded) >>> 64 - coder.maxCodewordLength();
		final long decoded = decoder.decode(longEncoded);
		assertEquals(i, decoded);
	}
}
 
开发者ID:vigna,项目名称:Sux4J,代码行数:14,代码来源:CodecTest.java


示例17: testBinary

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
@Test
public void testBinary() {
	final Binary binary = new Codec.Binary();
	final Long2LongOpenHashMap frequencies = new Long2LongOpenHashMap(new long[] { 6, 9, 1, 2, 4, 5, 3, 4, 7, 1000 }, new long[] { 64, 32, 16, 1, 8, 4, 10, 2, 1, 1 });
	final Coder coder = binary.getCoder(frequencies);
	final Decoder decoder = coder.getDecoder();
	for (int i = 0; i <= 1000; i++) {
		final long encoded = coder.encode(i);
		final long decoded = decoder.decode(Long.reverse(encoded) >>> 64 - coder.maxCodewordLength());
		assertEquals(i, decoded);
	}
}
 
开发者ID:vigna,项目名称:Sux4J,代码行数:13,代码来源:CodecTest.java


示例18: rebuildToCapacity

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
private void rebuildToCapacity(long newCapacity) {
    Long2LongOpenHashMap newNode2count = new Long2LongOpenHashMap(MAP_INITIAL_SIZE, MAP_LOAD_FACTOR);
    
    // rebuild to newLogCapacity.
    // This means that our current tree becomes a leftmost subtree
    // of the new tree.
    // E.g. when rebuilding a tree with logCapacity = 2
    // (i.e. storing values in 0..3) to logCapacity = 5 (i.e. 0..31):
    // node 1 => 8 (+= 7 = 2^0*(2^3-1))
    // nodes 2..3 => 16..17 (+= 14 = 2^1*(2^3-1))
    // nodes 4..7 => 32..35 (+= 28 = 2^2*(2^3-1))
    // This is easy to see if you draw it on paper.
    // Process the keys by "layers" in the original tree.
    long scaleR = newCapacity / capacity - 1;
    Long[] keys = node2count.keySet().toArray(new Long[node2count.size()]);
    Arrays.sort(keys);
    long scaleL = 1;
    
    for (long k : keys) {
        while (scaleL <= k / 2) {
            scaleL <<= 1;
        }
        newNode2count.put(k + scaleL * scaleR, node2count.get(k));
    }
    
    node2count = newNode2count;
    capacity = newCapacity;
    compressFully();
}
 
开发者ID:mayconbordin,项目名称:streaminer,代码行数:30,代码来源:QDigest.java


示例19: combineClusterIntoResult

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
protected void combineClusterIntoResult(ColumnCombinationBitset partialUnique)
    throws AlgorithmExecutionException {
  LongArrayList touchedCluster = new LongArrayList();
  Long2LongOpenHashMap partialUniqueHash = this.algorithm.getPLI(partialUnique).asHashMap();
  Set<ColumnCombinationBitset> startPoints = this.getConditionStartPoints();
  for (ColumnCombinationBitset minimalConditionStartPoint : startPoints) {
    if (minimalConditionStartPoint.getSetBits().size() != 1) {
      minimalConditionStartPoint =
          minimalConditionStartPoint.getContainedOneColumnCombinations().get(0);
    }

    List<ConditionEntry> satisfiedCluster = new ArrayList<>();
    Long2ObjectOpenHashMap<LongArrayList> intersectingCluster = new Long2ObjectOpenHashMap<>();
    int clusterNumber = 0;
    //build intersecting cluster
    for (ConditionEntry singleCluster : this.singleConditions.get(minimalConditionStartPoint)) {
      satisfiedCluster.add(singleCluster.setClusterNumber(clusterNumber));
      touchedCluster.clear();
      for (long rowNumber : singleCluster.cluster) {
        if (partialUniqueHash.containsKey(rowNumber)) {
          touchedCluster.add(partialUniqueHash.get(rowNumber));
        }
      }
      for (long partialUniqueClusterNumber : touchedCluster) {
        if (intersectingCluster.containsKey(partialUniqueClusterNumber)) {
          intersectingCluster.get(partialUniqueClusterNumber).add(clusterNumber);
        } else {
          LongArrayList newConditionClusterNumbers = new LongArrayList();
          newConditionClusterNumbers.add(clusterNumber);
          intersectingCluster.put(partialUniqueClusterNumber, newConditionClusterNumbers);
        }
      }
      clusterNumber++;
    }
    intersectingCluster = purgeIntersectingClusterEntries(intersectingCluster);
    //convert into list
    List<LongArrayList> intersectingClusterList = new ArrayList<>();
    for (long partialUniqueCluster : intersectingCluster.keySet()) {
      intersectingClusterList.add(intersectingCluster.get(partialUniqueCluster));
    }

    Object2FloatArrayMap<List<ConditionEntry>>
        clustergroups =
        this.combineClusters(this.algorithm.frequency, satisfiedCluster,
                             intersectingClusterList);

    for (List<ConditionEntry> singleCondition : clustergroups.keySet()) {
      ResultSingleton.getInstance().addConditionToResult(partialUnique, singleCondition,
                                                         clustergroups.get(singleCondition));
    }
  }
}
 
开发者ID:HPI-Information-Systems,项目名称:metanome-algorithms,代码行数:53,代码来源:OrConditionTraverser.java


示例20: uploadGraph

import it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap; //导入依赖的package包/类
@Override
public void uploadGraph(Graph graph) throws Exception {
	LOG.info("Importing graph \"{}\" into a Neo4j database", graph.getName());

	String databasePath = Paths.get(dbPath, graph.getName()).toString();

	InputStream propertiesStream = getClass().getResourceAsStream(PROPERTIES_PATH);
	Map<String, String> properties = MapUtil.load(propertiesStream);
	BatchInserter inserter = BatchInserters.inserter(databasePath, properties);

	LOG.debug("- Inserting vertices");

	Long2LongMap vertexIdMap = new Long2LongOpenHashMap((int)graph.getNumberOfVertices());
	try (BufferedReader vertexData = new BufferedReader(new FileReader(graph.getVertexFilePath()))) {
		Map<String, Object> propertiesCache = new HashMap<>(1, 1.0f);
		for (String vertexLine = vertexData.readLine(); vertexLine != null; vertexLine = vertexData.readLine()) {
			if (vertexLine.isEmpty()) {
				continue;
			}

			long vertexId = Long.parseLong(vertexLine);
			propertiesCache.put(ID_PROPERTY, vertexId);
			long internalVertexId = inserter.createNode(propertiesCache, (Label)Vertex);
			vertexIdMap.put(vertexId, internalVertexId);
		}
	}

	LOG.debug("- Inserting edges");

	try (BufferedReader edgeData = new BufferedReader(new FileReader(graph.getEdgeFilePath()))) {
		for (String edgeLine = edgeData.readLine(); edgeLine != null; edgeLine = edgeData.readLine()) {
			if (edgeLine.isEmpty()) {
				continue;
			}

			String[] edgeLineChunks = edgeLine.split(" ");
			if (edgeLineChunks.length != 2) {
				throw new IOException("Invalid data found in edge list: \"" + edgeLine + "\"");
			}

			inserter.createRelationship(vertexIdMap.get(Long.parseLong(edgeLineChunks[0])),
					vertexIdMap.get(Long.parseLong(edgeLineChunks[1])), EDGE, null);
		}
	}

	inserter.createDeferredSchemaIndex(Vertex).on(ID_PROPERTY).create();

	inserter.shutdown();

	LOG.debug("- Graph \"{}\" imported successfully", graph.getName());
}
 
开发者ID:atlarge-research,项目名称:graphalytics-platforms-neo4j,代码行数:52,代码来源:Neo4jPlatform.java



注:本文中的it.unimi.dsi.fastutil.longs.Long2LongOpenHashMap类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java Event类代码示例发布时间:2022-05-22
下一篇:
Java ChildNumber类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap