The combiner does not reduce a list of 0's and 1's. When the stream is not run in parallel it's not used in this case so that the following loop is equivalent:
U result = identity;
for (T element : this stream)
result = accumulator.apply(result, element)
return result;
When you run the stream in parallel, the task is spanned into multiple threads. So for example the data in the pipeline is partitioned into chunks that evaluate and produce a result independently. Then the combiner is used to merge this results.
So you won't see a list that is reduced, but rather 2 values either the identity value or with another value computed by a task that are summed. For example if you add a print statement in the combiner
(i1, i2) -> {System.out.println("Merging: "+i1+"-"+i2); return i1+i2;});
you could see something like this:
Merging: 0-0
Merging: 0-0
Merging: 1-0
Merging: 1-0
Merging: 1-1
This would be helpful in debugging more complex situations which I'm
sure I'll come across eventaully.
More generally if you want to see the data on the pipeline on the go you can use peek
(or the debugger could also help). So applied to your example:
long countOfAWords = result.stream().map(s -> s.charAt(0) == 'A' ? 1 : 0).peek(System.out::print).mapToLong(l -> l).sum();
which can output:
100100
[Disclaimer: I realize this is probably not the best way to write this
code; it's just for practice!].
The idiomatic way to achieve your task would be to filter
the stream and then simply use count
:
long countOfAWords = result.stream().filter(s -> s.charAt(0) == 'A').count();
Hope it helps! :)
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…