Stream
Use this method to create a Stream object.
new ArrayList<>().stream()
Filter
Filter, which passes a function. If the return result of this function is true, the element will be retained, otherwise the element will be discarded.
stringCollection .stream() .filter((s) -> s.startsWith("a")) .forEach(System.out::println);
Foreach
Traversal, consumption.
stringCollection .stream() .filter((s) -> s.startsWith("a")) .forEach(System.out::println);
Map
This function is also traversal, but it has a return value, and the Foreach above has no return value, just pure consumption. Moreover, Foreach cannot be called chained because there is no return value, but the Map is OK.
stringCollection .stream() .map(String::toUpperCase) .sorted(Comparator.reverseOrder()) .forEach(System.out::println);
Sorted
This method is used for sorting. The passed function is a comparator. You can also use the default function without passing parameters.
stringCollection .stream() .sorted(( x, y)-> y.length()-x.length()) .filter((s) -> s.startsWith("a")) .forEach(System.out::println);
Match
Returns true or false according to whether the given stream object contains the specified content.
Specific are:
- allMatch
- anyMatch
- noneMatch
boolean anyStartsWithA = stringCollection .stream() .anyMatch((s) -> s.startsWith("a")); boolean allStartsWithA = stringCollection .stream() .allMatch((s) -> s.startsWith("a")); boolean noneStartsWithZ = stringCollection .stream() .noneMatch((s) -> s.startsWith("z"));
count
Counts the number of elements in the collection.
long startsWithB = stringCollection .stream() .filter((s) -> s.startsWith("b")) .count();
reduce
This function is similar to Fibonacci sequence. The parameters passed each time are the last result and the new elements taken from the set. The first time, the first element and the second element are taken out by default.
A simple example is to take 0 and 1 for the first time, take the result of the first reduce as the first parameter, take 2 as the second parameter, and so on.
Optional<String> reduced = stringCollection .stream() .sorted() .reduce((s1, s2) -> s1 + "#" + s2);
parallelStream
Parallel steam streams can be processed in parallel, which will be more efficient. When using stream There is no thread safety problem in this traversal during foreach, but there will be thread safety problems when using parallel stream. All external variables used in parallel stream, such as collections, must use thread safety collections, otherwise it will cause multi-threaded safety problems. If you need to ensure security, you need to use reduce and collect, but this is super troublesome!!!
long count = values.parallelStream().sorted().count();
IntStream.range(a,b)
You can directly generate integers from a to b. here, you still follow most conventions of the programming language, that is, including the head and not the tail.
IntStream.range(0, 10) .forEach(System.out::println);
The output is
0 1 2 3 4 5 6 7 8 9
new Random().ints()
Get a series of random values. The data from this interface is continuous, so you need to limit it with limit.
new Random().ints().limit(10).forEach(System.out::println);
Supplier
Supplier<String> stringSupplier=String::new; stringSupplier.get();
The interface is an abstract method, get method, which directly returns an instance of generic T without passing in any parameters It's like a nonparametric structure
Consumer
1. accept method
The only abstract method of this functional interface receives a parameter without a return value
2. andThen method
After executing the caller method, execute the method of passing in parameters
public class ConsumerTest { public static void main(String[] args) { Consumer<Integer> consumer = (x) -> { int num = x * 2; System.out.println(num); }; Consumer<Integer> consumer1 = (x) -> { int num = x * 3; System.out.println(num); }; consumer.andThen(consumer1).accept(10); }
Execute consumer. First Accept (10) then executes consumer1 accept(10)
ifPresent
If there is a value for an option, execute it. Otherwise, do not execute it.
IntStream .builder() .add(1) .add(3) .add(5) .add(7) .add(11) .build() .average() .ifPresent(System.out::println);
The result of average execution is an optional
Collect
He has two ways to call
<R> R collect(Supplier<R> supplier, BiConsumer<R, ? super T> accumulator, BiConsumer<R, R> combiner); <R, A> R collect(Collector<? super T, A, R> collector);
The following mainly introduces the use of these two methods:
1. Function
The interface of the first calling method is as follows
<R> R collect(Supplier<R> supplier, BiConsumer<R, ? super T> accumulator, BiConsumer<R, R> combiner);
- The supplier parameter provides a container. You can see that the result of the last collect operation is an R-type variable, and the supplier interface finally needs to return an R-type variable. Therefore, what is returned here is the container for collecting elements.
- For the accumulator parameter, you can see that the definition of this function is passed into an R container, followed by T-type elements. You need to put this t into the R container, that is, this step is used to add elements to the container.
- The parameter combiner is two containers, that is, how the container aggregates when multiple containers appear.
A simple example:
String concat = stringStream.collect(StringBuilder::new, StringBuilder::append,StringBuilder::append).toString(); //Equivalent to the above, so it should look clearer String concat = stringStream.collect(() -> new StringBuilder(),(l, x) -> l.append(x), (r1, r2) -> r1.append(r2)).toString();
2. Collector interface
The second scheme is a more advanced usage, using the Collector interface:
<R, A> R collect(Collector<? super T, A, R> collector);
You can see that it returns an R-type variable, that is, a container.
The Collector interface is the ultimate weapon that makes the collect operation powerful. For most operations, it can be decomposed into its main steps, providing the initial container - > adding elements to the container - > concurrent multi container aggregation - > operating the aggregated results
static class CollectorImpl<T, A, R> implements Collector<T, A, R> { private final Supplier<A> supplier; private final BiConsumer<A, T> accumulator; private final BinaryOperator<A> combiner; private final Function<A, R> finisher; private final Set<Characteristics> characteristics; CollectorImpl(Supplier<A> supplier, BiConsumer<A, T> accumulator, BinaryOperator<A> combiner, Function<A,R> finisher, Set<Characteristics> characteristics) { this.supplier = supplier; this.accumulator = accumulator; this.combiner = combiner; this.finisher = finisher; this.characteristics = characteristics; } CollectorImpl(Supplier<A> supplier, BiConsumer<A, T> accumulator, BinaryOperator<A> combiner, Set<Characteristics> characteristics) { this(supplier, accumulator, combiner, castingIdentity(), characteristics); } @Override public BiConsumer<A, T> accumulator() { return accumulator; } @Override public Supplier<A> supplier() { return supplier; } @Override public BinaryOperator<A> combiner() { return combiner; } @Override public Function<A, R> finisher() { return finisher; } @Override public Set<Characteristics> characteristics() { return characteristics; } }
You can see that we can directly create new CollectorImpl and pass these functions in. Another simple way is to use new collector Of () can still be passed directly into the function. It is equivalent to new CollectorImpl.
3. Tool function
1. toList()
Container: ArrayList::new
Add container operation: List::add
Multi container merge: left addAll(right); return left;
public static <T> Collector<T, ?, List<T>> toList() { return new CollectorImpl<>((Supplier<List<T>>) ArrayList::new, List::add, (left, right) -> { left.addAll(right); return left; }, CH_ID); }
2.joining()
Container: StringBuilder::new
Add container operation: StringBuilder::append
Multi container merge: R1 append(r2); return r1;
Result operation after aggregation: StringBuilder::toString
public static Collector<CharSequence, ?, String> joining() { return new CollectorImpl<CharSequence, StringBuilder, String>( StringBuilder::new, StringBuilder::append, (r1, r2) -> { r1.append(r2); return r1; }, StringBuilder::toString, CH_NOID); }
3.groupingBy()
roupingBy is an advanced method of toMap, which makes up for the fact that toMap cannot provide diversified collection operations for values. For example, for the form of return Map < T, list < E > >, toMap is not so easy. Then the focus of groupingBy is the processing and encapsulation of key and Value values Analyze the following code, where classifier is the processing of key Value, mapFactory is the specific type of container specifying Map, and downstream is the collection operation of Value
public static <T, K, D, A, M extends Map<K, D>> Collector<T, ?, M> groupingBy(Function<? super T, ? extends K> classifier, Supplier<M> mapFactory, Collector<? super T, A, D> downstream) { ....... }
A simple example
//Primary form Lists.<Person>newArrayList().stream() .collect(() -> new HashMap<Integer,List<Person>>(), (h, x) -> { List<Person> value = h.getOrDefault(x.getType(), Lists.newArrayList()); value.add(x); h.put(x.getType(), value); }, HashMap::putAll ); //groupBy form Lists.<Person>newArrayList().stream() .collect(Collectors.groupingBy(Person::getType, HashMap::new, Collectors.toList())); //Because I have operations on values, I can convert values more flexibly Lists.<Person>newArrayList().stream() .collect(Collectors.groupingBy(Person::getType, HashMap::new, Collectors.mapping(Person::getName,Collectors.toSet()))); // There is also a relatively simple way to use it. You only need to pass a parameter and divide it according to the key Map<Integer, List<Person>> personsByAge = persons .stream() .collect(Collectors.groupingBy(p -> p.age));
4.reducing()
reducing is a collection of single values. The returned result is not the type of collection family, but a single entity class T
Container: boxSupplier(identity). Here, the package uses an Object [] array with length of 1. Naturally, the reason is the immutable type
Add container operation: a[0] = op.apply(a[0], t)
Multi container consolidation: a[0] = op.apply(a[0], b[0]); return a;
Result operation after aggregation: the result is naturally the data a - > a [0] wrapped by Object[0]
Optimization operation status field: CH_NOID
public static <T> Collector<T, ?, T> reducing(T identity, BinaryOperator<T> op) { return new CollectorImpl<>( boxSupplier(identity), (a, t) -> { a[0] = op.apply(a[0], t); }, (a, b) -> { a[0] = op.apply(a[0], b[0]); return a; }, a -> a[0], CH_NOID); }
In short, this place does the same thing as reduce. The first id is the initial value of reduce, but it is wrapped into an array with a length of 1.
//Native operation final Integer[] integers = Lists.newArrayList(1, 2, 3, 4, 5) .stream() .collect(() -> new Integer[]{0}, (a, x) -> a[0] += x, (a1, a2) -> a1[0] += a2[0]); //reducing operation final Integer collect = Lists.newArrayList(1, 2, 3, 4, 5) .stream() .collect(Collectors.reducing(0, Integer::sum)); //Of course, Stream also provides the reduce operation final Integer collect = Lists.newArrayList(1, 2, 3, 4, 5) .stream().reduce(0, Integer::sum)