使用 flink aggregate 统计单词个数,报错Aggregate does not support grouping with KeySelector functions, yet.
val word01 = environment.fromCollection(List("flink", "spark", "spark", "hadoop"))
word01.map((_,1))
.groupBy(0)
.aggregate(Aggregations.SUM,1)
.print()
更改一下就可以了。
word01.map((_,1))
.groupBy(0)
.aggregate(Aggregations.SUM,1)
.print()
版权声明:本文为qq_39570355原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接和本声明。