在spark-shell中编写wordcont

xiaoxiao2021-02-28  116

首先启动Master:./sbin/start-master.sh 然后启动Worker:./sbin/start-slave.sh master:7077 启动Spark Shell:./bin/spark-shell --master spark://master:7077 在Spark shell中执行如下命令: val line = sc.textFile("file:home/spark/spark-2.2.0-bin-hadoop2.7/README.md") val result = line.flatMap(_.split(" ")).map((_, 1)).reduceByKey(_ + _) result.saveAsTextFile("file:///home/spark/spark-2.2.0-bin-hadoop2.7/out")
转载请注明原文地址: https://www.6miu.com/read-52265.html

最新回复(0)