How to execute spark script saved in a text file

I have written word count scala-script in a text file and saved it in home directory
how can i call and execute wordcount.txt

spark-submit wordcount.txt is not working

wordcount.txt


val text = sc.textFile("/data/mr/wordcount/big.txt");
val counts = text.flatMap(line => line.split(" "))
.map(word => (word.toLowerCase(),1))
.reduceByKey(+)
.sortBy(_._2,false)
.saveAsTextFile(“count_output”);

found one of the solution executing

:load wordcount.txt

or

spark-shell -i wordcount.txt

is there any other way also??

Hi Uday,

Though python scripts can be run like the way you were trying by just keeping the extension as .py but scala scripts can’t.

You will have to create a jar using sbt or other mechanisms. Please see the example project: https://github.com/cloudxlab/bigdata/tree/master/spark/projects/apache-log-parsing_sbt

You will have to keep your script in src/main after wrapping it in the main method:

package com.cloudxlab.logparsing

import org.apache.spark._
import org.apache.spark.SparkContext
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext._

object EntryPoint {
    def main(args: Array[String]) {
        val conf = new SparkConf().setAppName("WordCount")
        val sc = new SparkContext(conf);
        sc.setLogLevel("WARN")
        
        //This is where your code will be
        
    }   
}