I do not have Java 1.8 installed on my machine and somehow I am receiving this error. Other stackoverflow solutions propose to use Java 1.8 to compile the code. I want to understand the reason for the following error.
Exception in thread "main" java.lang.UnsupportedClassVersionError: com/typesafe/config/ConfigValue : Unsupported major.minor version 52.0
Here is my code:
pom.xml
org.apache.spark
spark-core_2.10
1.5.1
Scala Code
object SparkWithHbase {
def main(args: Array[String]) {
System.out.println("Java Version: " + System.getProperty("java.version"))
//Initiate spark context with spark master URL. You can modify the URL per your environment.
val sparkConf = new SparkConf().setAppName("Spark Hbase").setMaster("spark://10.41.50.126:7077")
val sc = new SparkContext(sparkConf) // Failing at this line
}
No comments:
Post a Comment