ClassNotFoundException in Spark while submit the jar file

First, I am writing the Spark application program on Scala Spark-shell in Scala program.

ClassNotFoundExceptions in Spark:

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.sql.SparkSession

object sparkDemo{
def  main(args:Array[String]): Unit={
val conf = new SparkCoonf().setAppName("Deply").setMaster("local[*])")
val sc = nes SparkContext(conf)
val data = sc.textFile("file:///home//Spark//Test//Input.log")
val file_data = data.filter(x=>x.contains("Population"))
println("Filtered Data")

While submitting the spark application jar file on spark-shell getting below error:

Scala> spark - submit --class sparkDemo.sparkDemo --master local[*] file:///home//Spark//Output/sparkDemo-0.0.1.-SNAPSHOT.jar

Error 1:

Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: 
Caused by: java.lang.ClassNotFoundException: kafka.DefaultSource


bin\spark-submit --class sparkDemo.Main --master local[*]  file:///home//Spark//Output/sparkDemo-0.0.1.-SNAPSHOT.jar

After submitting the Spark jar using the above command, it’s worked.

Error 2:

java.lang.ClassNotFoundException: Failed to find data source: 
org.apache.spark.sql.avro.AvroFileFormat. Please find parameters.html at 
org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:438) at at 
Caused by: java.lang.ClassNotFound Exception: 
org.apache.spark.sql.avro.AvroFileFormat.DefaultSource at 


The above error belongs to the Avro file format exception due to incompatible issues with Spark. So first check with Spark Core and Spark SQL compatible.

After the fixed, the compatible issue for Spark Core & Spark SQL then go with Spark-Avro

Summary: Here are we resolved two errors like ClassNotFoundException one is Spark jar file submission and another one is Spark – Avro file exception. We provide exact solutions.