Spark with Cassandra integration error | Spark | Hadoop | Big Data |Cassandra





I am getting an error while connecting to Spark with Cassandra integration on top of the Hadoop cluster. In this article, we will explain the Spark with Cassandra integration connection error and provided a solution for this issue.

Spark with Cassandra Integration Connection Error:

Server access error at url https://repo1.maven.org/maven2/org/scala-lang/scala-reflect/2.11.12/scala-reflect-2.11.12.jar (javax.net.ssl.SSLException: Received fatal alert: protocol_version)
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: commons-beanutils#commons-beanutils;1.8.0: not found, unresolved dependency: org.joda#joda-convert;1.2: not found, unresolved dependency: joda-time#joda-time;2.3: not found, unresolved dependency: io.netty#netty-all;4.0.33.Final: not found, unresolved dependency: com.twitter#jsr166e;1.1.0: not found, unresolved dependency: org.scala-lang#scala-reflect;2.11.12: not found]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1076)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:294)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:158)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Solution:

After getting the above error, I am trying to execute the below command.

echo 'export JAVA_TOOL_OPTIONS="-Dhttps.protocols=TLSv1.2"' >> ~/.bashrc source ~/.bashrc

After completion of the command, I am able to connect the Spark cluster by using Cassandra connectors.




Summary: I tried to integrate the Spark and Cassandra on top of the Hadoop cluster. After open spark-shell, and try to execute the Spark and Cassandra connector command getting the above error. After that, I try to change Cassandra jar files. Still, I am getting the same error so I tried to check the Spark and Cassandra jar files with compatible like version problem. Everything is fixed after that I enter the “export Java Tool Option” command then the issue has been fixed. Not only Spark integration with Cassandra some of the other components also getting Spark connection error while integration time. For example Spark integration with MySQL. In this scenario most of the cases are like connection related issues, it may be jar file compatible, java exceptions version related, and etc. Here we provided a simple solution for this error in the Spark environment. In the Big Data environment, we are getting large data sets so we need to processing that data with the help of Spark and some other tools. While data processing Spark needs to integrate with other tools like Cassandra, Sqoop, and Kafka, etc.