Hadoop common errors in the HIVE | HDFS | MapReduce for Developers & Admins





Basically, Hadoop common errors in the Hive, MapReduce in the Hadoop cluster environment while writing in the Hive querying, MapReduce programming for processing time. And storage time also we are getting some of the errors like permission denied accessing the folders, out of memory issues, etc.

Hadoop common errors:

HDFS errors:

1.Fatal error: java.lang.ArrayIndexOutOfBoundsException

Large data files storing in the Hadoop HDFS system exceed of size with in the limit

2.mkdir: permission denied : user = admin, access = READ

permission denied for accessing name node and data node in Cloudera/Hortonworks or mapR for folder accessing. Once user got permission like Write/Read access for node or folder will resolve this issue simply.

3.hadoop -version : error: Could not find or load main class. So use hadoop version

Error: No command named `-version’ was found. Perhaps you meant `Hadoop version’

4.Hadoop : error : could not find or load main class org.apache.hadoop.util.VersionInfo
5.Hadoop: hdfs namenode -format Could not find or load main class
6.hadoop fs -put command error in Hadoop cluster

MapReduce errors:




1.Hadoop MapReduce: error: Could not find or load main class com.sun.tools.javac.Main
2.Hadoop: MapReduce Word count program error

Hive errors:

1.Hive: Return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask
2.Caused by: org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException
3.FAILED: Error in Metadata: java.lang.Runtime Exception: Unable to connect org.apache.hadoop.hive.metastore.HiveMetaStoreClient
4. JDBC connect string for a JDBC metastore error while connecting in database.
5. Caused by: java.met.ConnectionException: Connection refused





Summary: Here we provide Hadoop related errors in the Hadoop cluster for Hive and MapReduce including resolution. In the Hadoop eco-system, so many services are there so obviously getting errors from different services. Collected the Hadoop common errors with the resolution. In Hadoop Distributed File System getting errors like out of memory issues, folder permission denied issues, etc. Coming to Hive most of the time getting errors like metastore issues, Connection refused issues, etc. And finally, MapReduce getting wordcount program errors while executing in the cluster environment.