After installing HBase on Hadoop cluster, I started HBase services I am getting below error
Here is the full error:
WARN [RpcServer.FifoWFPBQ.default.handler=27, queue = 0, port = 17471] io.FSDataInputStreamWrapper: Failed to invoke "unbuffer" method in calss org.apache.hadoop.fs.FSDataInputStream. So there may be a TCP socket connection left open in CLOSE_WAIT state. java.lang.reflect.invocationTargetException at sun.reflect.GeneratedMethodAccessor114.invoke(unknows Source) at sun.reflect.DelegatingMethodAccessormpl.invoke.(DelegatingMethodAcccessorimpl.java:43) at java.lang.reflect.Method.invoke(Method java:498) at org.apache.hadoop.hbase.io.FSDataInputStreamWrapper.unbuffer(FSDataInputStreamWrapper.java:273) at org.apache.hadoop.hbase.io.hfile.HFileBlock$SReaderImpl.unbufferStream ( HFileReader . java:1403) Caused by: java.lang.UnsupportedOperationException: this stream does not support unbuffered. at org.apache.hadoop.fs.FSDataInputStream.unbuffer (FS DataInputStream.java:534)
This type of error in HBase quite common in the Hadoop cluster
Step 1: Moved HBase and HBase – temp directory move to / temp folder and revert it.
Step 2: Once it is done then stop all AMS services (Ambari Metric Services).
Step 3: After a few mins start all SMS services.
Step 4: Reverting all directories.
Summary: In this article, we will explain HBase error and resolution with simple steps in the Bigdata environment. In the Hadoop cluster, HBase is used for the random-access and read-only purpose for the large data sets. HBase is open-source and it’s run on top of the Hadoop Distributed File System (HDFS) because it provides a fault-tolerant way of storing the huge data within the Hadoop cluster. In this post mostly resolve the below error :
"Caused by java.lang.UnsupportedOperationException: this stream does not support unbuffered. "
And here we provided the solution is very simple HBase, HBase-temp folder move to Hadoop temp folder for back-up mechanism. After that stop the AMS all services. This type of warning/error comes in the Hortonworks cluster. Sometimes this error belongs to connection issues in the Ambari services especially for HBase service so restart the HBase service simply. In case after that getting the same error then will check Ambari all services in the log files. Then proceed the next otherwise restart the entire Hadoop/Hortonwork cluster in the environment.