Latest Hadoop Admin interview Questions and Answers[Updated]

Hadoop Admin interview questions and answers

1. How to check Hive service in the Hadoop environment?
Ans: By using the below command to verify whether Hive service is worked or not.

hadoop dfsadmin -report

Another way is open edge node and type hive. Then Hive CLI comes then working fine otherwise it is not working.

2. How to copy data from one cluster to another cluster?

Copy data from one cluster to another cluster by using the putty below command.

hadoop distcp hdfs: //cluster hostname 1  hdfs://cluster hostname2

3. While accessing hive you get “authorization access issue”. Here what is meant by authorization access and how to resolve it.

Ans: Here authorization access means you don’t have access(permission issue) to hive folder access(“/user/hive/warehouse”) so need to raise RC then only you have it.

4. How to add new data nodes in Cloudera manager?

Ans: First, setup data nodes prerequisites then open Cloudera Manager goto add host options then give data nodes hostnames then added it simply.

In Single node, cluster put hostname into the slave text file in Hadoop configuration files s

5. How to set two replication factors in two data nodes?

Ans: Set up in hdfs-site. xml –> dfs.replication property then the value from 3 to 2 then it is working fine. Otherwise by using below command:

hadoop dfsadmin setrep -w 2 /<Path>

6. Have you used Ambari is Hortonworks? and explain how to install and how it works?

Ans: Yes, basically Amabri is used to Maintaining, Monitoring and integrate services like Spark, Hive and HBase, etc, in the Hortonwork Cluster. First, install the Ambari agent and Ambari commands in Hortonwoks documentation. Once the installation is done login with default user name and password. To add all data nodes and Namenode then provides security as well.

7.A brief idea about your project architecture with daily work?

Ans: explain about your project architecture and daily work  with clarity presentation.