Daily usage Linux commands for Hadoop Admin in the Big Data environment | Hadoop | Big Data

In this article, we will explain daily usage Linux commands for Hadoop Admin role with real-time examples.

Linux commands for Hadoop Admin.





I) Permissions & Change owner

1. mkdir -p hadoop_jobs

2.chmod 777 hadoop_jobs

3.chown hadoop:hadoop hadoop_jobs

II) Remove old large log files

Create one directory/folder then move to many old files from original folder to created directory.

1. mkdir -p old_log_files

2.mv spark_log  old_log_files

3.rm -rf old_log_files

III)Copy commands:




1. Normal file copy from one folder another folder within the machine only.

cp folder1/file1  /folder2/file2

2.Securely files copy from one machine to another machine using below scp command.

scp -r username@you're server FQDN: / path /   to /home/user/Downloads

The above scp command is used for remote machine to local folder with recursively.

3.How to copy from one cluster to another cluster?

Here we provided simple command for Hadoop Admins.

hadoop distcp hdfs://cloudera_manager:xxxx/ datasets/ s3a:// production/ backup_datasets/

IV) Status commnads:

java  -version

java c

service cloudera - manager status

service cloudera -scm- agent status

V) Grep commands

how to heck last time used commands in cli

history

which command do you need specifically

history | grep cloudera

How to check the services is running or not

ps -ef | grep cloudera




Leave a Reply

Your email address will not be published. Required fields are marked *