Error while running Sqoop jobs in Hadoop Cluster





Apache Sqoop: Sqoop is one of the import/export large data from HDFS to RDBMS and RDBMS to HDFS vice versa.
Talend: Talend is an open-source ETL Tool to provide Data Integration and Big Data environment and Cloud storage based Tool. Nowadays most popular for Data Integration and Big Data.
While running the Talend jobs in the Hadoop cluster environment getting Sqoop connection error like below.

Caused by:  java.sql.SQLRecoverableException: IO Error: Connection reset
at oracle.jdbc.driver.T4Cconnection.login(T4CConnection.java.498)
at.oracle.jdbc.driver.PhyscicalConnection.<init>(PhysicalConnection.java:553)
at.oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:528)
at.java.sql.DriverManager.getConnection(DriverMAnager.java:571)
at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
... 46 more
Caused by: java.net.SocketException:Connection reset
at java.net.SocketInputStream.read(SocketInputStream.java.196)

The above issue belongs to the connection (network) issue between Integration tools and Database Management System.

Resolution 1:

Step 1: We use Java Database Connectivity JDBC driver.
Step 2: Then configure DNS on both systems 
Step 3: Restart the DB and Tools in the Hadoop cluster.

Resolution: 2

Step 1: $JAVA_HOME/jre/lib/security/java.security
Step 2: securerandom.source=file:/dev/urandom

After completion of all steps then restart the Apache Sqoop.

Permission Denied error in Hive while creating Database in Hadoop eco-system

I have installed Hive service on top Hadoop eco-system then trying to create a database but I got below error and find out a solution as well.



Permission Denied Error in Hive:

FAILED: Execution Error, return code1 from org.apache.hadoop.hive.ql.exec.DDLTask
hive> set hive.auto.convert.join.nonconditional task = false:
hive> create database myhive:
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied user = hadoop access = WRITE, inode*/user*: hdfs : supergroup : drwxr-rx-r
at org.apache.hadoop.hdfs.server.namenode.FSPErmissionChecker.check(FSPermissionChecker.java:224)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:149)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:149)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4891)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:669)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java.453)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
hive>

Above error belongs to Permission issue in Hive component:

Resolution:

To resolve the permission issue to the user Hadoop in HDFS. Can you please follow step for the solutions is just change permission to the user using chmod commands:

Step 1: Login to as hduser then execute the below commands one by one.
Step 2: sudo - u hdfs hadoop fs -mkdir /user/hive/warehouse
Step 3: sudo -u hdfs hadoop fs -chmod g+w /tmp
Step 4: sudo -u hdfs hadoop fs -chmod g+w /user/hive/warehouse
Step 5: sudo - u hdfs hadoop fs  -chown -R /user/hive/warehouse
Step 6: sudo chmod 777 /var/lib/hive/metastore
Step 7: cd /var/lib/hive/metastore/metastore_db/
Step 8 :sudo rm *.lck

Summary: I have tried above resolutions then working fine now for above error in Hive.

Opera Failed to Uninstall on Windows 10





In Windows 10 operating system automatically installed Opera browser so now I am trying to uninstall the Opera browser on Windows 10

Opera Browser Uninstall on Windows 10

Click on Windows button it showing Opera Browser below :

While uninstalling time I get below error with snapshot for understanding

Opera failed to uninstall: Unable to uninstall Opera.exe.
Please make sure Opera is not running and try again.

Resolutions:

Step 1: Click on the Windows button then search on Opera then right-click on the application. After that select the uninstall option. It will redirect to the Control Panel uninstall the program.

Step2: If it will not get the above option then directly ho with Control panel uninstall programs then choose Opera Browser application to right-click on Uninstall button then select the options. Click on Delete my Opera user data and click on Uninstall button.






Step 3: After clicking on Uninstall button then will get Yes/no window. Choose Yes option will get completely uninstall from Windows 10 operating system. If it is not uninstalled then restart your Windows 10. Start with every step from scratching.

How to Remove(Uninstall) WebDiscover Browser on Windows[Virus Malware] with Pictures




What is WebDiscover Browser in Windows?

It is one of unwanted browser to provide bundled with other software like facebook, youtube, etc. It is automatically downloaded from the internet. It is customized google chromium-browser and changes the search engine automatically. Some of the operating systems in Windows 7 it showing in on top of the desktop window like below image:
Picture 1:

How to Remove WebDiscover Browser

Here is step by step processing to uninstall of WebDiscover browser with pictures.
Step 1: Open “Control Panel” in your operating system whether it is Window 7 or Windows 10. I am going with Windows 10.

Step 2: After opening Control Panel then go to  Programs option then select the  Uninstall a program for web discover remove it.

Step 3: Then select the WebDiscover Browser and Right-click on that. Uninstall the software simply.

Step 4: After uninstalling completion then go with LocalDisk(c) program files then delete that folder. If that folder is not deleted completely then restart the system again delete the folder.

Step 5: If you have Antivirus software are there then scan the entire system.

When the WebDicover Browser installed automatically on Windows operating system some common changes in your machine. Mostly changing the web browsers homepage to WebDiscover Homepage like picture 1 and change search engine also. New tab functionality to launch the modified search portal increase the loads into the Mozilla add – ons or chrome extension.




Summary: WebDiscover Browser one of the browser to search for something. But it is default applications are there without human interaction. So will uninstall the browser with simple steps from windows operating system.

Unable to Integrate Hive with Spark and different resolutions




How to integrate (connect) Hive and Spark:

Here are to provide solutions for how to integrate (connect) Hive DB with Spark in Hadoop development.
The first time, we tried to connect the Hive and Spark then we got below error and find different types of resolutions with different modes.

caused by: org.datanucleus.exceptions. NucleusExcepiton: Attempt tp invoke 
the ONECP" plugin to create a ConnectionPool gave an error: The specified 
data driver ("co.mysql.jdbc.Driver) was not found in the CLASSPATH. Please 
change our CLASSPATH specification and the name of the driver.

Different types of solution for the above error:

Resolution 1:

1.Download MySQL connector java jar file from maven official website like below link
https://mvnrepository.com/artifact/mysql/mysql-connector-java/5.1.21
2. Paste the jar file into jars folder which is present in the Spark installed directory.

Resolution 2:

Without JDBC driver:

1. Goto hive-site.xml and give hive.metastore.uri in that hive xml file
2. Import the org.apache.spark.sql.hive.HiveContext, as it can perform SQL query over Hive tables then define the sqlContext param like below code:
Val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
3. Finally, verify Tables in Spark SQL

Resolution 3:





Go with the beeline for Hive and Spark connection in Hive CLI. In beeline, they provide high security and provide a remote server through directly and check with below two commands for beeline with Hive 2 server configurations.

Step 1: ./bin/beeline
Step 2:  !connect jdbc.hive2.//remote_hive:10000

Hadoop job (YARN Staging) error while executing simple job

In a Hadoop eco-system, no.of jobs are executing in a fraction of time in that time. I am trying to execute the Hive job for Data validation in Hive server in Production server. While executing a Hive job in the hive command line I got this type of error.



at org.apache.hadoop.ipc.Client.call(Client.java:1468)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy10.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1532)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1349)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:588)
22:33:33 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging//.staging/job_1562044010976_0003
Exception in thread "main" org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/hadoop-yarn/staging//.staging/job_1562044010976_0003/job.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)

The above error belongs to a connection error in Datanode while executing the code. At the time Datanode not running properly. so find below resolution for this issue:

Stop all services:

stop-all.sh
start-all.sh

Here restart all services including Namenode, Secondary Namenode, DataNodes and remaining services like Hive, Spark,
etc.

If still showing this type of error then start the distributed file system.

start-dfs.sh

Check all the Hadoop Daemons like Name node, Secondary Name node, Datanode, Resource Manager and Node Manager, etc. By using below command

jps

And then check All node information by using “hadoop dfsadmin -report ” for the status of the Datanode whether it is running fine or not.

Above steps for Local, Pseudo distributed,  and standalone mode only in Hadoop eco-system.

For Cloudera, Hortonworks, MapR distributions are simply “Restart” DataNodes and Services like Hive, Spark, etc.




Summary: In Big Data environment we executing so many jobs like Hadoop/Spark/Hive for the result but some times showing above error. At the time we stuck but here the simple solution for the above error

MongoDB Error: The Program can’t start because MSVCP140.dll is missing from your computer.

Error:





The Program can’t start because MSVCP140.dll is missing from your computer. Try reinstalling the program to fix this problem in while installing MongoDB on Windows Operating System.

Resolutions:

Solution 1:

Step 1: Uninstall the MongoDB from your Windows machine.

Step 2: Clean your junk files (using CCleaner, etc) from your Windows

Step 3: Remove MongoDB all files from your system.

Step 4: Download the latest version of  MongoDB. If you need Robo 3T studio also download from the MongoDB official website.

Step 5: Trying to install the .exe file using Run as Administration. After completion of MongoDB restarts the windows machine.
After these steps error is still pending so try to follow the second solution

Solution 2:

If DLL (Dynamic Link Libraries) files are missing from your Windows machine. Some of the applications depend on the DLL files because external libraries sync up with these files to fix this issue.

Step 1:  Downloading the missing dll file from the internet and copy the file into a particular file location(C:\Windows\System32).

Step2: After Installing the missing dll file in your local machine then try to install MongoDB or other applications.

If still is not working go with below solution

Solution 3:

Step 1: Run the built-in System File checker tool for corrupted or missing files in the Windows operating system.

Step2: Try to Repair or reinstall of the MongoDB or some other application like Visual Studio.

Step 3: Then copy the DLL file from another Windows operating system and restore it on your computer and followed by re-registering the dll files in your computer.




Summary: In the Windows operating system most of the applications are not complete run the different files. If the Windows OS or software is not able to find any concerned DLL file is missing or corrupted then will receive this type of error: The Program can’t start because MSVCP140.dll is missing from your computer.

IntelliJ IDEA : Failed to load JVM DLL

I’m trying to solve this error in Windows operating system. While launching the IntelliJ IDEA for developing the code some conflicts came into the picture.



ERROR:

Failed to load JVM DLL C:\Program Files\JeeBrains\Intellij IDEA Community Edition 2019.1.1\jre64\\bin\server\jvm.dll

If you already have a 64 -bit JDK installed, define a JAVA_HOME variable in

Computer > System Properties > System Settings > Environment Variables.

But correctly defined Java Path on Windows operating system.

Resolutions:

Solution 1:

Set the JAVA_HOME path including jvm.dil path

Find below path in your local machine and copy that path into JAVA_HOME

Step1: Goto JDK path and copy the path up to jvm.dil

C:\Program Files\Java\jdk1.8.0_181\jre\bin\server

Step2: Set to JAVA_HOME in the environment variable

%JAVA_HOME%\bin

Step 3: Still, it’s not working simply remove the following below path in your System variable it may be caused to override of JAVA_HOME

C:\PrgogamData\Oracle\Java\javapath

Solution 2:

It may be sometimes a problem with Version compatibility so try to launch the 64-bit  version. Due to the 32-bit version problem on Windows 64 – bit version and create the shortcut of IDEA into your desktop.

Note: If still facing this type of issue then try to below solution.

Solution 3:

Step1: Download Latest version of JDK 1.8 and install it.

Step 2: Set the Path in user variables and JAVA_HOME in System variables with the full naming convention.

Step3: Download IntelliJ IDEA latest version with 64 – bit. And try to launch on Windows – 64-bit version

Above resolutions are almost solved your issue in IntelliJ IDEA on Windows operating system while installing of Jet Brains of IntelliJ IDEA or Eclipse IDEs are in your local machine.




Summary: In the Windows operating system An Integrated Development Environment is a major role in developing areas. All most all IDEs based on Java supporters so need to install JDK. After installation of JDK then set the environmental variable for accessing anywhere in the system.

How to Fix Windows Defender(Micro Essential) Error Code : 0x80073b01

        Windows Defender (Micro Essential) :

In Windows operating system Windows Defender is a security essential part to protect windows from Malware and Viruses. If viruses corrupt the file system like error code: 0x80073b01 of Windows Defender or Micro Essential.

In this error generally appears due to a corrupted installation of Windows Defender or Micro Essential in Windows operating system.

How to Resolve Above Error code : 0x80073b01

Find below simple steps to solve the error in windows defender in windows XP /7/8/8.1/10 :

=> Edit Registry  and Repair Registry

Step 1: Open Windows Registry Editor: Press Windows Key+R to open Run

Step 2: Type : Regedit and hit Enter

Step 3HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\Current Version\Image                       File Execution Options\”

Step 4: After that delete the entry ” msseces.exe 

Step 5 “HKEY_CURRENT_USER\Software\Microsoft\Windows\Current                                              Version\Policies\Explorer\DisallowRun” 

Step6: And delete the entry “msseces.exe

Step7: After completion of above steps then simply Restart the Windows xp/7/8/8.1/10

Check whether error code remove or not . If in the case it is not remove use PC Repair tool 

Step 1: Download and Install PC Repair tool

Step 2: After that Click on the “Scan” button  to diagnose the PC

Step 3: Click the “Fix Errors”, to fix the errors in windows defender.

Finally, Windows defender is Anti-Virus software that protects your system from virus attacks. Windows Defender most powerful Anti-spyware to remove all Malware.

 
 

How to resolve Google Chrome Crashes

At present sometimes google chrome showing Aw, Snap! error so how to solve these errors simply

  1. Check in your chrome crashes in google chrome

2. Type in browser: chrome://crashes

If no crashes showing like this 

3. If Crashes in your  google chrome showing like this

 

Please check above if errors are there in your google chrome then start or stop automatically reporting errors and crashes following below steps:

  1. Open your Google Chrome
  2. At the top right, click more : > Settings
  3. After that at the bottom showing :> Show advanced settings
  4. Under “privacy”, check or un check the box for ” Automatically send usage statistics and crash reports to Google”

(OR)

  1. Using Chrome Cleanup Tool  simple way to stop the google chrome crashes