Problem Description: after the Hadoop environment is deployed, the process of datanode cannot be seen when running JPS on the slave machine.
Solution: delete all HDFS on slave- site.xml All contents in the datanode folder configured in( dfs.data.dir Parameter), and then initialize the namenode to run
hadoop namenode -format
Reinitialize.
Reason: the namenode has been initialized many times, but the master has not cleared all the initialization data in the datanode folder, which makes the ID generated in the two folders inconsistent. After deleting the initialization data in the datanode folder, the namenode initialization will take effect. Start Hadoop again and use JPS to see the datanode process.
Read More:
- Why namenode can’t be started and its solution
- Datanode startup failed with an error: incompatible clusterids
- Elasticsearch cluster cannot be started. The following prompt fails to send join request to master appears
- mkdir: Call From hadoop102/192.168.6.102 to hadoop102:8020 failed on connection exception: java.net.
- A solution to the problem that the number of nodes does not increase and the name of nodes is unstable after adding nodes dynamically in Hadoop cluster
- Solution not found by JPS command
- dfs.namenode.name . dir and dfs.datanode.data .dir dfs.name.dir And dfs.data.dir What do you mean
- Summary of Hadoop error handling methods
- Systemctl command doesn’t exist. How to solve the problem that the service can’t be started
- Linux Mint installs Hadoop environment
- Centos7 view and close firewall
- [Solved] Import org.apache.hadoop.hbase.hbaseconfiguration package cannot be imported
- Common problems of Hadoop startup error reporting
- Introduction of Hadoop HDFS and the use of basic client commands
- MAMP failed to start: Apache could’t be started. Please check your MAMP installation and configuration
- This program cannot be started because vcruntime140 is missing from your computer_ 1.dll。 Try to install the program again to solve the problem.
- Oracle reports an error and lsnrctl listening cannot be started
- Error in initializing namenode when configuring Hadoop!!!
- This program cannot be started because the computer is missing COpenGL.dll
- The main class could not be found or loaded when springboot started