There is a problem when running Hadoop HDFS in idea. The error is as follows:
org. Apache. Hadoop. Security. Accesscontrolexception: permission denied: user = XXXX, access = write, inode = “/”: root: supergroup: drwxr-xr-x
because the user name of this machine is different from that of the Linux operating system, an error will be reported.
Under Linux system, find the directory where Hadoop is installed, and find etc/Hadoop/HDFS site. XML
under this directory
<property> <name>dfs.permissions.enabled</name> <value>false</value> <description> If "true", enable permission checking in HDFS. If "false", permission checking is turned off, but all other behavior is unchanged. Switching from one parameter value to the other does not change the mode, owner or group of files or directories. </description> </property>
Add the above code, restart the cluster, and there will be no problem in operation.
The ultimate code, one line solution
system.setproperty ("hadoop_user_name", "root") to the Java code Code> to set the permissions of the client to operate on HDFS. that will do
- Beeline connection hive2 reports an error permission denied
- dfs.namenode.name . dir and dfs.datanode.data .dir dfs.name.dir And dfs.data.dir What do you mean
- Error in configuring Hadoop 3.1.3: attempting to operate on yarn nodemanager as root error
- Ranger yarn plug-in installation
- Run spark to report error while identifying ‘ org.apache.spark . sql.hive.HiveSessionState ‘
- Summary of Hadoop error handling methods
- When linux installs rpm, it prompts: can’t create transaction lock on /var/lib/rpm/.rpm.lock error
- Error: attempting to operate on HDFS namenode as root
- Error in initializing namenode when configuring Hadoop!!!
- hdfs 192.168.2.19:9000 failed on connection exception: java.net.ConnectException:Connection refused
- Method of modifying file and folder permission by Chmod command in Linux
- ERROR: JAVA_HOME is not set and could not be found.
- Two kinds of errors caused by root / lack of execution permission x
- Sparkcontext: error initializing sparkcontext workaround
- Namenode startup error: outofmemoryerror: Java heap space
- Linux Mint installs Hadoop environment
- Android has applied for permission and still prompts open failed: eacces (permission denied)
- A case diagnosis and solution of DB2 error code 1639 and SQL state 08001 is described in detail
- Failed to write output file ‘C:’ windows\ Microsoft.NET \Framework64\v4.0.30319\Temporary ASP.NET Files\root\106f9ae8\cc0e1
- An error occurs when HBase uses the shell command: pleaseholdexception: Master is initializing solution