HDFS Java API operation error (user permission)

Problem Description:

There is a problem when running Hadoop HDFS in idea. The error is as follows:
org. Apache. Hadoop. Security. Accesscontrolexception: permission denied: user = XXXX, access = write, inode = “/”: root: supergroup: drwxr-xr-x
because the user name of this machine is different from that of the Linux operating system, an error will be reported.

Solution:

Under Linux system, find the directory where Hadoop is installed, and find etc/Hadoop/HDFS site. XML
under this directory

<property>
  <name>dfs.permissions.enabled</name>
  <value>false</value>
  <description>
    If "true", enable permission checking in HDFS.
    If "false", permission checking is turned off,
    but all other behavior is unchanged.
    Switching from one parameter value to the other does not change the mode,
    owner or group of files or directories.
  </description>
</property>

Add the above code, restart the cluster, and there will be no problem in operation.

Solution:

The ultimate code, one line solution
Add system.setproperty ("hadoop_user_name", "root") to the Java code to set the permissions of the client to operate on HDFS. that will do

Read More: