Error message:
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000:
Failed to open new session:
java.lang.RuntimeException:
org.apache.hadoop.security.AccessControlException:
Permission denied: user=anonymous, access=EXECUTE , inode="/tmp":root:supergroup:drwx------
After trying to add an account and password to hive, it is found that the problem is stored in the last sentence. I am anonymous and the read-write permission of/tmp directory is drwx——
The first character: -
indicates that this is a file, d
indicates that this is a folder, |
indicates that this is a connection file
is divided into three characters in a group
the first three: that is, RWX
indicates owner permissions
the middle three: ---
user permissions in the same group
the last three: ---
other user permissions
permission th> | represents th> | value th> | binary th> | specific role th> TR> thead> |
---|---|---|---|---|
R td> | read td> | 4 td> | 00000100 td> | the current user can read the file content and browse the directory Td> TR> |
W td> | write write td> | 2 td> | 00000010 td> | the current user can add or modify file contents, and the current user can delete or move directories or files in directories Td> TR> |
x td> | execute td> | 1 td> | 00000001 td> | the current user can execute files and enter the directory td> |
Aka, my root account is accessing hive2, so I belong to other user permissions, so I was rejected
solution:
Change the access permission of this file/file directory in the HDFS file system and relax it. The syntax of changing permissions is similar to that of Linux
HDFS DFS - Chmod - R 777/tmp
log in with the account password of HDFS