Run spark to report error while identifying ‘ org.apache.spark . sql.hive.HiveSessionState ‘

Problem description

Recently, I changed a computer and tested the spark streaming code of the original computer to the new computer. I used idea to run it, but I reported an error

Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState'

The root scratch dir: /tmp/hive on HDFS should be writable

After checking, it is found that/TMP/hive has insufficient permissions


First, make sure that there is Hadoop on the computer and that Hadoop is configured_ The home environment variable ensures% Hadoop_ HOME%\bin\ winutils.exe , winutils.exe The following code can be executed in Baidu CMD, where f is my code running directory

%HADOOP_HOME%\bin\winutils.exe ls F:\tmp\hive
%HADOOP_HOME%\bin\winutils.exe chmod 777 F:\tmp\hive
%HADOOP_HOME%\bin\winutils.exe ls F:\tmp\hive

After modification, spark streaming can run normally

Read More: