Caused by: org.apache.commons.vfs2.FileSystemException: Could not create file
First method.
Edit the Spoon.bat file, and add the following to line 119.
“-DHADOOP_USER_NAME=xxx” “-Dfile-encoding=UTF-8”
Note: xxx is your own user name
Second way.
hdfs dfs -chmod 777 /
Read More:
- [Solved] ERROR: Attempting to operate on hdfs journalnode as rootERROR: but there is no HDFS_JOURNALNODE_USE
- [Solved] ERROR: Attempting to operate on hdfs namenode as root ERROR: but there is no HDFS_NAMENODE_USER defi
- Hadoop ERROR: Attempting to operate on hdfs namenode as root ERROR: but there is no HDFS_NAMENODE_US
- Kettle Flash Back Error:The graphical interface cannot be opened. There is an ETI installation problem
- Jenkins reported an error when running the docker permission in the shell script
- Cli Write Error: This is related to npm not being able to find a file. [Solved]
- [Solved] Kettle Error: ../deploy does not exist, please create it.
- How to Fix Error 1069:The service did not start due to a logon failure
- [Solved] The “QtRunWork“ task returned false but did not log an error
- Hadoop Error: hdfs.DFSClient: Exception in createBlockOutputStream
- How to Solve Error: Module did not self-register
- [Solved] flink Write Files Error: lang.NoClassDefFoundError: org/apache/flink/api/common/typeinfo/TypeInformation
- [Solved] Maven Error: The packaging for this project did not assign a file to the build artifact
- [Solved] kettle Error: GC overhead limit exceeded
- [Solved] Vscode Connect to the remote server error: permission denied (public key)
- Solutions to error reporting after adding comments to XML files
- Audit reported an error: “the device settings could not be applied because of the following error: Mme device internal error“
- Celery Error: Did you remember to import the module containing this task?
- Zeppelin reported an error. The JDK version is low
- How to Solve spark Writes Files to odps Error