The reason for this problem is that the old metadata information was not deleted when hbase was reinstalled, so use zookeeper to delete the hbase metadata and restart hbase.
Solution:
Go to the zookeeper directory under bin
zkCli.sh -server localhost:2181
If the command is not found, you can change it to
. /zkCli.sh -server localhost:2181
Then delete all hbase related files
Restart hbase
hbase shell
now you can create the sheet successfully!
Read More:
- [Solved] hbase Create Sheet Error: ERROR: org.apache.hadoop.hbase.PleaseHoldException: Master is initializing
- HBase shell Find ERROR: org.apache.hadoop.hbase.PleaseHoldException: Master is initializing
- [Solved] HBase Error: ERROR: org.apache.hadoop.hbase.PleaseHoldException: Master is initializing
- [Solved] eclipse Error: org.apache.hadoop.hbase.NotServingRegionException:
- [Solved] habse Start Error: Error: Could not find or load main class org.apache.hadoop.hbase.util.GetJavaProperty
- [Solved] Failed update hbase:meta table descriptor HBase Startup Error
- [Solved] hbase Startup Error: ERROR: Can’t get master address from ZooKeeper; znode data == null
- How to Solve HBase error: region is not online
- [Solved] Hadoop Error: Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster
- Hbase Error: Regions In Transition [How to Solve]
- [Solved] HBase shell command Error: ERROR: connection closed
- [Solved] Hbase …ERROR: Unable to read .tableinfo from file:/hbaseData/data/default/table1/xxxx
- [Solved] Flume Error: java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
- Kafka executes the script to create topic error: error org apache. kafka. common. errors. InvalidReplicationFactorException: Replicati
- [Solved] Exception in thread “main“ org.apache.spark.SparkException: When running with master ‘yarn‘ either
- [Solved] Spark Error: org.apache.spark.SparkException: A master URL must be set in your configuration
- Hbase Shell startup error: [ERROR] Terminal initialization failed; falling back to unsupported
- [Solved] ERROR SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Could not pars
- [Solved] Hive Error: FAILED: Execution Error, return code 3 from org.apache.hadoop.hive.ql.exec.mr.MapredLocalTask
- K8s initializing the master & worker node error [How to Solve]