Brokerload statement
LOAD
LABEL gaofeng_broker_load_HDD
(
DATA INFILE("hdfs://eoop/user/coue_data/hive_db/couta_test/ader_lal_offline_0813_1/*")
INTO TABLE ads_user
)
WITH BROKER "hdfs_broker"
(
"dfs.nameservices"="eadhadoop",
"dfs.ha.namenodes.eadhadoop" = "nn1,nn2",
"dfs.namenode.rpc-address.eadhadoop.nn1" = "h4:8000",
"dfs.namenode.rpc-address.eadhadoop.nn2" = "z7:8000",
"dfs.client.failover.proxy.provider.eadhadoop" = "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider",
"hadoop.security.authentication" = "kerberos","kerberos_principal" = "ou3.CN",
"kerberos_keytab_content" = "BQ8uMTYzLkNPTQALY291cnNlXgAAAAFfVyLbAQABAAgCtp0qmxxP8QAAAAE="
);
report errors
Task cancelled
type:ETL_ QUALITY_ UNSATISFIED; msg:quality not good enough to cancel
Solution:
Generally, there must be a deeper reason for this error
you can see the URL field of the brokerload task through show load
show load warnings on ‘{URL}’
or open the web page directly
the number of fields is inconsistent or other reasons. The fundamental reason
is that the number of fields in some rows in the file to be imported is inconsistent with that in the table, Or the size of a field in some lines of the file exceeds the upper limit of the corresponding table field, resulting in data quality problems, which need to be adjusted accordingly
If you wants to ignore these error data
modify the task statement configuration parameter “Max_ filter_ratio” = “1”
LOAD
LABEL gaofeng_broker_load_HDD
(
DATA INFILE("hdfs://eoop/user/coue_data/hive_db/couta_test/ader_lal_offline_0813_1/*")
INTO TABLE ads_user
)
WITH BROKER "hdfs_broker"
(
"dfs.nameservices"="eadhadoop",
"dfs.ha.namenodes.eadhadoop" = "nn1,nn2",
"dfs.namenode.rpc-address.eadhadoop.nn1" = "h4:8000",
"dfs.namenode.rpc-address.eadhadoop.nn2" = "z7:8000",
"dfs.client.failover.proxy.provider.eadhadoop" = "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider",
"hadoop.security.authentication" = "kerberos","kerberos_principal" = "ou3.CN",
"kerberos_keytab_content" = "BQ8uMTYzLkNPTQALY291cnNlXgAAAAFfVyLbAQABAAgCtp0qmxxP8QAAAAE="
)
PROPERTIES
(
"max_filter_ratio" = "1"
);
Read More:
- Doris BrokerLoad Error: quality not good enough to cancel
- Doris BrokerLoad Error: No source file in this table [How to Solve]
- [Solved] Doris BrokerLoad Error: Scan bytes per broker scanner exceed limit: 3221225472
- How to Solve Doris dynamic partition table routineload Error
- [Solved] Doris Error: too many filtered rows
- Vue: How to Solve Error uncaught (in promise) cancel
- [Solved] Doris StreamLoad Error: load by MERGE or DELETE need to upgrade table to support batch delete
- [Solved] Doris Error: Label Already Exists
- [Solved] Doris’s routineload task alarm consuming failed
- github Error: Logon failed, use ctrl+c to cancel basic credential prompt.
- Kettle reported an error and did not have permission to write files to HDFS
- How to Solve HBase error: region is not online
- [Solved] Canal Error: CanalParseException: column size is not match,parse row data failed
- How to Fix Error 1069:The service did not start due to a logon failure
- SSH integration error: org.hibernate.hql.internal.ast.QuerySyntaxException: User is not mapped[……]
- Readiness probe failed: calico/node is not ready: BIRD is not ready: Error querying BIRD: unable to
- [Solved] Hive execute insert overwrite error: could not be cleared up
- [Solved] bin/hive Startup Error: Operation category READ is not supported in state standby
- How to Solve Error: “initializer element is not constant”
- How to Solve error C2039: “to_ String “: not a member of” STD “