1. Error reporting scenario
Load data
[[email protected] FLink-test-LT]$ curl --location-trusted -u root:aba -T 'data_2022-04-11.csv' -H "label:testdoris201" -H "column_separator:," -XPUT http://10.22.33.69:8030/api/os/ws/_stream_load
2. Error reporting content
{
"TxnId": 831311,
"Label": "testdoris201",
"Status": "Fail",
"Message": "too many filtered rows",
"NumberTotalRows": 1,
"NumberLoadedRows": 0,
"NumberFilteredRows": 1,
"NumberUnselectedRows": 0,
"LoadBytes": 1050,
"LoadTimeMs": 85,
"BeginTxnTimeMs": 0,
"StreamLoadPutTimeMs": 1,
"ReadDataTimeMs": 0,
"WriteDataTimeMs": 57,
"CommitAndPublishTimeMs": 0,
"ErrorURL": "http://192.168.1.6:8040/api/_load_error_log?file=__shard_32/error_log_insert_stmt_b84753506aa7e6e9-434467e6d4a8a49e_b84753506aa7e6e9_434467e6d4a8a49e"
}
3. Solutions
1. If the loaded data is inconsistent with the corresponding Doris table fields, adjust them to be consistent
2 The corresponding Doris table has no corresponding data partition. You can adjust the data or partition
Read More:
- [Solved] Doris Error: Label Already Exists
- How to Solve Doris dynamic partition table routineload Error
- Sqoop Error: Can‘t parse input data: ‘\N‘ [How to Solve]
- ElementUI Error in callback for watcher “data”: “Error: [ElTable] prop row-key is required”
- [Solved] Doris BrokerLoad Error: Scan bytes per broker scanner exceed limit: 3221225472
- How to Solve Starrocks Various Error
- Doris BrokerLoad Error: No source file in this table [How to Solve]
- [Solved] Doris StreamLoad Error: load by MERGE or DELETE need to upgrade table to support batch delete
- Doris BrokerLoad Error: quality not good enough to cancel
- Doris BrokerLoad Error: quality not good enough to cancel
- ORA-08103: object no longer exists [How to Solve]
- AUC Error – ValueError: Data is not binary and pos_label is not specified
- How to Solve Cocos creator label text is too many error
- Hive Error: Error: GC overhead limit exceeded
- Java.sql.SQLException: ORA-02291: integrity constraint violated – par
- [Solved] waterdrop Import hive to clickhouse Error: Too many partitions for single INSERT block (more than 100).
- TypeException: Error setting non null for parameter #1 with JdbcType null
- Sqllineage Error: OSError: [Errno 99] Cannot assign requested address
- [Solved] Milvus Error: [ERROR][SERVER][TakeToEUse milvus.has_collection to verify whether the collection exists
- Kafkaconsumer calls seek() method error [How to Solve]