Question
When importing Oracle data with sqoop, the following errors are reported:
INFO mapreduce.Job: Task Id : attempt_1646802944907_15460_m_000000_1, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:275)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:568)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
Caused by: java.sql.SQLException: ORA-24920: column size too large for client
reason
Before using sqoop import other database is normal, this time from the new database import data problems, first check what is the difference between the two databases, found an Oracle version is 11, the new Oracle database version is 19, which may be the cause of the problem.
Go online to check the ORA-24920 error, said to upgrade the oracle client, further speculation may be the problem of Oracle driver.
Under the lib file of sqoop tool, the Oracle JDBC driver found for sqoop is ojdbc6.jar, which does not match with Oracle version 19.
You can check the Oracle version and the corresponding Oracle JDBC driver version on this page:
https://www.oracle.com/database/technologies/faq-jdbc.html#02_03
The screenshot is as follows:
the link to the download page is as follows:
https://www.oracle.com/database/technologies/appdev/jdbc-downloads.html
Solution:
According to the version, ojdbc8.0.jar was downloaded. After uploading, delete the original version and re import the data.
the driver of the original version here needs to be deleted or moved, otherwise it will not succeed. Guess that if there are two versions, the old version may be read
Read More:
- [Solved] ES Query SIZE too large Error: ENTITY CONTENT IS TOO LONG [105539255] FOR THE CONFIGURED BUFFER LIMIT [104857600]
- ValueError: Negative dimension size caused by subtracting 2 from 1 for…
- Caused by: java.net.SocketException: Software caused connection abort: socket write error
- Error in plot.new() : figure margins too large
- [Solved] ELK Log System Error: “statusCode“:429,“error“:“Too Many Requests“,“message“ Data too large
- Grafana Error: 414 Request-URI Too Large [How to Solve]
- Sqoop exports hive data to MySQL Error [How to Solve]
- [Solved] hive Caused by: MetaException(message:Version information not found in metastore. )
- Solution of socket write error caused by pressing F5 to refresh page by ehcache user
- [Solved] Browser Access Error: Request Header or Cookie too large
- [Solved] Nginx Error: 400 Request Header Or Cookie Too Large
- [Solved] Canal Error: CanalParseException: column size is not match,parse row data failed
- [Solved] nodejs Error: request entity too large
- Kettle Error Caused by: java.lang.ArrayIndexOutOfBoundsException
- [How to Solve] java.lang.IllegalArgumentException: Request header is too large
- How to Solve Nginx 413 Error (request entity too large)
- Sqoop connection gbase data error [How to Solve]
- [Solved] Caused by: java.sql.SQLException: Connections could not be acquired from the underlying database!
- error: (-215:Assertion failed) size.width>0 && size.height>0 in function ‘cv::imshow‘
- Sqoop Error: Error during export:Export job failed [How to Solve]