sqoop从mysql导数据到hdfs使用lzop压缩格式,报:NullPointerException

2022/6/18 2:50:12

本文主要是介绍sqoop从mysql导数据到hdfs使用lzop压缩格式,报:NullPointerException,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!

sqoop从mysql导数据到hdfs使用lzop压缩格式,报:NullPointerException

具体报错如下:

Error: java.lang.NullPointerException
        at com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:63)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:560)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:798)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
 
2021-12-21 11:14:28,253 INFO mapreduce.Job: Task Id : attempt_1639967851440_0006_m_000000_1, Status : FAILED
Error: java.lang.NullPointerException
        at com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:63)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:560)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:798)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
 
2021-12-21 11:14:31,265 INFO mapreduce.Job: Task Id : attempt_1639967851440_0006_m_000000_2, Status : FAILED
Error: java.lang.NullPointerException
        at com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:63)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:560)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:798)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
 
2021-12-21 11:14:36,283 INFO mapreduce.Job:  map 100% reduce 0%
2021-12-21 11:14:37,292 INFO mapreduce.Job: Job job_1639967851440_0006 failed with state FAILED due to: Task failed task_1639967851440_0006_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0 killedMaps:0 killedReduces: 0
 
2021-12-21 11:14:37,344 INFO mapreduce.Job: Counters: 9
        Job Counters
                Failed map tasks=4
                Launched map tasks=4
                Other local map tasks=3
                Data-local map tasks=1
                Total time spent by all maps in occupied slots (ms)=11344
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=5672
                Total vcore-milliseconds taken by all map tasks=5672
                Total megabyte-milliseconds taken by all map tasks=5808128
2021-12-21 11:14:37,345 ERROR lzo.DistributedLzoIndexer: DistributedIndexer job job_1639967851440_0006 failed.

解决方法:在core-site.xml增加配置支持LZO压缩配置即可。

<configuration>
    <property>
        <name>io.compression.codecs</name>
        <value>
            org.apache.hadoop.io.compress.GzipCodec,
            org.apache.hadoop.io.compress.DefaultCodec,
            org.apache.hadoop.io.compress.BZip2Codec,
            org.apache.hadoop.io.compress.SnappyCodec,
            com.hadoop.compression.lzo.LzoCodec,
            com.hadoop.compression.lzo.LzopCodec
        </value>
    </property>
 
    <property>
        <name>io.compression.codec.lzo.class</name>
        <value>com.hadoop.compression.lzo.LzoCodec</value>
    </property>
</configuration>

本文摘抄自https://blog.csdn.net/jackfeng86/article/details/122062976



这篇关于sqoop从mysql导数据到hdfs使用lzop压缩格式,报:NullPointerException的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!


扫一扫关注最新编程教程