mapreduce从hbase大量读数据超时异…

xiaoxiao2021-02-28  68

错误描述: 16/05/06 19:56:13 INFO mapreduce.Job: Task Id : attempt_1461653563167_0008_m_000001_2, Status : FAILED Error: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions: Fri May 06 19:55:05 CST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=60302: row '' on table 'solway_data_pv' at region=solway_data_pv,\x00\x12\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00,1462414827453.d8f26eae36431b4aa0a1c834dc8fd9d8., hostname=solway-online10,60020,1462432265703, seqNum=2753         at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.         at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.         at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.         at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.         at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.         at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.         at org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl.nextKeyValue(TableRecordReaderImpl.         at org.apache.hadoop.hbase.mapreduce.TableRecordReader.nextKeyValue(TableRecordReader.         at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase$1.nextKeyValue(TableInputFormatBase.         at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.         at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.         at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.         at org.apache.hadoop.mapred.MapTask.run(MapTask.         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.         at          at          at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.         at org.apache.hadoop.mapred.YarnChild.main(YarnChild. Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=60302: row '' on table 'solway_data_pv' at region=solway_data_pv,\x00\x12\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00,1462414827453.d8f26eae36431b4aa0a1c834dc8fd9d8., hostname=solway-online10,60020,1462432265703, seqNum=2753         at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.         at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.         at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.         at          at          at          at  Caused by: java.io.IOException: Call to solway-online10/60.12.10.20:60020 failed on local exception: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call id=5, waitTime=60001, operationTimeout=60000 expired.         at org.apache.hadoop.hbase.ipc.RpcClientImpl.wrapException(RpcClientImpl.         at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl. 问题原因:HBase有很多超时机制,本次报错应当是RPC超时引起的 解决方案:一般HBase进行scan读的时候,超时异常,可根据实际情况适当调整如下参数解决: set mapreduce.task.timeout= 1200000 set  hbase.client.scanner.timeout.period=600000 set  hbase.rpc.timeout=600000 其中, hbase.client.scanner.timeout.period和 hbase.rpc.timeout最好配置一样大
转载请注明原文地址: https://www.6miu.com/read-26753.html

最新回复(0)