注意:配置文件大家不要写错了哦。
A机器:avro-client B机器:avro-source ==> channel(memory) ==>sink(logger) A机器向B机器传输日志 - 配置文件
B机器的agent文件: a1.sources=r1 a1.sinks=k1 a1.channels=c1 a1.sources.r1.type=avro a1.sources.r1.bind=0.0.0.0 a1.sources.r1.port=44444 a1.channels.c1.type=memory a1.sinks.k1.type=logger a1.sinks.k1.channel=c1 a1.sources.r1.channels=c1 执行命令 ./flume-ng agent \ --name a1 \ --conf $FLUME_HOME/conf \ --conf-file $FLUME_HOME/conf/avro.conf \ -Dflume.root.logger=INFO,console 在a机器上执行: ./flume-ng avro-client --host 0.0.0.0 --port 44444 --filename /home/hadoop/data/input.txt注意:这种方式只能传一次,完了就会中断。这种方式在生产上肯定是不行的,那该如何是好呢,下面我们介绍另外一种方式。
官方网址,我们可以借助于flume的log4j.appender来实现
配置文件 定义agent文件: b1.sources=r1 b1.sinks=k1 b1.channels=c1 b1.sources.r1.type=avro b1.sources.r1.bind=0.0.0.0 b1.sources.r1.port=44444 b1.channels.c1.type=memory b1.sinks.k1.type=logger b1.sinks.k1.channel=c1 b1.sources.r1.channels=c1 导入依赖 <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> <version>1.7.5</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.flume.flume-ng-clients/flume-ng-log4jappender --> <dependency> <groupId>org.apache.flume.flume-ng-clients</groupId> <artifactId>flume-ng-log4jappender</artifactId> <version>1.6.0</version> </dependency> 创建log4j.properties 此时在项目工程中新建一个resources,选择resources,单击搜索框左边,选择Modules,找到resources,将其选择Mask as Resources。 在resources下新建一个log4j.properties,将log4j数据导入到里面。 //log4j.properties添加: log4j.rootCategory=INFO, console, flume log4j.appender.console=org.apache.log4j.ConsoleAppender log4j.appender.console.target=System.err log4j.appender.console.layout=org.apache.log4j.PatternLayout log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender log4j.appender.flume.Hostname = hadoop log4j.appender.flume.Port = 44444 log4j.appender.flume.UnsafeMode = true 启动命令 ./flume-ng agent \ --name b1 \ --conf $FLUME_HOME/conf \ --conf-file $FLUME_HOME/conf/avro_source.conf \ -Dflume.root.logger=INFO,console 运行主函数查看结果提示:这时候运行可能会出现如下错误
log4j:ERROR Could not find value for key log4j.appender.flume.layout 18/05/03 11:24:45 WARN NettyAvroRpcClient: Using default maxIOWorkers log4j:ERROR RPC client creation failed! NettyAvroRpcClient { host: hadoop, port: 44444 }: RPC connection error 18/05/03 11:24:46 INFO Flume_LogApp: current value is:0 log4j:ERROR Cannot Append to Appender! Appender either closed or not setup correctly! 18/05/03 11:24:47 INFO Flume_LogApp: current value is:1 log4j:ERROR Cannot Append to Appender! Appender either closed or not setup correctly! 18/05/03 11:24:48 INFO Flume_LogApp: current value is:2 log4j:ERROR Cannot Append to Appender! Appender either closed or not setup correctly!错误原因: rpc通信失败,可能是本地没有配置ip和主机名的映射关系,修改: log4j.appender.flume.Hostname = hadoop ,把hadoop主机名修改为ip即可。
