flume 从文件夹导入hdfs

xiaoxiao2021-02-28  86

#agent名, source、channel、sink的名称 a1.sources = r1 a1.channels = c1 a1.sinks = k1 #具体定义source a1.sources.r1.type = spooldir a1.sources.r1.spoolDir = /root/logs #具体定义channel a1.channels.c1.type = memory a1.channels.c1.capacity = 10000 a1.channels.c1.transactionCapacity = 100 #具体定义sink a1.sinks.k1.type = hdfs a1.sinks.k1.hdfs.path = hdfs://192.168.8.110:9000/flume/%Y%m%d a1.sinks.k1.hdfs.filePrefix = events- a1.sinks.k1.hdfs.fileType = DataStream a1.sinks.k1.hdfs.useLocalTimeStamp = true #不按照条数生成文件 a1.sinks.k1.hdfs.rollCount = 0 #HDFS上的文件达到128M时生成一个文件 a1.sinks.k1.hdfs.rollSize = 134217728 #HDFS上的文件达到60秒生成一个文件 a1.sinks.k1.hdfs.rollInterval = 60   #组装source、channel、sink a1.sources.r1.channels = c1

a1.sinks.k1.channel = c1

=====================================

bin/flume-ng agent -c ./conf/ -f conf/a1.conf -Dflume.root.logger=INFO,console -n a1

==========================================================

bin/flume-ng agent -n a4 -c conf -f conf/a4.conf -Dflume.root.logger=INFO,console cd /usr/java vim /etc/profile source /etc/profile chmod +x jdk-6u2-linux-i586-rpm.bin ============================== 2 解压flume 3 vim flume-env.sh  4 export JAVA_HOME=/root/jdk1.6.0_45 export HADOOP_HOME=/usr/hadoop/hadoop-2.2.0 export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib" export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin ============== export JAVA_HOME=/root/jdk1.6.0_45 export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin 5 修改配置文件 cd conf/ 6 cp /root/a4.conf .  //拷贝到当前目录 7 启动flume( bin/flume-ng agent -n(名字) a4 -c(配置信息) conf -f(source、channel、sink的类型) conf/a4.conf -Dflume.root.logger=INFO,console)  bin/flume-ng agent -n a4 -c conf -f conf/a4.conf -Dflume.root.logger=INFO,console    bin/flume-ng agent -n a2 -c conf -f conf/a2.conf -Dflume.root.logger=INFO,console 8 cd /usr/hadoop/hadoop-2.2.0/share/hadoop/common cd /itcast/apache-flume-1.5.0-bin/lib /itcast/apache-flume-1.5.0-bin/conf 9 ctrl+shift+t  查看类在那 10 scp /itcast/hadoop-2.2.0/etc/hadoop/{ core-site.xml, hdfs-site.xml} 192.168.8.129:/itcast/apache-flume-1.5.0-bin/conf

转载请注明原文地址: https://www.6miu.com/read-57842.html

最新回复(0)