Flume - Stream log file from Windows to HDFS in Linux -


how stream log file windows 7 hdfs in linux ?

flume in windows giving error

i have installed 'flume-node-0.9.3' on windows 7 (node 1) . 'flumenode' service running , localhost:35862 accessible.
in windows, log file located @ 'c:/logs/weblogic.log'
the flume agent in centos linux (node 2) running.

  1. in windows machine, java_home variable set "c:\program files\java\jre7"
  2. the java.exe file located @ "c:\program files\java\jre7\bin\java.exe"
  3. flume node installed @ " c:\program files\cloudera\flume 0.9.3"

here flume-src.conf file placed inside 'conf' folder of flume on windows 7 (node 1)

source_agent.sources = weblogic_server source_agent.sources.weblogic_server.type = exec source_agent.sources.weblogic_server.command = tail -f c:/logs/weblogic.log source_agent.sources.weblogic_server.batchsize = 1 source_agent.sources.weblogic_server.channels = memorychannel source_agent.sources.weblogic_server.interceptors = itime ihost itype  source_agent.sources.weblogic_server.interceptors.itime.type = timestamp  source_agent.sources.weblogic_server.interceptors.ihost.type = host source_agent.sources.weblogic_server.interceptors.ihost.useip = false source_agent.sources.weblogic_server.interceptors.ihost.hostheader = host  source_agent.sources.weblogic_server.interceptors.itype.type = static source_agent.sources.weblogic_server.interceptors.itype.key = log_type source_agent.sources.weblogic_server.interceptors.itype.value = apache_access_combined  source_agent.channels = memorychannel source_agent.channels.memorychannel.type = memory source_agent.channels.memorychannel.capacity = 100  source_agent.sinks = avro_sink source_agent.sinks.avro_sink.type = avro source_agent.sinks.avro_sink.channel = memorychannel source_agent.sinks.avro_sink.hostname = 10.10.201.40  source_agent.sinks.avro_sink.port = 41414 

i tried run above mentioned file executing following command inside flume folder:

c:\program files\cloudera\flume 0.9.3>"c:\program files\java\jre7\bin\java.exe"  -xmx20m -dlog4j.configuration=file:///%cd%\conf\log4j.properties -cp "c:\program files\cloudera\flume 0.9.3\lib*" org.apache.flume.node.application  -f c:\program files\cloudera\flume 0.9.3\conf\flume-src.conf -n source_agent 

but gives following message:

error: not find or load main class files\cloudera\flume 

here trg-node.conf file running in centos (node 2). centos node working fine:

collector.sources = avroin collector.sources.avroin.type = avro collector.sources.avroin.bind = 0.0.0.0 collector.sources.avroin.port = 41414 collector.sources.avroin.channels = mc1 mc2  collector.channels = mc1 mc2 collector.channels.mc1.type = memory collector.channels.mc1.capacity = 100 collector.channels.mc2.type = memory collector.channels.mc2.capacity = 100  collector.sinks = hadoopout collector.sinks.hadoopout.type = hdfs collector.sinks.hadoopout.channel = mc2 collector.sinks.hadoopout.hdfs.path =/user/root collector.sinks.hadoopout.hdfs.calltimeout = 150000 collector.sinks.hadoopout.hdfs.filetype = datastream collector.sinks.hadoopout.hdfs.writeformat = text collector.sinks.hadoopout.hdfs.rollsize = 0 collector.sinks.hadoopout.hdfs.rollcount = 10000 collector.sinks.hadoopout.hdfs.rollinterval = 600 

the problem due white space between program , files in path:

c:**program files**\cloudera\flume 0.9.3

consider installing flume in path without whitespaces, work charm.


Comments

Popular posts from this blog

c++ - No viable overloaded operator for references a map -

java - Custom OutputStreamAppender not run: LOGBACK: No context given for <MYAPPENDER> -

java - Cannot secure connection using TLS -