Homepage/ blog/ 2015/ 11/ Monitoring with big-data technologies - Part Three

In this post ...
Retrieve and query the remote linux machine message logging by a visualization platform


Previous post: Retrieve and query the remote linux machine message logging
Next step ... ;)
Github repository


$ --> execute the command as cloudera user
# --> execute the command as root user
Remote server = Big Data server = localhost . In this example the remote machine to monitoring is the Big Data server in order to simplified the architecture.

Install elasticsearch

  1. Download elasticsearch version 1.7.3. I choose this version in order to avoid library conflicts between Cloudera Virtual Machine and elasticsearch dependencies.

  2. Extract elasticsearch software


    $ unzip ~cloudera/Downloads/elasticsearch-1.7.3.zip -d ~cloudera/ $ mv ~cloudera/elasticsearch-1.7.3 ~cloudera/elasticsearch

  3. Start elasticsearch


    $ ~cloudera/elasticsearch/bin/elasticsearch

  4. Run a test with the following command


    $ curl -X GET http://localhost:9200/

Install kibana

  1. Download kibana version 4.1.1

  2. Extract kibana


    $ cd ~cloudera $ tar -xvf ~cloudera/Downloads/kibana-4.1.1-linux-x64.tar.gz $ mv ~cloudera/kibana-4.1.1-linux-x64 ~cloudera/kibana

  3. Start Kibana


    $ ~cloudera/kibana/bin/kibana

  4. Open in the browser http://quickstart:5601

Set up Flume agent

  1. Create a directory from a user directory and change the current directory


    $ mkdir ~cloudera/monitoring-with-big-data-technologies-part-three $ cd ~cloudera/monitoring-with-big-data-technologies-part-three

  2. Copy previous flume agent files


    cp ../monitoring-with-big-data-technologies-part-two/step-3/* ./

  3. Replace in flume.properties and log4j.properties with the new flume agent


    $ sed -i 's/syslogAgentStep3/syslogAgentPartThree/g' flume.properties $ sed -i 's/flume-step3.log/flume-part-three.log/g' log4j.properties $ sed -i 's/monitoring-with-big-data-technologies-part-two\/step-3/monitoring-with-big-data-technologies-part-three/g' *.* $ sed -i 's/monitoring-with-big-data-technologies-part-two/monitoring-with-big-data-technologies-part-three/g' log4j.properties

  4. Add elasticsearch sink to flume agent by Replace line 74-76 with the following lines in flume.properties file:


    # Define ElasticSearch (es) as a new flume sink syslogAgentPartThree.sinks = es syslogAgentPartThree.sinks.es.type = elasticsearch # Comma separated list of hostname:port syslogAgentPartThree.sinks.es.hostNames = localhost:9200,localhost:9300 # The name of the index which the date will be appended to syslogAgentPartThree.sinks.es.indexName = monitoring # The type to index the document to syslogAgentPartThree.sinks.es.indexType = syslog # Default elastic search cluster syslogAgentPartThree.sinks.es.clusterName = elasticsearch # Number of events to be written per txn syslogAgentPartThree.sinks.es.batchSize = 10 # TTL in days syslogAgentPartThree.sinks.es.ttl = 10d syslogAgentPartThree.sinks.es.serializer = org.apache.flume.sink.elasticsearch.ElasticSearchDynamicSerializer

    # Binding source/channel/sink syslogAgentPartThree.sources.r1.channels = c1 syslogAgentPartThree.sinks.s1.channel = c1 syslogAgentPartThree.sinks.es.channel = c1

  5. Start elasticsearch


    $ ~cloudera/elasticsearch/bin/elasticsearch

  6. Add in flume-env.sh in order to include in flume classpath elastichsearch library


    export FLUME_CLASSPATH=$FLUME_CLASSPATH:`find ~cloudera/elasticsearch/lib/ -type f -name *.jar | tr "\n" ":"`

  7. Start Flume agent with the following command


    $ flume-ng agent --conf "./" --conf-file "./flume.properties" -n syslogAgentPartThree -Dlog4j.configuration=file:./log4j.properties

  8. Monitoring Flume agent log with the following command from another tab


    $ tail -F ~cloudera/monitoring-with-big-data-technologies-part-three/log/flume-part-three.log

  9. Send from another tab a test message using the logger command to test it


    $ logger -t test 'Testing Flume with rsyslog!'

  10. Retrieve via curl syslog message from elastichsearch


    $ curl 'http://localhost:9200/_search?pretty=true'

Create a simple dashboard

I just prepared a video in oder to show how to create a dashboard.
Please visit elasticsearch for more informations.


Previous post: Retrieve and query the remote linux machine message logging
Next step ... ;)
Github repository

___---*** That's all ***---___

Happy coding ;)


No comments yet.

Add Comment

* Required information
What is the day after Friday?
Enter answer:
I have read and understand the privacy policy. *
I have read and agree to the terms and conditions. *
Powered by Commentics