系统各部分应用介绍:

Kibana:开源web展现

Elasticsearch:开源的搜索引擎框架logstash部分,可进行多数据集群,提高效率,从redis中读取数据,并转发到Kibana

Logstash:系统log收集,转载的工具,同时集成各类日志插件,对日志查询和分析的效率显著提高

Logstash shipper:收集log 并将log转发给redis 存储

Logstash indexer:从redis中读取数据并转发给elasticsearch

Redis:是db,logstash shipper将log转发到redis数据库中存储

部署:

1、jdk:

path:/usr/local/jdk7

cat /etc/profile

export JAVA_HOME=/usr/local/jdk7
export PATH=$JAVA_HOME/bin:$PATH

export REDIS_HOME=/usr/local/redis-2.6.12

export ES_HOME=/usr/local/elasticsearch

export ES_CLASSPATH=$ES_HOME/config

2、ElasticSearch

wget https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-0.20.2.tar.gztar xvf elasticsearch-0.20.2.tar.gzmv elasticsearch-0.20.2 elasticsearch

cd /usr/local/elasticsearch/config

vim elasticsearch.yml

cluster.name: elasticsearch

node.name: "litong"

path.conf: /usr/local/elasticsearch

path.data: /usr/local/elasticsearch/data

path.work: /usr/local/elasticsearch/tmp

path.logs: /usr/local/elasticsearch/logs

bootstrap.mlockall: true

mkdir -p /usr/local/elasticsearch/data /usr/local/elasticsearch/tmp /usr/local/elasticsearch/logs3、Configure Java Service Wrapper

Get the service wrapper

wget http://github.com/elasticsearch/elasticsearch-servicewrapper/archive/master.zipunzip master.zipmv elasticsearch-servicewrapper-master/service/ /usr/local/elasticsearch/bin/rm -rf elasticsearch-servicewrapper-master/

vim service/elasticsearch.conf

set.default.ES_HOME=/usr/local/elasticsearchset.default.ES_HEAP_SIZE=1024 #memorywrapper.java.additional.10=-Des.max-open-files=truewrapper.logfile.maxsize=5mwrapper.logfile.maxfiles=5

service:/etc/init.d/elasticsearch

bin/service/elasticsearch install

service elasticsearch start

4、ElasticSearch Head

bin/plugin -install mobz/elasticsearch-head

5、Redis

Install Redis server

wget http://redis.googlecode.com/files/redis-2.6.12.tar.gztar xzf redis-2.6.12.tar.gzmv redis-2.6.12 /usr/local/rediscd /usr/local/redismake

make install

5、Configure Redis – ‘cp redis.conf 6379.conf’

vim 6379.conf

daemonize yespidfile /var/run/redis/redis_6379.pidport 6379timeout 300tcp-keepalive 60logfile /var/log/redis/redis_6379.log

Add REDIS home to root user’s ‘.bash_profile’

# Redisexport REDIS_HOME=/usr/local/redis

Copy Redis init script

cp utils/redis_init_script /etc/init.d/redis_6379

Configure Redis init script

# chkconfig: - 85 15# description: Redis is a persistent key-value database# processname: redisREDISPORT=6379EXEC=/usr/local/redis/src/redis-serverCLIEXEC=/usr/local/redis/src/redis-cliPIDFILE=/var/run/redis/redis_6379.pidCONF="/usr/local/redis/6379.conf"

Activate Redis service

mkdir /var/run/redis /var/log/rediscd /etc/init.dchkconfig --add redis

Start

service redis start

6、Logstash

mkdir /usr/local/logstashcd /usr/local/logstashwget https://logstash.objects.dreamhost.com/release/logstash-1.1.9-monolithic.jar

Indexer configuration – indexer.conf:

input {
redis {
host => "192.168.0.235"
port => "6379"
type => "redis-input"
data_type => "list"
key => "logstash"
format => "json_event"
}
}

output {
stdout { debug => true debug_format => "json"}

elasticsearch {
host => "192.168.0.235"
port => "9300"
cluster => "elasticsearch"
}
}

Shipper configuration – shipper.conf:

input {
file {
type => "nginx"
path => ["/usr/local/nginx/logs/*.log"]

exclude => ["*.gz"]

tags => ["nginx"]
}
}

output {
stdout { debug => true debug_format => "json"}

redis {
host => "192.168.0.235"
data_type => "list"
key => "logstash"
}
}

java -jar logstash-1.1.9-monolithic.jar agent -f indexer.conf &java -jar logstash-1.1.9-monolithic.jar agent -f shipper.conf &7、Kibana

Setup Ruby

yum install ruby ruby-devel ruby-ri ruby-rdoc rubygems

wget http://production.cf.rubygems.org/rubygems/rubygems-2.0.3.zipunzip rubygems-2.0.3.zipruby rubygems-2.0.3/setup.rb

Get Kibana

wget https://github.com/rashidkpc/Kibana/archive/v0.2.0.zipunzip v0.2.0.zipcd Kibana-0.2.0gem install bundlerbundle install

Configure KibanaConfig.rb:

Elasticsearch = "192.168.0.235:9200"KibanaPort = 80KibanaHost = '192.168.0.235'

Run Kibana

bundle exec ruby kibana.rb