logstash收集日志并写入kafka再到es集群
2021/10/7 6:13:18
本文主要是介绍logstash收集日志并写入kafka再到es集群,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
条件:
有kafka环境
图形架构:
环境准备
172.31.2.101 es1 + kibana 172.31.2.102 es2 172.31.2.103 es3 172.31.2.104 logstash1 172.31.2.105 logstash2 172.31.2.41 zookeeper + kafka 172.31.2.42 zookeeper + kafka 172.31.2.43 zookeeper + kafka 172.31.2.107 web1
先启动zookeeper
[root@mq1 ~]# /usr/local/zookeeper/bin/zkServer.sh restart [root@mq2 ~]# /usr/local/zookeeper/bin/zkServer.sh restart [root@mq3 ~]# /usr/local/zookeeper/bin/zkServer.sh restart
启动kafka
[root@mq1 ~]# /apps/kafka/bin/kafka-server-start.sh -daemon /apps/kafka/config/server.properties [root@mq2 ~]# /apps/kafka/bin/kafka-server-start.sh -daemon /apps/kafka/config/server.properties [root@mq3 ~]# /apps/kafka/bin/kafka-server-start.sh -daemon /apps/kafka/config/server.properties
查看端口
[root@mq1 ~]# ss -tanl | grep 9092 LISTEN 0 50 [::ffff:172.31.2.41]:9092 *:*
web服务器改配置写入到kafka
[root@es-web1 ~]# cat /etc/logstash/conf.d/kafka-nginx-es.conf input { file { path => "/var/log/nginx/access.log" start_position => "beginning" stat_interval => 3 type => "nginx-accesslog" codec => "json" } file { path => "/apps/nginx/logs/error.log" start_position => "beginning" stat_interval => 3 type => "nginx-errorlog" } } output { if [type] == "nginx-accesslog" { kafka { bootstrap_servers => "172.31.2.41:9092" topic_id => "long-linux21-accesslog" codec => "json" }} if [type] == "nginx-errorlog" { kafka { bootstrap_servers => "172.31.2.41:9092" topic_id => "long-linux21-errorlog" #codec => "json" }} }
重启
root@long:~# systemctl restart logstash
在logstash服务器配置写入elasticsearch
[root@logstash1 ~]# cat /etc/logstash/conf.d/kafka-to-es.conf input { kafka { bootstrap_servers => "172.31.2.41:9092,172.31.2.42:9092,172.31.2.43:9092" topics => "long-linux21-accesslog" codec => "json" } kafka { bootstrap_servers => "172.31.2.41:9092,172.31.2.42:9092,172.31.2.43:9092" topics => "long-linux21-errorlog" codec => "json" } } output { if [type] == "nginx-accesslog" { elasticsearch { hosts => ["172.31.2.101:9200"] index => "n19-long-kafka-nginx-accesslog-%{+YYYY.MM.dd}" }} if [type] == "nginx-errorlog" { elasticsearch { hosts => ["172.31.2.101:9200"] index => "n17-long-kafka-nginx-errorlog-%{+YYYY.MM.dd}" }} }
测试
[root@logstash1 ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/kafka-to-es.conf -t
重启
root@long:~# systemctl restart logstash
kafak工具
写入kibana
略
这篇关于logstash收集日志并写入kafka再到es集群的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2024-10-27[开源] 一款轻量级的kafka可视化管理平台
- 2024-10-23Kafka消息丢失资料详解:初学者必看教程
- 2024-10-23Kafka资料新手入门指南
- 2024-10-23Kafka解耦入门:新手必读教程
- 2024-10-23Kafka入门:新手必读的简单教程
- 2024-10-23Kafka入门:新手必读的简单教程
- 2024-10-23Kafka消息丢失入门:新手必读指南
- 2024-10-23Kafka消息队列入门:新手必看的简单教程
- 2024-10-23Kafka消息队列入门与应用
- 2024-10-23Kafka重复消费入门:轻松掌握Kafka重复消息处理技巧