logstash写入到kafka和从kafka读取日志
2022/10/3 2:16:48
本文主要是介绍logstash写入到kafka和从kafka读取日志,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
收集nginx日志放到kafka
修改nginx日志格式:[nginx日志格式修改](https://blog.51cto.com/9025736/2373483) input { file { type => "nginx-access" path => "/data/wwwlogs/access_nginx.log" start_position => "beginning" codec => json } file { path => "/var/log/messages" start_position => "beginning" type => "system-log-252" } } } output { if [type] == "nginx-access" { kafka { bootstrap_servers => "192.168.1.252:9092" #kafka服务器地址 topic_id => "252nginx-accesslog" batch_size => 5 codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式 } } } if [type] == "system-log-252" { kafka { bootstrap_servers => "192.168.1.252:9092" topic_id => "system-log-252" batch_size => 5 codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式 } } } }
配置logstash从kafka读取日志
input { kafka { bootstrap_servers => "192.168.1.252:9092" #kafka服务器地址 topics => "252nginx-accesslog" batch_size => 5 codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式 group_id => "252nginx-access-log" consumer_threads => 1 decorate_events => true } kafka { bootstrap_servers => "192.168.1.252:9092" topics => "system-log-252" consumer_threads => 1 decorate_events => true codec => "json" } } output { if [type] == "252nginx-accesslo" { elasticsearch { hosts => ["192.168.1.252:9200"] index => "252nginx-accesslog-%{+YYYY.MM.dd}" }} if [type] == "system-log-252" { elasticsearch { hosts => ["192.168.1.252:9200"] index => "system-log-1512-%{+YYYY.MM.dd}" } }收集nginx日志放到kafka 修改nginx日志格式:[nginx日志格式修改](https://blog.51cto.com/9025736/2373483) input { file { type => "nginx-access" path => "/data/wwwlogs/access_nginx.log" start_position => "beginning" codec => json } file { path => "/var/log/messages" start_position => "beginning" type => "system-log-252" } } } output { if [type] == "nginx-access" { kafka { bootstrap_servers => "192.168.1.252:9092" #kafka服务器地址 topic_id => "252nginx-accesslog" batch_size => 5 codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式 } } } if [type] == "system-log-252" { kafka { bootstrap_servers => "192.168.1.252:9092" topic_id => "system-log-252" batch_size => 5 codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式 } } } } 配置logstash从kafka读取日志 input { kafka { bootstrap_servers => "192.168.1.252:9092" #kafka服务器地址 topics => "252nginx-accesslog" batch_size => 5 codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式 group_id => "252nginx-access-log" consumer_threads => 1 decorate_events => true } kafka { bootstrap_servers => "192.168.1.252:9092" topics => "system-log-252" consumer_threads => 1 decorate_events => true codec => "json" } } output { if [type] == "252nginx-accesslo" { elasticsearch { hosts => ["192.168.1.252:9200"] index => "252nginx-accesslog-%{+YYYY.MM.dd}" }} if [type] == "system-log-252" { elasticsearch { hosts => ["192.168.1.252:9200"] index => "system-log-1512-%{+YYYY.MM.dd}" } }
这篇关于logstash写入到kafka和从kafka读取日志的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2024-12-21MQ-2烟雾传感器详解
- 2024-12-09Kafka消息丢失资料:新手入门指南
- 2024-12-07Kafka消息队列入门:轻松掌握Kafka消息队列
- 2024-12-07Kafka消息队列入门:轻松掌握消息队列基础知识
- 2024-12-07Kafka重复消费入门:轻松掌握Kafka消费的注意事项与实践
- 2024-12-07Kafka重复消费入门教程
- 2024-12-07RabbitMQ入门详解:新手必看的简单教程
- 2024-12-07RabbitMQ入门:新手必读教程
- 2024-12-06Kafka解耦学习入门教程
- 2024-12-06Kafka入门教程:快速上手指南