logstash写入到kafka和从kafka读取日志
2022/10/3 2:16:48
本文主要是介绍logstash写入到kafka和从kafka读取日志,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
收集nginx日志放到kafka
修改nginx日志格式:[nginx日志格式修改](https://blog.51cto.com/9025736/2373483) input { file { type => "nginx-access" path => "/data/wwwlogs/access_nginx.log" start_position => "beginning" codec => json } file { path => "/var/log/messages" start_position => "beginning" type => "system-log-252" } } } output { if [type] == "nginx-access" { kafka { bootstrap_servers => "192.168.1.252:9092" #kafka服务器地址 topic_id => "252nginx-accesslog" batch_size => 5 codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式 } } } if [type] == "system-log-252" { kafka { bootstrap_servers => "192.168.1.252:9092" topic_id => "system-log-252" batch_size => 5 codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式 } } } }
配置logstash从kafka读取日志
input { kafka { bootstrap_servers => "192.168.1.252:9092" #kafka服务器地址 topics => "252nginx-accesslog" batch_size => 5 codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式 group_id => "252nginx-access-log" consumer_threads => 1 decorate_events => true } kafka { bootstrap_servers => "192.168.1.252:9092" topics => "system-log-252" consumer_threads => 1 decorate_events => true codec => "json" } } output { if [type] == "252nginx-accesslo" { elasticsearch { hosts => ["192.168.1.252:9200"] index => "252nginx-accesslog-%{+YYYY.MM.dd}" }} if [type] == "system-log-252" { elasticsearch { hosts => ["192.168.1.252:9200"] index => "system-log-1512-%{+YYYY.MM.dd}" } }收集nginx日志放到kafka 修改nginx日志格式:[nginx日志格式修改](https://blog.51cto.com/9025736/2373483) input { file { type => "nginx-access" path => "/data/wwwlogs/access_nginx.log" start_position => "beginning" codec => json } file { path => "/var/log/messages" start_position => "beginning" type => "system-log-252" } } } output { if [type] == "nginx-access" { kafka { bootstrap_servers => "192.168.1.252:9092" #kafka服务器地址 topic_id => "252nginx-accesslog" batch_size => 5 codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式 } } } if [type] == "system-log-252" { kafka { bootstrap_servers => "192.168.1.252:9092" topic_id => "system-log-252" batch_size => 5 codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式 } } } } 配置logstash从kafka读取日志 input { kafka { bootstrap_servers => "192.168.1.252:9092" #kafka服务器地址 topics => "252nginx-accesslog" batch_size => 5 codec => "json" #写入的时候使用json编码,因为logstash收集后会转换成json格式 group_id => "252nginx-access-log" consumer_threads => 1 decorate_events => true } kafka { bootstrap_servers => "192.168.1.252:9092" topics => "system-log-252" consumer_threads => 1 decorate_events => true codec => "json" } } output { if [type] == "252nginx-accesslo" { elasticsearch { hosts => ["192.168.1.252:9200"] index => "252nginx-accesslog-%{+YYYY.MM.dd}" }} if [type] == "system-log-252" { elasticsearch { hosts => ["192.168.1.252:9200"] index => "system-log-1512-%{+YYYY.MM.dd}" } }
这篇关于logstash写入到kafka和从kafka读取日志的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2024-05-08「布道师系列文章」解析 AutoMQ 对象存储中的文件存储格式
- 2024-05-08「布道师系列文章」小红书黄章衡:AutoMQ Serverless 基石-秒级分区迁移
- 2024-05-08AutoMQ 系统测试体系揭秘
- 2024-03-14AutoMQ 携手阿里云共同发布新一代云原生 Kafka,帮助得物有效压缩 85% Kafka 云支出!
- 2024-02-22kafka partitioner
- 2024-01-24AutoMQ生态集成 - 将数据从 AutoMQ Kafka 导入 RisingWave 数据库
- 2024-01-13消息队列面试题:为什么要使用消息队列?
- 2024-01-08"基于 XHAMQ 的消息队列系统实现"
- 2023-11-24全网最全图解Kafka适用场景
- 2023-09-19RabbitMQ 消息应答