filebeat与logstash实践
背景
我们在之前有用过ELK,并详细使用过logstash,作为数据从mysql到es的cdc的传输工具。可查看 ElasticStack-logstash篇
这一次让我们来通过filebeat采集,logstash过滤处理一下日志文件,通过采集日志文件进行数据提取,入库到mongodb
仅采集含有 A large volume of broadcast packets has been detected
内容的数据,并将所需要的数据提取出来入库
示例数据:
1 | 2021-12-01 00:00:07.115 [HUB "hub_dkwbj"] Session "SID-BRIDGE-5": A large volume of broadcast packets has been detected. There are cases where packets are discarded based on the policy. The source MAC address is 50-9A-4C-27-F9-D3, the source IP address is fe80::e8d3:8281:e69e:afda, the destination IP address is ff02::1:3. The number of broadcast packets is equal to or larger than 32 items per 1 second (note this information is the result of mechanical analysis of part of the packets and could be incorrect). |
创建docker网络
1 | docker network create --driver bridge leiqin |
制作包含logstash-output-mongodb的logstash镜像包
创建安装了logstash-output-mongodb的镜像包dockerfile文件logstash.dockerfile
文件
1 | FROM docker.elastic.co/logstash/logstash:7.13.0 |
打包自己的logstash镜像
1 | docker build -f logstash.dockerfile -t dakewe/logstash:1.0 . |
docker-compose定义
1 | version: '3.0' |
安装logstash-output-mongodb(使用官方镜像包情况)
tip
: 如果是安装的官方的镜像包,安装后,请进入容器内安装logstash-output-mongodb
不要安装3.1.6新版本,请指定3.1.5版本。具体的坑详见:Github作者回复
1 | bin/logstash-plugin install --version=3.1.5 logstash-output-mongodb |
fitebeat 定义
1 |
|
logstash输出打印到终端
我们先让filebeat的文件到logstash直接输出处理
1 | input { |
logstash 输出到mongodb
在logstash过滤,入库到mongodb
1 | input { |
总结
较为简单,如果配合elk,效果更佳。
专题目录
ElasticStack-安装篇
ElasticStack-elasticsearch篇
ElasticStack-logstash篇
elasticSearch-mapping相关
elasticSearch-分词器介绍
elasticSearch-分词器实践笔记
elasticSearch-同义词分词器自定义实践
docker-elk集群实践
filebeat与logstash实践
filebeat之pipeline实践
Elasticsearch 7.x 白金级 破解实践
elk的告警调研与实践