ELK logstash KV过滤插件

网友投稿 295 2022-11-02


ELK logstash KV过滤插件

过滤插件:通用配置字段

add_field 如果过滤成功,添加一个字段到这个事件add_tags 如果过滤成功,添加任意数量的标签到这个事件remove_field 如果过滤成功,从这个事件移除任意字段remove_tag 如果过滤成功,从这个事件移除任意标签

Description

This filter helps automatically parse messages (or specific event fields) which are of the ​​foo=bar​​ variety.

For example, if you have a log message which contains ​​ip=1.2.3.4 error=REFUSED​​, you can parse those automatically by configuring:

filter { kv { } }

The above will result in a message of ​​ip=1.2.3.4 error=REFUSED​​ having the fields:

​​ip: 1.2.3.4​​​​error: REFUSED​​

This is great for postfix, iptables, and other types of logs that tend towards ​​key=value​​ syntax.

You can configure any arbitrary strings to split your data on, in case your data is not structured using ​​=​​​ signs and whitespace. For example, this filter can also be used to parse query parameters like ​​foo=bar&baz=fizz​​​ by setting the ​​field_split​​​ parameter to ​​&​​.

过滤插件:KV

KV插件:接收一个键值数据,按照指定分隔符解析为Logstash事件中的数据结构,放到事件顶层。

常用字段:

• field_split 指定键值分隔符,默认空

​​field_split​​

Value type is​​string​​Default value is​​" "​​

A string of characters to use as single-character field delimiters for parsing out key-value pairs.

These characters form a regex character class and thus you must escape special regex characters like ​​[​​​ or ​​]​​​ using ​​\​​.

Example with URL Query Strings

For example, to split out the args from a url query string such as ​​?pin=12345~0&d=123&e=foo@bar.com&oq=bobo&ss=12345​​:

filter { kv { field_split => "&?" } }

The above splits on both ​​&​​​ and ​​?​​ characters, giving you the following fields:

pin: 12345~0d: 123e: foo@bar.comoq: boboss: 12345

示例如下:

如果日志以键值存储那么用这个插件会比较方便,指定分隔符

[root@localhost ~]# cat /usr/local/logstash/conf.d/test.confinput { file { path => "/var/log/nginx/*.log" exclude => "error.log" start_position => "beginning" tags => "web" tags => "nginx" type => "access" add_field => { "project" => "microservice" "app" => "product" } }}filter { kv { field_split => "&?" } }output { elasticsearch { hosts => ["192.168.179.102:9200","192.168.179.103:9200","192.168.179.104:9200"] index => "test-%{+YYYY.MM.dd}" }}

配置好logstash之后让其重新加载配置 ,查看信息看是否有报错

如果字段没有拆开只能在message里面去搜索

这样就很呆板,就不能多维度去查询了,可视化展示是基于格式化后某些字段进行展示的。所以字段很重要,可以通过key value做解析。


版权声明:本文内容由网络用户投稿,版权归原作者所有,本站不拥有其著作权,亦不承担相应法律责任。如果您发现本站中有涉嫌抄袭或描述失实的内容,请联系我们jiasou666@gmail.com 处理,核实后本网站将在24小时内删除侵权内容。

上一篇:ELK logstash 过滤插件:JSON
下一篇:SpringCloud Open feign 使用okhttp 优化详解
相关文章

 发表评论

暂时没有评论,来抢沙发吧~