The plugins described in this section are useful for deserializing data intoLogstash events.
Reads serialized Avro records as Logstash events. This plugin deserializesindividual Avro records. It is not for reading Avro files. Avro files have aunique format that must be handled upon input.
The following config deserializes input from Kafka:
input { kafka { codec => { avro => { schema_uri => "/tmp/schema.avsc" } } }}...
Parses comma-separated value data into individual fields. By default, thefilter autogenerates field names (column1, column2, and so on), or you can specifya list of names. You can also change the column separator.
The following config parses CSV data into the field names specified in thecolumns
field:
filter { csv { separator => "," columns => [ "Transaction Number", "Date", "Description", "Amount Debit", "Amount Credit", "Balance" ] }}
Reads the Fluentd msgpack
schema.
The following config decodes logs received from fluent-logger-ruby
:
input { tcp { codec => fluent port => 4000 }}
Decodes (via inputs) and encodes (via outputs) JSON formatted content, creatingone event per element in a JSON array.
The following config decodes the JSON formatted content in a file:
input { file { path => "/path/to/myfile.json" codec =>"json"}
Reads protobuf encoded messages and converts them to Logstash events. Requiresthe protobuf definitions to be compiled as Ruby files. You can compile them byusing theruby-protoc compiler.
The following config decodes events from a Kafka stream:
input kafka { zk_connect => "127.0.0.1" topic_id => "your_topic_goes_here" codec => protobuf { class_name => "Animal::Unicorn" include_path => ['/path/to/protobuf/definitions/UnicornProtobuf.pb.rb'] } }}
Parses XML into fields.
The following config parses the whole XML document stored in the message
field:
filter { xml { source => "message" }}