logstash
logstash作为数据搜集器,主要分为三个部分:input->filter->output 作为pipeline的形式进行处理,支持复杂的操作,如发邮件等
input配置数据的输入和简单的数据转换
filter配置数据的提取,一般使用grok
output配置数据的输出和简单的数据转换
运行:logstash -f /etc/logstash.conf
-f 指定配置文件
-e 只在控制台运行
具体的配置见官网
https://www.elastic.co/products/logstash
Centralize, Transform & Stash Your Data
input
| Plugin | Description | Github repository | 
| Receives events from the Elastic Beats framework | ||
| Streams events from CouchDB’s  | ||
| Reads query results from an Elasticsearch cluster | ||
| Streams events from files | ||
| Reads GELF-format messages from Graylog2 as events | ||
| Generates random log events for test purposes | ||
| Reads metrics from the  | ||
| Generates heartbeat events for testing | ||
| Receives events over HTTP or HTTPS | ||
| Decodes the output of an HTTP API into events | ||
| Creates events from JDBC data | ||
| Reads events from a Kafka topic | ||
| Reads events over a TCP socket from a Log4j | ||
| Receives events using the Lumberjack protocl | ||
| Pulls events from a RabbitMQ exchange | ||
| Reads events from a Redis instance | ||
| Streams events from files in a S3 bucket | ||
| Pulls events from an Amazon Web Services Simple Queue Service queue | ||
| Reads events from standard input | ||
| Reads syslog messages as events | ||
| Reads events from a TCP socket | ||
| Reads events from the Twitter Streaming API | ||
| Reads events over UDP | 
Community supported plugins
These plugins are maintained and supported by the community. These plugins have met the Logstash development & testing criteria for integration. Contributors include Community Maintainers, the Logstash core team at Elastic, and the broader community.
| Plugin | Description | Github repository | 
| Pulls events from the Amazon Web Services CloudWatch API | ||
| Retrieves watchdog log events from Drupal installations with DBLog enabled | ||
| Pulls events from the Windows Event Log | ||
| Captures the output of a shell command as an event | ||
| Reads Ganglia packets over UDP | ||
| Pushes events to a GemFire region | ||
| Reads events from a GitHub webhook | ||
| Streams events from the logs of a Heroku app | ||
| Reads mail from an IMAP server | ||
| Reads events from an IRC server | ||
| Retrieves metrics from remote Java applications over JMX | ||
| Receives events through an AWS Kinesis stream | ||
| Captures the output of command line tools as an event | ||
| Streams events from a long-running command pipe | ||
| Receives facts from a Puppet server | ||
| Receives events from a Rackspace Cloud Queue service | ||
| Receives RELP events over a TCP socket | ||
| Captures the output of command line tools as an event | ||
| Creates events based on a Salesforce SOQL query | ||
| Creates events based on SNMP trap messages | ||
| Creates events based on rows in an SQLite database | ||
| Creates events received with the STOMP protocol | ||
| Reads events over a UNIX socket | ||
| Reads from the  | ||
| Reads events from a websocket | ||
| Creates events based on the results of a WMI query | ||
| Receives events over the XMPP/Jabber protocol | ||
| Reads Zenoss events from the fanout exchange | ||
| Reads events from a ZeroMQ SUB socket | 
filter
| Plugin | Description | Github repository | 
| Aggregates information from several events originating with a single task | ||
| Replaces field values with a consistent hash | ||
| Parses comma-separated value data into individual fields | ||
| Parses dates from fields to use as the Logstash timestamp for an event | ||
| Computationally expensive filter that removes dots from a field name | ||
| Extracts unstructured event data into fields using delimiters | ||
| Performs a standard or reverse DNS lookup | ||
| Drops all events | ||
| Fingerprints fields by replacing values with a consistent hash | ||
| Adds geographical information about an IP address | ||
| Parses unstructured event data into fields | ||
| Parses JSON events | ||
| Parses key-value pairs | ||
| Merges multiple lines into a single event | ||
| Performs mutations on fields | ||
| Executes arbitrary Ruby code | ||
| Sleeps for a specified time span | ||
| Splits multi-line messages into distinct events | ||
| Parses the  | ||
| Throttles the number of events | ||
| Replaces field contents based on a hash or YAML file | ||
| Decodes URL-encoded fields | ||
| Parses user agent strings into fields | ||
| Adds a UUID to events | ||
| Parses XML into fields | 
Community supported plugins
These plugins are maintained and supported by the community. These plugins have met the Logstash development & testing criteria for integration. Contributors include Community Maintainers, the Logstash core team at Elastic, and the broader community.
| Plugin | Description | Github repository | 
| Performs general alterations to fields that the  | ||
| Checks IP addresses against a list of network blocks | ||
| Applies or removes a cipher to an event | ||
| Duplicates events | ||
| Collates events by time or count | ||
| Calculates the elapsed time between a pair of events | ||
| Copies fields from previous log events in Elasticsearch to current events | ||
| Stores environment variables as metadata sub-fields | ||
| Extracts numbers from a string | ||
| Removes special characters from a field | ||
| Serializes a field to JSON | ||
| Adds arbitrary fields to an event | ||
| Takes complex events containing a number of metrics and splits these up into multiple events, each holding a single metric | ||
| Aggregates metrics | ||
| Parse OUI data from MAC addresses | ||
| Prunes event data based on a list of fields to blacklist or whitelist | ||
| Strips all non-punctuation content from a field | ||
| Checks that specified fields stay within given size or length limits | ||
| Replaces the contents of the default message field with whatever you specify in the configuration | ||
| Takes an existing field that contains YAML and expands it into an actual data structure within the Logstash event | ||
| Sends an event to ZeroMQ | 
output
Elastic supported plugins
These plugins are maintained and supported by Elastic.
| Plugin | Description | Github repository | 
| Writes events to disk in a delimited format | ||
| Stores logs in Elasticsearch | ||
| Sends email to a specified address when output is received | ||
| Writes events to files on disk | ||
| Writes metrics to Graphite | ||
| Sends events to a generic HTTP or HTTPS endpoint | ||
| Writes events to a Kafka topic | ||
| Sends events using the  | ||
| Pushes events to a RabbitMQ exchange | ||
| Sends events to a Redis queue using the  | ||
| Sends Logstash events to the Amazon Simple Storage Service | ||
| Prints events to the standard output | ||
| Writes events over a TCP socket | ||
| Sends events over UDP | 
Community supported plugins
These plugins are maintained and supported by the community. These plugins have met the Logstash development & testing criteria for integration. Contributors include Community Maintainers, the Logstash core team at Elastic, and the broader community.
| Plugin | Description | Github repository | 
| Sends annotations to Boundary based on Logstash events | ||
| Sends annotations to Circonus based on Logstash events | ||
| Aggregates and sends metric data to AWS CloudWatch | ||
| Sends events to DataDogHQ based on Logstash events | ||
| Sends metrics to DataDogHQ based on Logstash events | ||
| Stores logs in Elasticsearch using the  | ||
| Runs a command for a matching event | ||
| Writes metrics to Ganglia’s  | ||
| Generates GELF formatted output for Graylog2 | ||
| Writes events to Google BigQuery | ||
| Writes events to Google Cloud Storage | ||
| Sends metric data on Windows | ||
| Writes events to HipChat | ||
| Writes metrics to InfluxDB | ||
| Writes events to IRC | ||
| Writes strutured JSON events to JIRA | ||
| Pushes messages to the Juggernaut websockets server | ||
| Sends metrics, annotations, and alerts to Librato based on Logstash events | ||
| Ships logs to Loggly | ||
| Writes metrics to MetricCatcher | ||
| Writes events to MongoDB | ||
| Sends passive check results to Nagios | ||
| Sends passive check results to Nagios using the NSCA protocol | ||
| Sends logstash events to New Relic Insights as custom events | ||
| Writes metrics to OpenTSDB | ||
| Sends notifications based on preconfigured services and escalation policies | ||
| Pipes events to another program’s standard input | ||
| Sends events to a Rackspace Cloud Queue service | ||
| Creates tickets using the Redmine API | ||
| Writes events to the Riak distributed key/value store | ||
| Sends metrics to Riemann | ||
| Sends events to Amazon’s Simple Notification Service | ||
| Stores and indexes logs in Solr | ||
| Pushes events to an Amazon Web Services Simple Queue Serice queue | ||
| Sends metrics using the  | ||
| Writes events using the STOMP protocol | ||
| Sends events to a  | ||
| Sends Logstash events to HDFS using the  | ||
| Publishes messages to a websocket | ||
| Posts events over XMPP | ||
| Sends events to a Zabbix server | ||
| Writes events to a ZeroMQ PUB socket | 
 
                    
                
 
                
            
         浙公网安备 33010602011771号
浙公网安备 33010602011771号