Logstash Always Keep One message in PipeLine


I am using Logstash to read and parse logs from a file and send them to a Rest based API. My shipper is working fine, but I am experiencing a strange behavior.



When Logstash shipper parses the first log entry, it does not send it, It keeps it in the pipeline. When it parses the second log entry, it sends the first log entry to the API. Hence one message always remains in the pipeline and it is not being sent towards my API.

Whenever I stop my Logstash shipper process, then it sends the last remaining message as well. So, In a sense no message is lost, but shipper always is one message behind.

Question: Why is Logstash unable to flush out its pipeline and send message to the API as soon as it receives.


You should paste your logstash config and log format in order to get the correct answer, however from whatever you have described you seem to be using multiline plugin. So from logstash 2.2 onwards there is a auto_flush_interval for multline plugin in Codec. Basically this 'auto_flush_interval' can be set to a number of seconds and if multline input plugin does not listen any log line till the specified number of seconds then it will flush the input pending in pipepline to your API...

For example and more information please go through this:

input {
  file {
    path => "$LogstashFilePathValue"
    type => "DemandwareError"
    tags => "$EnvironmentName"
    start_position => "beginning"
    sincedb_path => "NUL"
    codec => multiline {
        pattern => "\A\[%{TIMESTAMP_ISO8601:demandware_timestamp} GMT\]"
        negate => true
        what => previous
        auto_flush_interval => 10

The example is from the link: https://github.com/elastic/logstash/issues/1482 For more information on auto_flush_interval visit: https://www.elastic.co/guide/en/logstash/current/plugins-codecs-multiline.html#plugins-codecs-multiline-auto_flush_interval

This video can help you solving your question :)
By: admin