You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I gathered from the maxmind website, that postal codes for the UK are being returned with the first 2-4 characters (https://dev.maxmind.com/geoip/geoip2/geoip2-city-country-csv-databases/).
Unfortunately this results in all of our parsed entries to go to the dead letter queue, as the geoip plugin seems to deem them to be in an invalid format.
Version: 6.4.1
Operating System: CentOS 7.4
Config File (if you have sensitive info, please remove it):
We parse our webserver logs and use the geoip filter for the clientip. All our webserver log entries end up in the dead letter queue, because the geoip plugin is throwing this error:
Could not index event to Elasticsearch. status: 400, action: ["index", {:_id=>nil, :_index=>"filebeat-6.4.1-2018.12.12", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2e831888>], response: {"index"=>{"_index"=>"filebeat-6.4.1-2018.12.12", "_type"=>"doc", "_id"=>"pnGhoWcBhV0O84dhLCIa", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [geoip.postal_code]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"EC2V\""}
The text was updated successfully, but these errors were encountered:
The error is coming back from your Elasticsearch output, not from the GeoIP filter.
Another user experienced this error and discovered that Elasticsearch was auto-detecting the field as a "date" because the first entry it received for a day looked like a date, and proposed explicitly adding the field to the index template in the Elasticsearch Output Plugin: logstash-plugins/logstash-output-elasticsearch#788
@yaauie Thank you so much! Indeed...this seems to be exactly what's going on. Funny enough, in today's index, the field is of type text and I get no failures. But Kibana shows the field with conflicting types for the past few months. Now it makes sense why we didn't notice this on initial rollout.
I will go ahead and define the mapping in our index template.
I gathered from the maxmind website, that postal codes for the UK are being returned with the first 2-4 characters (https://dev.maxmind.com/geoip/geoip2/geoip2-city-country-csv-databases/).
Unfortunately this results in all of our parsed entries to go to the dead letter queue, as the geoip plugin seems to deem them to be in an invalid format.
We parse our webserver logs and use the geoip filter for the clientip. All our webserver log entries end up in the dead letter queue, because the geoip plugin is throwing this error:
The text was updated successfully, but these errors were encountered: