Upload Amazon sellercentral reports into Elastic (ELK stack). Currently supports the following reports
-
Amazon Fulfilled Shipments Report https://www.amazon.com/gp/help/customer/display.html?ie=UTF8&nodeId=200592790
-
All Orders Report https://www.amazon.com/gp/help/customer/display.html/ref=hp_left_sib?ie=UTF8&nodeId=200592830
https://www.elastic.co/downloads/elasticsearch
https://www.elastic.co/downloads/logstash
https://www.elastic.co/downloads/kibana
To install translate plugin, stop logstash service and run the following command on your logstash host
/usr/share/logstash/bin/logstash-plugin install logstash-filter-translate
To be able to map byers address to geo_point in Kibana world map we need to translate zip codes to geo location. This is done in the filter section of the Logstash config file like this:
mutate {
add_field => ["latitude","%{[translation[0]}"]
add_field => ["longitude","%{[translation[1]}"]
}
mutate {
convert => { "longitude" => "float" }
convert => { "latitude" => "float" }
}
mutate {
rename => {
"longitude" => "[location][lon]"
"latitude" => "[location][lat]"
}
The new location field type then is mapped as geo_point with template inside Elasticsearch. For logstash to be able to upload this template into Elasticsearch, the template path must be defined in the output settings of the logstash config. This template is uploaded into Elasticsearch when Logstash starts up or restarts.
"properties": {
"@version": { "type": "string", "index": "not_analyzed" },
"location": { "type": "geo_point" }
}
To manually import this template into Elasticsearch, do the following:
curl -XPUT 'http://localhost:9200/_template/mws-collector-reports-fulfillment' -d@/etc/logstash/templates/mws-collector-reports-fulfillment.json
Everything should be ready for the Amazon sellercentral csv data sets. First generate Fulfilled shipments data report in sellercentral and then download the report. Copy this file to the path defined in the input settings in the Logstash config file.
input {
file {
path => "/tmp/fulfillment/*"
Now, tail the logstash logs and see if the data is flowing correctly through Logstash.
tail -f /var/log/logstash/logstash-plain.log
[2017-03-01T09:38:50,385][DEBUG][logstash.pipeline ] output received {"event"=>{"bill-country"=>nil, "amazon-order-item-id"=>"45646456", "type"=>"mws-collector-reports-fulfillment", "tracking-number"=>"xxxxxxxxxxxxx", "path"=>"/tmp/fulfillment/AMAZON_FULFILLED_SHIPMENTS_DATA.csv", "amazon-order-id"=>"xxxxxxx", "item-promotion-discount"=>0.0, "estimated-arrival-date"=>"2017-02-07T04:00:00+00:00", "ship-postal-code"=>49404,
Select Management->Index patterns->Add New
Make sure that location is defined as type:geo_point (if not, you need to upload the elasticsearch template)
The template should look like this inside Elasticsearch
Next create the the map with Visualize->Tile Map Then select your index with location as the geo_point field
The map should show the data like this