Local ES and Kibana for development

It can be handy having a local Kibana and Elastisearch running while developing the Spark ETL

Change-Id: I7ceb54b4bd2ef8223dd4fd0f6565b33ef787a19e
diff --git a/README.md b/README.md
index 41f06b0..34a959a 100644
--- a/README.md
+++ b/README.md
@@ -22,4 +22,19 @@
     if not provided no ES export will be performed
 - -o --out folder location for storing the output as JSON files
     if not provided data is saved to </tmp>/analytics-<NNNN> where </tmp> is
-    the system temporary directory
\ No newline at end of file
+    the system temporary directory
+
+## Development environment
+
+A docker compose file is provided to spin up an instance of Elastisearch with Kibana locally.
+Just run `docker-compose up`.
+
+Kibana will run on port `5601` and Elastisearch on port `9200`
+
+### Default credentials
+
+The Elastisearch default user is `elastic` and the default password `changeme`
+
+### Caveats
+
+If Elastisearch dies with `exit code 137` you might have to give Docker more memory ([check this article for more details](https://github.com/moby/moby/issues/22211))
\ No newline at end of file
diff --git a/docker-compose.yaml b/docker-compose.yaml
new file mode 100644
index 0000000..7c646e1
--- /dev/null
+++ b/docker-compose.yaml
@@ -0,0 +1,29 @@
+version: '3'
+
+services:
+
+  elasticsearch:
+    image: docker.elastic.co/elasticsearch/elasticsearch:5.5.2
+    ports:
+      - "9200:9200"
+      - "9300:9300"
+    environment:
+      - ES_JAVA_OPTS=-Xmx256m -Xms256m
+      - http.host=0.0.0.0
+      - http.publish_host=127.0.0.1
+    networks:
+      - ek
+
+  kibana:
+    image: docker.elastic.co/kibana/kibana:5.5.2
+    ports:
+      - "5601:5601"
+    networks:
+      - ek
+    depends_on:
+      - elasticsearch
+
+networks:
+
+  ek:
+    driver: bridge