This is a project to make a data vizualisation on Open data of Lille. All the data were found here (https://opendata.lillemetropole.fr/) and export to csv. You can find my export in data directory.
Install :
- Java 8 (http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html)
- Kibana (https://www.elastic.co/downloads/kibana) or on MacOS run this command :
brew install kibana
- ElasticSearch (https://www.elastic.co/downloads/elasticsearch) or on MacOS run this command :
brew install elasticsearch && brew info elasticsearch
- Logstash (https://www.elastic.co/downloads/logstash) or on MacOS run this command :
brew install logstash
- Start services (Kibana, Logstash, Elasticsearch)
- Create a file logstash_filename.conf and run this file in your shell (Go to part
Get started with Logstash
) - Go to Kibana (http://localhost:5601) on your browser.
- On Kibana navigation, go to
Dev Tools
and to add the code of the partGet started with Kibana
- Run the method
GET items/_count
and check to result of count.
-
On Kibana navigation, go to
Management
andCreate Index pattern
. The index name is equal to that defined in the logstash_filename.conf. You can see your data on Kibana navigation inDiscover
. -
On Kibana navigation, go to
Vizualize
and click on button+
to create a new visualization. Choose the coordinate map and select yourindex_pattern
, chooseGeohash
and Geo point columns then click on run. -
It should be working ! Enjoy.
Open the Kibana configuration file:
- kibana.yml
Uncomment the directives for defining the Kibana port and Elasticsearch instance:
server.port: 5601
elasticsearch.url: "http://localhost:9200โ
$ brew services start kibana
Go to (http://localhost:5601) on your browser.
In Dashboard click in Dev tools
view and add this code, if you want to get and view data :
PUT items
{
"settings": {
"number_of_shards": 1
},
"mappings": {
"doc": {
"properties": {
"geo_point_2d" : { "type": "geo_point" }
}
}
}
}
GET items/_search
{
"query": {
"match_all": {}
}
}
GET items/_count
DELETE items
Run this command :
$ brew services start logstash
Create file logstash_filename.conf
on your shell with this command :
sudo vim logstash_filename.conf
Add this code in your file config logstash (you can custom the filter object according to your needs):
input {
stdin {
type => "stdin-type"
}
file {
path => [ "/PATH_DIR/dataviz-lille/data/bureaux-de-poste.csv"]
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ";"
columns => [
"geometry",
"commune",
"libelle",
"code-postal",
"objectid",
"geo_point_2d"
]
# mutate {
# convert => {"geo_point_2d" => "float"}
# }
}
}
output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "NAME_OF_YOUR_INDEX_PATTERN"
}
stdout { codec => rubydebug }
}
Run the file config logstash on your shell
$ cd dataviz-lille
$ logstash -f logstash/filename.conf
Run this command :
$ brew services start elasticsearch
Visit this url http://localhost:9200
on your browser.
Verify if all your services are started with this command :
$ brew services list
- Dany PHENGSIAROUN