Giter VIP home page Giter VIP logo

dockerfiles's People

Contributors

3vivekb avatar dianashk avatar jengeb avatar missinglink avatar orangejulius avatar tigerlily-he avatar trescube avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dockerfiles's Issues

How to configure pelias.json if multiple regions/cities need to be imported?

Assuming that I need to import the datasets for Idaho and Oregon, I would adjust the imports in pelias.json. I would configure the id of importPlace, e.g. Idaho for the whosonfirst dataset as follows and run the build script.

    "whosonfirst": {
      "datapath": "/data/whosonfirst",
      "importVenues": false,
      "importPostalcodes": true,
      "importPlace": "85688657",
      "api_key": "your-api-key"
    }

After this I would like to import the dataset for Oregon with the importPlace id 85688513. What's the best way to do that? Changing the pelias.json file and running the build script again for each region?

Data Load / WOF issues

I'm having problems running build.sh with recent (August 2017) commits to the dockerfiles project. The problems seem to stem from WOF data not being available for OA and OSM loaders.

A bit of history: back on August 1st, I was able to successfully run build.sh, and see a fully loaded Pelias system running (via the example pelias.json, thus data for just the Portland area). I was using dockerfiles commit 1e68574 from master.

On August 17th, after doing a git update (now on commit 77ba1be), I tried to re-run build.sh again, which failed. On this same server, I nuked all of the docker images and volumes, and tried re-running build.sh. Pretty much the same failures, with the OA and OSM loaders hung because (seemingly) there was no WOF data available in $DATA_DIR/whosonfirst (meta directory and meta/whosonfirst_bundle_index.txt is there, but nothing else).

Yesterday I provisioned a fresh server, and ran the docker files again from master (commit 77ba1be). I again am seeing similar issues, and WOF data missing.

I'm attaching two console logs. First fail-1.txt is the initial run of build.sh on the fresh server. The second file fail-2.txt is after nuking the Pelias docker stuff from step one, and trying to run build.sh a second time.

BTW, here's what my $DATA_DIR looks like:

var/data/elasticsearch/:
elasticsearch

var/data/interpolation/:

var/data/openaddresses/:
README.txt us

var/data/openstreetmap/:
portland_oregon.osm.pbf

var/data/placeholder/:
graph.json store.sqlite3 wof.extract

var/data/tiger/:
downloads shapefiles

var/data/whosonfirst/:
meta

AND docker ps shows the following running after nuking and re-running build.sh:

CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
71c969760a99 pelias/openstreetmap "npm start" 19 hours ago Up 19 hours dockerfiles_openstreetmap_run_1
de4c0170d5bd pelias/openaddresses "npm start" 19 hours ago Up 19 hours dockerfiles_openaddresses_run_1
68ddcf0aafd7 pelias/elasticsearch "/docker-entrypoint.s" 19 hours ago Up 19 hours 0.0.0.0:9200->9200/tcp, 0.0.0.0:9300->9300/tcp pelias_elasticsearch

libpostal error on build

I try to use the docker version of pelias, i clone the repository and then do ./build and for 1 hour of work at final i get this log with errors:

FAIL: test_libpostal
============================================================================
Testsuite summary for libpostal 1.0.0
============================================================================
# TOTAL: 1
# PASS:  0
# SKIP:  0
# XFAIL: 0
# FAIL:  1
# XPASS: 0
# ERROR: 0
============================================================================
See test/test-suite.log
============================================================================
make[3]: *** [test-suite.log] Error 1
Makefile:794: recipe for target 'test-suite.log' failed
make[3]: Leaving directory '/code/libpostal/test'
make[2]: *** [check-TESTS] Error 2
make[1]: *** [check-am] Error 2
Makefile:900: recipe for target 'check-TESTS' failed
make[2]: Leaving directory '/code/libpostal/test'
Makefile:973: recipe for target 'check-am' failed
make[1]: Leaving directory '/code/libpostal/test'
make: *** [check-recursive] Error 1
Makefile:454: recipe for target 'check-recursive' failed
ERROR: Service 'libpostal_baseimage' failed to build: The command '/bin/sh -c ./bootstrap.sh &&     ./configure --datadir=/usr/share/libpostal &&     make && make check && make install &&     ldconfig' returned a non-zero code: 2

real	63m32.221s
user	0m10.204s
sys	0m1.200s

The process ends and when i do docker-compose ps there are only running:

pelias_baseimage       /bin/bash                        Exit 0                                                      
pelias_elasticsearch   /docker-entrypoint.sh elas ...   Up           0.0.0.0:9200->9200/tcp, 0.0.0.0:9300->9300/tcp 
pelias_pip             npm start -- /data/whosonfirst   Restarting                                                  
pelias_placeholder     npm start                        Up           0.0.0.0:4100->4100/tcp 

So maybe i need to do a manual installation instead?

Interpolation not working as of November 7, 2017

I am again seeing interpolation not work properly (November 7, 2017). I had seen similar issues previously, ala #25 . This is on a system cleaned of older docker containers and data, and running the dockerfiles' build.sh with minimal changes (only change is adding the key to pelias.json).

This call should show an interpolated address:
https://ws-st.trimet.org/pelias/v1/search?text=888%20SE%20Lambert%20St

But instead, I get this: interpolation

I'm guessing these build.sh errors are part of the problem:
ESC[32minfoESC[39m: [update_tiger] Downloaded tl_2016_41071_addrfeat.zip
Starting pelias_baseimage
Traceback (most recent call last):
File "/usr/bin/docker-compose", line 9, in
load_entry_point('docker-compose==1.9.0', 'console_scripts', 'docker-compose')()
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 65, in main
command()
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 117, in perform_command
handler(command, command_options)
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 712, in run
run_one_off_container(container_options, self.project, service, options)
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 998, in run_one_off_container
pty.start(sockets)
File "/usr/lib/python2.7/site-packages/dockerpty/pty.py", line 334, in start
self._hijack_tty(pumps)
File "/usr/lib/python2.7/site-packages/dockerpty/pty.py", line 373, in _hijack_tty
pump.flush()
File "/usr/lib/python2.7/site-packages/dockerpty/io.py", line 378, in flush
raise e
OSError: [Errno 9] Bad file descriptor
Starting pelias_baseimage

Here's the full log from running dockerfiles/build.sh:
build.log

Importers Error

I've changed the importer's source to S3,
Now I'm encountering npm error while executing build.sh:

info: [download] Attempting to download selected data files: https://s3.amazonaws.com/data.openaddresses.io/openaddr-collected-global.zip
info: [download] Downloading https://s3.amazonaws.com/data.openaddresses.io/openaddr-collected-global.zip
info: [download_data_filtered] Getting parent ids
error: [download_data_filtered] { Error: write EPROTO 140614379554624:error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure:../deps/openssl/openssl/ssl/s23_clnt.c:802:

Any ideas?

Missing WOF data?

Seems to be references to missing data on the ./build.sh script. $DATA_DIR is defined and should be of sufficient file space (50GB+). Running Fedora 25.

I get the following, followed by a hanging after "Build completed!". At that point, docker-compose ps shows
screenshot from 2017-10-23 14-22-40

Suggestions?

events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-ocean-latest.csv'
    at Error (native)

npm ERR! Linux 4.12.9-200.fc25.x86_64
npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm" "start"
npm ERR! node v4.8.4
npm ERR! npm  v3.10.10
npm ERR! code ELIFECYCLE
npm ERR! [email protected] start: `node import.js`
npm ERR! Exit status 1
npm ERR! 
npm ERR! Failed at the [email protected] start script 'node import.js'.
npm ERR! Make sure you have the latest version of node.js and npm installed.
npmevents.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-locality-latest.csv'
    at Error (native)
 ERR! If you do, this is most likely a problem with the pelias-whosonfirst package,
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR!     node import.js
npm ERR! You can get information on how to open an issue for this project with:
npm ERR!     npm bugs pelias-whosonfirst
npm ERR! Or if that isn't available, you can get their info via:
npm ERR!     npm owner ls pelias-whosonfirst
npm ERR! There is likely additional logging output above.

npm ERR! Please include the following file with any support request:
npm ERR!     /code/pelias/whosonfirst/npm-debug.log
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-macroregion-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-borough-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-country-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-dependency-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-localadmin-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-macrocounty-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-region-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-neighbourhood-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-locality-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-localadmin-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-neighbourhood-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-borough-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-dependency-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-neighbourhood-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-region-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-country-latest.csv'
    at Error (native)
events.js:141
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-macrocounty-latest.csv'
    at Error (native)
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-county-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-dependency-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-borough-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-localadmin-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-county-latest.csv'
    at Error (native)
events.js:141
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-region-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-locality-latest.csv'
    at Error (native)
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-country-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-macroregion-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-macrocounty-latest.csv'
    at Error (native)
events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: ENOENT: no such file or directory, open '/data/whosonfirst/meta/wof-macroregion-latest.csv'
    at Error (native)
- archiving street database
- conflating openaddresses`



env var DATA_DIR not set

Maybe a dumb question, but eventough i created /tmp/data folder, upon ./build.sh I'm getting:
env var DATA_DIR not set error.

Docker npm error

Thank you for creating this Pelias docker container! I've been following along the "How to setup" guide and I come across an error following the ​docker-compose​ ​ run​ ​ --rm​ ​ whosonfirst​ ​ npm​ ​ run​ ​ download command.

user@ubuntu:~/osGeo/pelias/dockerfiles$ docker-compose run --rm whosonfirst npm run download
Starting pelias_baseimage ... 
Starting pelias_baseimage ... done

> [email protected] download /code/pelias/whosonfirst
> node ./utils/download_data.js

/code/pelias/whosonfirst/node_modules/mergeable/lib/Mergeable.js:70
    throw new Error( 'failed to merge config from path:' + path );
    ^

Error: failed to merge config from path:/code/pelias.json
    at Config._requirePath (/code/pelias/whosonfirst/node_modules/mergeable/lib/Mergeable.js:70:11)
    at Config.deepMergeFromPath (/code/pelias/whosonfirst/node_modules/mergeable/lib/Mergeable.js:20:21)
    at getConfig (/code/pelias/whosonfirst/node_modules/pelias-config/index.js:45:37)
    at Object.generate (/code/pelias/whosonfirst/node_modules/pelias-config/index.js:23:18)
    at Object.<anonymous> (/code/pelias/whosonfirst/utils/download_data.js:1:105)
    at Module._compile (module.js:409:26)
    at Object.Module._extensions..js (module.js:416:10)
    at Module.load (module.js:343:32)
    at Function.Module._load (module.js:300:12)
    at Function.Module.runMain (module.js:441:10)

npm ERR! Linux 4.4.0-97-generic
npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm" "run" "download"
npm ERR! node v4.8.4
npm ERR! npm  v3.10.10
npm ERR! code ELIFECYCLE
npm ERR! [email protected] download: `node ./utils/download_data.js`
npm ERR! Exit status 1
npm ERR! 
npm ERR! Failed at the [email protected] download script 'node ./utils/download_data.js'.
npm ERR! Make sure you have the latest version of node.js and npm installed.
npm ERR! If you do, this is most likely a problem with the pelias-whosonfirst package,
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR!     node ./utils/download_data.js
npm ERR! You can get information on how to open an issue for this project with:
npm ERR!     npm bugs pelias-whosonfirst
npm ERR! Or if that isn't available, you can get their info via:
npm ERR!     npm owner ls pelias-whosonfirst
npm ERR! There is likely additional logging output above.

npm ERR! Please include the following file with any support request:
npm ERR!     /code/pelias/whosonfirst/npm-debug.log

My pelias.json file is the same as the one in the "How to setup" guide.

{
​ "esclient":​ ​ {
​ ​ ​ ​ "hosts":​ ​ [{​ ​ "host":​ ​ "elasticsearch"​ ​ }]
​ ​ },
​ ​ "api":​ ​ {
​ ​ ​ ​ "textAnalyzer":​ ​ "libpostal",
​ ​ ​ ​ "services":​ ​ {
​ ​ ​ ​ ​ ​ "placeholder":​ ​ {
​ ​ ​ ​ ​ ​ ​ ​ "url":​ ​ "http://placeholder:4100"
​ ​ ​ ​ ​ ​ }
​ ​ ​ ​ }
​ ​ },
​ ​ "imports":​ ​ {
​ ​ ​ ​ "whosonfirst":​ ​ {
​ ​ ​ ​ ​ ​ "datapath":​ ​ "/data/whosonfirst",
​ ​ ​ ​ ​ ​ "importVenues":​ ​ false,
​ ​ ​ ​ ​ ​ "importPostalcodes":​ ​ true,
​ ​ ​ ​ ​ ​ "importPlace":​ ​ "85950361",
​ ​ ​ ​ ​ ​ "api_key":​ ​ "your-mapzen-api-key"
​ ​ ​ ​ }
​ ​ }
}

Any help on this error would be much appreciated!

REVISION=filtered-download

the docker-compose.yaml file contains references to branches, such as: args: [ "REVISION=filtered-download" ]

@dianashk are we safe to switch these to master? I can't recall if all the associated branch PRs have been merged?

The link to sign up and generate a Mapzen API key doesn't work

The how-to guide states that you need an API key to configure the whosonfirst import. The sign-up link doesn't work right now. I am aware that Mapzen is shutting down its services. :-( So how will this be handled as from now on?

I ask because it seems necessary to use an API key in pelias.json:

    "whosonfirst": {
      "datapath": "/data/whosonfirst",
      "importVenues": false,
      "importPostalcodes": true,
      "importPlace": "101715829",
      "api_key": "your-api-key"
    },

Plans to allow for CI builds, local development

Now that the Dockerfiles in this repository are fairly complete and becoming quite useful, we want to think a bit more about how we organize them to handle all the ways we want to use them and how we want them to be built.

Here are some things we want to be able to do

  • Build Docker images in a CI environment: so that builds can be done in a controlled and repeatable way. We also want to avoid tying up developer machines, since some builds can take a good amount of time.
  • Use Docker images with docker-compose to easily set up Pelias. For this use case we want images tagged on a public image repository (like Docker Hub) in a way that behaves like a git branch, and update over time, so users are always on the latest stable code without having to constantly modify docker-compose.yml
  • Use Docker images in Mapzen Search and other production environments via Kubernetes. For this case, and other production cases, we want images with very specific tags that never change, except when we choose to deploy new code. This allows confidence and control when working in production environments.
  • Set up local environments for development primarily using Docker images. This requires developers to be able to modify bits and pieces of their setup, but mostly follow a standardized system, configured with docker-compose.yml

After some discussion, we think we can handle all these cases. Of course we'll probably have to continue to tweak things over time, but we have a starting plan.

Here's what we'll do

  • Move Dockerfiles from this repository to individual repos This change appears to be necessary as there's really no good way to regularly build individual images via CI tools if the Dockerfile doesn't live in the same repository as the code it builds from. Fortunately this is also pretty standard, and also solves our current problem of having two Dockerfiles for some parts of Pelias
  • Modify Dockerfiles to copy local files, rather than using git. Otherwise, we can't build local changes or have branches.
  • Continue to use TravisCI with no changes to run unit/functional tests quickly for feedback when merging branches
  • Use CircleCI to build Docker containers and publish to Docker Hub (and possibly other container registries in the future) for every commit, with fully versioned tags that have a format like $branch-$date-$git_hash so that they are very unique, never change, and suitable for production use. We also publish "branch" tags that follow the branch name and change over time. Its worth it to use both CI tools so that the Travis code we use for determining when branches are ready to merge is unchanged and stable.
  • Use Docker Hub's built in image building utility to make the "branches" (images with tags in the format of $branch that change over time) (tutorial on setup here) It turns out enabling these builds on existing projects is not possible, and we can easily replicate the functionality in CircleCI builds.

Possible Problems:

  • Some repos are standalone, can we ensure Dockerfiles for standalone use and general pelias use co-exist? Maybe we will simply need several docker-compose files to handle different cases
  • Where should the baseimages live? They could easily be in repositories of their own, or stay in this repo with some custom code to handle building them.
  • Running docker-compose build will no longer be possible. However building all the images takes quite a long time and should not normally be desired.
  • Without docker-compse build, people will have to do more work to manage their local docker images, as the Dockerfiles will no longer be known about by the docker-compose file in this repo. Is it reasonable to expect newcomers, or any developer, to properly manage tagging their own builds for development? The current workflow might look like this:
  1. work on some code
  2. run docker build . -t pelias/project in that code directory
  3. in another terminal window, run docker-compose up or similar command to restart changed parts of the local environment
  4. repeat

Is that too much? It would be a few steps more than managing services with npm start as we currently do.

cannot restart container

When trying to restart container, i typed in "docker-compose down" as instructed in wiki and got errors below. Of course, I cannot successully start "docker-compose up" and even if i killed -9 node with pelias process and build.sh doesnt work anymore.

Stopping pelias_api ... error
Stopping` pelias_pip ... error
Stopping pelias_placeholder ... error
Stopping pelias_interpolation ... error
Stopping pelias_elasticsearch ... error

ERROR: for pelias_elasticsearch Cannot stop container 734892b101eed17790955b12c9d5576e8d42ccf4a3aaa145eb6e7ac3fad3f912: Cannot kill container 734892b101eed17790955b12c9d5576e8d42ccf4a3aaa145eb6e7ac3fad3f912: rpc error: code = 14 desc = grpc: the connection is unavailable

ERROR: for pelias_pip Cannot stop container 82644549ecf8026fc16cc02b56dc9377a4506e42a2e49e520bf0725ba894f645: Cannot kill container 82644549ecf8026fc16cc02b56dc9377a4506e42a2e49e520bf0725ba894f645: rpc error: code = 14 desc = grpc: the connection is unavailable

ERROR: for pelias_interpolation Cannot stop container 48f17d45306fcc2f1531da4261db6c0c297dc8db60a10d16febe25404d831a4f: Cannot kill container 48f17d45306fcc2f1531da4261db6c0c297dc8db60a10d16febe25404d831a4f: rpc error: code = 14 desc = grpc: the connection is unavailable

ERROR: for pelias_placeholder Cannot stop container cab862063e92632b604cec222dd312130824efde3d5d1b53f87039ad59054220: Cannot kill container cab862063e92632b604cec222dd312130824efde3d5d1b53f87039ad59054220: rpc error: code = 14 desc = grpc: the connection is unavailable

ERROR: for pelias_api Cannot stop container abf8d30cb4e26077bc18a3fc51e5dc79ee00cf3d5de7837592ca56799370851d: Cannot kill container abf8d30cb4e26077bc18a3fc51e5dc79ee00cf3d5de7837592ca56799370851d: rpc error: code = 14 desc = grpc: the connection is unavailable

Looking for paid installation service

Sorry this is not the right place to post this.

Feel free to delete this Issue after keeping it for few days !

I need a self-hosted pelias Geocoder for our taxi booking app.
But i am finding it quite difficult to install it by myself !

Can anyone help me install pelias geocoder on Digital Ocean VPS ?
You can ask for payments and charges !

If you guys help me install pelias how much will you charge for your service ?

Wall of errors during successful setup, not sure what is wrong

Toward the end of Docker setup of Pelias, I got trapped in what looked like an infinite loop printing many errors over and over.. I hit Ctrl+C to stop this part of the process. Everything else installed cleanly and the geocoder is working on some addresses. Any idea what glitched here?

Elasticsearch ERROR: 2018-02-27T21:34:03Z
  Error: Request error, retrying
  POST http://elasticsearch:9200/_bulk => socket hang up
      at Log.error (/code/pelias/openaddresses/node_modules/elasticsearch/src/lib/log.js:225:56)
      at checkRespForFailure (/code/pelias/openaddresses/node_modules/elasticsearch/src/lib/transport.js:258:18)
      at HttpConnector.<anonymous> (/code/pelias/openaddresses/node_modules/elasticsearch/src/lib/connectors/http.js:157:7)
      at ClientRequest.bound (/code/pelias/openaddresses/node_modules/elasticsearch/node_modules/lodash/dist/lodash.js:729:21)
      at emitOne (events.js:77:13)
      at ClientRequest.emit (events.js:169:7)
      at Socket.socketCloseListener (_http_client.js:245:9)
      at emitOne (events.js:82:20)
      at Socket.emit (events.js:169:7)
      at TCP._onclose (net.js:490:12)

Elasticsearch WARNING: 2018-02-27T21:34:03Z
  Unable to revive connection: http://elasticsearch:9200/

No living connections

error: [dbclient] esclient error Error: No Living connections
    at sendReqWithConnection (/code/pelias/openaddresses/node_modules/elasticsearch/src/lib/transport.js:225:15)
    at next (/code/pelias/openaddresses/node_modules/elasticsearch/src/lib/connection_pool.js:213:7)
    at nextTickCallbackWith0Args (node.js:489:9)
    at process._tickCallback (node.js:418:13)
error: [dbclient] invalid resp from es bulk index operation

info: [dbclient] retrying batch [429]
error: [dbclient] [429] type=es_rejected_execution_exception, reason=rejected execution of org.elasticsearch.transport.TransportService$4@167c6af0 on EsThreadPoolExecutor[bulk, queue capacity = 50, org.elasticsearch.common.util.concurrent.EsThreadPoolExecutor@43c5e68[Running, pool size = 4, active threads = 4, queued tasks = 50, completed tasks = 110]]

Early Errors ... build hangs on ES startup

Not a lot has changed in the dockerfiles project since Sept 29 (last week) .. see comments at the bottom of #25. But now I'm seeing some early errors, followed by the build hanging when trying to start Elastic Search. See hung-es.txt

Again, this is from a clean Docker system with all cached data, images, volumes, etc... nuked.

whosonfirst does not start with default pelias.json configuration

Hi everybody.
I'm trying to install and run a pelias instance for testing and learning purpose. I'm on macOS Sierra.

I (think I) correctly followed the guidelines provided in the read.me. But I'm facing an issue related to whosonfirst postal codes import.

Configuration
I used the same as default (except I put my mapzen-key-id !!).

image

It means that:

  • only data from the Portland area will be imported
  • postal codes will be imported

The problem I have
The download phase (docker-compose run --rm whosonfirst npm run download) works fine. In particular, 26 postal codes were imported

Below is an extract of the log:
image

Later on, during the import phase (docker-compose run --rm whosonfirst npm start), I got an error when the program tries to import the postal codes data.

Here are the logs:
image

My understanding is that, even if the scope is reduced to Portland ("importPlace": "101715829" is set in the pelias.json properties file), the program doesn't take this configuration into account and assume that all postal codes need to be indexed, and thus all postal codes files should be available.

Am I correct ? Or do I miss something ?

After some googling, I found whosonfirst issue #230. It seems to confirm my assumptions.

So, if is correct, it would be worth:

  • changing the default pelias.json file and set "importPostalcodes": false
  • adding an explanation in the read.me related to postal codes

What do you think ?

Best,
Philippe

error when executing "docker-compose run --rm whosonfirst npm run download" error:14077410:SSL routines:SSL23_GET_SERVER_HELLO

Hello colleagues, he tried to follow the instructions of the document how_to_guide.pdf but I was stuck in this error that I could not solve:

1

PS C:\Users\andres.castillo\dockerfiles> docker-compose run --rm whosonfirst npm run download
Starting pelias_baseimage ... done

[email protected] download /code/pelias/whosonfirst
node ./utils/download_data.js

2018-03-09T16:53:06.032Z - info: [download_data_filtered] Getting parent ids
2018-03-09T16:53:06.304Z - error: [download_data_filtered] { Error: write EPROTO 140406295983936:error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure:../deps/openssl/openssl/ssl/s23_clnt.c:802:

at exports._errnoException (util.js:1020:11)
at WriteWrap.afterWrite (net.js:812:14) code: 'EPROTO', errno: 'EPROTO', syscall: 'write' }

2018-03-09T16:53:06.307Z - error: [download_data_filtered] "failed to get place info"

## I did the same procedure on linux and the same error appeared :(

This is my pelias.json:

{
"esclient": {
"hosts": [{ "host": "elasticsearch" }]
},
"api": {
"textAnalyzer": "libpostal",
"services": {
"placeholder": {
"url": "http://placeholder:4100"
}
}
},
"imports": {
"whosonfirst": {
"datapath": "/data/whosonfirst",
"importVenues": false,
"importPostalcodes": true,
"importPlace": "85950361", //costa rica
"api_key": "xxxx" // my apikey
}
}
}

Interpolation and Polylines issues

Hi Diana,

I just tried rebuilding the example pelias.json in the dockerfiles directory. I'm not seeing polylines get processed, and as a result interpolation doesn't build. Below is a listing of the data directory (note that the interpolation folder is empty, and that there isn't a polylines directory):

[pelias_docker@cs-dv-mapgeo01 dockerfiles]$ ls ~/data/*
/srv/pelias_docker/data/elasticsearch:
elasticsearch

/srv/pelias_docker/data/interpolation:

/srv/pelias_docker/data/openaddresses:
README.txt us

/srv/pelias_docker/data/openstreetmap:
portland_oregon.osm.pbf

/srv/pelias_docker/data/placeholder:
graph.json store.sqlite3 wof.extract

/srv/pelias_docker/data/tiger:
downloads shapefiles

/srv/pelias_docker/data/whosonfirst:
data meta

Here's the failure from the log file:

[email protected] build /code/pelias/interpolation
./script/build.sh

  • importing polylines
    poyline line not found

npm ERR! Linux 3.10.0-514.26.2.el7.x86_64
npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm" "run" "build"
npm ERR! node v4.8.4
npm ERR! npm v2.15.11
npm ERR! code ELIFECYCLE
npm ERR! [email protected] build: ./script/build.sh
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the [email protected] build script './script/build.sh'.
npm ERR! This is most likely a problem with the pelias-interpolation package,

mem_limit setting in docker-compose.yml was causing imports to fail

I am leaving an issue here for posterity and google-ability in case anyone encounters this.

Our custom importer was always failing around 3.75M rows, with errors about lost socket connections... it would retry a bunch, and eventually resume, but pelias-dbclient's logging would say we lost several batches.

After trying everything under the sun, it was this mem_limit that seemed to be affecting things. Once we removed it and rebuilt the container, we didn't see the issue again.

Interpolation maxes out one core & the other processes & cores are idle

In my pelias.json file I've configured it to import the state of Florida (USA). The virtual machine has 4 cores but only 1 seems to be utilized.

I see multiple threads running but most of them are idle. So this seems like a bug but perhaps it's expected behavior that hasn't been documented (or I didn't see it in the docs).

Is this correct or can I make a change that will distributed the load across all the threads & cores? I'd love to use 36 or 72 cores and get this processing window down as small time as possible.

Thank you

by default 2017-11-07 at 6 15 52 am

Libpostal errors

I've been trying a deployment on Ubuntu 16 (using the default config) and the build process around libpostal shows a lot of errors. The whole container does build successfully but when you try to run it error messages surround how libpostal is missing.

Docker volume configuration

Not sure if this is a pelias or a docker issue, but in docker-compose.yml file you can see:

version: '2'
networks:
  pelias:
    driver: bridge
volumes:
  libpostaldata:   <-- what's the real path?
    driver: local
    ...

And in the configuration you can see reference to libpostaldata but I cannot understand where is the actual absolute path that is defined as libpostaldata on the host machine.

The process hangs after downloads with message "All Done!"

When i do a fresh install the process go well and after the download of the data the process hangs for ever:

Creating pelias_baseimage ... done

> [email protected] create_index /code/pelias/schema
> node scripts/create_index


--------------
 create index 
--------------

[put mapping] 	 pelias { acknowledged: true } 

Starting pelias_baseimage ... 
Starting pelias_baseimage
Starting pelias_baseimage ... done
Starting pelias_baseimage ... done
Creating pelias_libpostal_baseimage ... 
Starting pelias_baseimage ... 
Starting pelias_baseimage
Creating pelias_libpostal_baseimage ... done

> [email protected] download /code/pelias/openstreetmap
> node util/download_data.js


> [email protected] download /code/pelias/whosonfirst
> node ./utils/download_data.js


> [email protected] download /code/pelias/openaddresses
> node utils/download_data.js

info: [download] Downloading sources: http://download.geofabrik.de/south-america/colombia-latest.osm.pbf
info: [download_data_filtered] Getting parent ids
info: [download] Attempting to download all data
info: [download_data_filtered] Getting descendants
info: [download_data_filtered] Generating metafiles
info: [download] All done!
info: [download_data_filtered] Finished generating metafile for continent
info: [download_data_filtered] Finished generating metafile for country
info: [download_data_filtered] Finished generating metafile for dependency
info: [download_data_filtered] Finished generating metafile for disputed
info: [download_data_filtered] Finished generating metafile for macroregion
info: [download_data_filtered] Finished generating metafile for region
info: [download_data_filtered] Finished generating metafile for macrocounty

> [email protected] download-tiger /code/pelias/interpolation
> node script/js/update_tiger.js

info: [update_tiger] Downloaded tl_2016_41001_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41003_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41005_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41007_addrfeat.zip
info: [download_data_filtered] Finished generating metafile for county
info: [download_data_filtered] Finished generating metafile for localadmin
info: [update_tiger] Downloaded tl_2016_41009_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41011_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41013_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41015_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41017_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41019_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41021_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41023_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41025_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41027_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41029_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41031_addrfeat.zip
info: [download_data_filtered] Finished generating metafile for locality
info: [download_data_filtered] Finished generating metafile for borough
info: [download_data_filtered] Finished generating metafile for macrohood
info: [update_tiger] Downloaded tl_2016_41033_addrfeat.zip
info: [download_data_filtered] Finished generating metafile for neighbourhood
info: [update_tiger] Downloaded tl_2016_41035_addrfeat.zip
info: [download_data_filtered] Finished generating metafile for postalcode
info: [download_data_filtered] Finished generating all metafiles
info: [download_data_filtered] downloading continent
info: [update_tiger] Downloaded tl_2016_41037_addrfeat.zip
info: [download_data_filtered] downloading country
info: [download_data_filtered] downloading dependency
info: [download_data_filtered] downloading disputed
info: [download_data_filtered] downloading macroregion
info: [download_data_filtered] downloading region
info: [update_tiger] Downloaded tl_2016_41039_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41041_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41043_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41045_addrfeat.zip
info: [download_data_filtered] downloading macrocounty
info: [download_data_filtered] downloading county
info: [update_tiger] Downloaded tl_2016_41047_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41049_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41051_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41053_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41055_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41057_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41059_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41061_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41063_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41065_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41067_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41069_addrfeat.zip
info: [update_tiger] Downloaded tl_2016_41071_addrfeat.zip
info: [download_data_filtered] downloading localadmin
info: [download_data_filtered] downloading locality
info: [download_data_filtered] downloading borough
info: [download_data_filtered] downloading macrohood
info: [download_data_filtered] downloading neighbourhood
info: [download_data_filtered] downloading postalcode
info: [download_data_filtered] downloaded 1 continent records
info: [download_data_filtered] downloaded 1 country records
info: [download_data_filtered] downloaded 0 dependency records
info: [download_data_filtered] downloaded 2 disputed records
info: [download_data_filtered] downloaded 0 macroregion records
info: [download_data_filtered] downloaded 34 region records
info: [download_data_filtered] downloaded 0 macrocounty records
info: [download_data_filtered] downloaded 1122 county records
info: [download_data_filtered] downloaded 2 localadmin records
info: [download_data_filtered] downloaded 1093 locality records
info: [download_data_filtered] downloaded 0 borough records
info: [download_data_filtered] downloaded 0 macrohood records
info: [download_data_filtered] downloaded 111 neighbourhood records
info: [download_data_filtered] downloaded 9 postalcode records
All done!

Using top i cannot see any process running

image

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.