Giter VIP home page Giter VIP logo

musicbrainz-docker's Introduction

MusicBrainz mirror server with search and replication

Build Status

This repo contains everything needed to run a musicbrainz mirror server with search and replication in docker.

Table of contents

Prerequisites

Recommended hardware/VM

  • CPU: 16 threads (or 2 without indexed search), x86-64 architecture
  • RAM: 16 GB (or 4 without indexed search)
  • Disk Space: 250 GB (or 100 without indexed search)

Required software

If you use Docker Desktop on macOS you may need to increase the amount of memory available to containers from the default of 2GB:

  • Preferences > Resources > Memory

If you use Ubuntu 19.10 or later, the above requirements can be set up by running:

sudo apt-get update && \
sudo apt-get install docker.io docker-compose git && \
sudo systemctl enable --now docker.service

If you use UFW to manage your firewall:

  • ufw-docker or any other way to fix the Docker and UFW security flaw.

External documentation

Components version

  • Current MB Branch: v-2024-05-13-schema-change
  • Current DB_SCHEMA_SEQUENCE: 29
  • Postgres Version: 16 (can be changed by setting the environment variable POSTGRES_VERSION)
  • MB Solr search server: 3.4.2 (can be changed by setting the environment variable MB_SOLR_VERSION)
  • Search Index Rebuilder: 3.0.1

Installation

This section is about installing MusicBrainz mirror server with locally indexed search and automatically replicated data.

Download this repository and change current working directory with:

git clone https://github.com/metabrainz/musicbrainz-docker.git
cd musicbrainz-docker

If you want to mirror the Postgres database only (neither the website nor the web API), change the base configuration with the following command (as a first step, otherwise it will blank it out):

admin/configure with alt-db-only-mirror

Build Docker images

Docker images for composed services should be built once using:

sudo docker-compose build

Create database

⚙️ Postgres shared buffers are set to 2GB by default. Before running this step, you should consider modifying your memory settings in order to give your database a sufficient amount of ram, otherwise your database could run very slowly.

Download latest full data dumps and create the database with:

sudo docker-compose run --rm musicbrainz createdb.sh -fetch

Build materialized tables

This is an optional step.

MusicBrainz Server makes use of materialized (or denormalized) tables in production to improve the performance of certain pages and features. These tables duplicate primary table data and can take up several additional gigabytes of space, so they're optional but recommended. If you don't populate these tables, the server will generally fall back to slower queries in their place.

If you wish to configure the materialized tables, you can run:

sudo docker-compose exec musicbrainz bash -c 'carton exec -- ./admin/BuildMaterializedTables --database=MAINTENANCE all'

Start website

Make the local website available at http://localhost:5000 with:

sudo docker-compose up -d

At this point the local website will show data loaded from the dumps only. For indexed search and replication, keep going!

Set up search indexes

Depending on your available ressources in CPU/RAM vs. bandwidth:

  • Either build search indexes manually from the installed database:

    sudo docker-compose exec indexer python -m sir reindex

    ⚙️ Java heap for Solr is set to 2GB by default. Before running this step, you should consider modifying your memory settings in order to give your search server a sufficient amount of ram, otherwise your search server could run very slowly.

    (This option is known to take 4½ hours with 16 CPU threads and 16 GB RAM.)

    To index cores individually, rather than all at once, add --entity-type CORE (any number of times) to the command above. For example sudo docker-compose exec indexer python -m sir reindex --entity-type artist --entity-type release

  • Or download pre-built search indexes based on the latest data dump:

    sudo docker-compose run --rm musicbrainz fetch-dump.sh search
    sudo docker-compose run --rm search load-search-indexes.sh

    (This option downloads 30GB of Zstandard-compressed archives from FTP.)

⚠️ Search indexes are not included in replication. You will have to rebuild search indexes regularly to keep it up-to-date. This can be done manually with the commands above, with Live Indexing (see below), or with a scheduled cron job. Here's an example cron job that can be added to your etc/crontab file from your server's root:

0 1 * * 7 YOUR_USER_NAME cd ~/musicbrainz-docker && /usr/bin/docker-compose exec -T indexer python -m sir reindex

At this point indexed search works on the local website/webservice. For replication, keep going!

Enable replication

Set replication token

First, copy your MetaBrainz access token (see instructions for generating a token) and paste when prompted to by the following command:

admin/set-replication-token

The token will be written to the file local/secrets/metabrainz_access_token.

Then, grant access to the token for replication with:

admin/configure add replication-token
sudo docker-compose up -d

Run replication once

Run replication script once to catch up with latest database updates:

sudo bash -c 'docker-compose exec musicbrainz replication.sh &' && \
sudo docker-compose exec musicbrainz /usr/bin/tail -f mirror.log

Schedule replication

Enable replication as a cron job of root user in musicbrainz service container with:

admin/configure add replication-cron
sudo docker-compose up -d

By default, it replicates data every day at 3 am UTC. To change that, see advanced configuration.

You can view the replication log file while it is running with:

sudo docker-compose exec musicbrainz tail --follow mirror.log

You can view the replication log file after it is done with:

sudo docker-compose exec musicbrainz tail mirror.log.1

Enable live indexing

⚠️ Search indexes’ live update for mirror server is not stable yet. Until then, it should be considered as an experimental feature. Do not use it if you don't want to get your hands dirty.

  1. Disable replication cron job if you enabled it:

    admin/configure rm replication-cron
    sudo docker-compose up -d
  2. Make indexer goes through AMQP Setup with:

    sudo docker-compose exec indexer python -m sir amqp_setup
    admin/create-amqp-extension
    admin/setup-amqp-triggers install
  3. Build search indexes if they either have not been built or are outdated.

  4. Make indexer watch reindex messages with:

    admin/configure add live-indexing-search
    sudo docker-compose up -d
  5. Reenable replication cron job if you disabled it at 1.

    admin/configure add replication-cron
    sudo docker-compose up -d

Advanced configuration

Local changes

You should preferably not locally change any file being tracked by git. Check your working tree is clean with:

git status

Git is set to ignore the followings you are encouraged to write to:

  • .env file,
  • any new file under local directory.

Docker environment variables

There are many ways to set environment variables in Docker Compose, the most convenient here is probably to edit the hidden file .env.

You can then check values to be passed to containers using:

sudo docker-compose config

Finally, make Compose picks up configuration changes with:

sudo docker-compose up -d

Customize web server host:port

By default, the web server listens at http://localhost:5000

This can be changed using the two Docker environment variables MUSICBRAINZ_WEB_SERVER_HOST and MUSICBRAINZ_WEB_SERVER_PORT.

If MUSICBRAINZ_WEB_SERVER_PORT set to 80 (http), then the port number will not appear in the base URL of the web server.

If set to 443 (https), then the port number will not appear either, but the a separate reverse proxy is required to handle https correctly.

Customize the number of processes for MusicBrainz Server

By default, MusicBrainz Server uses 10 plackup processes at once.

This number can be changed using the Docker environment variable MUSICBRAINZ_SERVER_PROCESSES.

Customize download server

By default, data dumps and pre-built search indexes are downloaded from https://data.metabrainz.org/pub/musicbrainz.

The download server can be changed using the Docker environment variable MUSICBRAINZ_BASE_DOWNLOAD_URL.

For backwards compatibility reasons an FTP server can be specified using the MUSICBRAINZ_BASE_FTP_URL Docker environment variable. Note that support for this variable is deprecated and will be removed in a future release.

See the list of download servers for alternative download sources.

Customize replication schedule

By default, there is no crontab file in musicbrainz service container.

If you followed the steps to schedule replication, then the crontab file used by musicbrainz service is bound to default/replication.cron.

This can be changed by creating a custom crontab file under local/ directory, and finally setting the Docker environment variable MUSICBRAINZ_CRONTAB_PATH to its path.

Customize search indexer configuration

By default, the configuration file used by indexer service is bound to default/indexer.ini.

This can be changed by creating a custom configuration file under local/ directory, and finally setting the Docker environment variable SIR_CONFIG_PATH to its path.

Customize backend Postgres server

By default, the services indexer and musicbrainz are trying to connect to the host db (for both read-only and write host) but the hosts can be customized using the MUSICBRAINZ_POSTGRES_SERVER and MUSICBRAINZ_POSTGRES_READONLY_SERVER environment variables.

Notes:

  • After switching to another Postgres server:
    • If not transferring data, it is needed to create the database again.
    • For live indexing, the RabbitMQ server has to still be reachable from the Postgres server.
  • The helper scripts check-search-indexes and create-amqp-extension won’t work anymore.
  • The service db will still be up even if unused.

Customize backend RabbitMQ server

By default, the services db, indexer and musicbrainz are trying to connect to the host mq but the host can be customized using the MUSICBRAINZ_RABBITMQ_SERVER environment variable.

Notes:

  • After switching to another RabbitMQ server:
    • Live indexing requires to go through AMQP Setup again.
    • If not transferring data, it might be needed to build search indexes again.
  • The helper script purge-message-queues won’t work anymore.
  • The service mq will still be up even if unused.

Customize backend Redis server

By default, the service musicbrainz is trying to connect to the host redis but the host can be customized using the MUSICBRAINZ_REDIS_SERVER environment variable.

Notes:

  • After switching to another Redis server:
    • If not transferring data, MusicBrainz user sessions will be reset.
  • The service redis will still be running even if unused.

Docker Compose overrides

In Docker Compose, it is possible to override the base configuration using multiple Compose files.

Some overrides are available under compose directory. Feel free to write your own overrides under local directory.

The helper script admin/configure is able to:

  • list available compose files, with a descriptive summary
  • show the value of COMPOSE_FILE variable in Docker environment
  • set/update COMPOSE_FILE in .env file with a list of compose files
  • set/update COMPOSE_FILE in .env file with added or removed compose files

Try admin/configure help for more information.

Publish ports of all services

To publish ports of services db, mq, redis and search (additionally to musicbrainz) on the host, simply run:

admin/configure add publishing-all-ports
sudo docker-compose up -d

If you are running a database only mirror, run this instead:

admin/configure add publishing-db-port
sudo docker-compose up -d

Modify memory settings

By default, each of db and search services have about 2GB of RAM. You may want to set more or less memory for any of these services, depending on your available resources or on your priorities.

For example, to set 4GB to each of db and search services, create a file local/compose/memory-settings.yml as follows:

version: '3.1'

# Description: Customize memory settings

services:
  db:
    command: postgres -c "shared_buffers=4GB" -c "shared_preload_libraries=pg_amqp.so"
  search:
    environment:
      - SOLR_HEAP=4g

See postgres for more configuration parameters and options to pass to db service, and solr.in.sh for more environment variables to pass to search service,

Then enable it by running:

admin/configure add local/compose/memory-settings.yml
sudo docker-compose up -d

Test setup

If you just need a small server with sample data to test your own SQL queries and/or MusicBrainz Web Service calls, you can run the below commands instead of following the above installation:

git clone https://github.com/metabrainz/musicbrainz-docker.git
cd musicbrainz-docker
admin/configure add musicbrainz-standalone
sudo docker-compose build
sudo docker-compose run --rm musicbrainz createdb.sh -sample -fetch
sudo docker-compose up -d

The two differences are:

  1. Sample data dump is downloaded instead of full data dumps,
  2. MusicBrainz Server runs in standalone mode instead of mirror mode.

Build search indexes and Enable live indexing are the same.

Replication is not applicable to test setup.

Development setup

Required disk space is much lesser than normal setup: 15GB to be safe.

The below sections are optional depending on which service(s) you are coding.

Local development of MusicBrainz Server

For local development of MusicBrainz Server, you can run the below commands instead of following the above installation:

git clone https://github.com/metabrainz/musicbrainz-server.git
MUSICBRAINZ_SERVER_LOCAL_ROOT=$PWD/musicbrainz-server
git clone https://github.com/metabrainz/musicbrainz-docker.git
cd musicbrainz-docker
echo MUSICBRAINZ_DOCKER_HOST_IPADDRCOL=127.0.0.1: >> .env
echo MUSICBRAINZ_SERVER_LOCAL_ROOT="$MUSICBRAINZ_SERVER_LOCAL_ROOT" >> .env
admin/configure add musicbrainz-dev
sudo docker-compose build
sudo docker-compose run --rm musicbrainz createdb.sh -sample -fetch
sudo docker-compose up -d

The main differences are:

  1. Sample data dump is downloaded instead of full data dumps,
  2. MusicBrainz Server runs in standalone mode instead of mirror mode,
  3. Development mode is enabled (but Catalyst debug),
  4. JavaScript and resources are automaticaly recompiled on file changes,
  5. MusicBrainz Server is automatically restarted on Perl file changes,
  6. MusicBrainz Server code is in musicbrainz-server/ directory.
  7. Ports are published to the host only (through MUSICBRAINZ_DOCKER_HOST_IPADDRCOL)

After changing code in musicbrainz-server/, it can be run as follows:

sudo docker-compose restart musicbrainz

Build search indexes and Enable live indexing are the same.

Replication is not applicable to development setup.

Simply restart the container when checking out a new branch.

Local development of Search Index Rebuilder

This is very similar to the above but for Search Index Rebuilder (SIR):

  1. Set the variable SIR_LOCAL_ROOT in the .env file
  2. Run admin/configure add sir-dev
  3. Run sudo docker-compose up -d

Notes:

Local development of MusicBrainz Solr

The situation is quite different for this service as it doesn’t depends on any other. Its development rather rely on schema. See mb-solr and mmd-schema.

However, other services depend on it, so it is useful to run a local version of mb-solr in search service for integration tests:

  1. Run build.sh from your mb-solr local working copy, which will build an image of metabrainz/mb-solr with a local tag reflecting the working tree status of your local clone of mb-solr.
  2. Set MB_SOLR_VERSION in .env to this local tag.
  3. Run sudo docker-compose up -d

Helper scripts

There are two directories with helper scripts:

  • admin/ contains helper scripts to be run from the host. For more information, use the --help option:

    admin/check-search-indexes --help
    admin/delete-search-indexes --help

    See also:

  • build/musicbrainz/scripts/ contains helper scripts to be run from the container attached to the service musicbrainz. Most of these scripts are not for direct use, but createdb.sh and below-documented recreatedb.sh.

Recreate database

If you need to recreate the database, you will need to enter the postgres password set in postgres.env:

  • sudo docker-compose run --rm musicbrainz recreatedb.sh

or to fetch new data dumps before recreating the database:

  • sudo docker-compose run --rm musicbrainz recreatedb.sh -fetch

Recreate database with indexed search

If you need to recreate the database with indexed search,

admin/configure rm replication-cron # if replication is enabled
sudo docker-compose stop
sudo docker-compose run --rm musicbrainz fetch-dump.sh both
admin/purge-message-queues
sudo docker-compose run --rm search load-search-indexes.sh --force
sudo docker-compose run --rm musicbrainz recreatedb.sh
sudo docker-compose up -d
admin/setup-amqp-triggers install
admin/configure add replication-cron
sudo docker-compose up -d

you will need to enter the postgres password set in postgres.env:

  • sudo docker-compose run --rm musicbrainz recreatedb.sh

or to fetch new data dumps before recreating the database:

  • sudo docker-compose run --rm musicbrainz recreatedb.sh -fetch

Update

Check your working tree is clean with:

git status

Check your currently checked out version:

git describe --dirty

Check releases for update instructions.

Cleanup

Each time you are rebuilding a new image, for either updating to a new release or applying some changes in configuration, the previous image is not removed. On the one hand, it is convenient as it allows you to quickly restore it in case the new image has critical issues. On the other hand, it is filling your disk with some GBs over time. Thus it is recommended to do a regular cleanup as follows.

⚠️ If you are using Docker for anything else than this Compose project, the below command will also remove all unused images.

sudo docker system prune --all

Removal

Removing the directory isn’t enough, the Docker objects (images, containers, volumes) have to be removed too for a complete removal.

Before removing the directory where you cloned this repository, run the following command from that directory.

sudo docker-compose down --remove-orphans --rmi all --volumes

It will output what has been removed so that you can check it. Only after it is over, you can remove the directory.

Issues

If anything doesn't work, check the troubleshooting page.

If you still don’t have a solution, please create an issue with versions info:

echo MusicBrainz Docker: `git describe --always --broken --dirty --tags` && \
echo Docker Compose: `docker-compose version --short` && \
sudo docker version -f 'Docker Client/Server: {{.Client.Version}}/{{.Server.Version}}'

musicbrainz-docker's People

Contributors

akshaaatt avatar alastair avatar atj avatar brookstalley avatar corny avatar gregsochanik avatar hectoras avatar jackjocross avatar jsturgis avatar justinthiele avatar leoverto avatar lordsputnik avatar lotooo avatar mayhem avatar mwiencek avatar nikosmichas avatar orangepeelbeef avatar pdf avatar pingupingu avatar reosarevok avatar ronanm avatar santiagofn avatar supersandro2000 avatar toolboc avatar y-young avatar yvanzo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

musicbrainz-docker's Issues

Error when Indexing the database

After cloning this repo and following all the steps, when i have to index the database i run
sudo docker-compose run --rm indexer /home/search/index.sh

But i always get the same error

ubuntu@ip-172-00-00-0:~/docker_server/musicbrainz-docker$ sudo docker-compose run --rm indexer /home/search/index.sh Index Builder Started:21:07:35 org.postgresql.util.PSQLException: FATAL: database "musicbrainz_db" does not exist at org.postgresql.core.v3.ConnectionFactoryImpl.readStartupMessages(ConnectionFactoryImpl.java:444) at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:99) at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:66) at org.postgresql.jdbc2.AbstractJdbc2Connection.<init>(AbstractJdbc2Connection.java:124) at org.postgresql.jdbc3.AbstractJdbc3Connection.<init>(AbstractJdbc3Connection.java:30) at org.postgresql.jdbc4.AbstractJdbc4Connection.<init>(AbstractJdbc4Connection.java:29) at org.postgresql.jdbc4.Jdbc4Connection.<init>(Jdbc4Connection.java:24) at org.postgresql.Driver.makeConnection(Driver.java:386) at org.postgresql.Driver.connect(Driver.java:260) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.musicbrainz.search.index.IndexOptions.getMainDatabaseConnection(IndexOptions.java:52) at org.musicbrainz.search.index.IndexBuilder.main(IndexBuilder.java:110) Exception in thread "main" java.lang.NullPointerException at org.musicbrainz.search.index.DatabaseIndex.readReplicationInformationFromDatabase(DatabaseIndex.java:112) at org.musicbrainz.search.index.IndexBuilder.main(IndexBuilder.java:147)

I've been trying to see where the database is created and change the name with no luck.
I'm pretty sure it must be a pretty simple fix.

Someone could please help me out with this one??
Thanks!!!

Use Debian for base image?

How would you feel about moving from ubuntu:xenial to a debian:jessie as the base image? This would probably result in smaller images with less unnecessary software installed.

update to postgres 9.5

Latest replication data requires migration to postgres 9.5

(and i just got it all working too le sigh )

Specific lookup errors with new SOLR search server

The new SOLR search server works fine now (one known exception: annotations).

During my actual tagging session with Picard v2.1.3 and the local SOLR search server, I have found a few lookup queries who returns only such an error:

Oops, something went wrong!

Error message: (No details about this error are available)

Time: 2019-08-15 16:56:17 UTC

Host: 72e66089454d

Interface language: en

URL: http://192.168.1.61:5000/taglookup?artist=Various%20Artists&release=Time-Life%20AM%20Gold%201963&track=&tracknum=&duration=&filename=&tport=8000

Request data:

$VAR1 = {
'body_parameters' => {},
'query_parameters' => {
'track' => '',
'duration' => '',
'artist' => 'Various Artists',
'filename' => '',
'tracknum' => '',
'release' => 'Time-Life AM Gold 1963',
'tport' => '8000'
}
};

We're terribly sorry for this problem. Please wait a few minutes and repeat your request — the problem may go away.

Other queries from the same Picard to the same SOLR works perfect. This are only a very few producing the above error. Maybe someone could have a look at the syntax or other possible reasons?
Hint: The same lookups against "musicbrainz.org" instead to "192.168.1.61:5000" runs without error.

Examples for above error:
http://192.168.1.61:5000/taglookup?artist=Various%20Artists&release=Time-Life%20AM%20Gold%201963&track=&tracknum=&duration=&filename=&tport=8000

#2 Example:
http://192.168.1.61:5000/search/textsearch?limit=25&type=release&query=time%20life%2080s%20music%20explosion&tport=8000

#3 Example:
http://192.168.1.61:5000/taglookup?artist=Various&release=Body%20%26%20Soul%20(disc%201)&track=&tracknum=&duration=&filename=&tport=8000

Null pointer exception when building search indexes

Im getting this error when trying to build the database indexes

[ec2-user@ip-172-30-2-201 musicbrainz-docker]$ docker-compose run --rm indexer /home/search/index.sh
Index Builder Started:18:28:22
recording: Unable to get replication information
tmp_artistcredit:Started at:18:28:23
tmp_artistcredit:Finished:0.0 secs
tmp_artistcredit:Created Indexes:0.0 secs
tmp_release :Started at:18:28:23
tmp_release :Finished:0.0 secs
tmp_release :Created Indexes:0.0 secs
tmp_release_event :Started at:18:28:23
tmp_release_event :Finished:0.0 secs
tmp_release_event :Created Indexes:0.0 secs
tmp_track :Started at:18:28:23
tmp_track :Finished:0.0 secs
tmp_track :Created Indexes0.0 secs
recording:Started at 18:28:23
Exception in thread "main" java.lang.NullPointerException
at org.musicbrainz.search.index.DatabaseIndex.addMetaInformation(DatabaseIndex.java:151)
at org.musicbrainz.search.index.IndexBuilder.buildDatabaseIndex(IndexBuilder.java:265)
at org.musicbrainz.search.index.IndexBuilder.main(IndexBuilder.java:164)
[ec2-user@ip-172-30-2-201 musicbrainz-docker]$

I dont know what to do, any answer would be apreciated

Thank you
Nico

Logging for server

Where do logs for the server get written to? I'd like to see those using docker logs but the only logging I see is the server startup logs.

Unable to index DB

Here's the output when attempting to index:

musicbrainz:~/musicbrainz-docker$ sudo docker-compose run --rm indexer /home/search/index.sh
ERROR: No such service: indexer

In the latest docker-compose.yml, the indexer section was removed from the file but the "index.sh" script doesn't appear to be in the search container.

Unable to build latest container -- Error in /musicbrainz-server/script/compile_resources.sh

Running: docker-compose run --rm musicbrainz /createdb.sh -fetch

Error:

Step 22 : RUN /musicbrainz-server/script/compile_resources.sh
 ---> Running in f6099aa17eda
Use of uninitialized value $ENV{"DB_ENV_POSTGRES_USER"} in string at /musicbrainz-server/admin/../lib/DBDefs.pm line 51.
Use of uninitialized value $ENV{"DB_ENV_POSTGRES_PASSWORD"} in string at /musicbrainz-server/admin/../lib/DBDefs.pm line 51.
Use of uninitialized value $ENV{"DB_PORT_5432_TCP_ADDR"} in string at /musicbrainz-server/admin/../lib/DBDefs.pm line 51.
Use of uninitialized value $ENV{"DB_PORT_5432_TCP_PORT"} in string at /musicbrainz-server/admin/../lib/DBDefs.pm line 51.
Use of uninitialized value $ENV{"DB_ENV_POSTGRES_USER"} in string at /musicbrainz-server/admin/../lib/DBDefs.pm line 51.
Use of uninitialized value $ENV{"DB_ENV_POSTGRES_PASSWORD"} in string at /musicbrainz-server/admin/../lib/DBDefs.pm line 51.
Use of uninitialized value $ENV{"DB_PORT_5432_TCP_ADDR"} in string at /musicbrainz-server/admin/../lib/DBDefs.pm line 51.
Use of uninitialized value $ENV{"DB_PORT_5432_TCP_PORT"} in string at /musicbrainz-server/admin/../lib/DBDefs.pm line 51.
Use of uninitialized value $ENV{"DB_ENV_POSTGRES_USER"} in string at /musicbrainz-server/admin/../lib/DBDefs.pm line 51.
Use of uninitialized value $ENV{"DB_ENV_POSTGRES_PASSWORD"} in string at /musicbrainz-server/admin/../lib/DBDefs.pm line 51.
Use of uninitialized value $ENV{"DB_PORT_5432_TCP_ADDR"} in string at /musicbrainz-server/admin/../lib/DBDefs.pm line 51.
Use of uninitialized value $ENV{"DB_PORT_5432_TCP_PORT"} in string at /musicbrainz-server/admin/../lib/DBDefs.pm line 51.
Attribute (port) does not pass the type constraint because: Validation failed for 'Int' with value "" at constructor MusicBrainz::Server::Database::new (defined at /musicbrainz-server/admin/../lib/MusicBrainz/Server/Database.pm line 53) line 52
        MusicBrainz::Server::Database::new('MusicBrainz::Server::Database', 'HASH(0x3c62b00)') called at /musicbrainz-server/admin/../lib/MusicBrainz/Server/DatabaseConnectionFactory.pm line 19
        MusicBrainz::Server::DatabaseConnectionFactory::register_databases('MusicBrainz::Server::DatabaseConnectionFactory', 'READWRITE', 'HASH(0x3c55f00)', 'TEST', 'HASH(0x3c560b0)', 'READONLY', 'HASH(0x3c62b00)', 'SYSTEM', 'HASH(0x3c627a0)') called at /musicbrainz-server/admin/../lib/DBDefs.pm line 51
        require DBDefs.pm at admin/ShowDBDefs line 38
        main::BEGIN at /musicbrainz-server/admin/../lib/DBDefs.pm line 0
        eval {...} at /musicbrainz-server/admin/../lib/DBDefs.pm line 0
Compilation failed in require at admin/ShowDBDefs line 38.
BEGIN failed--compilation aborted at admin/ShowDBDefs line 38.
/musicbrainz-server/root/static/gulpfile.js:4
  throw new Error('error: DEVELOPMENT_SERVER should be set to either 0 or 1');
  ^

Error: error: DEVELOPMENT_SERVER should be set to either 0 or 1
    at Object.<anonymous> (gulpfile.js:2:9)
    at Module._compile (module.js:413:34)
    at loader (/musicbrainz-server/node_modules/babel-register/lib/node.js:130:5)
    at Object.require.extensions.(anonymous function) [as .js] (/musicbrainz-server/node_modules/babel-register/lib/node.js:140:7)
    at Module.load (module.js:357:32)
    at Function.Module._load (module.js:314:12)
    at Module.require (module.js:367:17)
    at require (internal/module.js:20:19)
    at Object.<anonymous> (/musicbrainz-server/gulpfile.js:2:1)
    at Module._compile (module.js:413:34)

docker-compose.yml:

postgresqldata:
  build: data-dockerfile
  restart: always
  volumes:
   - /datadump:/media/dbdump:rw

postgresql:
  build: postgres-dockerfile
  restart: always
  env_file:
   - ./postgres-dockerfile/postgres.env
  expose:
   - "5432"

musicbrainz:
  build: musicbrainz-dockerfile
  ports:
   - "5000:5000"
  volumes_from:
   - postgresqldata
  restart: always
  environment:
   - MUSICBRAINZ_USE_PROXY=1
   - DEVELOPMENT_SERVER=1
  expose:
   - "5000"
  links:
   - postgresql:db
$ docker --version
Docker version 1.11.1, build 5604cbe

$ docker-compose --version
docker-compose version 1.7.1, build 0a9ab35

DBDEFs.pm has sub DEVELOPMENT_SERVER { 0 }

ERROR: database "musicbrainz" already exists / createdb.sh

I follow your instructions, but whatever I do, I can't solve the step "Create Database":

ubuntu@ubuntu:~/musicbrainz-docker$ sudo docker-compose run --rm musicbrainz /createdb.sh
found existing dumps
Tue Aug  1 19:37:07 2017 : InitDb.pl starting
Tue Aug  1 19:37:07 2017 : Creating database 'musicbrainz'
Failed query:
	'CREATE DATABASE musicbrainz WITH OWNER = musicbrainz TEMPLATE template0 ENCODING = 'UNICODE' LC_CTYPE='C' LC_COLLATE='C''
	()
42P04 DBD::Pg::st execute failed: ERROR:  database "musicbrainz" already exists [for Statement "CREATE DATABASE musicbrainz WITH OWNER = musicbrainz TEMPLATE template0 ENCODING = 'UNICODE' LC_CTYPE='C' LC_COLLATE='C'"]
 at /musicbrainz-server/admin/../lib/Sql.pm line 115.
	Sql::catch {...} (MusicBrainz::Server::Exceptions::DatabaseError=HASH(0x2d33d40)) called at /usr/share/perl5/Try/Tiny.pm line 115
	Try::Tiny::try(CODE(0x2c49120), Try::Tiny::Catch=REF(0x2b556c8)) called at /musicbrainz-server/admin/../lib/Sql.pm line 116
	Sql::do(Sql=HASH(0x2afdb10), "CREATE DATABASE musicbrainz WITH OWNER = musicbrainz TEMPLATE"...) called at /musicbrainz-server/admin/InitDb.pl line 218
	main::Create("MAINTENANCE") called at /musicbrainz-server/admin/InitDb.pl line 523
Tue Aug  1 19:37:07 2017 : InitDb.pl failed
ubuntu@ubuntu:~/musicbrainz-docker$ 

Where should I change which lines to import the existing data dumps?
Why does the database already exist if I have not executed the "createdb.sh" before?

Error when creating the database with createdb script

Hi,

When I attempt to create the database with sudo docker-compose run --rm musicbrainz /createdb.sh -fetch I get the following error:

Error in tempdir() using /media/dbdump/tmp/MBImport-XXXXXXXX: Parent directory (/media/dbdump/tmp) does not exist at /musicbrainz-server/admin/MBImport.pl line 132.

Someone experiencing the same issue?

Thanks!

Some database setup output statements

When setting up the database and running the server against it, I am seeing some interesting output. Wondering if it's an issue or not...

PostgreSQL init process complete; ready for start up.

LOG:  database system was shut down at 2015-12-21 19:33:08 UTC
LOG:  MultiXact member wraparound protections are now enabled
LOG:  database system is ready to accept connections
LOG:  autovacuum launcher started
ERROR:  could not open extension control file "/usr/share/postgresql/9.4/extension/plperlu.control": No such file or directory
STATEMENT:  CREATE EXTENSION "plperlu";

Seen once during start

...

LOG:  checkpoints are occurring too frequently (4 seconds apart)
HINT:  Consider increasing the configuration parameter "checkpoint_segments".
LOG:  checkpoints are occurring too frequently (3 seconds apart)
HINT:  Consider increasing the configuration parameter "checkpoint_segments".
LOG:  checkpoints are occurring too frequently (3 seconds apart)
HINT:  Consider increasing the configuration parameter "checkpoint_segments".
LOG:  checkpoints are occurring too frequently (4 seconds apart)
HINT:  Consider increasing the configuration parameter "checkpoint_segments".
LOG:  checkpoints are occurring too frequently (6 seconds apart)

Repeated multiple times

...

ERROR:  constraint "dbmirror_pendingdata_SeqId" of relation "dbmirror_pendingdata" does not exist
STATEMENT:  ALTER TABLE dbmirror_pendingdata DROP CONSTRAINT "dbmirror_pendingdata_SeqId" CASCADE
WARNING:  SET TRANSACTION can only be used in transaction blocks
WARNING:  SET CONSTRAINTS can only be used in transaction blocks

Repeated multiple times

Container not running error

Any ideas on how to fix this? (Running in Win10pro 64)

src\musicbrainz-docker>docker-compose -f docker-compose.yml -f docker-compose.public.yml up -d
musicbrainz-docker_redis_1 is up-to-date
musicbrainz-docker_search_1 is up-to-date
musicbrainz-docker_db_1 is up-to-date
musicbrainz-docker_musicbrainz_1 is up-to-date
Starting musicbrainz-docker_indexer_1 ... done                                                                          
C:\Users\rob\src\musicbrainz-docker>docker exec musicbrainzdocker_musicbrainz_1 /set-token.sh MY_TOKEN
**Error: No such container: musicbrainzdocker_musicbrainz_1**

Database tuning

tweaking of shared_buffers and other postgresql settings can often be useful; the VM image does some of this.

JSON.pm include error when running ./admin/replication/LoadReplicationChanges

I'm trying to update the db to schema 24 and I'm getting the following error when running both ./admin/replication/LoadReplicationChanges and ./update.sh from within the musicbrainz_docker_musicbrainz_1 container:

Can't locate JSON.pm in @INC (you may need to install the JSON module) (@INC contains: /musicbrainz-server/admin/replication/../../lib /etc/perl /usr/local/lib/x86_64-linux-gnu/perl/5.22.1 /usr/local/share/perl/5.22.1 /usr/lib/x86_64-linux-gnu/perl5/5.22 /usr/share/perl5 /usr/lib/x86_64-linux-gnu/perl/5.22 /usr/share/perl/5.22 /usr/local/lib/site_perl /usr/lib/x86_64-linux-gnu/perl-base .) at /musicbrainz-server/admin/replication/../../lib/DBDefs/Default.pm line 32.
BEGIN failed--compilation aborted at /musicbrainz-server/admin/replication/../../lib/DBDefs/Default.pm line 32.
Compilation failed in require at /usr/share/perl/5.22/parent.pm line 20.
BEGIN failed--compilation aborted at /musicbrainz-server/admin/replication/../../lib/DBDefs.pm line 29.
Compilation failed in require at ./admin/replication/LoadReplicationChanges line 32.
BEGIN failed--compilation aborted at ./admin/replication/LoadReplicationChanges line 32.

./upgrade.sh

Can't locate JSON.pm in @INC (you may need to install the JSON module) (@INC contains: lib /etc/perl /usr/local/lib/x86_64-linux-gnu/perl/5.22.1 /usr/local/share/perl/5.22.1 /usr/lib/x86_64-linux-gnu/perl5/5.22 /usr/share/perl5 /usr/lib/x86_64-linux-gnu/perl/5.22 /usr/share/perl/5.22 /usr/local/lib/site_perl /usr/lib/x86_64-linux-gnu/perl-base .) at lib/DBDefs/Default.pm line 32.
BEGIN failed--compilation aborted at lib/DBDefs/Default.pm line 32.
Compilation failed in require at /usr/share/perl/5.22/parent.pm line 20.
BEGIN failed--compilation aborted at lib/DBDefs.pm line 29.
Compilation failed in require at -e line 1.
BEGIN failed--compilation aborted at -e line 1.

Is there a step missing in the setup, or does something need to be added to the PATH env var?

Thanks,
Greg

Accessing database from the terminal

I run a docker, and I can successfully access musicbrainz web service through localhost:5000.

I can also connect to the database through the terminal by using:

psql -h localhost -U musicbrainz -p 5432

However, /dt shows "Did not find any relations". Am I missing some step in accessing the database which runs in docker?

Failed to Create Database

Running Docker on Ubuntu 14.04.1 LTS (Trusty) host system.

$ docker --version
Docker version 1.4.1, build 5bc2ff8

After completing the steps to create the musicbrainz and postgresql containers, creating the database with the command "./createdb.sh" within the Musicbrainz container failed with the following error data:

...
mbdump/work
mbdump/work_alias
mbdump/work_alias_type
mbdump/work_attribute
mbdump/work_attribute_type
mbdump/work_attribute_type_allowed_value
mbdump/work_gid_redirect
mbdump/work_type
Mon Jan 19 02:25:02 2015 : Validating snapshot
Mon Jan 19 02:25:02 2015 : Snapshot timestamp is 2015-01-17 02:40:29.631492+00
Mon Jan 19 02:25:02 2015 : This snapshot corresponds to replication sequence #83381
Mon Jan 19 02:25:02 2015 : starting import
Table Rows est% rows/sec
No data file found for 'artist_rating_raw', skipping
No data file found for 'artist_tag_raw', skipping
Mon Jan 19 02:25:03 2015 : load cdtoc_raw
cdtoc_raw 0 0% 0Error loading /tmp/MBImport-oLaO27uh/mbdump/cdtoc_raw: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 1.
No data file found for 'edit', skipping
No data file found for 'edit_area', skipping
No data file found for 'edit_artist', skipping
No data file found for 'edit_event', skipping
No data file found for 'edit_instrument', skipping
No data file found for 'edit_label', skipping
No data file found for 'edit_note', skipping
No data file found for 'edit_place', skipping
No data file found for 'edit_recording', skipping
No data file found for 'edit_release', skipping
No data file found for 'edit_release_group', skipping
No data file found for 'edit_series', skipping
No data file found for 'edit_url', skipping
No data file found for 'edit_work', skipping
No data file found for 'event_tag_raw', skipping
No data file found for 'label_rating_raw', skipping
No data file found for 'label_tag_raw', skipping
No data file found for 'place_tag_raw', skipping
No data file found for 'recording_rating_raw', skipping
No data file found for 'recording_tag_raw', skipping
No data file found for 'release_group_rating_raw', skipping
No data file found for 'release_group_tag_raw', skipping
No data file found for 'release_tag_raw', skipping
Mon Jan 19 02:25:03 2015 : load release_raw
release_raw 0 0% 0Error loading /tmp/MBImport-oLaO27uh/mbdump/release_raw: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 2.
Mon Jan 19 02:25:03 2015 : load series
series 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/series: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 3.
Mon Jan 19 02:25:03 2015 : load series_alias
series_alias 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/series_alias: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 4.
Mon Jan 19 02:25:03 2015 : load series_alias_type
series_alias_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/series_alias_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 5.
Mon Jan 19 02:25:03 2015 : load series_annotation
series_annotation 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/series_annotation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 6.
Mon Jan 19 02:25:03 2015 : load series_gid_redirect
series_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/series_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 7.
Mon Jan 19 02:25:03 2015 : load series_ordering_type
series_ordering_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/series_ordering_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 8.
Mon Jan 19 02:25:03 2015 : load series_type
series_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/series_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 9.
Mon Jan 19 02:25:03 2015 : load series_tag
series_tag 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/series_tag: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 10.
No data file found for 'series_tag_raw', skipping
Mon Jan 19 02:25:03 2015 : load track_raw
track_raw 0 0% 0Error loading /tmp/MBImport-oLaO27uh/mbdump/track_raw: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 11.
No data file found for 'vote', skipping
No data file found for 'work_rating_raw', skipping
No data file found for 'work_tag_raw', skipping
Mon Jan 19 02:25:03 2015 : load annotation
annotation 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/annotation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 12.
No data file found for 'application', skipping
Mon Jan 19 02:25:03 2015 : load area
area 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/area: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 13.
Mon Jan 19 02:25:03 2015 : load area_type
area_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/area_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 14.
Mon Jan 19 02:25:03 2015 : load area_alias
area_alias 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/area_alias: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 15.
Mon Jan 19 02:25:03 2015 : load area_alias_type
area_alias_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/area_alias_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 16.
Mon Jan 19 02:25:03 2015 : load area_annotation
area_annotation 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/area_annotation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 17.
Mon Jan 19 02:25:03 2015 : load area_gid_redirect
area_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/area_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 18.
Mon Jan 19 02:25:03 2015 : load area_tag
area_tag 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/area_tag: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 19.
No data file found for 'area_tag_raw', skipping
Mon Jan 19 02:25:03 2015 : load country_area
country_area 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/country_area: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 20.
Mon Jan 19 02:25:03 2015 : load iso_3166_1
iso_3166_1 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/iso_3166_1: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 21.
Mon Jan 19 02:25:03 2015 : load iso_3166_2
iso_3166_2 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/iso_3166_2: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 22.
Mon Jan 19 02:25:03 2015 : load iso_3166_3
iso_3166_3 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/iso_3166_3: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 23.
Mon Jan 19 02:25:03 2015 : load artist
artist 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/artist: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 24.
Mon Jan 19 02:25:03 2015 : load artist_alias
artist_alias 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/artist_alias: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 25.
Mon Jan 19 02:25:03 2015 : load artist_alias_type
artist_alias_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/artist_alias_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 26.
Mon Jan 19 02:25:03 2015 : load artist_annotation
artist_annotation 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/artist_annotation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 27.
Mon Jan 19 02:25:03 2015 : load artist_credit
artist_credit 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/artist_credit: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 28.
Mon Jan 19 02:25:03 2015 : load artist_credit_name
artist_credit_name 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/artist_credit_name: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 29.
Mon Jan 19 02:25:03 2015 : load artist_gid_redirect
artist_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/artist_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 30.
Mon Jan 19 02:25:03 2015 : load artist_ipi
artist_ipi 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/artist_ipi: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 31.
Mon Jan 19 02:25:03 2015 : load artist_isni
artist_isni 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/artist_isni: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 32.
Mon Jan 19 02:25:03 2015 : load artist_meta
artist_meta 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/artist_meta: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 33.
Mon Jan 19 02:25:03 2015 : load artist_tag
artist_tag 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/artist_tag: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 34.
Mon Jan 19 02:25:03 2015 : load artist_type
artist_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/artist_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 35.
No data file found for 'autoeditor_election', skipping
No data file found for 'autoeditor_election_vote', skipping
Mon Jan 19 02:25:03 2015 : load cdtoc
cdtoc 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/cdtoc: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 36.
No data file found for 'editor', skipping
No data file found for 'editor_oauth_token', skipping
No data file found for 'editor_preference', skipping
No data file found for 'editor_language', skipping
No data file found for 'editor_sanitised', skipping
No data file found for 'editor_subscribe_artist', skipping
No data file found for 'editor_subscribe_collection', skipping
No data file found for 'editor_subscribe_editor', skipping
No data file found for 'editor_subscribe_label', skipping
No data file found for 'editor_subscribe_series', skipping
No data file found for 'editor_watch_artist', skipping
No data file found for 'editor_watch_preferences', skipping
No data file found for 'editor_watch_release_group_type', skipping
No data file found for 'editor_watch_release_status', skipping
Mon Jan 19 02:25:03 2015 : load event
event 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/event: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 37.
Mon Jan 19 02:25:03 2015 : load event_alias
event_alias 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/event_alias: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 38.
Mon Jan 19 02:25:03 2015 : load event_alias_type
event_alias_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/event_alias_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 39.
Mon Jan 19 02:25:03 2015 : load event_annotation
event_annotation 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/event_annotation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 40.
Mon Jan 19 02:25:03 2015 : load event_gid_redirect
event_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/event_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 41.
Mon Jan 19 02:25:03 2015 : load event_tag
event_tag 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/event_tag: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 42.
Mon Jan 19 02:25:03 2015 : load event_type
event_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/event_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 43.
Mon Jan 19 02:25:03 2015 : load gender
gender 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/gender: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 44.
Mon Jan 19 02:25:03 2015 : load instrument
instrument 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/instrument: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 45.
Mon Jan 19 02:25:03 2015 : load instrument_alias
instrument_alias 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/instrument_alias: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 46.
Mon Jan 19 02:25:03 2015 : load instrument_alias_type
instrument_alias_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/instrument_alias_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 47.
Mon Jan 19 02:25:03 2015 : load instrument_annotation
instrument_annotation 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/instrument_annotation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 48.
Mon Jan 19 02:25:03 2015 : load instrument_gid_redirect
instrument_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/instrument_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 49.
Mon Jan 19 02:25:03 2015 : load instrument_type
instrument_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/instrument_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 50.
Mon Jan 19 02:25:03 2015 : load instrument_tag
No data file found for 'instrument_tag_raw', skipping
Mon Jan 19 02:25:03 2015 : load isrc
isrc 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/isrc: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 51.
Mon Jan 19 02:25:03 2015 : load iswc
iswc 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/iswc: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 52.
Mon Jan 19 02:25:03 2015 : load l_area_area
l_area_area 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_area_area: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 53.
Mon Jan 19 02:25:03 2015 : load l_area_artist
Mon Jan 19 02:25:03 2015 : load l_area_event
l_area_event 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_area_event: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 54.
Mon Jan 19 02:25:03 2015 : load l_area_instrument
l_area_instrument 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_area_instrument: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 55.
Mon Jan 19 02:25:03 2015 : load l_area_label
Mon Jan 19 02:25:03 2015 : load l_area_place
Mon Jan 19 02:25:03 2015 : load l_area_recording
l_area_recording 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_area_recording: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 56.
Mon Jan 19 02:25:03 2015 : load l_area_release
l_area_release 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_area_release: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 57.
Mon Jan 19 02:25:03 2015 : load l_area_release_group
Mon Jan 19 02:25:03 2015 : load l_area_series
Mon Jan 19 02:25:03 2015 : load l_area_url
l_area_url 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_area_url: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 58.
Mon Jan 19 02:25:03 2015 : load l_area_work
l_area_work 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_area_work: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 59.
Mon Jan 19 02:25:03 2015 : load l_artist_artist
l_artist_artist 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_artist_artist: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 60.
Mon Jan 19 02:25:03 2015 : load l_artist_event
l_artist_event 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_artist_event: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 61.
Mon Jan 19 02:25:03 2015 : load l_artist_instrument
Mon Jan 19 02:25:03 2015 : load l_artist_label
l_artist_label 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_artist_label: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 62.
Mon Jan 19 02:25:03 2015 : load l_artist_place
l_artist_place 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_artist_place: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 63.
Mon Jan 19 02:25:03 2015 : load l_artist_recording
l_artist_recording 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_artist_recording: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 64.
Mon Jan 19 02:25:03 2015 : load l_artist_release
l_artist_release 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_artist_release: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 65.
Mon Jan 19 02:25:03 2015 : load l_artist_release_group
l_artist_release_group 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_artist_release_group: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 66.
Mon Jan 19 02:25:03 2015 : load l_artist_series
l_artist_series 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_artist_series: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 67.
Mon Jan 19 02:25:03 2015 : load l_artist_url
l_artist_url 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_artist_url: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 68.
Mon Jan 19 02:25:03 2015 : load l_artist_work
l_artist_work 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_artist_work: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 69.
Mon Jan 19 02:25:03 2015 : load l_event_event
l_event_event 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_event_event: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 70.
Mon Jan 19 02:25:03 2015 : load l_event_instrument
Mon Jan 19 02:25:03 2015 : load l_event_label
Mon Jan 19 02:25:03 2015 : load l_event_place
l_event_place 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_event_place: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 71.
Mon Jan 19 02:25:03 2015 : load l_event_recording
l_event_recording 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_event_recording: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 72.
Mon Jan 19 02:25:03 2015 : load l_event_release
l_event_release 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_event_release: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 73.
Mon Jan 19 02:25:03 2015 : load l_event_release_group
l_event_release_group 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_event_release_group: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 74.
Mon Jan 19 02:25:03 2015 : load l_event_series
l_event_series 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_event_series: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 75.
Mon Jan 19 02:25:03 2015 : load l_event_url
l_event_url 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_event_url: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 76.
Mon Jan 19 02:25:03 2015 : load l_event_work
Mon Jan 19 02:25:03 2015 : load l_instrument_instrument
l_instrument_instrument 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_instrument_instrument: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 77.
Mon Jan 19 02:25:03 2015 : load l_instrument_label
Mon Jan 19 02:25:03 2015 : load l_instrument_place
Mon Jan 19 02:25:03 2015 : load l_instrument_recording
Mon Jan 19 02:25:03 2015 : load l_instrument_release
Mon Jan 19 02:25:03 2015 : load l_instrument_release_group
Mon Jan 19 02:25:03 2015 : load l_instrument_series
Mon Jan 19 02:25:03 2015 : load l_instrument_url
l_instrument_url 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_instrument_url: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 78.
Mon Jan 19 02:25:03 2015 : load l_instrument_work
Mon Jan 19 02:25:03 2015 : load l_label_label
l_label_label 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_label_label: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 79.
Mon Jan 19 02:25:03 2015 : load l_label_place
Mon Jan 19 02:25:03 2015 : load l_label_recording
l_label_recording 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_label_recording: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 80.
Mon Jan 19 02:25:03 2015 : load l_label_release
l_label_release 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_label_release: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 81.
Mon Jan 19 02:25:03 2015 : load l_label_release_group
Mon Jan 19 02:25:03 2015 : load l_label_series
Mon Jan 19 02:25:03 2015 : load l_label_url
l_label_url 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_label_url: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 82.
Mon Jan 19 02:25:03 2015 : load l_label_work
l_label_work 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_label_work: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 83.
Mon Jan 19 02:25:03 2015 : load l_place_place
l_place_place 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_place_place: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 84.
Mon Jan 19 02:25:03 2015 : load l_place_recording
l_place_recording 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_place_recording: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 85.
Mon Jan 19 02:25:03 2015 : load l_place_release
l_place_release 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_place_release: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 86.
Mon Jan 19 02:25:03 2015 : load l_place_release_group
Mon Jan 19 02:25:03 2015 : load l_place_series
Mon Jan 19 02:25:03 2015 : load l_place_url
l_place_url 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_place_url: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 87.
Mon Jan 19 02:25:03 2015 : load l_place_work
l_place_work 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_place_work: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 88.
Mon Jan 19 02:25:03 2015 : load l_recording_recording
l_recording_recording 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_recording_recording: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 89.
Mon Jan 19 02:25:03 2015 : load l_recording_release
l_recording_release 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_recording_release: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 90.
Mon Jan 19 02:25:03 2015 : load l_recording_release_group
Mon Jan 19 02:25:03 2015 : load l_recording_series
l_recording_series 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_recording_series: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 91.
Mon Jan 19 02:25:03 2015 : load l_recording_url
l_recording_url 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_recording_url: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 92.
Mon Jan 19 02:25:03 2015 : load l_recording_work
l_recording_work 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_recording_work: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 93.
Mon Jan 19 02:25:03 2015 : load l_release_group_release_group
l_release_group_release_group 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_release_group_release_group: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 94.
Mon Jan 19 02:25:03 2015 : load l_release_group_series
l_release_group_series 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_release_group_series: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 95.
Mon Jan 19 02:25:03 2015 : load l_release_group_url
l_release_group_url 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_release_group_url: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 96.
Mon Jan 19 02:25:03 2015 : load l_release_group_work
Mon Jan 19 02:25:03 2015 : load l_release_release
l_release_release 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_release_release: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 97.
Mon Jan 19 02:25:03 2015 : load l_release_release_group
Mon Jan 19 02:25:03 2015 : load l_release_series
l_release_series 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_release_series: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 98.
Mon Jan 19 02:25:03 2015 : load l_release_url
l_release_url 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_release_url: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 99.
Mon Jan 19 02:25:03 2015 : load l_release_work
Mon Jan 19 02:25:03 2015 : load l_series_series
l_series_series 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_series_series: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 100.
Mon Jan 19 02:25:03 2015 : load l_series_url
l_series_url 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_series_url: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 101.
Mon Jan 19 02:25:03 2015 : load l_series_work
l_series_work 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_series_work: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 102.
Mon Jan 19 02:25:03 2015 : load l_url_url
Mon Jan 19 02:25:03 2015 : load l_url_work
l_url_work 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_url_work: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 103.
Mon Jan 19 02:25:03 2015 : load l_work_work
l_work_work 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/l_work_work: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 104.
Mon Jan 19 02:25:03 2015 : load label
label 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/label: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 105.
Mon Jan 19 02:25:03 2015 : load label_alias
label_alias 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/label_alias: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 106.
Mon Jan 19 02:25:03 2015 : load label_alias_type
label_alias_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/label_alias_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 107.
Mon Jan 19 02:25:03 2015 : load label_annotation
label_annotation 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/label_annotation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 108.
Mon Jan 19 02:25:03 2015 : load label_gid_redirect
label_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/label_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 109.
Mon Jan 19 02:25:03 2015 : load label_ipi
label_ipi 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/label_ipi: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 110.
Mon Jan 19 02:25:03 2015 : load label_isni
label_isni 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/label_isni: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 111.
Mon Jan 19 02:25:03 2015 : load label_meta
label_meta 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/label_meta: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 112.
Mon Jan 19 02:25:03 2015 : load label_tag
label_tag 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/label_tag: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 113.
Mon Jan 19 02:25:03 2015 : load label_type
label_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/label_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 114.
Mon Jan 19 02:25:03 2015 : load language
language 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/language: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 115.
Mon Jan 19 02:25:03 2015 : load link
link 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/link: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 116.
Mon Jan 19 02:25:03 2015 : load link_attribute
link_attribute 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/link_attribute: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 117.
Mon Jan 19 02:25:03 2015 : load link_attribute_credit
link_attribute_credit 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/link_attribute_credit: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 118.
Mon Jan 19 02:25:03 2015 : load link_attribute_text_value
link_attribute_text_value 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/link_attribute_text_value: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 119.
Mon Jan 19 02:25:03 2015 : load link_attribute_type
link_attribute_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/link_attribute_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 120.
Mon Jan 19 02:25:03 2015 : load link_creditable_attribute_type
link_creditable_attribute_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/link_creditable_attribute_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 121.
Mon Jan 19 02:25:03 2015 : load link_text_attribute_type
link_text_attribute_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/link_text_attribute_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 122.
Mon Jan 19 02:25:03 2015 : load link_type
link_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/link_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 123.
Mon Jan 19 02:25:03 2015 : load link_type_attribute_type
link_type_attribute_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/link_type_attribute_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 124.
No data file found for 'editor_collection', skipping
No data file found for 'editor_collection_event', skipping
No data file found for 'editor_collection_release', skipping
Mon Jan 19 02:25:03 2015 : load editor_collection_type
editor_collection_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/editor_collection_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 125.
Mon Jan 19 02:25:03 2015 : load medium
medium 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/medium: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 126.
Mon Jan 19 02:25:03 2015 : load medium_cdtoc
medium_cdtoc 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/medium_cdtoc: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 127.
Mon Jan 19 02:25:03 2015 : load medium_format
medium_format 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/medium_format: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 128.
Mon Jan 19 02:25:03 2015 : load orderable_link_type
orderable_link_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/orderable_link_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 129.
Mon Jan 19 02:25:03 2015 : load place
place 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/place: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 130.
Mon Jan 19 02:25:03 2015 : load place_alias
place_alias 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/place_alias: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 131.
Mon Jan 19 02:25:03 2015 : load place_alias_type
place_alias_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/place_alias_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 132.
Mon Jan 19 02:25:03 2015 : load place_annotation
place_annotation 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/place_annotation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 133.
Mon Jan 19 02:25:03 2015 : load place_gid_redirect
place_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/place_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 134.
Mon Jan 19 02:25:03 2015 : load place_tag
place_tag 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/place_tag: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 135.
Mon Jan 19 02:25:03 2015 : load place_type
place_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/place_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 136.
Mon Jan 19 02:25:03 2015 : load recording
recording 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/recording: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 137.
Mon Jan 19 02:25:03 2015 : load recording_annotation
recording_annotation 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/recording_annotation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 138.
Mon Jan 19 02:25:03 2015 : load recording_gid_redirect
recording_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/recording_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 139.
Mon Jan 19 02:25:03 2015 : load recording_meta
recording_meta 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/recording_meta: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 140.
Mon Jan 19 02:25:03 2015 : load recording_tag
recording_tag 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/recording_tag: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 141.
Mon Jan 19 02:25:03 2015 : load release
release 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/release: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 142.
Mon Jan 19 02:25:03 2015 : load release_annotation
release_annotation 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/release_annotation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 143.
Mon Jan 19 02:25:03 2015 : load release_country
release_country 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/release_country: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 144.
Mon Jan 19 02:25:03 2015 : load release_gid_redirect
release_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/release_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 145.
Mon Jan 19 02:25:03 2015 : load release_group
release_group 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/release_group: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 146.
Mon Jan 19 02:25:03 2015 : load release_group_annotation
release_group_annotation 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/release_group_annotation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 147.
Mon Jan 19 02:25:03 2015 : load release_group_gid_redirect
release_group_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/release_group_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 148.
Mon Jan 19 02:25:03 2015 : load release_group_meta
release_group_meta 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/release_group_meta: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 149.
Mon Jan 19 02:25:03 2015 : load release_group_tag
release_group_tag 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/release_group_tag: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 150.
Mon Jan 19 02:25:03 2015 : load release_group_primary_type
release_group_primary_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/release_group_primary_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 151.
Mon Jan 19 02:25:03 2015 : load release_group_secondary_type
release_group_secondary_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/release_group_secondary_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 152.
Mon Jan 19 02:25:03 2015 : load release_group_secondary_type_join
release_group_secondary_type_j 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/release_group_secondary_type_join: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 153.
Mon Jan 19 02:25:03 2015 : load release_label
release_label 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/release_label: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 154.
Mon Jan 19 02:25:03 2015 : load release_meta
release_meta 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/release_meta: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 155.
No data file found for 'release_coverart', skipping
Mon Jan 19 02:25:03 2015 : load release_packaging
release_packaging 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/release_packaging: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 156.
Mon Jan 19 02:25:03 2015 : load release_status
release_status 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/release_status: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 157.
Mon Jan 19 02:25:03 2015 : load release_tag
release_tag 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/release_tag: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 158.
Mon Jan 19 02:25:03 2015 : load release_unknown_country
release_unknown_country 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/release_unknown_country: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 159.
Mon Jan 19 02:25:03 2015 : load replication_control
replication_control 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/replication_control: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 160.
Mon Jan 19 02:25:03 2015 : load script
script 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/script: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 161.
Mon Jan 19 02:25:03 2015 : load tag
tag 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/tag: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 162.
Mon Jan 19 02:25:03 2015 : load tag_relation
tag_relation 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/tag_relation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 163.
Mon Jan 19 02:25:03 2015 : load track
track 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/track: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 164.
Mon Jan 19 02:25:03 2015 : load track_gid_redirect
track_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/track_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 165.
Mon Jan 19 02:25:03 2015 : load medium_index
medium_index 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/medium_index: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 166.
Mon Jan 19 02:25:03 2015 : load url
url 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/url: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 167.
Mon Jan 19 02:25:03 2015 : load url_gid_redirect
url_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/url_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 168.
Mon Jan 19 02:25:03 2015 : load work
work 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/work: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 169.
Mon Jan 19 02:25:03 2015 : load work_alias
work_alias 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/work_alias: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 170.
Mon Jan 19 02:25:03 2015 : load work_alias_type
work_alias_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/work_alias_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 171.
Mon Jan 19 02:25:03 2015 : load work_annotation
work_annotation 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/work_annotation: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 172.
Mon Jan 19 02:25:03 2015 : load work_attribute
work_attribute 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/work_attribute: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 173.
Mon Jan 19 02:25:03 2015 : load work_attribute_type
work_attribute_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/work_attribute_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 174.
Mon Jan 19 02:25:03 2015 : load work_attribute_type_allowed_value
work_attribute_type_allowed_va 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/work_attribute_type_allowed_value: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 175.
Mon Jan 19 02:25:03 2015 : load work_gid_redirect
work_gid_redirect 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/work_gid_redirect: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 176.
Mon Jan 19 02:25:03 2015 : load work_meta
work_meta 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/work_meta: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 177.
Mon Jan 19 02:25:03 2015 : load work_tag
work_tag 0 0% 0Error loading /tmp/MBImport-zRwvw2_J/mbdump/work_tag: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 178.
Mon Jan 19 02:25:03 2015 : load work_type
work_type 0 0% 0Error loading /tmp/MBImport-NByCepaZ/mbdump/work_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 179.
Mon Jan 19 02:25:03 2015 : load cover_art_archive.art_type
cover_art_archive.art_type 0 0% 0Error loading /tmp/MBImport-Qv5G7pag/mbdump/cover_art_archive.art_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 180.
Mon Jan 19 02:25:03 2015 : load cover_art_archive.image_type
cover_art_archive.image_type 0 0% 0Error loading /tmp/MBImport-Qv5G7pag/mbdump/cover_art_archive.image_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 181.
Mon Jan 19 02:25:03 2015 : load cover_art_archive.cover_art
cover_art_archive.cover_art 0 0% 0Error loading /tmp/MBImport-Qv5G7pag/mbdump/cover_art_archive.cover_art: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 182.
Mon Jan 19 02:25:03 2015 : load cover_art_archive.cover_art_type
cover_art_archive.cover_art_ty 0 0% 0Error loading /tmp/MBImport-Qv5G7pag/mbdump/cover_art_archive.cover_art_type: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 183.
Mon Jan 19 02:25:04 2015 : load cover_art_archive.release_group_cover_art
cover_art_archive.release_grou 0 0% 0Error loading /tmp/MBImport-Qv5G7pag/mbdump/cover_art_archive.release_group_cover_art: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 184.
No data file found for 'documentation.l_area_area_example', skipping
No data file found for 'documentation.l_area_artist_example', skipping
No data file found for 'documentation.l_area_instrument_example', skipping
No data file found for 'documentation.l_area_label_example', skipping
No data file found for 'documentation.l_area_place_example', skipping
No data file found for 'documentation.l_area_recording_example', skipping
No data file found for 'documentation.l_area_release_example', skipping
No data file found for 'documentation.l_area_release_group_example', skipping
No data file found for 'documentation.l_area_series_example', skipping
No data file found for 'documentation.l_area_url_example', skipping
No data file found for 'documentation.l_area_work_example', skipping
No data file found for 'documentation.l_artist_artist_example', skipping
No data file found for 'documentation.l_artist_instrument_example', skipping
No data file found for 'documentation.l_artist_label_example', skipping
No data file found for 'documentation.l_artist_recording_example', skipping
No data file found for 'documentation.l_artist_release_example', skipping
No data file found for 'documentation.l_artist_release_group_example', skipping
No data file found for 'documentation.l_artist_place_example', skipping
No data file found for 'documentation.l_artist_series_example', skipping
No data file found for 'documentation.l_artist_url_example', skipping
No data file found for 'documentation.l_artist_work_example', skipping
No data file found for 'documentation.l_instrument_instrument_example', skipping
No data file found for 'documentation.l_instrument_label_example', skipping
No data file found for 'documentation.l_instrument_place_example', skipping
No data file found for 'documentation.l_instrument_recording_example', skipping
No data file found for 'documentation.l_instrument_release_example', skipping
No data file found for 'documentation.l_instrument_release_group_example', skipping
No data file found for 'documentation.l_instrument_series_example', skipping
No data file found for 'documentation.l_instrument_url_example', skipping
No data file found for 'documentation.l_instrument_work_example', skipping
No data file found for 'documentation.l_label_label_example', skipping
No data file found for 'documentation.l_label_recording_example', skipping
No data file found for 'documentation.l_label_release_example', skipping
No data file found for 'documentation.l_label_release_group_example', skipping
No data file found for 'documentation.l_label_place_example', skipping
No data file found for 'documentation.l_label_series_example', skipping
No data file found for 'documentation.l_label_url_example', skipping
No data file found for 'documentation.l_label_work_example', skipping
No data file found for 'documentation.l_place_place_example', skipping
No data file found for 'documentation.l_place_recording_example', skipping
No data file found for 'documentation.l_place_release_example', skipping
No data file found for 'documentation.l_place_release_group_example', skipping
No data file found for 'documentation.l_place_series_example', skipping
No data file found for 'documentation.l_place_url_example', skipping
No data file found for 'documentation.l_place_work_example', skipping
No data file found for 'documentation.l_recording_recording_example', skipping
No data file found for 'documentation.l_recording_release_example', skipping
No data file found for 'documentation.l_recording_release_group_example', skipping
No data file found for 'documentation.l_recording_series_example', skipping
No data file found for 'documentation.l_recording_url_example', skipping
No data file found for 'documentation.l_recording_work_example', skipping
No data file found for 'documentation.l_release_group_release_group_example', skipping
No data file found for 'documentation.l_release_group_series_example', skipping
No data file found for 'documentation.l_release_group_url_example', skipping
No data file found for 'documentation.l_release_group_work_example', skipping
No data file found for 'documentation.l_release_release_example', skipping
No data file found for 'documentation.l_release_release_group_example', skipping
No data file found for 'documentation.l_release_series_example', skipping
No data file found for 'documentation.l_release_url_example', skipping
No data file found for 'documentation.l_release_work_example', skipping
No data file found for 'documentation.l_series_series_example', skipping
No data file found for 'documentation.l_series_url_example', skipping
No data file found for 'documentation.l_series_work_example', skipping
No data file found for 'documentation.l_url_url_example', skipping
No data file found for 'documentation.l_url_work_example', skipping
No data file found for 'documentation.l_work_work_example', skipping
No data file found for 'documentation.link_type_documentation', skipping
Mon Jan 19 02:25:04 2015 : load statistics.statistic
statistics.statistic 0 0% 0Error loading /tmp/MBImport-h4xoWG14/mbdump/statistics.statistic: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 185.
Mon Jan 19 02:25:04 2015 : load statistics.statistic_event
statistics.statistic_event 0 0% 0Error loading /tmp/MBImport-h4xoWG14/mbdump/statistics.statistic_event: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 186.
Mon Jan 19 02:25:04 2015 : load wikidocs.wikidocs_index
wikidocs.wikidocs_index 0 0% 0Error loading /tmp/MBImport-1vm1e4pT/mbdump/wikidocs.wikidocs_index: DBD::Pg::db pg_putcopydata failed: no COPY in progress at /musicbrainz-server/admin/MBImport.pl line 289, line 187.
Mon Jan 19 02:25:04 2015 : ensuring editor information is present
Mon Jan 19 02:25:04 2015 : import finished
Loaded 0 tables (0 rows) in 1 seconds

Failed to import dataset.
Mon Jan 19 02:25:05 2015 : InitDb.pl failed

Possible issue with MBImport.pl

From a local merge of 34a4862

$ docker-compose run --rm musicbrainz /createdb.sh
no dumps found or dumps are incomplete
Mon May 30 18:56:05 2016 : InitDb.pl starting
Mon May 30 18:56:05 2016 : Creating database 'musicbrainz_db'
createlang: language "plpgsql" is already installed in database "musicbrainz_db"

Failed to create language plpgsql -- it's likely to be already installed, continuing.
createlang: language installation failed: ERROR:  could not open extension control file "/usr/share/postgresql/9.4/extension/plperlu.control": No such file or directory

Failed to create language plperlu -- it's likely to be already installed, continuing.
CREATE SCHEMA
CREATE SCHEMA
CREATE SCHEMA
CREATE SCHEMA
CREATE SCHEMA
CREATE SCHEMA
CREATE SCHEMA
Mon May 30 18:56:07 2016 : Installing extensions (Extensions.sql)
Mon May 30 18:56:08 2016 : Creating tables ... (CreateTables.sql)
Mon May 30 18:56:15 2016 : Creating tables ... (caa/CreateTables.sql)
Mon May 30 18:56:15 2016 : Creating documentation tables ... (documentation/CreateTables.sql)
Mon May 30 18:56:17 2016 : Creating tables ... (report/CreateTables.sql)
Mon May 30 18:56:18 2016 : Creating sitemaps tables ... (sitemaps/CreateTables.sql)
Mon May 30 18:56:18 2016 : Creating statistics tables ... (statistics/CreateTables.sql)
Mon May 30 18:56:18 2016 : Creating wikidocs tables ... (wikidocs/CreateTables.sql)
Usage: MBImport.pl [options] FILE ...

        --help            show this help
        --fix-broken-utf8 replace invalid UTF-8 byte sequences with a
                          special U+FFFD codepoint (UTF-8: 0xEF 0xBF 0xBD)
    -i, --ignore-errors   if a table fails to import, continue anyway
    -t, --tmp-dir DIR     use DIR for temporary storage (default: /tmp)
        --skip-editor     do not guarantee editor rows are present (useful when
                          importing single tables).
        --update-replication-control whether or not this import should
                          alter the replication control table. This flag is
                          internal and is only be set by MusicBrainz scripts
        --delete-first    If set, will delete from non-empty tables immediately
                          before importing

FILE can be any of: a regular file in Postgres "copy" format (as produced
by ExportAllTables --nocompress); a gzip'd or bzip2'd tar file of Postgres
"copy" files (as produced by ExportAllTables); a directory containing
Postgres "copy" files; or a directory containing an "mbdump" directory
containing Postgres "copy" files.

If any "tar" files are named, they are firstly all
decompressed to temporary directories (under the directory named by
--tmp-dir).  These directories are removed on exit.

This script then proceeds through all of the MusicBrainz known table names,
and processes each as follows: firstly the file to load for that table
is identified, by considering each named argument in turn to see if it
provides a file for this table; if no file is available, processing of this
table ends.

Then, if the database table is not empty and delete-first is not set, a warning
is generated, and processing of this table ends.  Otherwise, the file is loaded
into the table.  (Exception: the "moderator_santised" file, if present, is
loaded into the "moderator" table).

Note: The --fix-broken-utf8 is usefull when upgrading a database to
      Postgres 8.1.x and your old database includes byte sequences that are
      invalid in UTF-8. It does not really fix the data, because the
      original encoding can't be determined automatically. Instead it
      replaces the affected byte sequence with the special Unicode "replacement
      character" U+FFFD. A warning is printed on every such replacement.
Mon May 30 18:56:19 2016 : Creating primary keys ... (CreatePrimaryKeys.sql)
Mon May 30 18:56:28 2016 : Creating CAA primary keys ... (caa/CreatePrimaryKeys.sql)
Mon May 30 18:56:28 2016 : Creating documentation primary keys ... (documentation/CreatePrimaryKeys.sql)
Mon May 30 18:56:30 2016 : Creating statistics primary keys ... (statistics/CreatePrimaryKeys.sql)
Mon May 30 18:56:31 2016 : Creating wikidocs primary keys ... (wikidocs/CreatePrimaryKeys.sql)
Mon May 30 18:56:31 2016 : Creating search configuration ... (CreateSearchConfiguration.sql)
Mon May 30 18:56:31 2016 : Creating functions ... (CreateFunctions.sql)
Mon May 30 18:56:31 2016 : Creating CAA functions ... (caa/CreateFunctions.sql)
Mon May 30 18:56:31 2016 : Creating indexes ... (CreateIndexes.sql)
Mon May 30 18:56:39 2016 : Creating CAA indexes ... (caa/CreateIndexes.sql)
Mon May 30 18:56:39 2016 : Creating sitemaps indexes ... (sitemaps/CreateIndexes.sql)
Mon May 30 18:56:39 2016 : Creating statistics indexes ... (statistics/CreateIndexes.sql)
Mon May 30 18:56:40 2016 : Creating slave-only indexes ... (CreateSlaveIndexes.sql)
Mon May 30 18:56:40 2016 : Setting raw initial sequence values ... (SetSequences.sql)
Mon May 30 18:56:42 2016 : Setting raw initial statistics sequence values ... (statistics/SetSequences.sql)
Mon May 30 18:56:42 2016 : Creating views ... (CreateViews.sql)
Mon May 30 18:56:42 2016 : Creating CAA views ... (caa/CreateViews.sql)
Mon May 30 18:56:42 2016 : Creating search indexes ... (CreateSearchIndexes.sql)
Mon May 30 18:56:43 2016 : Setting up replication ... (ReplicationSetup.sql)
Mon May 30 18:56:43 2016 : Optimizing database ...
VACUUM
Mon May 30 18:56:44 2016 : Initialized and imported data into the database.
Mon May 30 18:56:44 2016 : InitDb.pl succeeded

Dependencies failed in installation

I run command sudo docker-compose up -d and get error:

! Installing the dependencies failed: Module 'Test::XPath' is not installed, Module 'HTML::HTML5::Sanity' is not installed, Module 'HTML::HTML5::Parser' is not installed, Module 'LWP::Protocol::https' is not installed
! Bailing out the installation for ..
319 distributions installed
ERROR: Service 'musicbrainz' failed to build: The command '/bin/sh -c cd /musicbrainz-server/ &&     eval $( perl -Mlocal::lib) && cpanm --installdeps --notest .' returned a non-zero code: 1

eval $( perl -Mlocal::lib ) may have no effect

Please correct me if I'm wrong about any of the following - this is just based on my understanding of how Docker works, which may be incorrect!

I haven't yet tested this, but line 24, where eval $( perl -Mlocal::lib ) is appended to the ~/.bashrc file, may have no effect on subsequent RUN commands. And the . ~/.bashrc part of that command will only affect the current RUN command, and since nothing else is executing following that, it seems to be redundant.

https://github.com/jsturgis/musicbrainz-docker/blob/master/musicbrainz-dockerfile/Dockerfile#L24

RUN uses /bin/sh, which may not be set to use bash as the default shell.

I believe that this command is actually unnecessary - if its purpose is to prevent the Perl modules installed for MB polluting the global Perl installation by directing the libraries to be installed in a separate folder (as it seems to be), that doesn't seem to be too important in a Docker image.

Again, if I'm wrong about any of this, please let me know so that I can improve my Docker knowledge!

"Nice" CSS will not be loaded

I followed your instructions step by step.
After creating the search index I rebooted my ubuntu and restarted the container with
sudo docker-compose up

I can access the local webserver with 192.168.1.35:5000, but I get a "text-only" output, there is no "nice" CSS or color.

Somewhere I read, that I have to change the line
sub WEB_SERVER { "localhost:5000" }
to
sub WEB_SERVER { "192.168.1.35:5000" }

Is this manual step still necessary?

Where exactly do I have to do that?
Inside the running "musicbrainzdocker_musicbrainz_1" container?
Inside the file /musicbrainz-server/lib/DBDefs.pm?
Anywhere else?

Issue when running recreatedb script

Hi,

I'm experiencing issues with the latest changes when running sudo docker-compose run --rm musicbrainz /recreatedb.sh

I'm getting this output in every run:

psql: warning: extra command-line argument "postgres" ignored psql: could not connect to server: No such file or directory Is the server running locally and accepting connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?

Tried different postgres configuration changes but I had no luck.
Someone there experiencing same issue?

Thanks!

Strange log messages

I got strange log messages. Is this normal?

postgresql_1      | LOG:  autovacuum: found orphan temp table "pg_temp_3"."tmp_release" in database "musicbrainz_db"
postgresql_1      | LOG:  autovacuum: found orphan temp table "pg_temp_3"."tmp_release_event" in database "musicbrainz_db"
postgresql_1      | LOG:  autovacuum: found orphan temp table "pg_temp_3"."tmp_artistcredit" in database "musicbrainz_db"
postgresql_1      | LOG:  autovacuum: found orphan temp table "pg_temp_3"."tmp_release" in database "musicbrainz_db"
postgresql_1      | LOG:  autovacuum: found orphan temp table "pg_temp_3"."tmp_release_event" in database "musicbrainz_db"
postgresql_1      | LOG:  received smart shutdown request
postgresql_1      | LOG:  autovacuum launcher shutting down
postgresql_1      | LOG:  database system was interrupted; last known up at 2016-10-21 17:28:07 UTC
postgresql_1      | LOG:  database system was not properly shut down; automatic recovery in progress
postgresql_1      | LOG:  invalid record length at 2/84B511B0
postgresql_1      | LOG:  redo is not required
postgresql_1      | LOG:  MultiXact member wraparound protections are now enabled
postgresql_1      | LOG:  database system is ready to accept connections
postgresql_1      | LOG:  autovacuum launcher started

Use the latest DBDefs.pm file

We should use the version of DBDefs.pm that is pulled down during the image build step and update it to have the correct values. This will prevent DBDefs.pm from getting out of sync.

Windows 10 Docker issue

When I run docker-compose up -d

I get this:
musicbrainzdocker_postgresql_1 is up-to-date
musicbrainzdocker_search_1 is up-to-date
musicbrainzdocker_postgresqldata_1 is up-to-date
Starting musicbrainzdocker_indexer_1
Starting musicbrainzdocker_musicbrainz_1

ERROR: for musicbrainz Cannot start service musicbrainz: oci runtime error: exec: "/start.sh": permission denied
�[31mERROR�[0m: Encountered errors while bringing up the project.

I'm in Windows 10 pro, docker 1.12.1 build 23cf638

Builded search index will not be used

I have created the search index with this command:
sudo docker-compose run --rm indexer /home/search/index.sh

But if I access the local webserver with
http://192.168.1.35:5000/search?query=madonna&type=artist&method=indexed
I only get this error:

HTTP ERROR 500

Problem accessing /. Reason:

    Index is currently not available for resource type ARTIST

Powered by Jetty:// 9.3.10.v20160621

What do I have to check or change or restart or move after the index building process?

[FR] Possibility to choose source (FTP, http)?

Actually, the official MusicBrainz FTP servers are slow like hell.
Could you please let the user choose (optionally) the source for the needed "searchserver.war" and the fulldump *.bz2?

I think about something like
createdb.sh -fetch "ftp://myown.fast.sourceserver" or "http://evenfaster.webserver"

Error setting up a new EC2 instance

I am experiencing issues setting up a new instance with the new applied changes. I ran sudo docker-compose up -d without errors, but the indexer container exits with code 1. When I attempted to index I got the following error:

Index Builder Started:20:33:19 recording: Unable to get replication information tmp_artistcredit:Started at:20:33:19 Exception in thread "main" org.postgresql.util.PSQLException: ERROR: relation "artist_credit_name" does not exist at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:1592) at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1327) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:192) at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:451) at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:336) at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:328) at org.musicbrainz.search.index.CommonTables.createArtistCreditTableUsingDb(CommonTables.java:65) at org.musicbrainz.search.index.CommonTables.createTemporaryTables(CommonTables.java:197) at org.musicbrainz.search.index.IndexBuilder.main(IndexBuilder.java:151)

When I run sudo docker exec musicbrainzdocker_musicbrainz_1 ./admin/replication/LoadReplicationChanges I get Can't locate JSON.pm in @INC (you may need to install the JSON module) (@INC contains: /musicbrainz-server/admin/replication/../../lib /etc/perl /usr/local/lib/x86_64-linux-gnu/perl/5.22.1 /usr/local/share/perl/5.22.1 /usr/lib/x86_64-linux-gnu/perl5/5.22 /usr/share/perl5 /usr/lib/x86_64-linux-gnu/perl/5.22 /usr/share/perl/5.22 /usr/local/lib/site_perl /usr/lib/x86_64-linux-gnu/perl-base .) at /musicbrainz-server/admin/replication/../../lib/DBDefs/Default.pm line 32. BEGIN failed--compilation aborted at /musicbrainz-server/admin/replication/../../lib/DBDefs/Default.pm line 32. Compilation failed in require at /usr/share/perl/5.22/parent.pm line 20. BEGIN failed--compilation aborted at /musicbrainz-server/admin/replication/../../lib/DBDefs.pm line 29. Compilation failed in require at ./admin/replication/LoadReplicationChanges line 32. BEGIN failed--compilation aborted at ./admin/replication/LoadReplicationChanges line 32.

Someone can help me out?

Docker image for oracle jdk no longer available

The JDK docker image that the indexer uses is no longer available in the docker hub repository.
You should switch to using openjdk docker images instead.

Modify indexer-dockerfile/Dockerfile and change the first line to use the latest openjdk java 8:
FROM openjdk:8-jdk

Replace old LUCENE-based with new SOLR-based search server

Motivation

  • Old LUCENE-based search server is likely to be discontinued by the end of 2018;
  • New search index rebuilder allows for incremental update of search indexes.

References

/recreatedb.sh can't connect..

Following the directions to get musicbrainz setup and got as far as DLing and creating the DB, got a failure.
I was going to try and recreate the DB and try again but...

xxx@docker01:/musicbrainz-docker$ sudo docker-compose run --rm musicbrainz /recreatedb.sh
psql: warning: extra command-line argument "postgres" ignored
psql: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
xxx@docker01:
/musicbrainz-docker$

Postgres is running.

Search Error

After freshly setting this up, I get the following error with search

HTTP ERROR 500

Problem accessing /. Reason:

    Index is currently not available for resource type ARTIST

Powered by Jetty:// 9.3.10.v20160621

It doesn't help to use another resource type, all indexes are unavailable.

To set this up I did everything pretty much per instructions:

$ git clone && cd
# sudo docker-compose up -d
# (change another service away from port 5000)
# sudo docker-compose up -d
# sudo docker exec musicbrainzdocker_musicbrainz_1 /set-token.sh <token>
# sudo docker-compose run --rm musicbrainz /createdb.sh -fetch
# sudo docker-compose run --rm indexer /home/search/index.sh

After the search failed to operate, I ran a "sudo docker-compose restart" but that did not help.
Any help would be much appreciated.

Database creation fails

I tried the latest update today, and it runs into the following error:

Sun May 19 20:31:38 2019 : InitDb.pl starting
ERROR: schema "musicbrainz" already exists
ERROR: schema "cover_art_archive" already exists
ERROR: schema "documentation" already exists
ERROR: schema "json_dump" already exists
ERROR: schema "report" already exists
ERROR: schema "sitemaps" already exists
ERROR: schema "statistics" already exists
ERROR: schema "wikidocs" already exists
Sun May 19 20:31:38 2019 : Installing extensions (Extensions.sql)
Sun May 19 20:31:38 2019 : Creating tables ... (CreateTables.sql)
Sun May 19 20:31:39 2019 : psql:/musicbrainz-server/admin/sql/CreateTables.sql:15: ERROR: relation "alternative_release" already exists
Error during CreateTables.sql at /musicbrainz-server/admin/InitDb.pl line 117.
Sun May 19 20:31:39 2019 : InitDb.pl failed
Sun May 19 20:31:39 2019 : InitDb.pl starting
Sun May 19 20:31:39 2019 : Creating database 'musicbrainz_db'
Failed query:
'CREATE DATABASE musicbrainz_db WITH OWNER = musicbrainz TEMPLATE template0 ENCODING = 'UNICODE' LC_CTYPE='C' LC_COLLATE='C''
()
42P04 DBD::Pg::st execute failed: ERROR: database "musicbrainz_db" already exists [for Statement "CREATE DATABASE musicbrainz_db WITH OWNER = musicbrainz TEMPLATE template0 ENCODING = 'UNICODE' LC_CTYPE='C' LC_COLLATE='C'"]
at /musicbrainz-server/admin/../lib/Sql.pm line 115.
Sql::catch {...} (MusicBrainz::Server::Exceptions::DatabaseError=HASH(0x3fd7d88)) called at /usr/share/perl5/Try/Tiny.pm line 115
Try::Tiny::try(CODE(0x3eea798), Try::Tiny::Catch=REF(0x369f7c0)) called at /musicbrainz-server/admin/../lib/Sql.pm line 116
Sql::do(Sql=HASH(0x3d9dac0), "CREATE DATABASE musicbrainz_db WITH OWNER = musicbrainz TEMPL"...) called at /musicbrainz-server/admin/InitDb.pl line 218
main::Create("MAINTENANCE") called at /musicbrainz-server/admin/InitDb.pl line 526
Sun May 19 20:31:39 2019 : InitDb.pl failed

DB Schema Name

The latest commit breaks some of the scripts due to a database name change. The new one is "musicbrainz_db", yet some of the bash scripts are still using "musicbrainz".

Here is the commit -- ec77259

Some scripts that I have had to modify:

indexer-dockerfile/index.sh
musicbrainz-dockerfile/scripts/recreatedb.sh

slave.log is rotated as soon as replication is done

Hello,
Readme is showing how to watch the replication status, but it is only available while it is running since slave.log is rotated as soon as replication is done. And it is running nightly, so it is unlikely we will watch it while running !
Should the documentation mentionned that ?
Or (even better) should the log rotation occur before the replication, so that we have the full day to read it ?

README.md : musicbrainz docker image name

in README.md mucisbrainz default docker image name is musicbrainzdocker_musicbrainz_1, but when executing all commands, the name is musicbrainz-docker_musicbrainz_1

Consider not installing redis

consider not installing redis (only memcached) and configuring it to have instantaneous
timeouts; this is done by changing DATASTORE_REDIS_ARGS in DBDefs;
you can pass new arguments to Redis->new using the redis_new_args bit
(look at the default in lib/DBDefs/Default.pm to see what I mean).
Anything in https://metacpan.org/pod/Redis#new can be passed, but
cnx_timeout, read_timeout, and write_timeout are presumably the most
relevant. Redis is only used for sessions (irrelevant on a slave) and
for configurable banner messages (irrelevant on a slave, but without
reducing the timeouts means pages take ages to load)

Postgres Dockerfile doesn't follow best practices

Couple of issues here, based on https://docs.docker.com/engine/userguide/eng-image/dockerfile_best-practices

  1. https://github.com/jsturgis/musicbrainz-docker/blob/master/postgres-dockerfile/Dockerfile#L5 and https://github.com/jsturgis/musicbrainz-docker/blob/master/postgres-dockerfile/Dockerfile#L7 should be merged so that the layer created by line 5 isn't incorrectly cached.
  2. I'd split the "RUN cd ... & ..." into seperate WORKDIR and RUN commands, as advised in https://docs.docker.com/engine/userguide/eng-image/dockerfile_best-practices/#workdir - mainly for readability. This isn't a huge thing so not that important if you feel otherwise.

I've updated my own fork to resolve these things, so I could make a PR for this if you like.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.