Giter VIP home page Giter VIP logo

migrate-mongo's Introduction

migrate-mongo database migration tool for Node.js

Build Status Coverage Status NPM Downloads Dependencies Known Vulnerabilities

tippin.me

migrate-mongo is a database migration tool for MongoDB running in Node.js

Installation

$ npm install -g migrate-mongo

CLI Usage

$ migrate-mongo
Usage: migrate-mongo [options] [command]


  Commands:

    init                  initialize a new migration project
    create [description]  create a new database migration with the provided description
    up [options]          run all unapplied database migrations
    down [options]        undo the last applied database migration
    status [options]      print the changelog of the database

  Options:

    -h, --help     output usage information
    -V, --version  output the version number

Basic Usage

Initialize a new project

Make sure you have Node.js 10 (or higher) installed.

Create a directory where you want to store your migrations for your mongo database (eg. 'albums' here) and cd into it

$ mkdir albums-migrations
$ cd albums-migrations

Initialize a new migrate-mongo project

$ migrate-mongo init
Initialization successful. Please edit the generated migrate-mongo-config.js file

The above command did two things:

  1. create a sample 'migrate-mongo-config.js' file and
  2. create a 'migrations' directory

Edit the migrate-mongo-config.js file. An object or promise can be returned. Make sure you change the mongodb url:

// In this file you can configure migrate-mongo

module.exports = {
  mongodb: {
    // TODO Change (or review) the url to your MongoDB:
    url: "mongodb://localhost:27017",

    // TODO Change this to your database name:
    databaseName: "YOURDATABASENAME",

    options: {
      useNewUrlParser: true // removes a deprecation warning when connecting
      //   connectTimeoutMS: 3600000, // increase connection timeout to 1 hour
      //   socketTimeoutMS: 3600000, // increase socket timeout to 1 hour
    }
  },

  // The migrations dir, can be an relative or absolute path. Only edit this when really necessary.
  migrationsDir: "migrations",

  // The mongodb collection where the applied changes are stored. Only edit this when really necessary.
  changelogCollectionName: "changelog",

  // The file extension to create migrations and search for in migration dir 
  migrationFileExtension: ".js",

  // Enable the algorithm to create a checksum of the file contents and use that in the comparison to determin
  // if the file should be run.  Requires that scripts are coded to be run multiple times.
  useFileHash: false
};

Alternatively, you can also encode your database name in the url (and leave out the databaseName property):

        url: "mongodb://localhost:27017/YOURDATABASE",

Creating a new migration script

To create a new database migration script, just run the migrate-mongo create [description] command.

For example:

$ migrate-mongo create blacklist_the_beatles
Created: migrations/20160608155948-blacklist_the_beatles.js

A new migration file is created in the 'migrations' directory:

module.exports = {
  up(db, client) {
    // TODO write your migration here. Return a Promise (and/or use async & await).
    // See https://github.com/seppevs/migrate-mongo/#creating-a-new-migration-script
    // Example:
    // return db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: true}});
  },

  down(db, client) {
    // TODO write the statements to rollback your migration (if possible)
    // Example:
    // return db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: false}});
  }
};

Edit this content so it actually performs changes to your database. Don't forget to write the down part as well. The db object contains the official MongoDB db object The client object is a MongoClient instance (which you can omit if you don't use it).

There are 3 options to implement the up and down functions of your migration:

  1. Return a Promises
  2. Use async-await
  3. Call a callback (DEPRECATED!)

Always make sure the implementation matches the function signature:

  • function up(db, client) { /* */ } should return Promise
  • async function up(db, client) { /* */ } should contain await keyword(s) and return Promise
  • function up(db, client, next) { /* */ } should callback next

Example 1: Return a Promise

module.exports = {
  up(db) {
    return db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: true}});
  },

  down(db) {
    return db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: false}});
  }
};

Example 2: Use async & await

Async & await is especially useful if you want to perform multiple operations against your MongoDB in one migration.

module.exports = {
  async up(db) {
    await db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: true}});
    await db.collection('albums').updateOne({artist: 'The Doors'}, {$set: {stars: 5}});
  },

  async down(db) {
    await db.collection('albums').updateOne({artist: 'The Doors'}, {$set: {stars: 0}});
    await db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: false}});
  },
};

Example 3: Call a callback (deprecated)

Callbacks are supported for backwards compatibility. New migration scripts should be written using Promises and/or async & await. It's easier to read and write.

module.exports = {
  up(db, callback) {
    return db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: true}}, callback);
  },

  down(db, callback) {
    return db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: false}}, callback);
  }
};

Overriding the sample migration

To override the content of the sample migration that will be created by the create command, create a file sample-migration.js in the migrations directory.

Checking the status of the migrations

At any time, you can check which migrations are applied (or not)

$ migrate-mongo status
┌─────────────────────────────────────────┬────────────┐
│ Filename                                │ Applied At │
├─────────────────────────────────────────┼────────────┤
│ 20160608155948-blacklist_the_beatles.js │ PENDING    │
└─────────────────────────────────────────┴────────────┘

Migrate up

This command will apply all pending migrations

$ migrate-mongo up
MIGRATED UP: 20160608155948-blacklist_the_beatles.js

If an an error occurred, it will stop and won't continue with the rest of the pending migrations

If we check the status again, we can see the last migration was successfully applied:

$ migrate-mongo status
┌─────────────────────────────────────────┬──────────────────────────┐
│ Filename                                │ Applied At               │
├─────────────────────────────────────────┼──────────────────────────┤
│ 20160608155948-blacklist_the_beatles.js │ 2016-06-08T20:13:30.415Z │
└─────────────────────────────────────────┴──────────────────────────┘

Migrate down

With this command, migrate-mongo will revert (only) the last applied migration

$ migrate-mongo down
MIGRATED DOWN: 20160608155948-blacklist_the_beatles.js

If we check the status again, we see that the reverted migration is pending again:

$ migrate-mongo status
┌─────────────────────────────────────────┬────────────┐
│ Filename                                │ Applied At │
├─────────────────────────────────────────┼────────────┤
│ 20160608155948-blacklist_the_beatles.js │ PENDING    │
└─────────────────────────────────────────┴────────────┘

Advanced Features

Using a custom config file

All actions (except init) accept an optional -f or --file option to specify a path to a custom config file. By default, migrate-mongo will look for a migrate-mongo-config.js config file in of the current directory.

Example:

$ migrate-mongo status -f '~/configs/albums-migrations.js'
┌─────────────────────────────────────────┬────────────┐
│ Filename                                │ Applied At │
├─────────────────────────────────────────┼────────────┤
│ 20160608155948-blacklist_the_beatles.js │ PENDING    │
└─────────────────────────────────────────┴────────────┘

Using npm packages in your migration scripts

You can use use Node.js modules (or require other modules) in your migration scripts. It's even possible to use npm modules, just provide a package.json file in the root of your migration project:

$ cd albums-migrations
$ npm init --yes

Now you have a package.json file, and you can install your favorite npm modules that might help you in your migration scripts. For example, one of the very useful promise-fun npm modules.

Using ESM (ECMAScript Modules) instead of CommonJS

Since migrate-mongo 7.0.0, it's possible to use ESM instead of CommonJS.

Using ESM when initializing a new project

Pass the -m esm option to the init action:

$ migrate-mongo init -m esm

It's also required to have package.json file in the root of your project with "type": "module". Create a new package.json file:

$ npm init --yes

Then edit this package.json file, and add:

"type": "module"

When you create migration files with migrate-mongo create, they will be prepared for you in ESM style.

Please note that CommonJS is still the default module loading system.

Using MongoDB's Transactions API

You can make use of the MongoDB Transaction API in your migration scripts.

Note: this requires both:

  • MongoDB 4.0 or higher
  • migrate-mongo 7.0.0 or higher

migrate-mongo will call your migration up and down function with a second argument: client. This client argument is an MongoClient instance, it gives you access to the startSession function.

Example:

module.exports = {
  async up(db, client) {
    const session = client.startSession();
    try {
        await session.withTransaction(async () => {
            await db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: true}}, {session});
            await db.collection('albums').updateOne({artist: 'The Doors'}, {$set: {stars: 5}}, {session});
        });
    } finally {
      await session.endSession();
    }
  },

  async down(db, client) {
    const session = client.startSession();
    try {
        await session.withTransaction(async () => {
            await db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: false}}, {session});
            await db.collection('albums').updateOne({artist: 'The Doors'}, {$set: {stars: 0}}, {session});
        });
    } finally {
      await session.endSession();
    }
  },
};

Using a file hash algorithm to enable re-running updated files

There are use cases where it may make sense to not treat scripts as immutable items. An example would be a simple collection with lookup values where you just can wipe and recreate the entire collection all at the same time.

useFileHash: true

Set this config value to will enable tracking a hash of the file contents and will run a file with the same name again as long as the file contents have changes. Setting this flag changes the behavior for every script and if this is enabled each script needs to be written in a manner where it can be re-run safefly. A script of the same name and hash will not be executed again, only if the hash changes.

Now the status will also include the file hash in the output

┌────────────────────────────────────────┬──────────────────────────────────────────────────────────────────┬──────────────────────────┐
│ Filename                               │ Hash                                                             │ Applied At               │
├────────────────────────────────────────┼──────────────────────────────────────────────────────────────────┼──────────────────────────┤
│ 20160608155948-blacklist_the_beatles.js│ 7625a0220d552dbeb42e26fdab61d8c7ef54ac3a052254588c267e42e9fa876d │ 2021-03-04T15:40:22.732Z │
└────────────────────────────────────────┴──────────────────────────────────────────────────────────────────┴──────────────────────────┘

Version

To know which version of migrate-mongo you're running, just pass the version option:

$ migrate-mongo version

API Usage

const {
  init,
  create,
  database,
  config,
  up,
  down,
  status
} = require('migrate-mongo');

init() → Promise

Initialize a new migrate-mongo project

await init();

The above command did two things:

  1. create a sample migrate-mongo-config.js file and
  2. create a migrations directory

Edit the migrate-mongo-config.js file. Make sure you change the mongodb url.

create(description) → Promise<fileName>

For example:

const fileName = await create('blacklist_the_beatles');
console.log('Created:', fileName);

A new migration file is created in the migrations directory.

database.connect() → Promise<{db: MongoDb, client: MongoClient}>

Connect to a mongo database using the connection settings from the migrate-mongo-config.js file.

const { db, client } = await database.connect();

config.read() → Promise<JSON>

Read connection settings from the migrate-mongo-config.js file.

const mongoConnectionSettings = await config.read();

config.set(yourConfigObject)

Tell migrate-mongo NOT to use the migrate-mongo-config.js file, but instead use the config object passed as the first argument of this function. When using this feature, please do this at the very beginning of your program.

Example:

const { config, up } = require('../lib/migrate-mongo');

const myConfig = {
    mongodb: {
        url: "mongodb://localhost:27017/mydatabase",
        options: { useNewUrlParser: true }
    },
    migrationsDir: "migrations",
    changelogCollectionName: "changelog",
    migrationFileExtension: ".js"
};

config.set(myConfig);

// then, use the API as you normally would, eg:
await up();

up(MongoDb, MongoClient) → Promise<Array<fileName>>

Apply all pending migrations

const { db, client } = await database.connect();
const migrated = await up(db, client);
migrated.forEach(fileName => console.log('Migrated:', fileName));

If an an error occurred, the promise will reject and won't continue with the rest of the pending migrations.

down(MongoDb, MongoClient) → Promise<Array<fileName>>

Revert (only) the last applied migration

const { db, client } = await database.connect();
const migratedDown = await down(db, client);
migratedDown.forEach(fileName => console.log('Migrated Down:', fileName));

status(MongoDb) → Promise<Array<{ fileName, appliedAt }>>

Check which migrations are applied (or not.

const { db } = await database.connect();
const migrationStatus = await status(db);
migrationStatus.forEach(({ fileName, appliedAt }) => console.log(fileName, ':', appliedAt));

client.close() → Promise

Close the database connection

const { db, client } = await database.connect();
await client.close();

migrate-mongo's People

Contributors

backeseduardo avatar bentinata avatar chiranjib-b avatar danielruf avatar daveboulard avatar davidmat avatar dependabot-preview[bot] avatar eric-swann-q2 avatar grigoreme avatar jesstelford avatar justinwatkinsact avatar katsanva avatar klassm avatar maximusya avatar moeriki avatar mwenko avatar nabeards avatar nathan-knight avatar ostefansson-atlassian avatar rryanrussell avatar seppevs avatar sethfalco avatar tennox avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

migrate-mongo's Issues

Ability to change the migrations dir

If I specify a config file (using -f), I should be able to specify in there what directory my migrations are in.

I see that currently it is hardcoded in ./env/migrationsDir to be migrations

SyntaxError: Unexpected token import

cannot use import syntax in config file

`/config.js:1
(function (exports, require, module, __filename, __dirname) { import { mongo } from '../src/config'
^^^^^^

SyntaxError: Unexpected token import
at createScript (vm.js:80:10)
at Object.runInThisContext (vm.js:139:10)
at Module._compile (module.js:599:28)
at Object.Module._extensions..js (module.js:646:10)
at Module.load (module.js:554:32)
`

`create` command misses --file option

create command requires default configuration file to be present in current root project directory. Does it really need it? How to specify custom config file for create? It misses -f option.

Do not allow to migrate downwards all the ran migrations.

Its a great package and we are using it in our production but I found an issue with this package. Currently if you run migrate-mongo down even after days it downgrades the previously ran migrations.

How I thought it would work was as follow:

migrate-mongo up runs the new migrations and keeps track of latest ran migrations.Then if I do migrate-mongo down it only downgrades the previously ran migrations.

Later if I do migrate-mongo down again, (in my case I did accidentally). It should not downgrade the next migration available instead should show message No migrations ran.

What I'm saying is this package should cache the recently ran migration and only should downgrade the these.This way even doing migrate-mongo down after days won't create data discrepancies.

If you think this should be fixed I would love to work on this 😛

Can't connect to mongodb

I'm having a very hard time connecting to mongo. It's locally on my computer and I've tried both localhost and 127.0.0.1 for the URL but nothing is working. However, I keep getting a 'failed to connect to server' error.

Do I need something else to connect? Do I need mongoose somewhere? I appreciate any help that points me in the right direction. Thank you.

Forced to pass the databaseName

Describe the bug
I have multiple env, I don't have to select manually a database, because I put my database name in the connection string, like so:

mongodb://localhost:27017/my_db

So I tried removing the warning, and it works, thanks to Mongodb client:

/**
 …
 * @param {string} [dbName] The name of the database we want to use. If not provided, use database name from connection string.
…
 */
MongoClient.prototype.db = function(dbName, options) {
…

So the database name shouldn't be required.

Expected behavior
I should be able to use a connection string to select the database, without having the cli popping an error and stopping execution.

BSON field 'update.updates.u' is the wrong type 'array', expected type 'object'

Describe the bug
I am trying to update my collection using mongo update query with aggregation which is supported by Mongo version 4.0 and higher. I tried to run the migration using this tool but I have got the error

ERROR: Could not migrate down 20200504124656-remove_deleted_messsage.js: BSON field 'update.updates.u' is the wrong type 'array', expected type 'object'

Migration script

module.exports = {
  async up(db, client) {
	await db.collection('messages').updateMany(
		{ is_deleted: true, archived: { $exists: false } },
		[
			{"$set": {"archived.content": "$content", "archived.url": "$url"}},
			{"$unset": ["content", "url"]  }
		])
  },

  async down(db, client) {

  }
};

To Reproduce

  • Create new migration with above script
  • Run migrate up command to reproduce the error

Expected behavior

  • As suggested in mongoDB document. We can use aggregation pipeline with update query

image

Additional context
I am able to run the same script from mongo client shell. But gives error with migration tool

Error: final argument to `executeOperation` must be a callback

This is the migration I wrote:

'use strict';

module.exports = {

  up(db, next) {
    db.collection('projects')
    .updateMany({}, { $set: { tags: [] }}, false, true)
    .then(next);
  },

  down(db, next) {
    db.collection('projects')
    .updateMany({}, { $unset: { tags: [] }}, false, true)
    .then(next);
  }

};

Very simple and I am following the directions exactly. But when I run the up command, I keep getting an error that says:
ERROR: Could not migrate up 20180406191421-add_project_tags.js: final argument to 'executeOperation' must be a callback

What is going wrong here? Any help is appreciated. Thank you.

How to get the client connection?

It would be really interesting to use mongo transactions in your migration module, so if the migration can not fail half-way in.

Describe the solution you'd like
Besides of getting the db connection to a mongo db as a param in the migration files up(db) or down(db), it would be great to have the MongoClient connection too.

Is it possible to do this alreay? Or would you consider to add it?

Thank you

working with objectId

We used to use a simple nodeJS script using raw mongoDB for our migrations. Something like:

const MongoClient = require('mongodb').MongoClient;
const assert = require('assert');
const USERS= require('./models/entities').USERS;
const ObjectId = require('bson').ObjectId;
require('dotenv').config();

const client = new MongoClient(mongoDbConnectionString);
client.connect(async function(err) {
  assert.equal(null, err);
  console.log("Connected successfully to server");

  const db = client.db(MONGODB_DATABASE);

  for (const user of USERS) {
    const result = await db.collection(user).find({}, {_id: 1, credentialsId: 1 }).snapshot().toArray();

    for (entity of result) {
      const update = await db.collection('credentials').updateOne({ _id: ObjectId('123456789abc') }, { $set: { blacklisted: true } });
    }
  }

  client.close();
});

It works well with no problem. Due to our recent needs, we decided to use such a great migration tool like migrate-mongo with up and down flows.

Here is our script transcription:

const ObjectId = require('bson').ObjectId;
const USERS= require('../models/entities').USERS;

module.exports = {
  async up(db) {
    for (const user of USERS) {
      const result = await db.collection(user).find({}, {_id: 1, credentialsId: 1 }).snapshot().toArray();
  
      for (entity of result) {
        const update = await db.collection('credentials').updateOne({ _id: ObjectId('123456789abc') }, { $set: { blacklisted: true } });
        console.log(update.result);
      }
    }
  },

  down(db) { }
};

unfortunately this doesn't work. We don't have any error reported in console or thrown by migrate-mongo. But the database is not updated at all.

Could this come from objectId? We tried defining it with a simple string with no more successful results.

Config option to remove TODO and Example in migration file?

Is there a way to remove the comments (the TODO and Example) in every single migration?

module.exports = {
  async up(db, client) {
    // TODO write your migration here.  <=== Getting rid of this part
    // See https://github.com/seppevs/migrate-mongo/#creating-a-new-migration-script
    // Example:
    // await db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: true}});
  },

  async down(db, client) {
    // TODO write the statements to rollback your migration (if possible)  <=== Getting rid of this part
    // Example:
    // await db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: false}});
  }
};

Flag -f doesn't work for the create command

The flag -f doesn't work for the create command, so it throws and error if it is passed.

I think it might be more consistent to accept this flag, but ignore the config file content if it is not needed.

Usage: I defined an npm command aliases npm migrate-mongo to npm migrate-mongo -f migrate-mongo-config.js, so I want to use it for all commands. However it crashes for npm migrate create because -f flag doesn't work with the create command.

Working with Typescript ?

I want to use migrate-mongo with nestjs and that use typescript and I would like to ask does it work with ts. Another thing is can I use mongo model in migration

TypeError when no migration is left

I want to run migrate-mongo up on every deployment. But I found out when there are no more scripts left to run, it blows up:

/usr/local/opt/backend/node_modules/mongodb/lib/utils.js:132
      throw err;
      ^

TypeError: undefined is not a function
    at Function.apply (/usr/local/opt/backend/node_modules/shifting/index.js:93:11)

I also get the same TypeError after applying a migration.

Wrong readme for connect method

According to api docs database.connect returns a promise resolving to mongodb object which can be directly pass to up, down and status functions but it actually returns a promise resolving to object having two keys db (MongoDB obejct) and client (MongoClient) and these needs to passed as separate params to above functions

9 high vulnerabilities

Thanks for this module!
However npm cries out about 9 high vulnerabilities after installation. Please fix them if possible.

--dry-run option

Is your feature request related to a problem? Please describe.
When writing up and down functions I have to run migrate-mongo up and then migrate-mongo down in order to test my implementation.

Describe the solution you'd like
It would be nice if there was an option migrate-mongo up --dry-run that would let me test my code but wouldn't actually apply the migration.

Can't access the mongodb module

In my migrations I need access to other NodeJs MongoDB objects like ObjectId and DBRef

I am trying to use:

var ObjectID = require('mongodb').ObjectID;
var DBRef = require('mongodb').DBRef;

but I get
Cannot find module 'mongodb'

How can I access these objects?

Version 4 upgrade

Hi. I've noticed that new version is published. However there are no changelog there. Could you please describe what's changed since 3.0.5 and is it safe to upgrade or there were some breaking changes?

Setting order/dependencies of migrations

I use migrate mongo both for migrations and for seeding test data into mongo. Some seeds/migrations rely on the fact that other seeds/migrations have been applied beforehand.

So my question/request is that if there either is some built-in way that I haven't considered or if a seed/migration can rely/depend on another seed/migration being executed before another migration tries to execute.

This would of course work in environments where we only add new migrations but we quite often run this locally to clean/setup the environment and then they are all new/pending.

I know that for example django (on the python stack) has this built in to their migrations framework that otherwise work very similar to this project.

Thanks! // Johan

Add support for multiple databases

It would be great if this library supported multiple mongo urls in single configuration file, and ran the migrations against all of the provided databases.

Update dependencies

Hi Sebastian, thanks for your work. Please, upgrade mongodb dependency to version 3.1.1. Version 3.0.7 doesn’t support MongoDB 4.0.0.

default to more specific config path

First, thank you for such a helpful project!

This was pointed out in #3 and #17, but I think it's pretty important: config.js is a very generic name.

The project I'm including this package in has 10 third-party config files in the root directory and that's not including internal ones. It doesn't make sense that config.js is reserved for just a single (though important!) portion of the application.

I realize we can use the --file flag, but it would be far more convenient if migrate-mongoose would first look for a more specific config (e.g. migrations_config.js, mm-config.js, .migrationrc.js, etc).

I suspect this would be as simple as changing how DEFAULT_CONFIG_FILE_NAME is checked, and would easily be made backward compatible as long as the new name is pretty unique. I'm also happy to contribute if you're cool with this approach.

Allow passing of extra arguments

Is your feature request related to a problem? Please describe.
I really like the API of this library over other migration frameworks that support Mongoose, but I would still like to use the Mongoose library in certain situations. Furthermore, there are other operations where it would helpful if custom arguments (utilities, metadata, etc) could be passed as an extra argument in down/up scripts.

Describe the solution you'd like
I would like to include a user-defined extra argument in the up/down functions.

Describe alternatives you've considered
Possible solution:

module.exports = {
	up(db, extra) {
		const { mongoose, utils, meta } = extra;
		utils.logger('Updating The Beatle\'s blacklist value');
		return mongoose.Albums.updateMany({
			artist: 'The Beatles',
		}, {
			$set: {
				blacklisted: meta.defaultBlacklistValue,
			},
		});
	},
	...

This extra argument could be defined in the config file when using the CLI or passed as an extra argument when using the Node API:

// migrate-mongo-config.js
module.exports = {
	...

	extra: {
		mongoose: mongoose.connect('mongodb://localhost/music'),
		utils: {
			logger: (...msgs) => console.log(...msgs)
		},
		meta: {
			defaultBlacklistValue: false,
		}
	}
};

// node API
const mm = require('migrate-mongo');

...

mm.up(db, extra)

Additional context
n/a

Add Programmatic API

This migration package looks great (and kudos for the 100% test coverage!).

Would you consider adding a programmatic API? For instance, something like the following:

const Migrate = require('migrate-mongo');
const migrationConfig = require('path/to/config.js');
const migrator = new Migrate(migrationConfig);

migrator.up().then(/* etc */)
migrator.down().then(/* etc */)

be able to organize migrations scripts in sub-level directory structure

Presently migrate-mongo reads only files under the "migrationDir" config value, it would be helpfull to be able to organize migrations files in subdirectories like version for example with the following layout:

`

   |____ v1.0.0
   |        |____  213233-script.js
   |____ v1.1.0
            |___ ... other scripts

`

Just a question

When execution migration on the following script I'm getting this message. That seems strange since I'm using insertMany as recommended, do you see something wrong in my script? The collection does not exist before this command.

up(db, next) {
db.collection('physical-type')
.insertMany([
{physicalTypeCode: 10, minLimit: -3.4E+38, maxLimit: 3.4E+38, dbDataUnit:'Seconds' ,supportDifferential: true},
{physicalTypeCode: 12, minLimit: -3.4E+38, maxLimit: 3.4E+38, dbDataUnit:'°C' ,supportDifferential: true}
]);
next();
},

DeprecationWarning: collection.insert is deprecated. Use insertOne, insertMany or bulkWrite instead.

Thanks

Issue probably with the connection or connection pool

ISSUE DESCRIPTION:
When we run the migration the first time, it is working fine. But when running for the second time after having deleted the database i got this error.

(node:3403) UnhandledPromiseRejectionWarning: MongoError: a collection 'test.physical-type' already exists.

REPRODUCTION:
1- git clone this repository https://github.com/stherrienaspnet/migrate-mongo-issue.git
2- enter command: 'npm run deploy'
3- check the db test, you should see correct result
4- delete the test db using compass or Robo 3T
5- enter command: 'npm run deploy', you should see the error message.

ADDITIONAL INFO:
I added the file deployme.js to use only the native mongodb driver and running the same script and deleting the test db several times and this is working perfectly.

Hope you will find the route cause, because I would prefer to use your migration project instead of creating mine from scratch.

Many thanks

Feature to store the _id's of documents affected during a migration

I'm willing to give effort on a PR for this feature

Currently there isn't a way to track the affected documents:

module.exports = {

    up(db, next) {
        return db.collection('users').updateMany({ status: 'active' }, { $set: { status: 'inactive' }, next);
    },

    down(db, next) {
        // TODO write the statements to rollback your migration (if possible)
        // It's not possible to query for only the documents affected in the 'up' command
        next();
    }

};

This could be accomplished by saving an array of document _id's affected and passing it to the down() method:

module.exports = {

    up(db, saveAffectedIdList, next) {
      const affectedIdList = []
      const query = { status: 'active' }
    
      db.collection('users')
        .find(query, { _id: 1 })
        .then(result => affectedIdList.push(...result))
        .updateMany(query, { $set: { status: 'inactive' })
        .then(() => {
            // adds the list to the object that will be inserted into the migrations collection
            return saveAffectedIdList(affectedIdList)
        })
        .then(next)
    },

    down(db, idList, next) {
        return db.collection('users').updateMany({ _id: idList }, { $set: { status: 'active' }, next);
    }

};

Ability to set migration directories or use patterns

When setting up a project with many different models it is nice to have migrations next to each model.

Inside of the configuration, it would be great If I could use an array or regex to map to migrations directories.

The alternate approach here would be to simply have one giant migrations folder.

Im attempting to use this inside of nestjs which is a dependency management system. It looks as though I am going to end up using migrations in a procedural way, without dependency injection.

Inconsistent issue in pushing values to empty arrays

Describe the bug
Sometimes, when running multiple $push updates on an empty array, the first $push statement is missed/skipped resulting in a single element missing from the array field

To Reproduce
Let's say there is a mongo collection called "sample" with an empty array field like this:

{_id: ObjectId('000000000000000000'), name: 'sample1', an_array_field: []}

Now, I want to make a migration to add some elements

db.collection('sample').update({name: 'sample1'}, {$push: {an_array_field: "test val 1"}}, next)
db.collection('sample').update({name: 'sample1'}, {$push: {an_array_field: "test val 2"}}, next)
db.collection('sample').update({name: 'sample1'}, {$push: {an_array_field: "test val 3"}}, next)

Expected behavior
After running this migration, I would expect the document to looks like this:

{_id: ObjectId('000000000000000000'), name: 'sample1', an_array_field: ["test val 1", "test val 2", "test val 3"]}

However, what sometimes (inconsistently) happens is this:

{_id: ObjectId('000000000000000000'), name: 'sample1', an_array_field: ["test val 2", "test val 3"]}

Notice how the first value is missing.

Additional context
This only happens when the updated field is an empty array, and it does not always happen. As an example of the inconsistent behavior, lets say this worked as expected locally and on a dev server, but failed to include the first value in the production environment. Same migration script, same versions of all dependencies, even the same docker container. Only difference is the configured mongo database host and the time ran.

Further more, we have had the same script work correctly the first time, and but skip the first element the second time after rolling back and running again, then working the third time after rolling back and running again.

Migrations don't apply to all the documents of a collection when using nested queries

Describe the bug
Migrations don't apply to all the documents of a collection when using nested queries. The example code shown.

To Reproduce

module.exports = {
  up(db) {
    return db.collection('users').find().snapshot().forEach(
      function (e) {
        // update document, using its own properties
        db.collection('views').find({ user: e._id, verified: true }).snapshot().toArray(function (err, views) {
          if(err) {
            e.viewCount = 0;
            delete e.views;          
            // save the updated document
            db.collection('users').save(e);
          } else {
            e.viewCount = views.length;
            delete e.views;
            console.log(e);
            // save the updated document
            db.collection('users').save(e, function(err){
              if(err){
                console.log(err);
              }
            });
          }
        });
      }
    )
  },

  down(db) {
    return db.collection('users').find().snapshot().forEach(
      function (e) {
        // update document, using its own properties
        e.views = 0;
        delete e.viewCount;
        // save the updated document
        db.collection('users').save(e);
      }
    )
  }
};


Expected behavior
Function should be applied to all the documents, it is randomly applying to some.

Unexpected token { because ES10 syntax in fs-extra

I am currently using node 8.11.3, and I am facing unexpected token { issue when I use migrate-mongo init

How to reproduce
node 8.11.3 and migrate-mongo init

node@b531b9ebb1eb:/opt/reaction/src$ npm run migrate

> [email protected] migrate /opt/reaction/src
> migrate-mongo init

/opt/reaction/src/node_modules/migrate-mongo/node_modules/fs-extra/lib/mkdirs/make-dir.js:86
      } catch {
              ^

SyntaxError: Unexpected token {
    at createScript (vm.js:80:10)
    at Object.runInThisContext (vm.js:139:10)
    at Module._compile (module.js:616:28)
    at Object.Module._extensions..js (module.js:663:10)
    at Module.load (module.js:565:32)
    at tryModuleLoad (module.js:505:12)
    at Function.Module._load (module.js:497:3)
    at Module.require (module.js:596:17)
    at require (internal/module.js:11:18)
    at Object.<anonymous> (/opt/reaction/src/node_modules/migrate-mongo/node_modules/fs-extra/lib/mkdirs/index.js:3:44)

fs-extra seems using optional catch binding which only supported in ES10.

Auth user+pass setting separate from URI

We would like to separate the connection URI from the database user and password while configuring migrate-mongo.

It would be appreciated if we could set the database address and the authentication parameters separately (not having to merge all info into a single string), for example in the options part of the config file, using user, pass... options.

Transactions with migrate-mongo

Is your feature request related to a problem? Please describe.
It seems that at the moment it is no possible to use cross-document transactions using migrate-mongo.

up and down functions accept a single parameter db which is an instance of the Db class. But to start a transaction we need to call the startSession method on the MongoClient class.

Describe the solution you'd like

Not sure what would be the best solution, but maybe up and down methods could receive two parameters: (db, client)

Describe alternatives you've considered
There does not seem to be a comprehensive alternative solution.

Allow to set config via API

In order to use migrate-mongo API (not as npm script) it should allow specifying config through API.
Now it tries to read config from default file and there's no way to change such behaviour.

So it would be good if it will allow to read config from file or just pass it as js object into database function, for example. Or set it somehow.

That's also important for docker environments where usually npm package is not even installed and present in system in order to reduce image file size. So there's no option to run npm scripts.

Support of repeatable migrations, that are always executed, if checksum is changed

Despite standard migrations we have some data in the database that serve as a complex configuration. We do not want to track changes to this configuration as sequence of migrations in our repository. At the moment we delete and re-create this data during any deployment.

As user of the migrate-mongo, I would like to add repeatable migrations, that are executed as soon as there checksum has changed, regardless of the database version, so that I have the actual state of this data reflected in the database and the repository.
The execution of baseline should be ordered by filename.

Describe alternatives you've considered
Using a wrapper around migrate-mongo which execute the baseline scripts at the beginning and then calls the migrate-mongo api, but therefor i would need to duplicate the db configuration.

Additional context
there’s a similar functionality in flyway: https://flywaydb.org/documentation/migrations#repeatable-migrations

Update mongodb version

Describe the bug
Update MongoDB version from 3.1.x to 4.0.x. Many features are missing bc monogoDB version is outdated.

Additional context
Add any other context about the problem here.

Thank you 👏

After spending hours test-driving migration libraries and searching for one that fits my needs, I finally stumbled across yours. It's clean, slick, amazing and simply the best there is. You cannot believe how much did this project save my day! Thank you very much 👏

TypeError: client.s.dbCache is not iterable

Describe the bug

When I call await db.close() at the end of my up(db) {...} function this error is thrown in the console:

/Users/coryrobinson/.nvm/versions/node/v10.13.0/lib/node_modules/migrate-mongo/node_modules/mongodb/lib/operations/close.js:19
      for (const item of client.s.dbCache) {
                                  ^

TypeError: client.s.dbCache is not iterable
    at completeClose (/Users/coryrobinson/.nvm/versions/node/v10.13.0/lib/node_modules/migrate-mongo/node_modules/mongodb/lib/operations/close.js:19:35)
    at client.topology.close.err (/Users/coryrobinson/.nvm/versions/node/v10.13.0/lib/node_modules/migrate-mongo/node_modules/mongodb/lib/operations/close.js:35:9)

Additional context

i'm using a fresh install of the pkg npm i -g migrate-mongo, node 10.15.x, npm 6.9.x

Missing documentation of optional options

Describe the bug
Currently the options are not mentioned in the docs but shown in the CLI.
See https://github.com/seppevs/migrate-mongo/blob/master/bin/migrate-mongo.js#L45

To Reproduce
Run migrate-mongo --help

Usage: migrate-mongo [options] [command]

Options:
  -V, --version                   output the version number
  -h, --help                      output usage information

Commands:
  init                            initialize a new migration project
  create [options] [description]  create a new database migration with the provided description
  up [options]                    run all pending database migrations
  down [options]                  undo the last applied database migration
  status [options]                print the changelog of the database

Expected behavior
The options should be mentioned / described in the docs.

Additional context

migrate-mono --version
6.0.0

Error requiring migration file obfuscated by library's error handling

Describe the bug

The error handler for up tries to log the migrations that couldn't be run. This is done in /bin/migrate-mongo on L72.

program
.command("up")
.description("run all pending database migrations")
.option("-f --file <file>", "use a custom config file")
.action(options => {
global.options = options;
migrateMongo.database
.connect()
.then(db => migrateMongo.up(db))
.then(migrated => {
printMigrated(migrated);
process.exit(0);
})
.catch(err => {
printMigrated(err.migrated);
handleError(err);
});
});

But if err.migrated is undefined, then a new error is thrown before the original error can be logged:

TypeError: Cannot read property 'forEach' of undefined
    at printMigrated (node_modules/migrate-mongo/bin/migrate-mongo.js:12:12)
    at database.connect.then.then.catch.err (node_modules/migrate-mongo/bin/migrate-mongo.js:74:9)
    at process._tickCallback (internal/process/next_tick.js:68:7)

Unfortunately, some errors are thrown that don't have the migrated property, so it's difficult to get visibility into them.

To Reproduce

  • write a migration that throws an error when required
    // requiring this file will error due to using await inside non-async fn
    module.exports = {
      up(db) {
        await somePromise
      }
    
      down(db) { }
    };
  • use migrate-mongo up

Other

Also if inserting the migration to the db collection fails, then you'll get an invalid error there as well:

try {
await collection.insertOne({ fileName, appliedAt });
} catch (err) {
throw new Error(`Could not update changelog: ${err.message}`);
}

config.js fallback name/path

I want to use migrate-mongo in my project.

In my project all the system configuration is managed with very popular config package. It allows to manage configuration per environment. Of course I want to manage there also migrate-mongo configuration. In this case my migrate-mongo configuration file config.js will be:

'use strict';
const config = require('config');

module.exports = config.get('migrations'); 
  1. config package looks for configuration files in ./config directory.
  2. migrate-mongo package expects configuration file config.js in the current directory.

So:

  1. If I run migrate-mongo commands from my project's root directory I will have to put migrate-mongo configuration file config.js in my project's root directory (and config.js has too general name to be there).

  2. If I put migrate-mongo configuration file config.js in any inner directory (e.g. ./migrations/) then I have to run migrate-mongo commands from there and then config package will not find its configuration files, unless I put a copy of its config directory under migration directory which is obviously wrong.

I would like it to try to use config.js from the current path (for backward compatibility) and if there is no such a file as fallback to check for example .\migrations\config.js or event just another name (e.g. migrations_config.js). I would prefer the first option (e.g. .\migrations\config.js ), to keep my project's root directory as clean as possible.

forward console.log in console

Sometime it can be useful to get some intermediate result when writing migration script. For log purpose in our migration runner, to be able to be parsed by elasticsearch for example. Or catch some errors.

Any option to get the console.log and thrown errors forwared back?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.