Giter VIP home page Giter VIP logo

migrate-mongo's Issues

Config option to remove TODO and Example in migration file?

Is there a way to remove the comments (the TODO and Example) in every single migration?

module.exports = {
  async up(db, client) {
    // TODO write your migration here.  <=== Getting rid of this part
    // See https://github.com/seppevs/migrate-mongo/#creating-a-new-migration-script
    // Example:
    // await db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: true}});
  },

  async down(db, client) {
    // TODO write the statements to rollback your migration (if possible)  <=== Getting rid of this part
    // Example:
    // await db.collection('albums').updateOne({artist: 'The Beatles'}, {$set: {blacklisted: false}});
  }
};

Just a question

When execution migration on the following script I'm getting this message. That seems strange since I'm using insertMany as recommended, do you see something wrong in my script? The collection does not exist before this command.

up(db, next) {
db.collection('physical-type')
.insertMany([
{physicalTypeCode: 10, minLimit: -3.4E+38, maxLimit: 3.4E+38, dbDataUnit:'Seconds' ,supportDifferential: true},
{physicalTypeCode: 12, minLimit: -3.4E+38, maxLimit: 3.4E+38, dbDataUnit:'°C' ,supportDifferential: true}
]);
next();
},

DeprecationWarning: collection.insert is deprecated. Use insertOne, insertMany or bulkWrite instead.

Thanks

Flag -f doesn't work for the create command

The flag -f doesn't work for the create command, so it throws and error if it is passed.

I think it might be more consistent to accept this flag, but ignore the config file content if it is not needed.

Usage: I defined an npm command aliases npm migrate-mongo to npm migrate-mongo -f migrate-mongo-config.js, so I want to use it for all commands. However it crashes for npm migrate create because -f flag doesn't work with the create command.

TypeError when no migration is left

I want to run migrate-mongo up on every deployment. But I found out when there are no more scripts left to run, it blows up:

/usr/local/opt/backend/node_modules/mongodb/lib/utils.js:132
      throw err;
      ^

TypeError: undefined is not a function
    at Function.apply (/usr/local/opt/backend/node_modules/shifting/index.js:93:11)

I also get the same TypeError after applying a migration.

Issue probably with the connection or connection pool

ISSUE DESCRIPTION:
When we run the migration the first time, it is working fine. But when running for the second time after having deleted the database i got this error.

(node:3403) UnhandledPromiseRejectionWarning: MongoError: a collection 'test.physical-type' already exists.

REPRODUCTION:
1- git clone this repository https://github.com/stherrienaspnet/migrate-mongo-issue.git
2- enter command: 'npm run deploy'
3- check the db test, you should see correct result
4- delete the test db using compass or Robo 3T
5- enter command: 'npm run deploy', you should see the error message.

ADDITIONAL INFO:
I added the file deployme.js to use only the native mongodb driver and running the same script and deleting the test db several times and this is working perfectly.

Hope you will find the route cause, because I would prefer to use your migration project instead of creating mine from scratch.

Many thanks

TypeError: client.s.dbCache is not iterable

Describe the bug

When I call await db.close() at the end of my up(db) {...} function this error is thrown in the console:

/Users/coryrobinson/.nvm/versions/node/v10.13.0/lib/node_modules/migrate-mongo/node_modules/mongodb/lib/operations/close.js:19
      for (const item of client.s.dbCache) {
                                  ^

TypeError: client.s.dbCache is not iterable
    at completeClose (/Users/coryrobinson/.nvm/versions/node/v10.13.0/lib/node_modules/migrate-mongo/node_modules/mongodb/lib/operations/close.js:19:35)
    at client.topology.close.err (/Users/coryrobinson/.nvm/versions/node/v10.13.0/lib/node_modules/migrate-mongo/node_modules/mongodb/lib/operations/close.js:35:9)

Additional context

i'm using a fresh install of the pkg npm i -g migrate-mongo, node 10.15.x, npm 6.9.x

Unexpected token { because ES10 syntax in fs-extra

I am currently using node 8.11.3, and I am facing unexpected token { issue when I use migrate-mongo init

How to reproduce
node 8.11.3 and migrate-mongo init

node@b531b9ebb1eb:/opt/reaction/src$ npm run migrate

> [email protected] migrate /opt/reaction/src
> migrate-mongo init

/opt/reaction/src/node_modules/migrate-mongo/node_modules/fs-extra/lib/mkdirs/make-dir.js:86
      } catch {
              ^

SyntaxError: Unexpected token {
    at createScript (vm.js:80:10)
    at Object.runInThisContext (vm.js:139:10)
    at Module._compile (module.js:616:28)
    at Object.Module._extensions..js (module.js:663:10)
    at Module.load (module.js:565:32)
    at tryModuleLoad (module.js:505:12)
    at Function.Module._load (module.js:497:3)
    at Module.require (module.js:596:17)
    at require (internal/module.js:11:18)
    at Object.<anonymous> (/opt/reaction/src/node_modules/migrate-mongo/node_modules/fs-extra/lib/mkdirs/index.js:3:44)

fs-extra seems using optional catch binding which only supported in ES10.

be able to organize migrations scripts in sub-level directory structure

Presently migrate-mongo reads only files under the "migrationDir" config value, it would be helpfull to be able to organize migrations files in subdirectories like version for example with the following layout:

`

   |____ v1.0.0
   |        |____  213233-script.js
   |____ v1.1.0
            |___ ... other scripts

`

Migrations don't apply to all the documents of a collection when using nested queries

Describe the bug
Migrations don't apply to all the documents of a collection when using nested queries. The example code shown.

To Reproduce

module.exports = {
  up(db) {
    return db.collection('users').find().snapshot().forEach(
      function (e) {
        // update document, using its own properties
        db.collection('views').find({ user: e._id, verified: true }).snapshot().toArray(function (err, views) {
          if(err) {
            e.viewCount = 0;
            delete e.views;          
            // save the updated document
            db.collection('users').save(e);
          } else {
            e.viewCount = views.length;
            delete e.views;
            console.log(e);
            // save the updated document
            db.collection('users').save(e, function(err){
              if(err){
                console.log(err);
              }
            });
          }
        });
      }
    )
  },

  down(db) {
    return db.collection('users').find().snapshot().forEach(
      function (e) {
        // update document, using its own properties
        e.views = 0;
        delete e.viewCount;
        // save the updated document
        db.collection('users').save(e);
      }
    )
  }
};


Expected behavior
Function should be applied to all the documents, it is randomly applying to some.

Allow passing of extra arguments

Is your feature request related to a problem? Please describe.
I really like the API of this library over other migration frameworks that support Mongoose, but I would still like to use the Mongoose library in certain situations. Furthermore, there are other operations where it would helpful if custom arguments (utilities, metadata, etc) could be passed as an extra argument in down/up scripts.

Describe the solution you'd like
I would like to include a user-defined extra argument in the up/down functions.

Describe alternatives you've considered
Possible solution:

module.exports = {
	up(db, extra) {
		const { mongoose, utils, meta } = extra;
		utils.logger('Updating The Beatle\'s blacklist value');
		return mongoose.Albums.updateMany({
			artist: 'The Beatles',
		}, {
			$set: {
				blacklisted: meta.defaultBlacklistValue,
			},
		});
	},
	...

This extra argument could be defined in the config file when using the CLI or passed as an extra argument when using the Node API:

// migrate-mongo-config.js
module.exports = {
	...

	extra: {
		mongoose: mongoose.connect('mongodb://localhost/music'),
		utils: {
			logger: (...msgs) => console.log(...msgs)
		},
		meta: {
			defaultBlacklistValue: false,
		}
	}
};

// node API
const mm = require('migrate-mongo');

...

mm.up(db, extra)

Additional context
n/a

Support of repeatable migrations, that are always executed, if checksum is changed

Despite standard migrations we have some data in the database that serve as a complex configuration. We do not want to track changes to this configuration as sequence of migrations in our repository. At the moment we delete and re-create this data during any deployment.

As user of the migrate-mongo, I would like to add repeatable migrations, that are executed as soon as there checksum has changed, regardless of the database version, so that I have the actual state of this data reflected in the database and the repository.
The execution of baseline should be ordered by filename.

Describe alternatives you've considered
Using a wrapper around migrate-mongo which execute the baseline scripts at the beginning and then calls the migrate-mongo api, but therefor i would need to duplicate the db configuration.

Additional context
there’s a similar functionality in flyway: https://flywaydb.org/documentation/migrations#repeatable-migrations

`create` command misses --file option

create command requires default configuration file to be present in current root project directory. Does it really need it? How to specify custom config file for create? It misses -f option.

--dry-run option

Is your feature request related to a problem? Please describe.
When writing up and down functions I have to run migrate-mongo up and then migrate-mongo down in order to test my implementation.

Describe the solution you'd like
It would be nice if there was an option migrate-mongo up --dry-run that would let me test my code but wouldn't actually apply the migration.

Setting order/dependencies of migrations

I use migrate mongo both for migrations and for seeding test data into mongo. Some seeds/migrations rely on the fact that other seeds/migrations have been applied beforehand.

So my question/request is that if there either is some built-in way that I haven't considered or if a seed/migration can rely/depend on another seed/migration being executed before another migration tries to execute.

This would of course work in environments where we only add new migrations but we quite often run this locally to clean/setup the environment and then they are all new/pending.

I know that for example django (on the python stack) has this built in to their migrations framework that otherwise work very similar to this project.

Thanks! // Johan

Feature to store the _id's of documents affected during a migration

I'm willing to give effort on a PR for this feature

Currently there isn't a way to track the affected documents:

module.exports = {

    up(db, next) {
        return db.collection('users').updateMany({ status: 'active' }, { $set: { status: 'inactive' }, next);
    },

    down(db, next) {
        // TODO write the statements to rollback your migration (if possible)
        // It's not possible to query for only the documents affected in the 'up' command
        next();
    }

};

This could be accomplished by saving an array of document _id's affected and passing it to the down() method:

module.exports = {

    up(db, saveAffectedIdList, next) {
      const affectedIdList = []
      const query = { status: 'active' }
    
      db.collection('users')
        .find(query, { _id: 1 })
        .then(result => affectedIdList.push(...result))
        .updateMany(query, { $set: { status: 'inactive' })
        .then(() => {
            // adds the list to the object that will be inserted into the migrations collection
            return saveAffectedIdList(affectedIdList)
        })
        .then(next)
    },

    down(db, idList, next) {
        return db.collection('users').updateMany({ _id: idList }, { $set: { status: 'active' }, next);
    }

};

forward console.log in console

Sometime it can be useful to get some intermediate result when writing migration script. For log purpose in our migration runner, to be able to be parsed by elasticsearch for example. Or catch some errors.

Any option to get the console.log and thrown errors forwared back?

Forced to pass the databaseName

Describe the bug
I have multiple env, I don't have to select manually a database, because I put my database name in the connection string, like so:

mongodb://localhost:27017/my_db

So I tried removing the warning, and it works, thanks to Mongodb client:

/**
 …
 * @param {string} [dbName] The name of the database we want to use. If not provided, use database name from connection string.
…
 */
MongoClient.prototype.db = function(dbName, options) {
…

So the database name shouldn't be required.

Expected behavior
I should be able to use a connection string to select the database, without having the cli popping an error and stopping execution.

Error requiring migration file obfuscated by library's error handling

Describe the bug

The error handler for up tries to log the migrations that couldn't be run. This is done in /bin/migrate-mongo on L72.

program
.command("up")
.description("run all pending database migrations")
.option("-f --file <file>", "use a custom config file")
.action(options => {
global.options = options;
migrateMongo.database
.connect()
.then(db => migrateMongo.up(db))
.then(migrated => {
printMigrated(migrated);
process.exit(0);
})
.catch(err => {
printMigrated(err.migrated);
handleError(err);
});
});

But if err.migrated is undefined, then a new error is thrown before the original error can be logged:

TypeError: Cannot read property 'forEach' of undefined
    at printMigrated (node_modules/migrate-mongo/bin/migrate-mongo.js:12:12)
    at database.connect.then.then.catch.err (node_modules/migrate-mongo/bin/migrate-mongo.js:74:9)
    at process._tickCallback (internal/process/next_tick.js:68:7)

Unfortunately, some errors are thrown that don't have the migrated property, so it's difficult to get visibility into them.

To Reproduce

  • write a migration that throws an error when required
    // requiring this file will error due to using await inside non-async fn
    module.exports = {
      up(db) {
        await somePromise
      }
    
      down(db) { }
    };
  • use migrate-mongo up

Other

Also if inserting the migration to the db collection fails, then you'll get an invalid error there as well:

try {
await collection.insertOne({ fileName, appliedAt });
} catch (err) {
throw new Error(`Could not update changelog: ${err.message}`);
}

working with objectId

We used to use a simple nodeJS script using raw mongoDB for our migrations. Something like:

const MongoClient = require('mongodb').MongoClient;
const assert = require('assert');
const USERS= require('./models/entities').USERS;
const ObjectId = require('bson').ObjectId;
require('dotenv').config();

const client = new MongoClient(mongoDbConnectionString);
client.connect(async function(err) {
  assert.equal(null, err);
  console.log("Connected successfully to server");

  const db = client.db(MONGODB_DATABASE);

  for (const user of USERS) {
    const result = await db.collection(user).find({}, {_id: 1, credentialsId: 1 }).snapshot().toArray();

    for (entity of result) {
      const update = await db.collection('credentials').updateOne({ _id: ObjectId('123456789abc') }, { $set: { blacklisted: true } });
    }
  }

  client.close();
});

It works well with no problem. Due to our recent needs, we decided to use such a great migration tool like migrate-mongo with up and down flows.

Here is our script transcription:

const ObjectId = require('bson').ObjectId;
const USERS= require('../models/entities').USERS;

module.exports = {
  async up(db) {
    for (const user of USERS) {
      const result = await db.collection(user).find({}, {_id: 1, credentialsId: 1 }).snapshot().toArray();
  
      for (entity of result) {
        const update = await db.collection('credentials').updateOne({ _id: ObjectId('123456789abc') }, { $set: { blacklisted: true } });
        console.log(update.result);
      }
    }
  },

  down(db) { }
};

unfortunately this doesn't work. We don't have any error reported in console or thrown by migrate-mongo. But the database is not updated at all.

Could this come from objectId? We tried defining it with a simple string with no more successful results.

Ability to change the migrations dir

If I specify a config file (using -f), I should be able to specify in there what directory my migrations are in.

I see that currently it is hardcoded in ./env/migrationsDir to be migrations

Error: final argument to `executeOperation` must be a callback

This is the migration I wrote:

'use strict';

module.exports = {

  up(db, next) {
    db.collection('projects')
    .updateMany({}, { $set: { tags: [] }}, false, true)
    .then(next);
  },

  down(db, next) {
    db.collection('projects')
    .updateMany({}, { $unset: { tags: [] }}, false, true)
    .then(next);
  }

};

Very simple and I am following the directions exactly. But when I run the up command, I keep getting an error that says:
ERROR: Could not migrate up 20180406191421-add_project_tags.js: final argument to 'executeOperation' must be a callback

What is going wrong here? Any help is appreciated. Thank you.

Inconsistent issue in pushing values to empty arrays

Describe the bug
Sometimes, when running multiple $push updates on an empty array, the first $push statement is missed/skipped resulting in a single element missing from the array field

To Reproduce
Let's say there is a mongo collection called "sample" with an empty array field like this:

{_id: ObjectId('000000000000000000'), name: 'sample1', an_array_field: []}

Now, I want to make a migration to add some elements

db.collection('sample').update({name: 'sample1'}, {$push: {an_array_field: "test val 1"}}, next)
db.collection('sample').update({name: 'sample1'}, {$push: {an_array_field: "test val 2"}}, next)
db.collection('sample').update({name: 'sample1'}, {$push: {an_array_field: "test val 3"}}, next)

Expected behavior
After running this migration, I would expect the document to looks like this:

{_id: ObjectId('000000000000000000'), name: 'sample1', an_array_field: ["test val 1", "test val 2", "test val 3"]}

However, what sometimes (inconsistently) happens is this:

{_id: ObjectId('000000000000000000'), name: 'sample1', an_array_field: ["test val 2", "test val 3"]}

Notice how the first value is missing.

Additional context
This only happens when the updated field is an empty array, and it does not always happen. As an example of the inconsistent behavior, lets say this worked as expected locally and on a dev server, but failed to include the first value in the production environment. Same migration script, same versions of all dependencies, even the same docker container. Only difference is the configured mongo database host and the time ran.

Further more, we have had the same script work correctly the first time, and but skip the first element the second time after rolling back and running again, then working the third time after rolling back and running again.

Add Programmatic API

This migration package looks great (and kudos for the 100% test coverage!).

Would you consider adding a programmatic API? For instance, something like the following:

const Migrate = require('migrate-mongo');
const migrationConfig = require('path/to/config.js');
const migrator = new Migrate(migrationConfig);

migrator.up().then(/* etc */)
migrator.down().then(/* etc */)

config.js fallback name/path

I want to use migrate-mongo in my project.

In my project all the system configuration is managed with very popular config package. It allows to manage configuration per environment. Of course I want to manage there also migrate-mongo configuration. In this case my migrate-mongo configuration file config.js will be:

'use strict';
const config = require('config');

module.exports = config.get('migrations'); 
  1. config package looks for configuration files in ./config directory.
  2. migrate-mongo package expects configuration file config.js in the current directory.

So:

  1. If I run migrate-mongo commands from my project's root directory I will have to put migrate-mongo configuration file config.js in my project's root directory (and config.js has too general name to be there).

  2. If I put migrate-mongo configuration file config.js in any inner directory (e.g. ./migrations/) then I have to run migrate-mongo commands from there and then config package will not find its configuration files, unless I put a copy of its config directory under migration directory which is obviously wrong.

I would like it to try to use config.js from the current path (for backward compatibility) and if there is no such a file as fallback to check for example .\migrations\config.js or event just another name (e.g. migrations_config.js). I would prefer the first option (e.g. .\migrations\config.js ), to keep my project's root directory as clean as possible.

Version 4 upgrade

Hi. I've noticed that new version is published. However there are no changelog there. Could you please describe what's changed since 3.0.5 and is it safe to upgrade or there were some breaking changes?

Missing documentation of optional options

Describe the bug
Currently the options are not mentioned in the docs but shown in the CLI.
See https://github.com/seppevs/migrate-mongo/blob/master/bin/migrate-mongo.js#L45

To Reproduce
Run migrate-mongo --help

Usage: migrate-mongo [options] [command]

Options:
  -V, --version                   output the version number
  -h, --help                      output usage information

Commands:
  init                            initialize a new migration project
  create [options] [description]  create a new database migration with the provided description
  up [options]                    run all pending database migrations
  down [options]                  undo the last applied database migration
  status [options]                print the changelog of the database

Expected behavior
The options should be mentioned / described in the docs.

Additional context

migrate-mono --version
6.0.0

Update mongodb version

Describe the bug
Update MongoDB version from 3.1.x to 4.0.x. Many features are missing bc monogoDB version is outdated.

Additional context
Add any other context about the problem here.

SyntaxError: Unexpected token import

cannot use import syntax in config file

`/config.js:1
(function (exports, require, module, __filename, __dirname) { import { mongo } from '../src/config'
^^^^^^

SyntaxError: Unexpected token import
at createScript (vm.js:80:10)
at Object.runInThisContext (vm.js:139:10)
at Module._compile (module.js:599:28)
at Object.Module._extensions..js (module.js:646:10)
at Module.load (module.js:554:32)
`

Ability to set migration directories or use patterns

When setting up a project with many different models it is nice to have migrations next to each model.

Inside of the configuration, it would be great If I could use an array or regex to map to migrations directories.

The alternate approach here would be to simply have one giant migrations folder.

Im attempting to use this inside of nestjs which is a dependency management system. It looks as though I am going to end up using migrations in a procedural way, without dependency injection.

Do not allow to migrate downwards all the ran migrations.

Its a great package and we are using it in our production but I found an issue with this package. Currently if you run migrate-mongo down even after days it downgrades the previously ran migrations.

How I thought it would work was as follow:

migrate-mongo up runs the new migrations and keeps track of latest ran migrations.Then if I do migrate-mongo down it only downgrades the previously ran migrations.

Later if I do migrate-mongo down again, (in my case I did accidentally). It should not downgrade the next migration available instead should show message No migrations ran.

What I'm saying is this package should cache the recently ran migration and only should downgrade the these.This way even doing migrate-mongo down after days won't create data discrepancies.

If you think this should be fixed I would love to work on this 😛

Add support for multiple databases

It would be great if this library supported multiple mongo urls in single configuration file, and ran the migrations against all of the provided databases.

default to more specific config path

First, thank you for such a helpful project!

This was pointed out in #3 and #17, but I think it's pretty important: config.js is a very generic name.

The project I'm including this package in has 10 third-party config files in the root directory and that's not including internal ones. It doesn't make sense that config.js is reserved for just a single (though important!) portion of the application.

I realize we can use the --file flag, but it would be far more convenient if migrate-mongoose would first look for a more specific config (e.g. migrations_config.js, mm-config.js, .migrationrc.js, etc).

I suspect this would be as simple as changing how DEFAULT_CONFIG_FILE_NAME is checked, and would easily be made backward compatible as long as the new name is pretty unique. I'm also happy to contribute if you're cool with this approach.

Wrong readme for connect method

According to api docs database.connect returns a promise resolving to mongodb object which can be directly pass to up, down and status functions but it actually returns a promise resolving to object having two keys db (MongoDB obejct) and client (MongoClient) and these needs to passed as separate params to above functions

Transactions with migrate-mongo

Is your feature request related to a problem? Please describe.
It seems that at the moment it is no possible to use cross-document transactions using migrate-mongo.

up and down functions accept a single parameter db which is an instance of the Db class. But to start a transaction we need to call the startSession method on the MongoClient class.

Describe the solution you'd like

Not sure what would be the best solution, but maybe up and down methods could receive two parameters: (db, client)

Describe alternatives you've considered
There does not seem to be a comprehensive alternative solution.

9 high vulnerabilities

Thanks for this module!
However npm cries out about 9 high vulnerabilities after installation. Please fix them if possible.

Update dependencies

Hi Sebastian, thanks for your work. Please, upgrade mongodb dependency to version 3.1.1. Version 3.0.7 doesn’t support MongoDB 4.0.0.

Can't connect to mongodb

I'm having a very hard time connecting to mongo. It's locally on my computer and I've tried both localhost and 127.0.0.1 for the URL but nothing is working. However, I keep getting a 'failed to connect to server' error.

Do I need something else to connect? Do I need mongoose somewhere? I appreciate any help that points me in the right direction. Thank you.

Allow to set config via API

In order to use migrate-mongo API (not as npm script) it should allow specifying config through API.
Now it tries to read config from default file and there's no way to change such behaviour.

So it would be good if it will allow to read config from file or just pass it as js object into database function, for example. Or set it somehow.

That's also important for docker environments where usually npm package is not even installed and present in system in order to reduce image file size. So there's no option to run npm scripts.

Thank you 👏

After spending hours test-driving migration libraries and searching for one that fits my needs, I finally stumbled across yours. It's clean, slick, amazing and simply the best there is. You cannot believe how much did this project save my day! Thank you very much 👏

BSON field 'update.updates.u' is the wrong type 'array', expected type 'object'

Describe the bug
I am trying to update my collection using mongo update query with aggregation which is supported by Mongo version 4.0 and higher. I tried to run the migration using this tool but I have got the error

ERROR: Could not migrate down 20200504124656-remove_deleted_messsage.js: BSON field 'update.updates.u' is the wrong type 'array', expected type 'object'

Migration script

module.exports = {
  async up(db, client) {
	await db.collection('messages').updateMany(
		{ is_deleted: true, archived: { $exists: false } },
		[
			{"$set": {"archived.content": "$content", "archived.url": "$url"}},
			{"$unset": ["content", "url"]  }
		])
  },

  async down(db, client) {

  }
};

To Reproduce

  • Create new migration with above script
  • Run migrate up command to reproduce the error

Expected behavior

  • As suggested in mongoDB document. We can use aggregation pipeline with update query

image

Additional context
I am able to run the same script from mongo client shell. But gives error with migration tool

Can't access the mongodb module

In my migrations I need access to other NodeJs MongoDB objects like ObjectId and DBRef

I am trying to use:

var ObjectID = require('mongodb').ObjectID;
var DBRef = require('mongodb').DBRef;

but I get
Cannot find module 'mongodb'

How can I access these objects?

How to get the client connection?

It would be really interesting to use mongo transactions in your migration module, so if the migration can not fail half-way in.

Describe the solution you'd like
Besides of getting the db connection to a mongo db as a param in the migration files up(db) or down(db), it would be great to have the MongoClient connection too.

Is it possible to do this alreay? Or would you consider to add it?

Thank you

Working with Typescript ?

I want to use migrate-mongo with nestjs and that use typescript and I would like to ask does it work with ts. Another thing is can I use mongo model in migration

Auth user+pass setting separate from URI

We would like to separate the connection URI from the database user and password while configuring migrate-mongo.

It would be appreciated if we could set the database address and the authentication parameters separately (not having to merge all info into a single string), for example in the options part of the config file, using user, pass... options.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.