Giter VIP home page Giter VIP logo

loopback-connector-elastic-search's Introduction

loopback-connector-elastic-search

Join the chat at https://gitter.im/strongloop-community/loopback-connector-elastic-search

Elasticsearch(versions 6.x and 7.x) datasource connector for Loopback 3.x.

Table of Contents

Overview

  1. lib directory has the entire source code for this connector
    1. this is what gets downloaded to your node_modules folder when you run npm install loopback-connector-esv6 --save --save-exact
  2. examples directory has a loopback app which uses this connector
    1. this is not published to NPM, it is only here for demo purposes
      1. it will not be downloaded to your node_modules folder!
      2. similarly the examples/server/datasources.json file is there for this demo app to use
      3. you can copy their content over to <yourApp>/server/datasources.json or <yourApp>/server/datasources.<env>.js if you want and edit it there but don't start editing the files inside examples/server itself and expect changes to take place in your app!
  3. test directory has unit tests
    1. it does not reuse the loopback app from the examples folder
    2. instead, loopback and ES/datasource are built and injected programatically
    3. this directory is not published to NPM.
      1. Refer to .npmignore if you're still confused about what's part of the published connector and what's not.
  4. You will find the datasources.json files in this repo mention various configurations:
    1. elasticsearch-ssl
    2. elasticsearch-plain
    3. db
    4. You don't need them all! They are just examples to help you see the various ways in which you can configure a datasource. Delete the ones you don't need and keep the one you want. For example, most people will start off with elasticsearch-plain and then move on to configuring the additional properties that are exemplified in elasticsearch-ssl. You can mix & match if you'd like to have mongo and es and memory, all three! These are basics of the "connector" framework in loooback and not something we added.
  5. Don't forget to edit your model-config.json file and point the models at the dataSource you want to use.

Install this connector in your loopback app

cd <yourApp>
npm install loopback-connector-esv6 --save --save-exact

Configuring connector

Important Note

  • This connector will only connect to one index per datasource.
  • This package is created to support ElasticSearch v6.x and 7.x only.
  • docType property is automatically added in mapping properties which is required to differentiate documents stored in index with loopback model data. It stores loopback modelName value. docType: { type: "keyword", index: true }

Required

  • name: name of the connector.
  • connector: Elasticsearch driver 'esv6'.
  • configuration: Elasticsearch client configuraiton object which includes nodes, authetication and ssl coonfiguration. Please refer this official link for more information on configuraiton.
  • index: Name of the ElasticSearch index eg: shakespeare.
  • version: specify the major version of the Elasticsearch nodes you will be connecting to. Supported versions: [6, 7] eg: version: 7
  • mappingType: mapping type for provided index. defaults to basedata. Required only for version: 6
  • mappingProperties: An object with properties for above mentioned mappingType

Optional

  • indexSettings: optional settings object for creating index.
  • defaultSize: Search size limit. Default is 50.

Sample

1.Edit datasources.json and set:

  "elastic-search-ssl": {
  "name": "elasticsearch-example-index-datasource",
  "connector": "esv6",
  "version": 7,
  "index": "example-index",
  "configuration": { // Elastic client configuration
    "node": "http://localhost:9200",
    "requestTimeout": 30000,
    "pingTimeout": 3000,
    "auth": {
      "username": "test",
      "password": "test"
    },
    "ssl": {
      "rejectUnauthorized": true
    }
  },
  "defaultSize": 50,
  "indexSettings": { // Elastic index settings
    "number_of_shards": 2,
    "number_of_replicas": 1
  },
  "mappingType": "basedata", // not required for verison: 7, will be ignored
  "mappingProperties": {
    "docType": {
      "type": "keyword",
      "index": true
    },
    "id": {
      "type": "keyword",
      "index": true
    },
    "seq": {
      "type": "integer",
      "index": true
    },
    "name": {
      "type": "keyword",
      "index": true
    },
    "email": {
      "type": "keyword",
      "index": true
    },
    "birthday": {
      "type": "date",
      "index": true
    },
    "role": {
      "type": "keyword",
      "index": true
    },
    "order": {
      "type": "integer",
      "index": true
    },
    "vip": {
      "type": "boolean",
      "index": true
    },
    "objectId": {
      "type": "keyword",
      "index": true
    },
    "ttl": {
      "type": "integer",
      "index": true
    },
    "created": {
      "type": "date",
      "index": true
    }
  }
}

2.You can peek at /examples/server/datasources.json for more hints.

Elasticsearch SearchAfter Support

  • search_after feature of elasticsearch is supported in loopback filter.
  • For this, you need to create a property in model called _search_after with loopback type ["any"]. This field cannot be updated using in connector.
  • Elasticsearch sort value will return in this field.
  • You need pass _search_after value in searchafter key of loopback filter.
  • Example filter query for find.
{
  "where": {
    "username": "hello"
  },
  "order": "created DESC",
  "searchafter": [
    1580902552905
  ],
  "limit": 4
}
  • Example result.
[
  {
    "id": "1bb2dd63-c7b9-588e-a942-15ca4f891a80",
    "username": "test",
    "_search_after": [
      1580902552905
    ],
    "created": "2020-02-05T11:35:52.905Z"
  },
  {
    "id": "fd5ea4df-f159-5816-9104-22147f2a740f",
    "username": "test3",
    "_search_after": [
      1580902552901
    ],
    "created": "2020-02-05T11:35:52.901Z"
  },
  {
    "id": "047c0adb-13ea-5f80-a772-3d2a4691d47a",
    "username": "test4",
    "_search_after": [
      1580902552897
    ],
    "created": "2020-02-05T11:35:52.897Z"
  }
]
  • This is useful for pagination. To go to previous page, change sorting order.

TotalCount Support for search

  • total value from elasticsearch for search queries is now supported in loopback response.
  • For this, you need to create a property in model called _total_count with loopback type "number". This field cannot be updated using in connector.
  • Example response find.
[
  {
    "id": "1bb2dd63-c7b9-588e-a942-15ca4f891a80",
    "username": "test",
    "_search_after": [
      1580902552905
    ],
    "_total_count": 3,
    "created": "2020-02-05T11:35:52.905Z"
  },
  {
    "id": "fd5ea4df-f159-5816-9104-22147f2a740f",
    "username": "test3",
    "_search_after": [
      1580902552901
    ],
    "_total_count": 3,
    "created": "2020-02-05T11:35:52.901Z"
  },
  {
    "id": "047c0adb-13ea-5f80-a772-3d2a4691d47a",
    "username": "test4",
    "_search_after": [
      1580902552897
    ],
    "_total_count": 3,
    "created": "2020-02-05T11:35:52.897Z"
  }
]

About the example app

  1. The examples directory contains a loopback app which uses this connector.
  2. You can point this example at your own elasticsearch instance or use the quick instances provided via docker.

Troubleshooting

  1. Do you have both elasticsearch-ssl and elasticsearch-plain in your datasources.json file? You just need one of them (not both), based on how you've setup your ES instance.
  2. Did you forget to set model-config.json to point at the datasource you configured? Maybe you are using a different or misspelled name than what you thought you had!
  3. Make sure to configure major version of Elastic in version
  4. Maybe the version of ES you are using isn't supported by the client that this project uses. Try removing the elasticsearch sub-dependency from <yourApp>/node_modules/loopback-connector-esv6/node_modules folder and then install the latest client:
    1. cd <yourApp>/node_modules/loopback-connector-esv6/node_modules
    2. then remove es6 && es7 folder
      1. unix/mac quickie: rm -rf es6 es7
    3. npm install
    4. go back to yourApp's root directory
      1. unix/mac quickie: cd <yourApp>
    5. And test that you can now use the connector without any issues!
    6. These changes can easily get washed away for several reasons. So for a more permanent fix that adds the version you want to work on into a release of this connector, please look into Contributing.

Contributing

  1. Feel free to contribute via PR or open an issue for discussion or jump into the gitter chat room if you have ideas.
  2. I recommend that project contributors who are part of the team:
    1. should merge master into develop ... if they are behind, before starting the feature branch
    2. should create feature branches from the develop branch
    3. should merge feature into develop then create a release branch to:
      1. update the changelog
      2. close related issues and mention release version
      3. update the readme
      4. fix any bugs from final testing
      5. commit locally and run npm-release x.x.x -m "<some comment>"
      6. merge release into both master and develop
      7. push master and develop to GitHub
  3. For those who use forks:
    1. please submit your PR against the develop branch, if possible
    2. if you must submit your PR against the master branch ... I understand and I can't stop you. I only hope that there is a good reason like develop not being up-to-date with master for the work you want to build upon.
  4. npm-release <versionNumber> -m <commit message> may be used to publish. Pubilshing to NPM should happen from the master branch. It should ideally only happen when there is something release worthy. There's no point in publishing just because of changes to test or examples folder or any other such entities that aren't part of the "published module" (refer to .npmignore) to begin with.

FAQs

  1. How do we enable or disable the logs coming from the underlying elasticsearch client? There may be a need to debug/troubleshoot at times.
    1. Use the env variable DEBUG=elasticsearch for elastic client logs.
  2. How do we enable or disable the logs coming from this connector?
    1. By default if you do not set the following env variable, they are disabled: DEBUG=loopback:connector:elasticsearch
  3. What are the tests about? Can you provide a brief overview?
    1. Tests are prefixed with 01 or 02 etc. in order to run them in that order by leveraging default alphabetical sorting.
    2. The 02.basic-querying.test.js file uses two models to test various CRUD operations that any connector must provide, like find(), findById(), findByIds(), updateAttributes() etc.
      1. the two models are User and Customer
      2. their ES mappings are laid out in test/resource/datasource-test.json
      3. their loopback definitions can be found in the first before block that performs setup in 02.basic-querying.test.js file ... these are the equivalent of a MyModel.json in your real loopback app.
        1. naturally, this is also where we define which property serves as the id for the model and if its generated or not
  4. How do we get elasticserch to take over ID generation?
    1. An automatically generated id-like field that is maintained by ES is _id. Without some sort of es-field-level-scripting-on-index (if that is possible at all) ... I am not sure how we could ask elasticsearch to take over auto-generating an id-like value for any arbitrary field! So the connector is setup such that adding id: {type: String, generated: true, id: true} will tell it to use _id as the actual field backing the id ... you can keep using the doing model.id abstraction and in the background _id values are mapped to it.
    2. Will this work for any field marked as with generated: true and id: true?
      1. No! The connector isn't coded that way right now ... while it is an interesting idea to couple any such field with ES's _id field inside this connector ... I am not sure if this is the right thing to do. If you had objectId: {type: String, generated: true, id: true} then you won't find a real objectId field in your ES documents. Would that be ok? Wouldn't that confuse developers who want to write custom queries and run 3rd party app against their ES instance? Don't use objectId, use _id would have to be common knowledge. Is that ok?

loopback-connector-elastic-search's People

Contributors

aquid avatar bharathkontham avatar dhmlau avatar drakerian avatar dtomasi avatar fabien avatar gitter-badger avatar harshadyeola avatar kamal0808 avatar lrossy avatar lukewendling avatar pulkitsinghal avatar slorf avatar wolfgang-s avatar yagobski avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

loopback-connector-elastic-search's Issues

How to run boot scripts when Es connector is not initialised

If we try to run a boot script now without our connector being initialised we get back an error saying Index doesn't exist.

here's the error message

Unhandled rejection [index_not_found_exception] no such index, with { resource.type=index_or_alias resource.id=blubox-elastic index=blubox-elastic } :: {"path":"/blubox-elastic/Bluboxer/_search","query":{"size":1,"from":0},"body":"{\"sort\":[\"_uid\"],\"query\":{\"bool\":{\"must\":[{\"match\":{\"email\":\"[email protected]\"}}]}}}","statusCode":404,"response":"{\"error\":{\"root_cause\":[{\"type\":\"index_not_found_exception\",\"reason\":\"no such index\",\"resource.type\":\"index_or_alias\",\"resource.id\":\"blubox-elastic\",\"index\":\"blubox-elastic\"}],\"type\":\"index_not_found_exception\",\"reason\":\"no such index\",\"resource.type\":\"index_or_alias\",\"resource.id\":\"blubox-elastic\",\"index\":\"blubox-elastic\"},\"status\":404}"}
  at respond (/Users/shahwarcoder/Blubox/node_modules/elasticsearch/src/lib/transport.js:289:15)
  at checkRespForFailure (/Users/shahwarcoder/Blubox/node_modules/elasticsearch/src/lib/transport.js:248:7)
  at HttpConnector.<anonymous> (/Users/shahwarcoder/Blubox/node_modules/elasticsearch/src/lib/connectors/http.js:164:7)
  at IncomingMessage.wrapper (/Users/shahwarcoder/Blubox/node_modules/lodash/index.js:3095:19)
  at emitNone (events.js:72:20)
  at IncomingMessage.emit (events.js:166:7)
  at endReadableNT (_stream_readable.js:913:12)
  at nextTickCallbackWith2Args (node.js:442:9)
  at process._tickDomainCallback (node.js:397:17)
  at process.fallback (/Users/shahwarcoder/Blubox/node_modules/async-listener/index.js:482:15)

cc @pulkitsinghal

Consecutive POST/PUT then GET results in empty response

When a Create or Update is done, it takes some time to save or update in ES index. Due to this, fetching the same document in a consecutive request results in "no document found".

I got this issue when I was trying to login. It may be a possible duplicate of #60 , but I think the reason mentioned in that issue is different. I get no errors in ES logs, whereas #60 has errors in logs.

When calling the login API, an access token is created by loopback which is returned to the client. If we try to call an API just after that requires the access token, it will result in "Access Token" not found.
For example,

User.login(...).$promise
.then(function(response){
    return User.profile(response).$promise;
})
.then(function(...){
...
})
.catch(function(err){
   console.log(err);  
});

The above code will result in error if User.profile() requires the Access Token created by User.login(). *But it works if we set a timeout of around 2s in between. *

Is it because the Access Token created need to be indexed first before it is fetched, which takes time?

support for multi_match queries

is it possible for this connector to support multi_match queries, e.g.:

{
  "multi_match" : {
    "query":    "this is a test", 
    "fields": [ "subject", "message" ] 
  }
}

I see in the notes that there is a mention of multi_field mapping:

"name" : {
    "type" : "multi_field",
    "fields" : {
        "name" : {"type" : "string", "index" : "not_analyzed"},
        "native" : {"type" : "string", "index" : "analyzed"}
    }
}
...
// this will treat 'George Harrison' as 'George Harrison' in a search
User.find({order: 'name'}, function (err, users) {..}
// this will treat 'George Harrison' as two tokens: 'george' and 'harrison' in a search
User.find({order: 'name', where: {'name.native': 'Harrison'}}, function (err, users) {..}

But I don't see how to do a multi_field query. When I run a query from the Explorer I see:

{
    "sort": [
      "_id"
    ],
    "query": {
      "bool": {
        "must": [
          {
            "match": {
              "question.text": "theory"
            }
          }
        ]
      }
    }
  }

Any pointers gratefully appreciated -- I'm writing this up for a webinar and would like to be able to use some of Elasticsearch's more interesting features with LoopBack.

user login does not work

During login, an unexpected/strange query to the elasticsearch (ES) is made and it fails because ACL doesn't have any entries in ES and even though the result set should be empty, there isn't any _uid to sort on and that errors out in ES:

server:middleware:accessLogger DEBUG 127.0.0.1-X-X req +21s POST /api/UserModels/login

Elasticsearch DEBUG: 2016-11-01T18:30:28Z
  starting request { method: 'POST',
    path: '/test1/ACL/_search',
    body: { sort: [ '_uid' ], query: { bool: [Object] } },
    query: { size: 50 } }
  

Elasticsearch TRACE: 2016-11-01T18:30:28Z
  -> POST http://localhost:9200/test1/ACL/_search?size=50
  {
    "sort": [
      "_uid"
    ],
    "query": {
      "bool": {
        "must": [
          {
            "match": {
              "model": "UserModel"
            }
          }
        ],
        "should": [
          {
            "match": {
              "property": "login"
            }
          },
          {
            "match": {
              "property": "*"
            }
          },
          {
            "match": {
              "accessType": "EXECUTE"
            }
          },
          {
            "match": {
              "accessType": "*"
            }
          }
        ]
      }
    }
  }
  <- 400
  {
    "error": "SearchPhaseExecutionException[Failed to execute phase [query],
                   all shards failed;
                   ...
                   Parse Failure [No mapping found for [_uid] in order to sort on",
    "status": 400
  }

I ran this on mongodb connector and apparently this unexpected query exists there too:

server:middleware:accessLogger DEBUG 127.0.0.1-X-X req +16s POST /api/UserModels/login

loopback:connector:mongodb all +35ms ACL { where: 
   { model: 'UserModel',
     property: { inq: [Object] },
     accessType: { inq: [Object] } } }

loopback:connector:mongodb MongoDB: model=ACL command=find +1ms [ { model: 'UserModel',
    property: { '$in': [Object] },
    accessType: { '$in': [Object] } },
  [Function] ]

loopback:connector:mongodb all +5ms ACL { where: 
   { model: 'UserModel',
     property: { inq: [Object] },
     accessType: { inq: [Object] } },
  order: [ 'id' ] } null []

But mongo doesn't barf on it! It tolerates it and returns an empty result set.

Next Steps

  1. I need to make ES more tolerant via the connector somehow
  2. @raymondfeng or @bajtos - what is this no-op query all about? Why does it exist at all? Checking ACL for user model via connector for EXECUTE permissions on login method seems like a no-op to me, since ACL table/collection is never created, even in other connectors like mongo.

Add the `include` filter

Notes to learn from mongodb's implementation:

  1. https://github.com/strongloop/loopback-connector-mongodb/blob/367bc033a546130b9f834d17a9d4f769aaefef92/lib/mongodb.js#L730
  2. https://github.com/strongloop/loopback-connector-mongodb/blob/2a90951c482508bdc06d04e5a5b87e8e54f1ae6f/test/imported.test.js#L9
  3. loopback-datasource-juggler/test/include.test.js

In the meantime a workaround is to use the native filter instead of include to write the query using ES's DSL instead of loopback. Ofcourse you need to have already setup parent-child mappings in the datasource correctly no matter what.

Optimize index and mapping creation

We must make sure that the following optimization is possible for most if not all versions of ES.

        /* 
         *       1. create a data structure from `self.settings.mappings`
         *          that has all mappings grouped under a unique index name
         *       {
         *         shakespeare : {
         *           index: "shakespeare",
         *           body: {
         *             mappings: {
         *               modelNameOrExplicitTypeName_A: {
         *                 properties: {...}
         *               },
         *               modelNameOrExplicitTypeName_B: {
         *                 properties: {...}
         *               }
         *             }
         *           }
         *         },
         *         juju : {
         *           index: "juju",
         *           body: {
         *             mappings: {
         *               modelNameOrExplicitTypeName_C: {
         *                 properties: {...}
         *               },
         *               modelNameOrExplicitTypeName_D: {
         *                 properties: {...}
         *               }
         *             }
         *           }
         *         }
         *       }
         *
         *       2. Then for each entry, optimize mapping and indexing to happen in one atomic request to ES
         */

References:

  1. ES 0.90
  2. ES v1.x
  3. ES v2.x
  4. GitHub code search indices.create mappings extension:js
    1. https://github.com/MennaDarwish/DSP/blob/e6156f3ea41294daeefc73b9a00fb92d232213ba/lib/elasticsearch/indices_initializer.js#L11

Automigrate when index or type does not exist yet

Perhaps it's because I'm running an older ES version (1.2.1), but it seems that without any proper pre-checks and handling (indices.delete and indices.existsType), you're unable to run automigration in order to create a new index from scratch. @pulkitsinghal is this correct?

Here's my current monkey-patched solution:

var automigrate = dataSource.connector.automigrate;
dataSource.connector.automigrate = function(models, cb) {
    var self = this;
    var params = { index: self.settings.index };
    this.db.indices.delete(params, function(err, info) {
        if (err && err.message.indexOf('IndexMissingException') === -1) return cb(err);
        self.db.indices.create(params, function(err, info) {
            if (err) return cb(err);
            automigrate.call(self, models, cb);
        });
    });
};

var removeMappings = dataSource.connector.removeMappings;
dataSource.connector.removeMappings = function(modelNames, callback) {
    var self = this;
    var settings = this.settings;
    if (_.isFunction(modelNames)) {
        callback = modelNames;
        modelNames = _.pluck(settings.mappings, 'name');
    }
    var mappingsToRemove = _.filter(settings.mappings, function(mapping){
        return !modelNames || _.include(modelNames, mapping.name);
    });

    modelNames = _.pluck(mappingsToRemove, 'name');

    support.async.filter(modelNames, function(type, next) {
        self.db.indices.existsType({ index: settings.index, type: type }, next);
    }, function(filtered) {
        removeMappings.call(self, filtered, callback);
    });
};

deleteByQuery should use a filter instead of a query for much better performance

Currently, destroyAll issues a deleteByQuery operation with a query. Using a query causes ES to generate scores as part of the process and as a result has additional overhead.

As a general rule, filters should be used instead of queries:
for binary yes/no searches
for queries on exact values

https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-filters.html

As a general rule, queries should be used instead of filters:
for full text search
where the result depends on a relevance score

https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl.html

Filters are very handy since they perform an order of magnitude better than plain queries since no scoring is performed and they are automatically cached.

https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl.html

Update attributes doesn't work for string IDs

For an API URL call like: http://localhost:3000/api/StoreConfigModels/XjEuI-pbRKWeSNFNxJRQsA the id isn't passed through to the update attributes method and it fails:

{
  "error": {
    "name": "Error",
    "status": 500,
    "message": "id not set!",
    "stack": "Error: id not set!
  at [object Object].updateAttrs [as updateAttributes] (/Users/pulkitsinghal/dev/shoppinpal/loopback-connector-elasticsearch/lib/esConnector.js:887:15)
  at ModelConstructor.<anonymous> (/Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/dao.js:1907:27)
  at ModelConstructor.trigger (/Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/hooks.js:65:12)
  at ModelConstructor.<anonymous> (/Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/dao.js:1893:14)
  at ModelConstructor.trigger (/Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/hooks.js:65:12)
  at /Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/dao.js:1892:12
  at ModelConstructor.<anonymous> (/Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/validations.js:460:11)
  at ModelConstructor.next (/Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/hooks.js:75:12)
  at ModelConstructor.<anonymous> (/Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/validations.js:457:23)
  at ModelConstructor.trigger (/Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/hooks.js:65:12)
  at ModelConstructor.Validatable.isValid (/Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/validations.js:433:8)
  at /Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/dao.js:1886:10
  at doNotify (/Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/model.js:595:49)
  at doNotify (/Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/model.js:595:49)
  at doNotify (/Users/pulkitsinghal/dev/w2/node_modules/loopback-datasource-juggler/lib/model.js:595:49)
  ..."
  }
}

Settings not pushed by create index

I'm using an ElasticSearch solution for search-as-you-type, that requires a custom analyzer. In order for this to work, a custom analyzer/tokenizer have to be set when creating an index. I'm assuming this connector supports that, as the config example an analyzer is made in settings too. However, when I try to do the same, no analyzer is made. So when I try to create the mapping using this custom analyzer, it obviously fails as the analyzer does not exist.

Here's what I got.

datasource.json:
"mappings": [ { "name": "term", "properties": { "term": { "type": "string", "analyzer": "autocomplete", "search_analyzer": "standard" } } } ], "settings": { "analysis": { "filter": { "autocomplete_filter": { "type": "edge_ngram", "min_gram": 1, "max_gram": 20 } }, "analyzer": { "autocomplete": { "type": "custom", "tokenizer": "standard", "filter": [ "lowercase", "autocomplete_filter" ] } } } },

Output:

api_1 | Elasticsearch DEBUG: 2016-10-19T10:21:23Z
api_1 | starting request { method: 'HEAD',
api_1 | castExists: true,
api_1 | path: '/superbuddy',
api_1 | query: {} }
api_1 |
api_1 |
api_1 | Elasticsearch TRACE: 2016-10-19T10:21:23Z
api_1 | -> HEAD http://search:9200/superbuddy
api_1 |
api_1 | <- 404
api_1 |
api_1 |
api_1 | Elasticsearch DEBUG: 2016-10-19T10:21:23Z
api_1 | Request complete
api_1 |
api_1 | Elasticsearch DEBUG: 2016-10-19T10:21:23Z
api_1 | starting request { method: 'POST', path: '/superbuddy', query: {} }
api_1 |
api_1 |
api_1 | Elasticsearch TRACE: 2016-10-19T10:21:23Z
api_1 | -> POST http://search:9200/superbuddy
api_1 |
api_1 | <- 200
api_1 | {
api_1 | "acknowledged": true
api_1 | }
api_1 |
api_1 | Elasticsearch DEBUG: 2016-10-19T10:21:23Z
api_1 | Request complete
api_1 |
api_1 | Elasticsearch DEBUG: 2016-10-19T10:21:23Z
api_1 | starting request { method: 'PUT',
api_1 | path: '/superbuddy/_mapping/term',
api_1 | body: { properties: { term: [Object] } },
api_1 | query: {} }
api_1 |
api_1 |
api_1 | Elasticsearch TRACE: 2016-10-19T10:21:23Z
api_1 | -> PUT http://search:9200/superbuddy/_mapping/term
api_1 | {
api_1 | "properties": {
api_1 | "term": {
api_1 | "type": "string",
api_1 | "analyzer": "autocomplete",
api_1 | "search_analyzer": "standard"
api_1 | }
api_1 | }
api_1 | }
api_1 | <- 400
api_1 | {
api_1 | "error": {
api_1 | "root_cause": [
api_1 | {
api_1 | "type": "mapper_parsing_exception",
api_1 | "reason": "analyzer [autocomplete] not found for field [term]"
api_1 | }
api_1 | ],
api_1 | "type": "mapper_parsing_exception",
api_1 | "reason": "analyzer [autocomplete] not found for field [term]"
api_1 | },
api_1 | "status": 400
api_1 | }
api_1 |
api_1 | Elasticsearch DEBUG: 2016-10-19T10:21:23Z
api_1 | Request complete
api_1 |
api_1 | Connection fails: [mapper_parsing_exception] analyzer [autocomplete] not found for field [term] :: {"path":"/superbuddy/_mapping/term","query":{},"body":"{"properties":{"term":{"type":"string","analyzer":"autocomplete","search_analyzer":"standard"}}}","statusCode":400,"response":"{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"analyzer [autocomplete] not found for field [term]"}],"type":"mapper_parsing_exception","reason":"analyzer [autocomplete] not found for field [term]"},"status":400}"}
api_1 | It will be retried for the next request.
api_1 | Unhandled rejection Error: [mapper_parsing_exception] analyzer [autocomplete] not found for field [term]
api_1 | at respond (/project/node_modules/elasticsearch/src/lib/transport.js:289:15)
api_1 | at checkRespForFailure (/project/node_modules/elasticsearch/src/lib/transport.js:248:7)
api_1 | at HttpConnector. (/project/node_modules/elasticsearch/src/lib/connectors/http.js:164:7)
api_1 | at IncomingMessage.wrapper (/project/node_modules/lodash/index.js:3095:19)
api_1 | at emitNone (events.js:91:20)
api_1 | at IncomingMessage.emit (events.js:185:7)
api_1 | at endReadableNT (_stream_readable.js:974:12)
api_1 | at _combinedTickCallback (internal/process/next_tick.js:74:11)
api_1 | at process._tickDomainCallback (internal/process/next_tick.js:122:9)

Naturally, those same settings work fine when I insert them manually. When I do this I push the settings in the body of the POST request for making the index. If I do this, this connector can also make the mapping without issue, giving me a workaround for now. But I'll be damned if I have to create the index manually when this connector can do it automatically.

Improve the example

  1. have a branch which only takes you through a README to setup a loopback project that uses ES
  2. have a branch that already has a looback project ready to go
    1. demo faceting on top of it
    2. demo kibana on top of it

"and" & "inq" loopback filter

Hello, i found this comment
// TODO: or Logical OR operator
// TODO: and Logical AND operator
// {"where":{"and":[{"id":{"inq":["1","2","3","4"]}},{"vip":true}]}}
in lib/esConnector.js.

I would ask you, if will be developed in few time and if i can help in some way.

Thank you so much!

datasources.json location in sample app project?

Where I should keep my datasources.json file for my app. When I refer README/doc I find it in example/server/datasources.json
So when I install this module with npm install the source goes in node_modules directory of my app directory.
So should I edit my datasources.json file in node_modules of my app?

Fail fast for mismatched field values

boot:create-model-instances (41) created +2ms
    TeamModel { ownerId: '[email protected]', memberId: NaN, id: 1 }

As the log statement shows, currently life goes on (memberId: NaN) and errors that crop-up because of a mismatch in field value versus field type are not caught on the spot but instead during user/end-2-end/functional testing.

Let's do something about this and fail fast to ensure that the developers know right away when something is wrong.

Cannot login, access token mapping broken

Log shared by @aquid:

Unhandled error for request GET /api/MyModel/profile?id=AVlqFgCzgZDS4o0PX3Uf: AssertionError: token.created must be a valid Date
    at ModelConstructor.AccessToken.validate (/Users/xxx/yyy/node_modules/loopback/common/models/access-token.js:137:7)

Multi Index usage

The goal is to implement and document how to use this connector with more than one index.

Incompatible with latest loopback-datasource-juggler v2.27.0 or above

I get the following fatal errors when used in conjunction with loopback-datasource-juggler v2.27.0 or above.

/devenv/build/node_modules/loopback-datasource-juggler/lib/include.js:526
            obj.__cachedRelations[relationName].push(target);
               ^
TypeError: Cannot read property '__cachedRelations' of undefined
    at linkManyToOne (/devenv/build/node_modules/loopback-datasource-juggler/lib/include.js:526:16)
    at /devenv/build/node_modules/loopback-datasource-juggler/node_modules/async/lib/async.js:162:20
    at /devenv/build/node_modules/loopback-datasource-juggler/node_modules/async/lib/async.js:230:13
    at _arrayEach (/devenv/build/node_modules/loopback-datasource-juggler/node_modules/async/lib/async.js:81:9)
    at _each (/devenv/build/node_modules/loopback-datasource-juggler/node_modules/async/lib/async.js:72:13)
    at Object.async.forEachOf.async.eachOf (/devenv/build/node_modules/loopback-datasource-juggler/node_modules/async/lib/async.js:229:9)
    at Object.async.forEach.async.each (/devenv/build/node_modules/loopback-datasource-juggler/node_modules/async/lib/async.js:206:22)
    at targetLinkingTask (/devenv/build/node_modules/loopback-datasource-[nodemon] app crashed - waiting for file changes before starting...

Implement support for replaceById operation

Error: The connector elasticsearch does not support replaceById operation. This is not a bug in LoopBack. Please contact the authors of the connector, preferably via GitHub issues.

Get rid of this error when POST /Model/{id}/replace Replace attributes for a model instance and persist it into the data source.`

1.0.5 init errors - Cannot initialize connector "es": Maximum call stack size exceeded

Updating from 1.0.4 to 1.0.5 throws this error when I try to start the app.

TypeError: Cannot create data source "elasticsearch-dev": Cannot initialize connector "es": Invalid apiVersion "2.x", expected a function or one of master, 1.x, 1.3, 1.2, 1.1, 1.0, 0.90
    at Object._.funcEnum (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/elasticsearch/src/lib/utils.js:356:11)
    at new Client (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/elasticsearch/src/lib/client.js:66:29)
    at ESConnector.connect (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-connector-es/lib/esConnector.js:103:19)
    at Object.module.exports.initialize (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-connector-es/lib/esConnector.js:29:30)
    at DataSource.setup (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/datasource.js:337:19)
    at new DataSource (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/datasource.js:114:8)
    at Registry.createDataSource (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback/lib/registry.js:349:12)
    at dataSourcesFromConfig (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback/lib/application.js:434:19)
    at EventEmitter.app.dataSource (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback/lib/application.js:233:14)
    at /Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-boot/lib/executor.js:178:9
    at /Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-boot/lib/executor.js:269:5
    at Array.forEach (native)
    at forEachKeyedObject (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-boot/lib/executor.js:268:20)
    at setupDataSources (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-boot/lib/executor.js:173:3)
    at execute (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-boot/lib/executor.js:32:3)
    at bootLoopBackApp (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-boot/index.js:140:3)
    at Object.<anonymous> (/Users/juan/Documents/Projects/bkn-buildings-api/server/server.js:21:1)
    at Module._compile (module.js:399:26)
    at Object.Module._extensions..js (module.js:406:10)
    at Module.load (module.js:345:32)
    at Function.Module._load (module.js:302:12)
    at Object.<anonymous> (/Users/juan/.nvm/versions/v5.2.0/lib/node_modules/strongloop/node_modules/strong-supervisor/bin/sl-run.js:77:19)
    at Module._compile (module.js:399:26)
    at Object.Module._extensions..js (module.js:406:10)
    at Module.load (module.js:345:32)
    at Function.Module._load (module.js:302:12)
    at /Users/juan/.nvm/versions/v5.2.0/lib/node_modules/strongloop/lib/command.js:28:23
    at CommandLoader.run (/Users/juan/.nvm/versions/v5.2.0/lib/node_modules/strongloop/lib/loader.js:126:3)

Removing apiVersion from the datasource config bypasses the error.

But after fixing that I get this error:

Cannot initialize connector "es": Maximum call stack size exceeded

The only solution I've for now is to revert to 1.0.4

Sanity Check example for sorting related changes

.... if username was registered with id:true for UserModel or its base User, would the connector be able to pick up on that fact and alter its es-queries accordingly or is that work still pending?

Add tests to verify underlying driver's compatibility with CLS

  1. LoopBack uses https://github.com/othiym23/node-continuation-local-storage for implicit context propagation across the async invocation paths.
  2. It has been speculated that ... some of the drivers (especially the ones with connection pooling) are not friendly with CLS
  3. CLS seems tricky because I've even seen comments that despite fixing the previous problem there maybe other undiscovered issues

So some robust testing is in order here.

Adding a test that somehow manages to use the CLS feature should help verify both the stability of the connector and the underlying elasticsearch driver.

After the test is ready, we should be mindful to use it such that we really hit the connector hard (like something such as apache ab would do) and verify the responses to make sure the data didn't get mixed up.

support for a dynamix index

Steven @onstrike07 Aug 14 17:34
I have a question about the index setting in datasource.json for this connector. Is it possible to make the index name to be dynamic? I mean I want to pass index name as a parameter from frontend for each query. Our situation is that our log data, which is very huge, are split to different indices in ES for each hour, for example, log_2016_08_10_02 index is for log data that were generated between 2:00 am and 3:00 am on August 10, 2016. Frontend code determines which index to query against based on time window users specify.

cc @onstrike07

id change during get query

When i call this endpoint :
http://host:port/api/models/20160204161716631962

Loopback trace this :

-> POST http://host:port/api/model/_search?size=20&from=0
{
  "sort": [
    "id"
  ],
  "query": {
    "bool": {
      "must": [
        {
          "match": {
            "_id": "20160204161716634000"
          }
        }
      ]
    }
  }
}
<- 200
{
  "took": 10,
  "timed_out": false,
  "_shards": {
    "total": 5,
    "successful": 5,
    "failed": 0
  },
  "hits": {
    "total": 0,
    "max_score": null,
    "hits": []
  }
}

Why did my id change during the query ?
20160204161716631962 => 20160204161716634000

lng is sent instead of lon in Geo Point

Geo point in elasticsearch takes lat, lon

Instead lat, lng is sent from this connector. Can we please fix it. Or at least tell point out where it is. I will send a PR if required.

Add a collaborator to this project

@bajtos and @crandmck - one of my team members will be helping me maintain this repo, going forward. Can you please add @aquid as a collaborator on this project, like myself? I don't have the privileges to do so. Thank You!

If you need some sort of dev history or something, let me know and I'll provide.

@bajtos you may remember @aquid from his contribution to the angular sdk:

modelsToIgnore option support for generator (Aquid Shahwar)

strongloop/grunt-loopback-sdk-angular@5d65107

get rid of `ESConnector.prototype.removeMappings`

If removing an index also effectively removes any mappings and data then there shouldn't be any need for an explicit method such as ESConnector.prototype.removeMappings ... we need to confirm this by reading ES docs for various versions and also running sanity tests.

If this turns out to be true then we can cleanup the code by removing ESConnector.prototype.removeMappings

Docs say apiVersion is optional, but it is required to connect

When you leave out the api-version, you get a type-error

TypeError: Cannot create data source "elasticsearch": Cannot initialize connector "es": Cannot read property 'indexOf' of undefined
at ESConnector.connect (/diekeure/platformso/API/node_modules/loopback-connector-es/lib/esConnector.js:105:37)
at Object.module.exports.initialize (/diekeure/platformso/API/node_modules/loopback-connector-es/lib/esConnector.js:30:30)
at DataSource.setup (/diekeure/platformso/API/node_modules/loopback-datasource-juggler/lib/datasource.js:339:19)
at new DataSource (/diekeure/platformso/API/node_modules/loopback-datasource-juggler/lib/datasource.js:117:8)
at Registry.createDataSource (/diekeure/platformso/API/node_modules/loopback/lib/registry.js:355:12)
at dataSourcesFromConfig (/diekeure/platformso/API/node_modules/loopback/lib/application.js:440:19)
at EventEmitter.app.dataSource (/diekeure/platformso/API/node_modules/loopback/lib/application.js:235:14)
at /diekeure/platformso/API/node_modules/loopback-boot/lib/executor.js:190:9
at /diekeure/platformso/API/node_modules/loopback-boot/lib/executor.js:281:5
at Array.forEach (native)
at forEachKeyedObject (/diekeure/platformso/API/node_modules/loopback-boot/lib/executor.js:280:20)
at setupDataSources (/diekeure/platformso/API/node_modules/loopback-boot/lib/executor.js:180:3)
at execute (/diekeure/platformso/API/node_modules/loopback-boot/lib/executor.js:38:3)
at bootLoopBackApp (/diekeure/platformso/API/node_modules/loopback-boot/index.js:154:3)
at Object. (/diekeure/platformso/API/server/server.js:28:1)
at Module._compile (module.js:556:32)

Test for findByIds fails

I've temporarily commented out the part of the test which fails.

The original test case was available at: https://github.com/strongloop/loopback-datasource-juggler/blob/master/test/basic-querying.test.js

I simply copied and repurposed it here: https://github.com/strongloop-community/loopback-connector-elastic-search/blob/master/test/02.basic-querying.test.js

In order to reproduce the problem:

  1. un-comment the following lines:
names.should.eql( // NOTE: order doesn't add up, is 2.ii.iii broken?
    [createdUsers[2].name, createdUsers[1].name, createdUsers[0].name]);
  1. comment-out the following lines:

    names.should.include(createdUsers[2].name);
    names.should.include(createdUsers[1].name);
    names.should.include(createdUsers[0].name);
    

makeId refactoring not applied to save method

Seems as if the refactoring that was done in 25e9029#diff-a27d2580e4157c6999e9327a4a358c21L446 was not applied to ESConnector.prototype.save (marked the line in the link above).

When I try to save an existing model the esConnector.js throws the following error:

TypeError: self.makeId is not a function
transform_1 |         at ESConnector.save (/usr/local/lib/node_modules/loopback-connector-es/lib/esConnector.js:1019:35)
transform_1 |         at /usr/local/lib/node_modules/loopback-datasource-juggler/lib/dao.js:2244:25
transform_1 |         at doNotify (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/observer.js:98:49)
transform_1 |         at doNotify (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/observer.js:98:49)
transform_1 |         at doNotify (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/observer.js:98:49)
transform_1 |         at doNotify (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/observer.js:98:49)
transform_1 |         at Function.ObserverMixin._notifyBaseObservers (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/observer.js:121:5)
transform_1 |         at Function.ObserverMixin.notifyObserversOf (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/observer.js:96:8)
transform_1 |         at Function.ObserverMixin._notifyBaseObservers (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/observer.js:119:15)
transform_1 |         at Function.ObserverMixin.notifyObserversOf (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/observer.js:96:8)
transform_1 |         at Function.ObserverMixin._notifyBaseObservers (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/observer.js:119:15)
transform_1 |         at Function.ObserverMixin.notifyObserversOf (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/observer.js:96:8)
transform_1 |         at Function.ObserverMixin._notifyBaseObservers (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/observer.js:119:15)
transform_1 |         at Function.ObserverMixin.notifyObserversOf (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/observer.js:96:8)
transform_1 |         at ModelConstructor.<anonymous> (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/dao.js:2238:17)
transform_1 |         at ModelConstructor.trigger (/usr/local/lib/node_modules/loopback-datasource-juggler/lib/hooks.js:70:12)

I will refactor and create a PR for it

Add mapping for Loopback model name to ES type

We use a domain-specific model name for our loopback model but we cannot change the ES url (it's a 3rd party server install). It would be helpful to map loopback model name to ES type in URLs.

For example, we want to map our Customer model to an ES url like /index/user. Customer -> user

Getting error when trying to delete: self.db.deleteByQuery is not a function

I'm using ElasticSarch 2.1.1 and loopback-connector-elastic-search 1.0.5 and every time I try to delete a record by id I get this error:

{  "error": {
    "name": "TypeError",
    "status": 500,
    "message": "self.db.deleteByQuery is not a function",
    "stack": "TypeError: self.db.deleteByQuery is not a function\n    at destroyAll (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-connector-es/lib/esConnector.js:796:13)\n    at doDelete (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/dao.js:1936:19)\n    at /Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/dao.js:1910:9\n    at doNotify (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/observer.js:98:49)\n    at doNotify (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/observer.js:98:49)\n    at doNotify (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/observer.js:98:49)\n    at doNotify (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/observer.js:98:49)\n    at Function.ObserverMixin._notifyBaseObservers (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/observer.js:121:5)\n    at Function.ObserverMixin.notifyObserversOf (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/observer.js:96:8)\n    at Function.ObserverMixin._notifyBaseObservers (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/observer.js:119:15)\n    at Function.ObserverMixin.notifyObserversOf (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/observer.js:96:8)\n    at Function.ObserverMixin._notifyBaseObservers (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/observer.js:119:15)\n    at Function.ObserverMixin.notifyObserversOf (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/observer.js:96:8)\n    at Function.ObserverMixin._notifyBaseObservers (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/observer.js:119:15)\n    at Function.ObserverMixin.notifyObserversOf (/Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/observer.js:96:8)\n    at /Users/juan/Documents/Projects/bkn-buildings-api/node_modules/loopback-datasource-juggler/lib/dao.js:1908:13"
  }
}

An important issue since I'm not able to remove anything.

Thank you.

Login does not work

When I use the memory connector, the accessToken looks like:

req.accessToken: {
  "id": "qGYGzUMsGsSRuG2Xwy28rDQSQOanp8dqDbYgdKdKqqqlqQtETzSNeyrdhdGlpvUA",
  "ttl": 1209600,
  "created": "2015-04-01T16:12:29.025Z",
  "userId": 2
}

But when I use the elasticsearch connector, it looks like:

req.accessToken: { 
  "id": "SEFEuxTa8XOjCtywWhAXNvKK1fVukSqawcNHBT9bHIy4DuUMBeeIEw9vIOp7iQWR", 
  "ttl": 1209600, 
  "created": "2015-04-01T16:06:03.489Z" 
}

Notice how the userId is missing for req.accessToken when elasticsearch connector is involved?

Support Parent/Child relationships

We can define relationships in Loopback, but ElasticSearch has no knowledge of them. If data was indexed in ES with relationship information it would allow for better use of ElasticSearch's features, like finding parents by their children or finding children by their parents.

With #22 it's possible to define parent child relationships manually in datasources.json. However, this isn't enough as when you are creating data with parent relationships, the post to elasticsearch needs to be slightly different (you must specify the ID of the associated parent document)

Cannot initialize connector "es": Maximum call stack size exceeded

Process to produce this issue:

I'm using the latest Elasticsearch which is 2.3.4.

  1. Download the examples module:
    c:> git clone https://github.com/strongloop-community/loopback-connector-elastic-search.git myEsConnector
  2. Change to examples folder
    c:> cd myEsConnector/examples
  3. Install dependencies.
    c:\examples>npm install
  4. Run the server
    c:\examples>node server\server.js

The error message in title is shown.

C:\Users\test\myEsConnector\examples>node server\server.js
loopback deprecated loopback.compress is deprecated. Use require('compression'); instead. server\server.js:10:17
C:\Users\test\myEsConnector\examples\node_modules\loopback\lib\application.js:243
throw err;
^
RangeError: Cannot create data source "elasticsearch-plain": Cannot initialize connector "es": Maximum call stack size exceeded
at Function.EventEmitter.listenerCount (events.js:397:38)
at Log.listenerCount (C:\Users\test\myEsConnector\examples\node_modules\loopback-connector-es\node_modules\elasticsearch\src\lib\log.js:68:25)
at Function.EventEmitter.listenerCount (events.js:399:20)
at Log.listenerCount (C:\Users\test\myEsConnector\examples\node_modules\loopback-connector-es\node_modules\elasticsearch\src\lib\log.js:68:25)
at Function.EventEmitter.listenerCount (events.js:399:20)
at Log.listenerCount (C:\Users\test\myEsConnector\examples\node_modules\loopback-connector-es\node_modules\elasticsearch\src\lib\log.js:68:25)
at Function.EventEmitter.listenerCount (events.js:399:20)
at Log.listenerCount (C:\Users\test\myEsConnector\examples\node_modules\loopback-connector-es\node_modules\elasticsearch\src\lib\log.js:68:25)
at Function.EventEmitter.listenerCount (events.js:399:20)
at Log.listenerCount (C:\Users\test\myEsConnector\examples\node_modules\loopback-connector-es\node_modules\elasticsearch\src\lib\log.js:68:25)
at Function.EventEmitter.listenerCount (events.js:399:20)
at Log.listenerCount (C:\Users\test\myEsConnector\examples\node_modules\loopback-connector-es\node_modules\elasticsearch\src\lib\log.js:68:25)
at Function.EventEmitter.listenerCount (events.js:399:20)
at Log.listenerCount (C:\Users\test\myEsConnector\examples\node_modules\loopback-connector-es\node_modules\elasticsearch\src\lib\log.js:68:25)
at Function.EventEmitter.listenerCount (events.js:399:20)
at Log.listenerCount (C:\Users\test\myEsConnector\examples\node_modules\loopback-connector-es\node_modules\elasticsearch\src\lib\log.js:68:25)

Workaround

The connector was only written and tested with ES v1.1, and the version of es-client downloaded as a dependency of es-connector on my computer was one that only supported up to v1.3, but I was running v2.3.4. In addition, the example was loading the es-connector version 0.3.2 rather that the latest 1.0.5.

Following are steps on how to fix it:

1 Manually delete the elasticsearch folder from the folder below:

C:\Users\test\myEsConnector\examples\node_modules\loopback-connector-es\node_modules

2 Edit C:\Users\test\myEsConnector\examples\node_modules\loopback-connector-es\package.json file to remove the explicit reference to elasticsearch.

3 Run npm install --save --save-exact elasticsearch under the folder below:

C:\Users\test\myEsConnector\examples\node_modules\loopback-connector-es\node_modules

4 Modify examples/server/datasource.json file like shown below

{
"db": {
"name": "db",
"connector": "memory",
"file": "db.json"
},
"elasticsearch-plain": {
"name": "elasticsearch-plain",
"connector": "es",
"index": "shakespeare",
"hosts": [
{
"host": "localhost",
"port": 9200
}
],
"apiVersion": "2.3",
"log": "trace",
"defaultSize": 50,
"requestTimeout": 30000
}
}

5 Edit server/model-config.json file to replace data source name 'elasticsearch-ssl' with 'elasticsearch-plain'.

The error should be gone. I'm having other issues; I'm not sure if they are related to different versions of ES and the connector. I'll be working on them.

Connector calls don't wait until its initialized

Any pointers to code would be appreciated.

I thought I had already accomplished the right flow via these lines of code:

dataSource.connector.connect(callback);
...
ESConnector.prototype.connect = function (callback) {
...
if(self.settings.mappings) {
  self.setupMappings(callback);
}
...

But apparently not.

hasManyThrough get relations

For some reason I am not able to get related object on hasManyThrough relations.
Post works fine.

But GET /user/:id/related results on empty array. This works perfectly if I switch to mongodb connector.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.